TECHTORCH POLAND sp. z o.o.Warszawa, Masovian, Poland
24 dni temu
Opis pracy
technologies-expected :
Azure SQL
Snowflake Data Cloud
responsibilities :
Design and implement data integrations between internal systems, client environments, and third-party platforms (e.g., ERP, CRM, cloud platforms).
Develop robust, secure, and scalable APIs and connectors for real-time and batch data exchange.
Ensure end-to-end monitoring and reliability of integration flows.
Build and maintain scalable, resilient data pipelines (ETL / ELT) to ingest, clean, transform, and serve data for analytics and machine learning.
Work with structured and unstructured data from various sources, including cloud services, APIs, databases, and flat files.
Design and implement data models that support both operational needs and analytical use cases.
Collaborate with data analysts and scientists to prepare data in usable formats for BI, AI, and ML tools.
Use and optimize modern data stack tools (e.g., Airflow, dbt, Databricks, Fivetran) and work with cloud platforms (AWS, GCP, or Azure).
Develop and maintain CI / CD pipelines for data and integration services.
Ensure data quality, consistency, and lineage across integrated systems.
Implement data security standards (encryption, masking, access controls) and comply with data privacy regulations.
Partner with product managers, business analysts, and software engineers to understand integration requirements and deliver scalable solutions.
Support stakeholders by building documentation and providing visibility into integration health and pipeline performance.
requirements-expected :
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
4–8 years of experience as a data engineer, integration engineer, or full-stack engineer with a focus on data.
Proven experience with APIs, integration patterns (REST, SOAP, Webhooks), and data streaming technologies (Kafka, Pub / Sub).
Strong skills in Python or Java / Scala; SQL expertise is required.
Hands-on experience with cloud data platforms (Snowflake, BigQuery, Redshift) and cloud infrastructure (AWS, GCP, Azure).
Proficiency with modern ETL / ELT tools (e.g., Apache Airflow, dbt, Fivetran, Informatica, Talend).
Experience working with both batch and real-time data architectures.
Familiarity with containerization (Docker), orchestration tools (Kubernetes), and CI / CD for data.
Experience in enterprise environments (ERP, CRM, SaaS integrations) is a strong plus.
offered :
At TechTorch, you'll be at the heart of enterprise innovation. Our work is fast-paced, impactful, and technology-first. You’ll collaborate with smart, driven teammates to create solutions that shape how Private Equity-backed businesses operate. If you're excited by integrations, data engineering, and building systems that scale — this is your place.
Utwórz powiadomienie o ofertach pracy dla tego wyszukiwania