Data Engineer
Miejsce pracy : Warszawa
Technologies we use
Expected
- Azure SQL
- Snowflake Data Cloud
Your responsibilities
Design and implement data integrations between internal systems, client environments, and third-party platforms (e.g., ERP, CRM, cloud platforms).Develop robust, secure, and scalable APIs and connectors for real-time and batch data exchange.Ensure end-to-end monitoring and reliability of integration flows.Build and maintain scalable, resilient data pipelines (ETL / ELT) to ingest, clean, transform, and serve data for analytics and machine learning.Work with structured and unstructured data from various sources, including cloud services, APIs, databases, and flat files.Design and implement data models that support both operational needs and analytical use cases.Collaborate with data analysts and scientists to prepare data in usable formats for BI, AI, and ML tools.Use and optimize modern data stack tools (e.g., Airflow, dbt, Databricks, Fivetran) and work with cloud platforms (AWS, GCP, or Azure).Develop and maintain CI / CD pipelines for data and integration services.Ensure data quality, consistency, and lineage across integrated systems.Implement data security standards (encryption, masking, access controls) and comply with data privacy regulations.Partner with product managers, business analysts, and software engineers to understand integration requirements and deliver scalable solutions.Support stakeholders by building documentation and providing visibility into integration health and pipeline performance.Our requirements
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.4–8 years of experience as a data engineer, integration engineer, or full-stack engineer with a focus on data.Proven experience with APIs, integration patterns (REST, SOAP, Webhooks), and data streaming technologies (Kafka, Pub / Sub).Strong skills in Python or Java / Scala; SQL expertise is required.Hands-on experience with cloud data platforms (Snowflake, BigQuery, Redshift) and cloud infrastructure (AWS, GCP, Azure).Proficiency with modern ETL / ELT tools (e.g., Apache Airflow, dbt, Fivetran, Informatica, Talend).Experience working with both batch and real-time data architectures.Familiarity with containerization (Docker), orchestration tools (Kubernetes), and CI / CD for data.Experience in enterprise environments (ERP, CRM, SaaS integrations) is a strong plus.Optional
Integration-first mindset with solid understanding of cross-system data flows.Strong communication and collaboration skills across technical and non-technical teams.Ability to break down complex data integration challenges into manageable components.Detail-oriented with high standards for data quality and governance.Curious, self-driven, and comfortable working in a fast-paced environment.What we offer
At TechTorch, you'll be at the heart of enterprise innovation. Our work is fast-paced, impactful, and technology-first. You’ll collaborate with smart, driven teammates to create solutions that shape how Private Equity-backed businesses operate. If you're excited by integrations, data engineering, and building systems that scale — this is your place.
About Us
TechTorch is a leader in delivering innovative Enterprise Technology solutions, leveraging AI-powered accelerators to drive business success for Private Equity-backed companies. Our team of experts disrupts the system integration space with cutting-edge data and AI solutions. We are currently seeking a Full Stack Data Engineer with a strong focus on data integrations to support the development and scaling of our proprietary accelerators.