Nice to Have
Design, build, and maintain robust data pipelines and ETL / ELT processes., Work with Python (especially Pandas) and SQL to transform and process structured and unstructured data., Use dbt and GitLab for data transformations and version control / CI / CD workflows., Collaborate with analysts, data scientists, and engineers to deliver clean, high-quality data., Ensure best practices in data architecture, documentation, and metadata management., Work with Snowflake for scalable data warehousing (warehouses, schemas, access control)., Maintain workflows using tools like Airflow, Glue, or Dataflow (optional but a plus).] Requirements : Python, SQL, dbt, Cloud Additionally : Sport subscription, Training budget, Private healthcare, International projects.
Data Engineer • Remote, Poland