We are seeking a Middle Data Engineer with proven expertise in AWS, Snowflake, and dbt to design and build scalable data pipelines and modern data infrastructure. You'll play a key role in shaping the data ecosystem, ensuring data availability, quality, and performance across business units.
Requirements :
- 4+ years of experience in Data Engineering roles.
- Experience with the AWS cloud platform.
- Proven experience with Snowflake in production environments.
- Hands-on experience building data pipelines using dbt .
- Python skills for data processing and orchestration.
- Deep understanding of data modeling and ELT best practices.
- Experience with CI / CD and version control systems (e.g., Git).
- Strong communication and collaboration skills.
Must-Have :
Strong experience with Snowflake (e.g., performance tuning, storage layers, cost management).Production-level proficiency with dbt (modular development, testing, deployment).Experience developing Python data pipelines.Proficiency in SQL (analytical queries, performance optimization).Nice-to-Have :
Experience with orchestration tools like Airflow, Prefect, or Dagster.Familiarity with cloud platforms (e.g., GCP, or Azure).Knowledge of data governance, lineage, and catalog tools.Experience in working in Agile teams and CI / CD deployment pipelines.Exposure to BI tools like Tableau or Power BI.We are seeking a Middle Data Engineer with proven expertise in AWS, Snowflake, and dbt to design and build scalable data pipelines and modern data infrastructure. You'll play a key role in shaping the data ecosystem, ensuring data availability, quality, and performance across business units.
Requirements :
4+ years of experience in Data Engineering roles.Experience with the AWS cloud platform.Proven experience with Snowflake in production environments.Hands-on experience building data pipelines using dbt .Python skills for data processing and orchestration.Deep understanding of data modeling and ELT best practices.Experience with CI / CD and version control systems (e.g., Git).Strong communication and collaboration skills.Design and build scalable data pipelines using Snowflake and dbt., Develop and maintain modern data infrastructure to support business needs., Ensure data availability, quality, and performance across various business units., Contribute to shaping the overall data ecosystem within the organization., Collaborate with cross-functional teams to support data-driven decision making.] Requirements : Python, AWS, Snowflake, dbt, SQL, Airflow, GCP, Azure, CI / CD Additionally : International projects, Competition of certification, Flexible working hours and remote work possibility, Active Tech community, Free coffee, Modern office.