Talent.com
This job offer is not available in your country.
Snowflake Data Engineer with ELT / ETL workflows @ Square One Resources

Snowflake Data Engineer with ELT / ETL workflows @ Square One Resources

Square One ResourcesRemote, Poland
30+ days ago
Job description

Project Description

We are looking for an experienced  Snowflake Data Engineer  to join a data-driven project focused on building and optimizing a robust cloud-based data infrastructure. This role involves designing and developing scalable data pipelines and data warehouse solutions within  Snowflake , with a strong emphasis on  performance, scalability , and  reliability . The position offers the opportunity to work in a cross-functional environment with data analysts, data scientists, and other engineering teams to support advanced analytics and reporting use cases.

Requirements

  • Bachelor’s or Master’s degree in  Computer Science, Engineering, Information Systems , or a related field
  • Minimum  4 years  of experience in  data engineering  or  data platform development
  • Proven hands-on expertise in  Snowflake , including  performance tuning ,  data modeling , and  advanced SQL
  • Proficiency in  SQL  and scripting languages such as  Python  or  Scala
  • Experience with  ETL / ELT frameworks  and  orchestration tools  (e.g.,  dbt, Airflow, Talend )
  • Familiarity with  cloud platforms  such as  AWS, Azure, or GCP
  • Strong understanding of  data warehousing concepts ,  star / snowflake schemas , and data normalization / denormalization
  • Experience working in  Agile  environments with tools like  Jira  and  Confluence

Nice to Have

  • Snowflake certifications  (e.g.,  SnowPro Core ,  SnowPro Advanced )
  • Understanding of  data governance  and  privacy regulations  such as  GDPR, HIPAA
  • Exposure to  machine learning workflows  and  streaming technologies  like  Kafka  or  Kinesis
  • Experience with  CI / CD practices , version control systems ( Git ), and  infrastructure-as-code  tools like  Terraform
  • Project Description

    We are looking for an experienced  Snowflake Data Engineer  to join a data-driven project focused on building and optimizing a robust cloud-based data infrastructure. This role involves designing and developing scalable data pipelines and data warehouse solutions within  Snowflake , with a strong emphasis on  performance, scalability , and  reliability . The position offers the opportunity to work in a cross-functional environment with data analysts, data scientists, and other engineering teams to support advanced analytics and reporting use cases.

    Design, implement, and optimize Snowflake data pipelines to meet analytical and business intelligence needs, Develop and maintain ELT / ETL workflows using tools such as dbt, Apache Airflow, Matillion, or equivalent, Model and manage data warehouse and data lake solutions with an emphasis on dimensional modeling and data partitioning best practices, Build and maintain secure, governed data environments, including the enforcement of access control and role-based policies, Monitor and optimize pipeline performance, ensuring cost-efficiency and high availability, Collaborate with cross-functional teams to gather requirements and translate them into scalable data solutions, Integrate Snowflake with external data sources such as AWS S3, Azure Data Lake, Kafka, and REST APIs, Troubleshoot data pipeline issues, ensuring data quality, lineage, and consistency, Automate and manage CI / CD pipelines for deploying and maintaining data infrastructure, Stay current with Snowflake features and industry best practices in modern data engineering] Requirements : Degree, Data engineering, Snowflake, Performance tuning, SQL, Scripting language, Python, Scala, ETL, dbt, Airflow, Cloud platform, AWS, Azure, GCP, Jira, Confluence, GDPR, Machine learning, Machine Learning, Kafka, Kinesis, CD, Version control system, Terraform

    Create a job alert for this search

    Data Engineer • Remote, Poland