SQUARE ONE RESOURCES sp. z o.o.Warszawa, Masovian, Poland
30+ days ago
Job description
technologies-expected :
Snowflake Data Cloud
Apache Airflow
Kafka
Python
talend
about-project :
We are looking for an experienced Snowflake Data Engineer to join a data-driven project focused on building and optimizing a robust cloud-based data infrastructure. This role involves designing and developing scalable data pipelines and data warehouse solutions within Snowflake, with a strong emphasis on performance, scalability, and reliability. The position offers the opportunity to work in a cross-functional environment with data analysts, data scientists, and other engineering teams to support advanced analytics and reporting use cases.
responsibilities :
Design, implement, and optimize Snowflake data pipelines to meet analytical and business intelligence needs
Develop and maintain ELT / ETL workflows using tools such as dbt, Apache Airflow, Matillion, or equivalent
Model and manage data warehouse and data lake solutions with an emphasis on dimensional modeling and data partitioning best practices
Build and maintain secure, governed data environments, including the enforcement of access control and role-based policies
Monitor and optimize pipeline performance, ensuring cost-efficiency and high availability
Collaborate with cross-functional teams to gather requirements and translate them into scalable data solutions
Integrate Snowflake with external data sources such as AWS S3, Azure Data Lake, Kafka, and REST APIs
Troubleshoot data pipeline issues, ensuring data quality, lineage, and consistency
Automate and manage CI / CD pipelines for deploying and maintaining data infrastructure
Stay current with Snowflake features and industry best practices in modern data engineering
requirements-expected :
Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field
Minimum 4 years of experience in data engineering or data platform development
Proven hands-on expertise in Snowflake, including performance tuning, data modeling, and advanced SQL
Proficiency in SQL and scripting languages such as Python or Scala
Experience with ETL / ELT frameworks and orchestration tools (e.g., dbt, Airflow, Talend)
Familiarity with cloud platforms such as AWS, Azure, or GCP
Strong understanding of data warehousing concepts, star / snowflake schemas, and data normalization / denormalization
Experience working in Agile environments with tools like Jira and Confluence