Working as a Senior Data Engineer with Snowflake you will :
- Design and implement modern data warehousing and analytics solutions using Snowflake.
- Develop ELT / ETL pipelines to ingest and process structured and semi-structured data.
- Optimize Snowflake warehouse performance, manage scaling, and monitor resource usage.
- Implement data models and schemas aligned with business intelligence and analytics needs.
- Collaborate closely with Data Architects, BI Developers, and Analysts.
- Apply best practices for version control, CI / CD, testing, and documentation in data engineering.
- Build data integrations using tools like DBT, Airflow, or similar orchestration platforms.
- Enforce data quality, security, governance, and privacy standards throughout data pipelines.
About Chabre IT Services
Chabre IT Services is a global professional IT services provider, building long-lasting relationships with Enterprises. We specialize in the delivery of tailor-made solutions, smart outsourcing, try&hire, and success fee services. We are a smart IT boutique with unique knowledge, which will deliver your ideas into reality.
About our Client
Our client is a global technology and management consultancy specializing in driving digital transformation in the financial services industry. It operates at the intersection of business and technology by combining innovative thinking with unrivalled industry knowledge to deliver end-to-end data-driven solutions and fast-track digital initiatives for banking and payments, capital markets, wealth and asset management, insurance, and the energy sector.
Qualifications
Have at least 3 years of experience in data engineering, with 2+ Snowflake implementations in production.Demonstrate strong proficiency in Snowflake-specific features : virtual warehouses, streams, tasks, time travel, and zero-copy cloning.Be fluent in SQL and scripting languages such as Python or JavaScript for data workflows.Know how to design efficient data warehouse models (e.g., star / snowflake schemas).Have experience with data orchestration tools (Airflow, DBT, Prefect, etc.).Understand cloud platforms (preferably AWS, Azure, or GCP) and how Snowflake integrates with them.Be familiar with DevOps / dataops principles, Git, CI / CD for data pipelines.Apply best practices for data governance, including role-based access control and metadata management.Communicate fluently in English.We offer
Rate up to 220,00 PLN / h + VATRemote workSubsidy for peripherals in the amount of 500,00złWorking tool (MacBook Pro or Lenovo Legion 5)Co-financing of courses related to the positionBenefits : MultiSport, Medicover