Role overview :
We are seeking a Senior Data Engineer with expertise in Snowflake, DBT, and cloud data engineering to design, build, and optimize scalable data pipelines across Roche's data ecosystem. This role will focus on data integration, transformation, automation, and performance optimization to enable high-quality insights and seamless access to data across commercial and foundational platforms.
Key responsibilities :
- Design, develop, and optimize ETL / ELT data pipelines leveraging DBT, Python, and Snowflake.
- Implement best practices for data modeling, partitioning, and performance tuning in Snowflake.
- Ensure data quality, security, and governance by implementing masking, row-level security, and audit controls.
- Automate data workflows and orchestration using Airflow, Prefect, or similar tools.
- Work closely with data architects, engineers, and business teams to define and implement scalable data solutions.
- Optimize and manage cloud-based data integration across AWS, Snowflake, and Roche ecosystems.
Required skills & experience :
8+ years of experience in data engineering, cloud platforms, and large-scale data processing.Strong expertise in Snowflake, DBT, Python, and SQL.Experience with data lake architectures, batch & streaming data processing, and cloud-native services.Knowledge of data governance frameworks, access control, and security best practices.Strong analytical and problem-solving skills with a business-driven mindset.Experience working in agile, cross-functional teams to deliver high-impact data solutions.