Start date : ASAP / 1 month / flexible
Duration : Long term (36 months with further extensions)
Work model : hybrid, min. 2 days per week work from the Wroclaw's office
Type of cooperation : B2B
Overview
We are looking for a skilled Data Engineer to join a team focused on transforming industrial data into valuable insights. The team processes data from machines and factories to deliver customized data products to clients worldwide. The goal is to help organizations become fully data-driven by unlocking the potential of their data. You’ll work with diverse data sources and technologies in a dynamic and collaborative environment.
Responsibilities
- Design, develop, and maintain data pipelines using Azure Databricks and Informatica Power Center
- Work with large-scale datasets using PySpark and Spark SQL
- Collaborate with stakeholders from both IT and business to understand data needs and deliver solutions
- Implement and manage data workflows, jobs, and cataloging using Unity Catalog
- Ensure data quality, security, and governance using Azure Key Vault and DevOps practices
- Contribute to continuous improvement of data engineering practices and tools
Requirements
Proficiency in Informatica Power CenterStrong experience with Azure Databricks (PySpark, Spark SQL, Unity Catalog, Jobs / Workflows)Advanced SQL skillsHands-on experience with at least one relational DBMS (SQL Server, Oracle, or PostgreSQL)Familiarity with Azure DevOps (Repos, Pipelines, YAML)Knowledge of Azure Key VaultOptional : experience with Azure Data Factory and DBTSoft skills : open-minded, engaged, flexible, proactive, and collaborativeBonus : experience working in a data mesh environmentWe offer
B2B contract via ExperisAccess to Medicover healthcareMultisport cardE-learning platform for continuous developmentGroup insurance#J-18808-Ljbffr