Talent.com
Data Engineer

Data Engineer

DevapoWrocław, Wrocław, Polska
Ponad 30 dni temu
Opis pracy

We are looking for an experienced Data Engineer responsible for planning, developing, and maintaining cloud environments for our clients.

About Devapo

At Devapo, we focus on continuous self-development and acquiring new knowledge. If you are a fast learner, want to participate in international projects, are a team player, and can work independently — join us!

We provide our clients with more than just code — we want to equip them with tools that allow their businesses to flourish. Our clients’ success is our success, which is why we ensure that everyone who creates Devapo has a long-term goal in mind.

Key Responsibilities :

Design, implement, and maintain scalable and efficient data pipelines in one of the cloud environments (Azure, AWS, GCP) using tools such as Databricks, Glue, Dataflow, or Azure Data Factory

Develop and optimize ETL / ELT processes using cloud-native services (e.g., Azure Data Factory, AWS Glue, GCP Dataflow) and Apache Spark / Databricks

Build Big Data solutions aligned with business and analytical requirements across cloud platforms

Collaborate with Data Science, BI, and development teams to deliver high-quality, well-structured, and performant data

Monitor and improve the performance, reliability, and scalability of data processing systems

Implement robust data governance, security standards, and best practices across cloud environments

Research and evaluate new tools and technologies within the cloud and data engineering ecosystem

Requirements :

Minimum 3 years of experience as a Data Engineer or in a similar role

Hands-on experience with one or more major cloud platforms (Azure, AWS, GCP); deep knowledge of cloud data services such as :

Azure Data Factory, Azure Data Lake, Synapse Analytics (Azure)

AWS Glue, S3, Redshift, Athena (AWS)

GCP Dataflow, BigQuery, Cloud Storage (GCP)

Extensive experience with Databricks and Apache Spark

Proficiency in SQL and experience with relational and columnar databases

Strong programming skills in Python and PySpark

Experience designing and optimizing data pipelines in distributed, cloud-based architectures

Familiarity with Delta Lake or other modern data lake architectures

Solid understanding of data modeling and schema design

What We Offer :

Salary : 17 800 - 21 500 PLN (B2B contract)

Co-financing for training and certifications, as well as guaranteed time for learning during working hours

Private medical care and a Multisport card

Language classes (English)

Flexible working hours and the possibility of hybrid work (Warsaw)

Team integration meetings and company events

Employee referral program with a bonus

An individually tailored career development path

Utwórz powiadomienie o ofertach pracy dla tego wyszukiwania

Data Engineer • Wrocław, Wrocław, Polska