Job description Role : Data EngineerLocation : PolandType of work : remoteResponsibilities : Design, develop, and maintain scalable data pipelines using Python, ADF, and Databricks Implement ETL process to extract, transform, and load data from various sources into SnowflakeEnsure data is processed efficiently and is made available for analytics and reportingRequirements :
- 8+ years of experience in data engineering, with a focus on Python, ADF, Snowflake, Databricks, and ETL processes.
- Strong experience with data modeling, data warehousing, and database design.
- Proficiency in SQL and experience with cloud-based data storage and processing.
- Strong problem-solving skills and the ability to work in a fast-paced environment
- Excellent communication skills, with the ability to work directly with customers and understand their needs
- Experience with Agile methodologies and working in a collaborative team environment.
- Certification in Snowflake, Azure, or other relevant technologies is an added advantage
- Bachelor’s degree in computer science engineering, Information Systems or equivalent field