Our customer is seeking to expand its Data Engineering team to stand up modern data platform for one of its portfolio companies in the financial services sector. The role requires expertise in Python, SQL (PostgreSQL, MySQL), Airflow, Snowflake, and AWS cloud experience.
This project involves managing financial assets owned by the company. It utilizes machine-learning models that operate on data ingested from third-party APIs. The process includes ELT (extract, load, transform), data modeling in Snowflake using DBT, training ML models using AWS SageMaker, running predictions, and storing predictions back in Snowflake.
As a Senior Data Engineer, you will design, build, and maintain scalable data pipelines and architectures for our cloud-based analytical platforms. You will collaborate closely with data scientists, analysts, and software engineering teams to deliver robust, high-quality data solutions that drive business decisions.
Project involvement plans :
Initially 5 months with a possibility of extension. Start in July 2025.
Required Skills and Qualifications :
Key Skills :
Technical Expertise :
Additional Competencies :
Tools
Must have :
Our customer is seeking to expand its Data Engineering team to stand up modern data platform for one of its portfolio companies in the financial services sector. The role requires expertise in Python, SQL (PostgreSQL, MySQL), Airflow, Snowflake, and AWS cloud experience.
This project involves managing financial assets owned by the company. It utilizes machine-learning models that operate on data ingested from third-party APIs. The process includes ELT (extract, load, transform), data modeling in Snowflake using DBT, training ML models using AWS SageMaker, running predictions, and storing predictions back in Snowflake.
As a Senior Data Engineer, you will design, build, and maintain scalable data pipelines and architectures for our cloud-based analytical platforms. You will collaborate closely with data scientists, analysts, and software engineering teams to deliver robust, high-quality data solutions that drive business decisions.
Project involvement plans :
Initially 5 months with a possibility of extension. Start in July 2025.
Design, implement, and maintain scalable data pipelines that support business analytics, reporting, and operational needs., Collaborate cross-functionally with analysts, engineers, and product teams to translate data requirements into efficient data models and pipelines., Ensure reliability and performance of data workflows by proactively monitoring, debugging, and optimizing data processes., Drive automation and testing practices in data workflows to maintain high code quality and deployment confidence., Contribute to architectural decisions involving cloud infrastructure, data warehousing strategies, and data governance policies.] Requirements : Kubernetes, Python, AWS, Snowflake, Airflow, dbt, Data modeling, MySQL, Data engineering, Cloud platform, AWS S3, Glue, Redshift, AWS Lambda, Security, Apache Airflow, SQL, Data models, PostgreSQL, Git, Continuous integration, Jira, Confluence, Data modelling Tools : Agile, Scrum. Additionally : International projects, Training budget, Mentoring program, Tech community, Free coffee, Modern offie, Kitchen.
Senior Data • Kraków, Ukraine