Talent.com
This job offer is not available in your country.
Senior Data Enigneer (2328) @ N-iX

Senior Data Enigneer (2328) @ N-iX

N-iXKraków, Ukraine
14 days ago
Job description

Our customer is seeking to expand its Data Engineering team to stand up modern data platform for one of its portfolio companies in the financial services sector. The role requires expertise in Python, SQL (PostgreSQL, MySQL), Airflow, Snowflake, and AWS cloud experience.

This project involves managing financial assets owned by the company. It utilizes machine-learning models that operate on data ingested from third-party APIs. The process includes ELT (extract, load, transform), data modeling in Snowflake using DBT, training ML models using AWS SageMaker, running predictions, and storing predictions back in Snowflake.

As a Senior Data Engineer, you will design, build, and maintain scalable data pipelines and architectures for our cloud-based analytical platforms. You will collaborate closely with data scientists, analysts, and software engineering teams to deliver robust, high-quality data solutions that drive business decisions.

Project involvement plans :

Initially 5 months with a possibility of extension. Start in July 2025.

Required Skills and Qualifications :

Key Skills :

  • Python
  • Snowflake
  • Airflow
  • DBT data modeling
  • PostgreSQL, MySQL (or similar)

Technical Expertise :

  • Programming Languages : Advanced proficiency in Python for data engineering, data wrangling, and pipeline development.
  • Cloud Platforms : Hands-on experience working with AWS (S3, Glue, Redshift, Lambda, etc.).
  • Data Warehousing : Proven expertise with Snowflake – schema design, performance tuning, data ingestion, and security.
  • Workflow Orchestration : Production experience with Apache Airflow (Prefect, Dagster or similar), including authoring DAGs, scheduling workloads and monitoring pipeline execution.
  • Data Modeling : Strong skills in DBT (Data Build Tool), including writing modular SQL transformations, building data models, and maintaining DBT projects.
  • SQL Databases : Extensive experience with PostgreSQL, MySQL (or similar), including schema design, optimization, and complex query development.
  • Additional Competencies :

  • Version Control and CI / CD : Familiarity with Git-based workflows and continuous integration / deployment practices to ensure seamless code integration and deployment processes.
  • Communication Skills : Ability to articulate complex technical concepts to technical and non-technical stakeholders alike.
  • Tools

  • JIRA
  • Confluence
  • Must have :

  • Extensive experience with Python for data analysis
  • Declarative Data Modeling : Experience with modern tools like DBT for streamlined and efficient data modelling.
  • Minimum 5 years of professional experience in production environments, emphasizing performance optimization and code quality.
  • Our customer is seeking to expand its Data Engineering team to stand up modern data platform for one of its portfolio companies in the financial services sector. The role requires expertise in Python, SQL (PostgreSQL, MySQL), Airflow, Snowflake, and AWS cloud experience.

    This project involves managing financial assets owned by the company. It utilizes machine-learning models that operate on data ingested from third-party APIs. The process includes ELT (extract, load, transform), data modeling in Snowflake using DBT, training ML models using AWS SageMaker, running predictions, and storing predictions back in Snowflake.

    As a Senior Data Engineer, you will design, build, and maintain scalable data pipelines and architectures for our cloud-based analytical platforms. You will collaborate closely with data scientists, analysts, and software engineering teams to deliver robust, high-quality data solutions that drive business decisions.

    Project involvement plans :

    Initially 5 months with a possibility of extension. Start in July 2025.

    Design, implement, and maintain scalable data pipelines that support business analytics, reporting, and operational needs., Collaborate cross-functionally with analysts, engineers, and product teams to translate data requirements into efficient data models and pipelines., Ensure reliability and performance of data workflows by proactively monitoring, debugging, and optimizing data processes., Drive automation and testing practices in data workflows to maintain high code quality and deployment confidence., Contribute to architectural decisions involving cloud infrastructure, data warehousing strategies, and data governance policies.] Requirements : Kubernetes, Python, AWS, Snowflake, Airflow, dbt, Data modeling, MySQL, Data engineering, Cloud platform, AWS S3, Glue, Redshift, AWS Lambda, Security, Apache Airflow, SQL, Data models, PostgreSQL, Git, Continuous integration, Jira, Confluence, Data modelling Tools : Agile, Scrum. Additionally : International projects, Training budget, Mentoring program, Tech community, Free coffee, Modern offie, Kitchen.

    Create a job alert for this search

    Senior Data • Kraków, Ukraine