What will you do?
For or our Client, a world-class financial institution, we are currently looking for a Data Engineer with Scala and Python skills.
Join a newly formed engineering team working on a strategic data analytics platform for risk management in a global financial environment. You'll contribute to the development of a distributed, cloud-ready solution processing massive volumes of risk and market data.
This initiative is part of a wider transformation program aiming to modernize risk analytics through big data technologies, advanced computation, and cloud-native architecture. You will work alongside teams in Poland, the UK and Asia, with close collaboration between software engineers, risk analysts, and product teams.
The platform combines Spark-based distributed processing, OLAP analytics, modern APIs, and DevOps-first culture. If you enjoy solving complex data engineering challenges in high-stakes environments, this role is for you.
Openness to work in a hybrid model : 2 days per week from the office in Kraków at the beginning of the knowledge transfer period; afterwards, it can be reduced to 2 days per month.
Your tasks
- Design, develop and maintain scalable big data solutions in a hybrid cloud / on-prem architecture
- Build and optimize batch and real-time data processing pipelines using Spark, Python / Java and Scala
- Integrate analytics libraries and APIs to support interactive data querying and risk calculations
- Collaborate with international teams across business, analytics and IT
- Ensure best practices in testing (unit, integration, performance), DevOps and CI / CD
- Participate in incident resolution, monitoring and platform maintenance
- Contribute to architectural decisions and continuous improvement of the system
Your skills
Strong experience in building distributed data systems (Spark-based)Solid programming skills in Scala and PythonBasic knowledge in Java (ability to review and understand code)Familiarity with Spring Boot, Microservices, and REST APIsSolid knowledge of SQL and RDBMS (e.g., PostgreSQL)Experience with Apache Airflow, Linux, Git, MavenComfort with CI / CD pipelines (e.g., Jenkins, Github Actions, Ansible)Cloud experience : GCP / AWSExperience working in Agile / Scrum teamsGood communication skills and fluency in English (spoken and written)Nice to have
Experience with OLAP / data lakehouse tools (e.g., Druid, Clickhouse, Trino)Knowledge of stream processing (e.g., Flink, Kafka, Beam)Background in financial risk or enterprise-scale analytics platformsWe offer you
Strategic, long-term proj ect in a global financial environmentInternational team, modern tech stack, and high data complexityOpportunities for growth in big data, cloud, and analyticsWorking in a highly experienced and dedicated teamExtra benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)Contract of employment or B2B contractOn-line training and certifications fit for career pathSocial eventsAccess to e-learning platformErgonomic and functional working space