Talent.com
This job offer is not available in your country.
GCP Data Engineer @ Antal

GCP Data Engineer @ Antal

AntalKraków, Poland
20 days ago
Job description

Are you ready to build impactful solutions on a global scale? Join a forward-thinking team that powers critical risk calculations in one of the world's leading financial institutions.

About the Role

We are looking for a talented  GCP Data Engineer  with a strong Java background to join the  STAR  platform team. STAR is HSBC’s strategic cloud-native platform designed to generate and deliver risk factor definitions, historical market data, and scenarios for Value at Risk (VaR) and Expected Shortfall (ES) calculations.

The platform leverages data pipelines and microservices, combining both real-time and batch processing to handle large-scale datasets. You’ll be joining a global team of developers within the Global Traded Risk Technology department, working in an open, inclusive, and innovation-driven environment.

  • Strong proficiency in  Java  and  Spring Boot
  • Understanding of key software design principles :   KISS ,  SOLID ,  DRY
  • Hands-on experience building  data processing pipelines  (preferably with Apache Beam)
  • Experience designing and building  RESTful APIs
  • Familiarity with  relational and NoSQL databases , especially  PostgreSQL  and  Bigtable
  • Basic knowledge of  DevOps  and CI / CD tools, including  Jenkins  and  Groovy scripting
  • Experience with integration frameworks and patterns (e.g.,  Saga ,  Lambda )
  • Strong problem-solving and analytical skills
  • Excellent communication skills and ability to thrive in a collaborative team environment

Nice to Have

  • Experience with  Google Cloud Platform (GCP)  services : GKE, Cloud SQL, Dataflow, Bigtable
  • Familiarity with  OpenTelemetry ,  Prometheus ,  Grafana
  • Knowledge of  Kubernetes ,  Docker , and  Terraform
  • Messaging / streaming experience with  Kafka
  • UI experience with  Vaadin
  • Exposure to  Apache Beam in large-scale data environments
  • Are you ready to build impactful solutions on a global scale? Join a forward-thinking team that powers critical risk calculations in one of the world's leading financial institutions.

    About the Role

    We are looking for a talented  GCP Data Engineer  with a strong Java background to join the  STAR  platform team. STAR is HSBC’s strategic cloud-native platform designed to generate and deliver risk factor definitions, historical market data, and scenarios for Value at Risk (VaR) and Expected Shortfall (ES) calculations.

    The platform leverages data pipelines and microservices, combining both real-time and batch processing to handle large-scale datasets. You’ll be joining a global team of developers within the Global Traded Risk Technology department, working in an open, inclusive, and innovation-driven environment.

    Translate complex business requirements into secure, scalable, and high-performance data solutions, Design and implement performant data processing pipelines (batch and streaming), Develop REST APIs and data ingestion patterns in a cloud-native architecture, Integrate internal systems with a focus on cost optimization and fast data processing, Modernize and enhance existing pipelines and microservices, Create and maintain solution blueprints and documentation, Conduct peer code reviews and provide constructive feedback, Promote test-centric development practices including unit and regression tests, Ensure consistent logging, monitoring, error handling, and automated recovery aligned with industry standards, Collaborate closely with engineers, analysts, and stakeholders across regions] Requirements : Java, Spring Boot, PostgreSQL, NoSQL, DevOps, CD tools, Jenkins, Groovy, DRY, AWS Lambda, Apache Beam, Cloud, REST API, Google Cloud Platform, Prometheus, Grafana, Kubernetes, Docker, Terraform, Kafka, UI, Vaadin

    Create a job alert for this search

    Data Engineer • Kraków, Poland