Join our team in building a modern, high-impact Analytical Platform for one of the largest integrated resort and entertainment companies in Southeast Asia. This platform will serve as a unified environment for data collection, transformation, analytics, and AI-driven insights—powering decisions across marketing, operations, gaming, and more.
You’ll work closely with Data Architects, Data Engineers, Business Analyst and DevOps Engineers to design and implement scalable data solutions.
Requirements
- Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
- 3+ years of experience in DataOps, DevOps, or Data Engineering roles.
- Proficiency in scripting languages (Python, Bash, etc.).
- Strong experience with orchestration tools (, Apache Airflow, Prefect, or Dagster).
- Hands-on experience with cloud platforms (, AWS, GCP, Azure) and cloud-native data tools.
- Familiarity with CI / CD tools (, GitLab CI, Jenkins, CircleCI).
- Knowledge of containerization and orchestration technologies (Docker, Kubernetes).
- Experience with infrastructure-as-code tools (Terraform, CloudFormation).
- Strong understanding of data privacy, security, and compliance practices
- Experience with modern data warehouses (, Snowflake, Redshift, Yellowbrick) and ETL / ELT tools.
- Understanding of data governance, metadata management, and data cataloging tools.
- Experience collaborating in Agile / Scrum teams and working with version-controlled data models (, via Git).
Nice to have skills
Experience with real-time data processing (, Kafka, Spark Streaming).Familiarity with data observability platforms (, Monte Carlo, Datadog, Great Expectations).Experience working in regulated industries (, gaming, finance, hospitality).Responsibilities :
Design, build, and manage CI / CD pipelines for data applications, models, and pipelines.Develop and maintain infrastructure-as-code (IaC) for data platform components.Automate data quality checks, validation, and monitoring processes.Collaborate with data engineers and analysts to optimize data ingestion and transformation pipelines.Implement robust logging, alerting, and observability tools for data pipelines.Manage orchestration frameworks (, Airflow) and ensure timely execution of workflows.Maintain compliance with data governance, privacy, and security policies.Support and troubleshoot production data issues and infrastructure outages.Benefits
35 absence days per year for work-life balanceUdemy courses of your choiceEnglish courses with native-speakerRegular soft-skills trainingsExcellence Сenters meetupsOnline / offline team-buildingsBusiness trips