What will you do?
As a Senior DevOps Engineer (GCP) you will design and develop DevOps / CICD / DataOps processes and pipelines and ensure the seamless integration of development and operations processes.
Your tasks :
- Design, implement, and maintain Continuous Integration (CI), Continuous Delivery (CD) and Continuous testing pipelines across all application components, data models, transformations, etc.
- Workout, automate and streamline workflows and processes, especially those related to delivery and operational efficiency
- Automate Infrastructure provisioning and configuration management
- Vulnerabilities and deficiencies investigation and resolution
- Maintaining Solution security and client standards compliance and up-to-date in automated manner
- Collaborate with Data Engineers, QA, Business teams to define and implement DevOps / DataOps best practices implementation
- Mentor and guide junior DevOps engineers, fostering a culture of continuous improvement
- System Monitoring and Failures Response / Resolution / Automation
Requirements :
Minimum of 4 years’ experience in a DevOps / DevSecOps Engineer or SRE role, ideally for similar GCP based Data solutions (e.g. cloud-based data warehouse / lakes)Hands on experience with DevOps tools like Ansible Tower / Jenkins / IaC(e.g. terraform) / etc. for designing, implementing, and maintaining CI / CD pipelinesStrong skills in scripting languages like Python, Bash, GroovyProficiency with Git as version control system, integration with DevOps tools like Ansible Tower, Jenkins etc.Experience in working in Agile environment and toolsetStrong problem-solving and analytical skillsEnthusiastic willingness to learn and develop technical and soft skills as needs require rapidly and independently.Strong organisational and multi-tasking skillsNice to have :
Experience in Data Vault modelling and / or usageExperience designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools, including ingestion and parsing and wrangling of CSV, JSON, XML etc formatted data from RESTful & SOAP APIs, SFTP servers, etc.Modern world data contract best practices understanding with experience for independently directing, negotiating, and documenting best in class data contracts.Java development, testing and deployment skills (ideally custom plugins for Data Fusion)SQL querying and optimization of complex queries / transformation in BigQuery, with a focus on cost, time-effective SQL coding and concurrency / data integritySQL Data Transformation / ETL / ELT pipelines development, testing and implementation, ideally in GCP DatafusionExperience in Cloud Composer / Airflow, Cloud Run, Pub / SubWe offer :
Working in a highly experienced and dedicated teamCompetitive salary and extra benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)Permanent or B2B contract after 3-months' probation periodOn-line training and certifications fit for career pathFree on-line foreign languages lessonsRegular social eventsAccess to e-learning platformErgonomic and functional working space with 2 monitors