The person in this role will be responsible for developing the team and building competencies in the area of Data Platform Administration, with a particular focus on the hands-on implementation of solutions at the intersection of DataOps and DevOps. This role combines expertise in CI / CD process automation, containerization, and infrastructure as code. We offer the opportunity to work in an international environment where cutting-edge technology meets real-world impact in the development and operation of data platforms.
Warsaw Remote / Hybrid
Requirements
- Understanding of DevOps practices, particularly in the automation of : Code delivery to environments (build, deployment), Code quality verification (static code analysis, unit / integration / regression testing), Environment monitoring
- Hands-on experience with CI / CD and Infrastructure as Code (IaC), including building and maintaining pipelines (e.g., Azure Pipelines, Bitbucket Pipelines, GitHub Actions, GitLab Pipelines, Jenkins, TeamCity, Travis CI) and IaC tools (Azure Resource Manager, Terraform)
- Proficiency in programming languages, especially Python and SQL
- 1+ experience with containerization and orchestration technologies (Docker, Kubernetes); familiarity with Azure App Services or AWS Elastic Beanstalk is a plus
- Practical knowledge of Apache Spark and related technologies (e.g., PySpark, Scala, Spark SQL)
- Familiarity with tools supporting data flow management, such as Delta Lake, MLflow, and integration with data warehouses (e.g., Azure Synapse Analytics)
- Ability to design and implement analytics solutions using Azure Databricks
- Scripting skills and working knowledge of at least one scripting language (e.g., PowerShell, Bash)
- Basic knowledge of Windows / Linux system administration
- Strong sense of ownership, self-reliance, and responsibility in task execution
- Practical command of English at a minimum B2 level (C1+ preferred)
Responsibilities
Designing and implementing automated CI / CD pipelines (build, deploy, test)Implementing Infrastructure as Code (Azure Resource Manager, Terraform) and provisioning environments from scratchDeploying and managing applications in Microsoft Service FabricImplementing Azure Databricks deployments, including ETL flow orchestration and Big Data solutionsAutomating development processes and monitoring environment stabilityProviding technical consultation to clients, analyzing requirements, and supporting business teamsAdministering Windows / Linux systems and maintaining development tooling supportWe offer
Global projects in clouds — we work with clients from all over the world based on modern cloud technologiesCertification reimbursement — we fund exams, certifications from Microsoft, AWS, Databricks, SnowflakeTime to learn — 60 paid hours per yearFlexible approach — you can choose to work from home or meet at our officesPersonalized benefits — medical care, subsidized sports packages, language tuition, new employee referral bonus (up to PLN 15,000) as well as annual and media bonusIf you are interested in the offer, send your CV
I believe that clear and honest communication is the determinant of successful cooperation. Through achieving those, we build a strong and cohesive team. If you are a candidate who values open and direct communication, then we would love to hear your questions about the company!
Technical competence, initiative, ability to innovate and problem solve - those are qualities important for every DevOps specialist and things I look for in my team. Our job is to make sure everything runs smoothly and to the highest standard. If you think that's you - join us!
#J-18808-Ljbffr