What you’ll do at Jamf :
Business Intelligence at Jamf powers data-driven decision-making across the organisation. As a Data Engineer II , you’ll be responsible not just for building & transforming data, but for owning critical data infrastructure : from ingestion and storage, to governance, quality, and consumption by analytics / ML tools. You will partner with analysts, data scientists, product owners and engineers to ensure that Jamf’s data assets are reliable, high-performance, secure, and scalable.
This role is offered as hybrid. We are only able to accept applications for those based in Poland and have sponsorship to live and work in Poland .
What you’ll do at Jamf :
Business Intelligence at Jamf powers data-driven decision-making across the organisation. As a Data Engineer II , you’ll be responsible not just for building & transforming data, but for owning critical data infrastructure : from ingestion and storage, to governance, quality, and consumption by analytics / ML tools. You will partner with analysts, data scientists, product owners and engineers to ensure that Jamf’s data assets are reliable, high-performance, secure, and scalable.
This role is offered as hybrid. We are only able to accept applications for those based in Poland and have sponsorship to live and work in Poland .
,[Design, build, maintain, and improve the data platform infrastructure (Snowflake environments, airflow workflows, orchestration, CI / CD pipelines for dbt / transformations)., Develop and maintain Terraform (or equivalent IaC) definitions for provisioning data infrastructure (compute, storage, permissions, networking where needed)., Automate deployment of data transformations (e.g. dbt CI / CD, staging / production pipelines)., Ensure data platform availability, reliability, security and performance (e.g. enforce roles & permissions in Snowflake, resource monitoring, concurrency / usage optimisation)., Instrument monitoring, logging and alerting of data workflows (Airflow / Kubernetes / dbt jobs)., Collaborate with Data Engineers / Analysts / Architects to define platform capabilities, set standards & best practices around schema design, governance, version control, and performance., Run capacity planning, ensure cost-efficiency, scaling strategy (e.g. concurrency limits in Snowflake warehouse sizing, cluster autoscaling, etc)., Facilitate onboarding of teams to the data platform : document usage patterns, create templates or utilities (for example dbt macros, shared libraries)., Participate in architecture reviews, evaluate new platform tooling (e.g. enhancements to orchestration, transformation frameworks, security strategy, etc)., Troubleshoot critical incidents and participate in incident / post-mortem cycles for platform issues.] Requirements : Python, SQL, IaC, Docker, Kubernetes, AWS, dbt Tools : Jira, ServiceNow. Additionally : International projects, Small teams, Apple equipment, Training budget, Private healthcare, Flat structure, Free coffee, Bike parking, Playroom, Shower, Free snacks, Modern office, No dress code, Kitchen.
Data Engineer • Katowice, Poland