Requirements
What makes you a great match
- Proficient in SQL : complex queries, CTE, window functions, analytics;
- Deep understanding of DWH concepts : ETL, ELT, Data Vault, Kimball, Star / Snowflake schema;
- Experience with Airflow / dbt or other pipeline orchestrators;
- Proficiency in one or more DWH platforms : BigQuery, Snowflake, Redshift, ClickHouse, Vertica, etc.
- Proficient use of GIT, experience in helping colleagues with Git flow (merge conflicts, rebase, pull requests);
- Knowledge of Python or other scripting languages for transformations;
- Understanding of server infrastructure : basic skills in configuring, maintaining, monitoring resources and load control.
1. DWH architecture and design
Designing Data Warehouse architecture for current and future business needs;Development of schemes taking into account performance, scalability and support for data historicity (SCD, snapshot);Defining standards and best practices for data storage, transformation and access;Participation in planning cloud migrations.2. Organization of ETL / ELT processes
Building, maintaining and optimizing ETL / ELT pipelines (Airflow, dbt, custom solutions);Implementation of incremental updates, CDC, backfill, reprocessing;Control and automation of data lineage, logging, alerting.3. Performance optimization
Deep optimization of queries, tables, DAGs;Implementation of batches, indexes, materialized views, clustering;Server resource management, load monitoring and balancing;Building ETL process performance metrics and regular auditing.4. Data quality control and reliability
Implementation of data validation, anomaly detection, reconciliation;Setting up automatic tests for ETL processes;Managing backup & recovery policies;Identifying and eliminating problems with duplicates, null values, data drifts.5. Integration of new data sources
Evaluation and connection of external APIs, raw sources, third-party databases;Harmonization of formats, update frequency, transformation logic;Adaptation of database schemas to new sources without disrupting current processes.6. DevOps and automation
Automation of deployments, testing, CI / CD for data;Working with Docker, Kubernetes, cloud infrastructure (GCP, AWS, Azure);Working with Terraform;Using GIT and code review processes to manage pipelines.7. Mentoring and coordination
Code review, support and training of junior / middle engineers;Implementation of documentation, templates, onboarding instructions;Collaboration with analysts, developers, BI team;Work with business customers to understand their needs and transform them into technical requirements.Benefits
Flexible payment options : choose the method that works best for you.
Tax assistance included : we handle part of your taxes and provide guidance on the local setup.
Financial perks : Bonuses for holidays, B-day, work milestones and more - just to show we care.
Learn & grow : We cover courses and certifications — and offer real opportunities to grow your career with us.
Benefit Сafeteria : Choose what suits you — sports, language courses, therapy sessions, and more.
Stay connected : From team-building events to industry conferences — we bring people together online, offline, and on stage.
Modern Equipment : We provide new laptops along with essential peripherals like monitors and headphones for a comfortable workflow.
Your schedule, your rules : Start your day at 9, 10, or even 11 — we care about results, not clock-ins.