Nice to have :
,[Design, build, and maintain scalable data pipelines (ETL / ELT) leveraging Snowflake and Airflow, Implement optimized schemas, partitioning, and indexing strategies in Snowflake and relational databases, Develop data processing workflows and automation scripts in Python and SQL; integrate with APIs and microservices, Ensure scalability, performance, and resilience of pipelines; implement observability for jobs and data flows, Partner with data scientists and ML engineers to deliver high-quality datasets optimized for AI / ML workloads, Prepare, transform, and manage datasets for embeddings, RAG workflows, and LLM fine-tuning] Requirements : Snowflake, Python, AWS, ETL / ELT, CI / CD, Docker, Kubernetes, SQL, NoSQL, Vector Databases, Kafka, Kinesis, PySpark Additionally : Sport subscription, Training budget, Private healthcare, Flat structure, International projects, Free coffee, Playroom, Bike parking, Free snacks, Free beverages, In-house trainings, Modern office, No dress code.
Senior Data Engineer • Łódź, Poland