Talent.com
Ta oferta pracy nie jest dostępna w Twoim kraju.
Data Engineer @ RITS Group

Data Engineer @ RITS Group

RITS GroupRemote, Poland
Ponad 30 dni temu
Opis pracy

We're looking for Data Engineer to join Data Engineering Team of our NYSE-listed client. Main focus of Data Engineer is integration of existing datalake, ETL processes, working in AWS Cloud environment. Main stack of the projects is : Python, AWS, Redshift, Snowflake, Athena, Python, Kafka.

We offer :

  • Long term cooperation
  • B2B contract
  • Fully remote job
  • Work in international environment in ET working hours (at least 6 hrs of overlap)
  • Rate 50-60USD / h net
  • Multisport card or private healthcare insurance
  • 3+ years of experience in a Data Platform Engineering role
  • Strong software engineering experience and working with Python
  • Strong experience working with SQL and databases / engines such as MySQL, PostgreSQL, SQL Server, Snowflake, Redshift, Presto, etc
  • Experience building ETL and stream processing pipelines using Kafka, Spark, Flink, Airflow / Prefect, etc.
  • Familiarity with data science stack : e.g. Juypter, Pandas, Scikit-learn, Dask, Pytorch, MLFlow, Kubeflow, etc.
  • Experience with using AWS / GCP (S3 / GCS, EC2 / GCE, IAM, etc.), Kubernetes and Linux in production.
  • Strong proclivity for automation and DevOps practices
  • Experience with managing increasing data volume, velocity and variety
  • Agile, self-starter and is focused on getting things done
  • Ability to deal with ambiguity
  • Strong communicator
  • Participate in on-call outside of regular business hours

Nice to have

  • Development skills in C++, Java, Go, Rust
  • Understands TCP / IP and distributed systems
  • Experience managing time series data
  • Familiarity with working with open source communities
  • Financial Services experience
  • We're looking for Data Engineer to join Data Engineering Team of our NYSE-listed client. Main focus of Data Engineer is integration of existing datalake, ETL processes, working in AWS Cloud environment. Main stack of the projects is : Python, AWS, Redshift, Snowflake, Athena, Python, Kafka.

    We offer :

  • Long term cooperation
  • B2B contract
  • Fully remote job
  • Work in international environment in ET working hours (at least 6 hrs of overlap)
  • Rate 50-60USD / h net
  • Multisport card or private healthcare insurance
  • Build and run client’s data platform using such technologies as public cloud infrastructure (AWS), Kafka, databases and containers , Develop Tradeweb’s data science platform based on open source software and Cloud services , Build and run ETL pipelines to onboard data into the platform, define schema, build DAG processing pipelines and monitor data quality. , Help develop machine learning development framework and pipelines , Manage and run mission crucial production services. ] Requirements : Python, SQL, MySQL, PostgreSQL, Snowflake, Redshift, Presto, ETL, Kafka, Spark, Flink, Airflow, Data science, pandas, scikit-learn, PyTorch, MLflow, Kubeflow, AWS, GCP, AWS S3, AWS EC2, IAM, Kubernetes, Linux, DevOps, C++, Java, Go, Rust, TCP, Open source

    Utwórz powiadomienie o ofertach pracy dla tego wyszukiwania

    Data Engineer • Remote, Poland