Why join GFT?
You will work with and learn from top IT experts. You will join a crew of experienced engineers : 60% of our employees are senior level.
Interested in the cloud? You will enjoy our full support in developing your skills : training programs, certifications and our internal community of experts. We have strong partnerships with top cloud providers : Google, Amazon and Microsoft - we are number one in Poland in numbers of GCP certificates. Apart from GCP, you can also develop in AWS or Azure.
We are focused on development and knowledge sharing. Internal expert communities provide a comfortable environment where you can develop your skillset in areas such as blockchain, Big Data, cloud computing or artificial intelligence.
You will work in a stable company (32 years on the market) in demanding and challenging projects for the biggest financial institutions in the world.
We offer you :
Social events
Why join GFT?
You will work with and learn from top IT experts. You will join a crew of experienced engineers : 60% of our employees are senior level.
Interested in the cloud? You will enjoy our full support in developing your skills : training programs, certifications and our internal community of experts. We have strong partnerships with top cloud providers : Google, Amazon and Microsoft - we are number one in Poland in numbers of GCP certificates. Apart from GCP, you can also develop in AWS or Azure.
We are focused on development and knowledge sharing. Internal expert communities provide a comfortable environment where you can develop your skillset in areas such as blockchain, Big Data, cloud computing or artificial intelligence.
You will work in a stable company (32 years on the market) in demanding and challenging projects for the biggest financial institutions in the world.
We offer you :
Social events
,[Openness to work in a hybrid model (2 days from the office per week), Openness to visiting the client's office in Cracow once every two months (for 3 days), Design, develop, and maintain scalable data pipelines and solutions using Big Data technologies (e.g., Apache NiFi, Kafka, Spark, Hive, HDFS), Implement event-driven and microservices-based architectures in cloud environments, preferably Google Cloud Platform (GCP), Automate data flows and build near real-time data pipelines using Apache NiFi and Kafka, Work with large-scale datasets from structured and unstructured sources, utilizing the Hadoop ecosystem (HDFS, Hive) for data storage and analysis., Leverage RDBMS and NoSQL databases to support data ingestion, transformation, and querying, Write clean, maintainable, and efficient code following best practices (DRY, KISS, SOLID, Clean Code), Use Python, SQL, and Shell scripting for data processing and orchestration, Collaborate with cross-functional teams to deliver secure and reliable software solutions aligned with business needs, Contribute to the development of enterprise-scale data solutions in a fast-paced, agile environment, Engage in daily communication in English within a global team] Requirements : Spark, Hadoop, Apache Kafka, Apache Nifi, Apache Spark, Apache Hive, Apache HDFS, Apache Oozie, SQL, Python, Google SDK, REST API, Linux, GCP, BigQuery, DataProc, Java, Spring Boot Additionally : Home office, Knowledge sharing, Life insurance, Sport subscription, Training budget, Private healthcare, International projects, Integration events, English lessons, Platforma Mindgram, Free coffee, Playroom, Free snacks, Free beverages, In-house trainings, In-house hack days, Modern office, Free fruits.
Senior Data Engineer • Wrocław, Poland