Correct Context is looking for a Scala / Spark Big Data Developer for Comscore in Poland and around.
Comscore is a global leader in media analytics, revolutionizing insights into consumer behavior, media consumption, and digital engagement.
Comscore leads in measuring and analyzing audiences across diverse digital platforms. Thrive on using cutting-edge technology, play a vital role as a trusted partner delivering accurate data to global businesses, and collaborate with industry leaders like Facebook, Disney, and Amazon. Contribute to empowering businesses in the digital era across media, advertising, e-commerce, and technology sectors.
We offer :
If you don't have all the qualifications, but you're interested in what we do and you have a solid Linux understanding ->
let's talk!
The recruitment process for the Scala / Spark Big Data Developer position has following steps :
The candidate must have :
Correct Context is looking for a Scala / Spark Big Data Developer for Comscore in Poland and around.
Comscore is a global leader in media analytics, revolutionizing insights into consumer behavior, media consumption, and digital engagement.
Comscore leads in measuring and analyzing audiences across diverse digital platforms. Thrive on using cutting-edge technology, play a vital role as a trusted partner delivering accurate data to global businesses, and collaborate with industry leaders like Facebook, Disney, and Amazon. Contribute to empowering businesses in the digital era across media, advertising, e-commerce, and technology sectors.
We offer :
If you don't have all the qualifications, but you're interested in what we do and you have a solid Linux understanding ->
let's talk!
The recruitment process for the Scala / Spark Big Data Developer position has following steps :
Design, implement, and maintain petabyte-scale Big data pipelines using Scala, Spark, Kubernetes and a lot of other tech, Optimize – working with Big data is very specific, sometimes it’s IO / CPU-bound, depending on the process, we need to figure out a faster way of doing things. At least empirical knowledge of calculation complexity, as in Big data, even simple operations, when you multiply by the size of the dataset can be costly , Conduct Proof of Concept (PoC) for enhancements , Writing great and performant Big Data Scala code, Cooperate with other Big data teams , Work with technologies like AWS, Kubernetes, Airflow, EMR, Hadoop, Linux / Ubuntu, Kafka, and Spark , Use Slack and Zoom for communication ] Requirements : Scala, Spark, Linux, API, AWS Tools : Jira, Bitbucket, GIT, Jenkins. Additionally : Sport Subscription, Private healthcare, Remote work, Flexible working hours, Free coffee, Playroom, Modern office, No dress code, In-house trainings.
Big Data Developer • Remote, Poland