Spark Developer

Posted 03 Mar 2022

Capgemini

Katowice (Inne)

https://www.capgemini.com/pl-pl/


What is the scope of duties in this role? • You will implement data processing solutions using modern technologies and Big Data or/and Cloud tools (incl. streaming, cloud, clustered computing, real-time processing, advanced analytics); • You will design and implement software to process large and unstructured datasets (noSQL, Data Lake Architecture); • You will optimize and test modern Big Data solutions, also in cloud and Continuous Delivery / Continuous Integration environment • You will work on migrating and improving the efficiency of existing processes and systems; We are looking for you if • You have commercial experience working in projects in Data Engineering, Big Data and/or Cloud environment using Apache Spark; • You can code in Scala, Java or Python; • You know well at least one (non)relational database system and (no)SQL language; • You have very good command of English (willingness to learn German would be an advantage

Jak zaaplikować?

https://www.capgemini.com/pl-pl/jobs/