The team is building a new system based on machine learning which will be ingesting and analyzing large sets of data in real-time.
The platform is also to be hosted in AWS and you will be supported in gaining your certifications within the role if you don't already have them.
We’re looking for someone with:
- java development, experience with building APIs (kong) and micro services.
- Exposure to distributed big data stacks e.g. Spark, PySpark, Kafka, Hive, Flume, Hue & Sqoop.
- An understanding and/or experience developing data analytics platforms like SAS, SPSS, Tableau, etc.
- Able to work with different data types e.g. streaming, real-time, file based, RDMS, unstructured data, etc.
- Experience working in cloud deployment and management (especially AWS).
- A strong understanding of CI/CD practices.
- An open mindset and proven ability to innovate and influence.
- A passion for learning (tell us about the self-learning courses and certifications you have done - Coursera, Udacity, AWS, etc.)