My client requires the services of a number of contract Big Data Engineers for a highly specialised project with expertise across the Hadoop ecosystem.
What you will bring...
- Great knowledge of Data and ETL
- Experience with AWS / Azure
- Experience with Scala and Spark
- Experience with programming in Python, R, and Java or equivalent
- Solid Experience with Hadoop, Apache (Saprk, Kafka etc)
- Experience with designing solutions for data acquisition, ingestion, integration, and transformation
- Working in highly client-facing roles and contributing end-to-end delivery life-cycle of complex and large big data lakes
- Worked with and guided clients on the best application of big data solutions tailored to their current situations
- Assessing emerging trends
- Maintenance and communication of current and future architecture
What's in it for you?
This is a fast growing organisation with lots of career opportunities and the ability to earn an excellent package.
To learn more and have access to a more complete job description listing the full responsibilities, please apply by sending your CV via the big button below or contact Geraint for a confidential discussion on (03) 8637 7370.