What you’ll be doing…
- Ability to write and optimize complex queries using both SQL and NoSQL paradigms.
- Able to work with different data types e.g. streaming, real-time, file based, RDMS, unstructured data, etc.
- Promote AWS and cloud best practices to maximise compute performance while minimising infrastructure costs.
- Design and develop ‘infrastructure as code’.
- Keep up to date with AWS and other cloud providers.
- Work on software and data engineering for bespoke projects.
- Exposure to Hadoop technologies e.g. Spark, PySpark, Kafka, Hive, Flume, Hue, Sqoop, etc.
- Ability to write code using Java or Python.
- Data ingestion technologies and capturing meta-data and data lineage.
- Familiarity with AWS services.
- Experience with ‘infrastructure as code’
- Experience with shell scripting.
- Exposure to DevOps and Agile.
- Experience with modern software practices.
- Experience of Reporting and Analytics, and/or experience working with analysts and data scientists.
- Experience using productivity and collaboration tools such as JIRA and confluence in a software delivery environment
- Postgrad qualifications and self-learning courses and certifications (Coursera, Udacity, AWS, etc.) highly regarded
What's in it for you?
This is an exciting opportunity to join a growing sector of the banking industry with many new applications and projects on the horizon.
To learn more and have access to a more complete job description listing the full responsibilities, please apply by sending your CV via the big button below or contact Geraint for a confidential discussion on (03) 8637 7370.