Data Engineer

Location: Melbourne
Job Type: Permanent
Specialisation:
Reference: 2900232
Consultant: Geraint Cooper
Email: Email Geraint
My client is embarking on a green-fields project to establish a world class Data Lake, Analytics and Insights platform, thus we require Data Engineers to play a leading role in building out our capability.



What you’ll be doing…
  • Create and maintain optimal data pipeline architecture through automation and service reliability practices
  • Design and build data separation schemes and security policies to meet differing regulatory requirements across multiple cloud environments
  • Ensure architecture meets the functional and non-functional requirements
  • Continually learning and researching best practices and tooling for data lake architectures (Kappa, Lambda, etc.)
  • Automation is built in into every aspect of your deliverables as a core component, not an afterthought.
  
What you will bring... 
  • Developing in Java, Scala or Python
  • Open source data tooling (Apache Kafka, Beam, Flink, Spark, ETL processing, etc.)
  • Knowledge of, or experience with traditional and MPP Data Warehouses (Exadata, Teradata, Greenplum, RedShift, etc.) and associated ETL tooling (Informatica, etc)
  • OOP or functional design patterns
  • Databases including Relational, Graph and Document/NoSQL, dimensional models (star, Snowflake etc), normalization and SQL development
  • Containers (Docker, Kubernetes)
  • CI/CD, Github
  • Traditional RDBMS (MSSQL, Postgres, MySQL etc)
  • Cloud: AWS (S3, EMR, EKS, EC2, lambda, etc.), Azure and/or Google Cloud Platform


What's in it for you?

This is a fast growing organisation with lots of career opportunities and the ability to earn an excellent package.
To learn more and have access to a more complete job description listing the full responsibilities, please apply by sending your CV via the big button below or contact Geraint for a confidential discussion on (03) 8637 7370.