DevOps Engineer

Grow your skills whilst using our cutting edge tools as a DevOps Engineer to develop and support our Tech tooling platforms.

Use your experience within the DevOps Engineering environment to join us on our data management tooling journey. You will help us build our next generation data management platform, covering data governance, data structure and lineage harvesting and data quality – all hosted on our public cloud.

You’ll work on exciting green-field projects, meaning exposure in designing and driving how we manage data in the years to come. By supporting the tech tooling products and ensuring compliance to technical standards, you will help us accelerate the pace of build, test and release cycles through automation.  You will use your previous experience with the design, build and test framework to continuously enhance our tooling platforms as well as integrate on-prem and cloud applications through APIs and help POC’s select and roll out vendor management applications. Along the way, you will develop a deep understanding of our internal data domains and systems.

As a DevOps professional with a keen interest in agile software development methodologies, and experience working with Continuous Delivery, including deploying applications in private or public cloud (AWS preferred). As a quick-thinking problem solver, you are able to assimilate requirements rapidly in order to present new design solutions, and further, have demonstrated your experience in automated testing frameworks (eg. Robot Framework, Selenium) for quality assurance.

Your technical experience includes:

  • experience in Shell, Powershell scripting, SQL, GIT, Bamboo.
  • hands-on skills on development and Integration connecting Cloud applications preferably in Java / Javascript/ Python and Mulesoft
  • design and implementation experience that you’ve gained on real world projects
  • willingness to be hands-on as required to get the job done and to adapt quickly to new technologies.
  • passionate to stay current with the latest research and technology 
  • strong understanding of data management concepts (governance, lineage, quality) and industry trends.

You may also demonstrate some of the below skills:

  • understanding and working with XML, JSON, YAML and Groovy.
  • experience using Cucumber, Sonar for application testing and maintenance.
  • experience with Meta-data harvesting tooling, such as Rochade, Informatica EIC, Orion
  • experience with meta-data management, glossary, and data governance tools such as Collibra, Axon, Informatica MM, Allation, IBM IGC
  • experience in workflow development and standards e.g.BPMN 2.0 development,
  • experience with ETL technologies such as SSIS, DataStage, Talend.
  • experience with data preparation, visualisation & reporting technologies
  • experience working with globally distributed teams.

The Corporate Operations Group (COG) brings together specialist support services including workplace, human resources, market operations and technology. COG's purpose is to drive operational excellence through business-aligned services with a focus on quality, cost and risk. COG comprises the following divisions: Business Improvement and Strategy, Business Services, Human Resources, Market Operations, and Technology.

Find out more about Macquarie careers at

Macquarie understands the importance of diversity and inclusion - our long history of success has come from being different. At Macquarie we value the innovation and creativity that diversity of thought brings. The one thing we all have in common is our focus on high performance. If you're capable, motivated and can deliver, we want you on our team. 

We facilitate a range of flexible working arrangements within our teams. Talk to us about what flexibility may be available. Our Technology Returner program is an opportunity for you to re-integrate yourself into the workforce following an extended professional career break. Find out more and apply at

Are you viewing this job on LinkedIn? Click here to apply