Consultant - Data Engineer
CME Group is the world's leading and most diverse derivatives marketplace. But who we are goes deeper than that. Here, you can impact markets worldwide. Transform industries. And build a career shaping tomorrow. We invest in your success and you own it, all while working alongside a team of leading experts who inspire you in ways big and small. Joining our company gives you the opportunity to make a difference in global financial markets every day, whether you work on our industry-leading technology and risk management services, our benchmark products or in a corporate services area that helps us serve our customers better. We're small enough for you and your contributions to be known. But big enough for your ideas to make an impact. The pace is dynamic, the work is unlike any other firm in the business, and the possibilities are endless. Problem solvers, difference makers, trailblazers. Those are our people. And we're looking for more.
To learn more about what a career at CME Group can offer you, visit us at www.wherefuturesaremade.com .
The selected candidate will join the CME Engineering and Execution team in Belfast as a Data Engineer, initially in the Enterprise Data Flow Team. This individual is responsible for leading the technical delivery of systems that must achieve a unique blend of low latency performance, big data scalability, and rock-solid reliability and integrity, all while undergoing rapid release cycles. Achieving these goals will require an understanding both the underlying technology and the development, testing, and deployment lifecycle of the applications. The candidate must be able to solve problems creatively, communicate effectively, and possess the ability to lead others to achieve the critical mission of the team.
Although initially working in the EDF Team space, the engineering and execution team also develop a variety of trading and post-trade solutions, supporting our exchange and clearing business.
- Hands on with detailed design and architecture plans for complex, large scale efforts within a multi-cloud environment.
- Assists with system design, working with the various teams to build fit for purpose platforms.
- Works ahead - ensuring the architecture is responsive to evolving needs.
- A team player - Assists the teams as required to achieve delivery milestones
- Utilizes the expertise of the team to develop architecture through consensus and team approach.
- Works with the enterprise architecture team, to gain an understanding of the evolving enterprise, to make efficient decisions on application architecture, and priorities.
- Applies expert knowledge of cloud technologies, java language, DBMS and middle-ware technologies in independently designing and developing key services.
- Participates in code reviews, proactively identifying and mitigating potential issues and defects.
- Defines key metrics driving code optimization and re-factoring.
- Understand the data and how it works to help develop functional solutions
- Takes part in preliminary story review, providing constructive feedback and input on both work effort estimation as well as architecture/design improvements.
- Works with analysts to interpret high level requirements for complex, large scale initiatives and decomposing them into independent stories and sub-tasks for the team.
- Contributes to continuous improvement efforts by identifying and championing practical means of reducing time to market while maintaining high quality products (i.e. - process improvements/automation opportunities).
- Embraces and enforces CME Group SDLC and information security standards.
- Bachelor's degree (with honours) or equivalent/better strongly preferred, but substantial relevant experience could substitute
- Experience in AWS big data services
- Experience in terraform
- Experience in python, Java, Linux
- Experience architecting enterprise software applications
- Experience in developing and automating solutions directly related to Continuous Integration/ Continuous Delivery and infrastructure automation
- AWS Data Analytics or AWS Certified Big Data Specialty qualification
- Experience coding in a story-driven, agile environment
- Experience working in the Big Data space handling both real time and batch
- Experience in the Hadoop ecosystem using EMR, Map Reduce and/or Spark
- Prior experience working in financial services/exchange space
- Prior experience working with BDD methodologies and automated acceptance criteria
- Prior experience using Confluence, JIRA, or other Atlassian tools