Big Data Engineer

  • Salaire :Flexible (Dependent on Experience)
  • Lieu de travail :Londres, Angleterre, Royaume-Uni
  • Type de contrat :CDI, Plein-temps
  • Entreprise :Alexander Ash Consulting
  • Mise à jour le :14 nov. 18

We have been retained by a huge multinational FS client in London who are in the process of carrying out their largest ever Technology transformation!

They are currenly looking for talented Big Data Engineers/Developers with expertise in Apache Tools (Spark/Hadoop/Kafka) to join a number of Agile teams across the organisation.

 

Core Responsibilities:

 

• Working as part of a cross-functional agile team, collaborating with others to understand requirements, analysing and refining stories, designing solutions, implementing and testing them

• Applying Behaviour Driven Development techniques, collaborating closely with users, analysts, developers and other testers
• Writing code and writing it well. Be proud to call yourself a programmer. Use test driven development, write clean code and refactor constantly. Make sure we are building the thing right
• Ensuring that the software you build is reliable and easy to support in production. Being prepared to take your turn on call providing 3rd line support when it’s needed
• Helping define the architecture of the components you are working on
• Helping your team to build, test and release software with the short lead times and a minimum of waste. Working to develop and maintain a highly automated Continuous Delivery pipeline
• Contributing towards a culture of learning and continuous improvement within your team and beyond

 

Skills & Qualifications:

 

• Deep knowledge of at least one modern programming language, along with understanding of both object oriented and functional programming. Ideally Java, Scala and Python
• Experience of the Hadoop ecosystem and the technologies that comprise it
• A good understanding of Apache Spark/Kafka
• Practical experience of using test driven development and constant refactoring in continuous integration environment
• Knowledge of SQL and relational databases – ideally both Hive/Impala/SparkSQL and a traditional RDMS, such as Oracle
• Experience working in an agile team, practicing Scrum, Kanban or XP
• A background in Behaviour Driven Development, particularly experience of how it can be used to define requirements in a collaborative manner, ensure that the team builds the right thing and create a system of living documentation
• An understanding of web technologies, frameworks and tools, for example: HTML, CSS, Javascript, Angular, Bootstrap, React, D3, Node.js
• An understanding of data science techniques, including experience with technologies such as Pandas, Spark ML Library, R, etc. Experience creating effective data visualisations
• Knowledge gained in Financial Services environments, for example products, instruments, trade lifecycles, regulation, risk, financial reporting or accounting