Software Engineering Specialist - Hadoop / Spark
- Salaire :circa £60-85,000 (DOE)
- Lieu de travail :Londres, Angleterre, Royaume-Uni
- Type de contrat :CDI, Plein-temps
- Entreprise :Alexander Ash Consulting
- Mise à jour le :17 juil. 18
LD250518 Are you an Apache Hadoop or Spark specialist keen to test your skills in the fast-moving Investment Banking sector?
We are currently partnering with a major FinTech provider, who are carrying out a huge transformation for one of the largest Investment Banks in the world.
This team are responsible for their client’s entire Information Technology infrastructure, and are engineering and developing new Digital software solutions across all of the Bank’s product channels. They are looking for a talented Software Engineer within their Regulatory Technology team who can provide hands-on expertise in Apache Hadoop and Spark.
Regulatory Technology aims to be an industry leading function that delivers sustainable regulatory compliance through technology automation and competitive operating leverage to create a safe and controlled operating environment that protect the franchise and its clients.
The successful candidate will:
• Work as part of a highly successful, cross-functional agile delivery team, that includes analysts, developers and testers.
• Bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value.
• Be someone who sees software engineering as a ‘team activity’, with a predisposition to open code, open discussion and creating a supportive, collaborative environment.
• Be ready to contribute to all stages of software delivery, from initial analysis right through to production support.
• Have the opportunity to work on challenging problems, building high-performance systems to process large volumes of data, using the latest technologies.
• Lead and mentor others in sharing knowledge, facilitating meetings and workshops, defining new designs and discovering new techniques.
Required Skills / Experience:
• Working knowledge of Apache Hadoop, and/or an understanding of both object oriented and functional programming (ideally Java, Scala and Python)
• Experience of the Hadoop ecosystem and the technologies that comprise it
• Practical experience of using test driven development and constant refactoring in continuous integration environment
• Knowledge of SQL and relational databases
• Experience working in an agile team, practicing Scrum, Kanban or XP
• A background in Behaviour Driven Development, particularly experience of how it can be used to define requirements in a collaborative manner, ensure that the team builds the right thing and create a system of living documentation
• An understanding of web technologies, frameworks and tools
• An understanding of data science techniques
• Experience creating effective data visualisations