A top-tier $30+ billion institutional asset management firm in midtown Manhattan is currently looking to hire a VP, Data Engineer, Quantitative Research with 5 - 10 years of design and development experience within the asset management / financial services industry including Big Data Technologies, Data Flow Processes, SQL Development and ELT (Extract, Load, Transform).
The firm is seeking a highly motivated and capable technologist to join the Systematic Strategies team to work on improving the team's research and portfolio management capabilities, with a focus on ELT and data curation.
The VP, Data Engineer will lead the development of and manage the team's ELT processes, support the team's research efforts, design and build data stores and transformations and coordinate team member's work into an integrated, high quality product. This role offers a unique opportunity to work on a team that is at the forefront of new initiatives at the firm with the potential to manage others over time.
- Gather, analyze, interpret the team's data requirements to develop complete solutions, including data modeling, ELT pipeline management, data quality management, governance, and lineage.
- Design and develop ELT processes based on functional and non-functional requirements using resources in Microsoft Azure, such as Azure Data Factory and Azure Databricks.
- Support the team's research efforts, including large scale data exploration, cleaning, transformation, and feature creation.
- Proactively improve the team's data-related processes.
- Maintain and improve SQL databases.
- Document work for the team and for broader communication.
- Understand and adopt a software development mindset.
- Develop, maintain, and document knowledge of available data.
- Work with the team to implement, maintain, enhance data governance, quality, and related policies, aligning with the firm's standards and framework
- 5-10 years of relevant design and development experience within the asset management / financial services industry.
- Big Data technologies
- Data Flow Processes
- SQL Development
- Strong SQL, Transact-SQL programming with minimum 5 years of experience in MS SQL.
- Experience with Python, PySpark, Spark, unstructured data and NoSQL databases is strongly preferred.
- Good understanding of Azure platform, with working knowledge of Azure Data Factory and Azure Databricks.
- Working experience with data modeling, relational modeling, and dimensional modeling.
- Familiarity with data extraction and web scraping is preferred.
- Working knowledge of source code control tool such as GIT.
- Experience with Apache Kafka is a plus.
- Experience with financial data and alternative data.
- Undergraduate or Master's Degree in Computer Science or Engineering-related field; or Technical Certificate.