Data Engineer - Big Data Data Engineer - Big Data …

S&P Global
à New York, NY, États-Unis
CDI, Plein-temps
Soyez parmi les premiers à postuler
S&P Global
à New York, NY, États-Unis
CDI, Plein-temps
Soyez parmi les premiers à postuler
Data Engineer - Big Data
Segment: S&P Global Market Intelligence
The Role: Data Engineer
Grade: 11 The Team:
Data Services Engineering, Big Data Tech team is responsible for enabling teams in the enterprise to unlock the insights, massive amounts of data provide on our platform. Tasked with overall onboarding, operationalizing, administration and maintenance of key big data/data science/machine learning platforms like Databricks, Domino Data Labs, ours is fairly new and rapidly expanding team.
The Imapact:
You will be part of the team and will be working on several initiatives lined up towards our overall goal of providing top of the line & industry leading big-data platform to our customers across the enterprise. That includes but not limited to automation, implementing best practices & standards, operational support & troubleshooting, performance tuning, new use case reviews and design recommendations, new products and tools evaluation etc.
What is it in you:
Self starter, ability to work independently or in general direction, willingness to learn new technologies, strong analytical skills to identify opportunities to improve the onboarding, operationalization, maintenance and administration of Big data platforms, great team player. Responsibilities:
  • You will be a key member of Data Services Team. As a Data/Database Engineer, your responsibilities include the following but not limited to the following.
  • Participates in designing, designing, development, operationalization and maintenance of products and solutions in a cutting-edge and next generation technologies in Big Data space.
  • You will be helping in all the onboarding, setup, configuration, operationalization of different solutions and products within Big Data and automating several of these.
  • You will assist in creating dashboards, monitoring frameworks, self-healing capabilities for products across the platform.
  • You will be developing automation routines and products to build self-serving capabilities for the teams to utilize database/Bigdata platforms.
  • You will be developing and maintaining documentation on various operational and design aspects of the BigData Platform. You will assist in troubleshooting issues and resolving them.
  • You will be part of a Scrum team participating in delivering project and operational goals and objectives.
Basic Qualifications
  • Bachelors or Masters degree, preferably with Computer science major or equivalent with a minimum of about 5 years relevant experience in design, development and administration of BigData and Analytical platforms like Delta Lake, Databricks, Snowflake, Redshift along with SQL & NoSQL databases like Microsoft SQL Server, PostgreSQL, Cassandra, etc.
  • Strong understanding of data movement and replication fundamentals and technologies behind them. Experience or exposure with Data Delivery and streaming technologies like Apache Kafka, Nifi, Qlik's Attunity, Apache Spark would be a great plus
  • Operations experience on Database/DataLake/BigData platforms. Ability to detect, troubleshoot issues, including performance, security, data ingestion and data processing on large data stores.
  • Strong background in system administration on Linux/Unix platforms and exposure or familiarity with Windows platforms.
  • Strong understanding & experience in Infrastructure systems, network, systems troubleshooting and performance
  • Strong understanding & experience in implementing High Availability and Disaster Recovery solutions. Basic Understanding and experience in OS Clustering, failover mechanisms.
  • Strong hand-on experience with writing scripts, programs and automation routines in Java/Python/Shell/PowerShell/Scala, with experience using code repositories like GitHub, Microsoft Azure DevOps Jenkins.
  • Experience in Cloud architecture and components, especially on AWS. Strong understanding of DevOps concepts, procedures and processes, including, CICD, Pipelining, Cloud Automation using Terraform/Ansible, Jenkins etc
Preferred Qualifications:
  • Any experience with front end development would be great plus.
  • Devops Experience to design custom code for database systems, data lake, data delivery pipelines and building kafka producers/consumers, Spark code, Open source Customization with Apache Nifi would be a great plus.
  • Understanding of Agile principles and practices. Experience in scrum teams would be preferable. Ability to work with other teams seamlessly and contribute to the general success of the team.
About S&P Global Market Intelligence : At S&P Global Market Intelligence, we know that not all information is important-some of it is vital. Accurate, deep and insightful. We integrate financial and industry data, research and news into tools that help track performance, generate alpha, identify investment ideas, understand competitive and industry dynamics, perform valuation and assess credit risk. Investment professionals, government agencies, corporations and universities globally can gain the intelligence essential to making business and financial decisions with conviction. S&P Global Market Intelligence is a division of S&P Global (NYSE: SPGI), which provides essential intelligence for individuals, companies and governments to make decisions with confidence. For more information, visit S&P Global is an equal opportunity employer committed to making all employment decisions without regard to race/ethnicity, gender, pregnancy, gender identity or expression, color, creed, religion, national origin, age, disability, marital status (including domestic partnerships and civil unions), sexual orientation,military veteran status, unemployment status, or other legally protected categories, subject to applicable law. S&P Global has a Securities Disclosure and Trading Policy ("the Policy") that seeks to mitigate conflicts of interest by monitoring and placing restrictions on personal securities holding and trading. The Policy is designed to promote compliance with global regulations. In some Divisions, pursuant to the Policy's requirements, candidates at S&P Global may be asked to disclose securities holdings. Some roles may include a trading prohibition and remediation of positions when there is an effective or potential conflict of interest. Employment at S&P Global is contingent upon compliance with the Policy.
Equal Opportunity Employer: S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: and your request will be forwarded to the appropriate person.

20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group)

Job ID: 265490
Posted On: 2021-11-11
Location: Richmond, Virginia, United States
S&P Global logo
Offres similaires
Plus d'offres