Bigdata Architect Bigdata Architect …

Cognizant
à Melbourne, Victoria, Australie
CDI, Plein-temps
Soyez parmi les premiers à postuler
Competitive
Cognizant
à Melbourne, Victoria, Australie
CDI, Plein-temps
Soyez parmi les premiers à postuler
Competitive
Bigdata Architect
Position Summary:
Looking for a Big Data Architect and Engineer who knows to design and work on potential use cases of Big Data and Cloud applications. You will be responsible for design, architecture and implementation of complex information system using various Big Data components, cloud, Batch (ETL/ELT) and streaming APIs. You have to clean, transform, and analyse vast amounts of raw data from various systems using Spark and Google cloud to provide ready-to-use data to feature developers and business analysts. You should be having knowledge of Real-time and Batch data processing in Big data and cloud platform. You will need to collaborate with enterprise architect & business architect to define big data implementations.
Mandatory Skills:
  • Hands on experience on Spark ( Scala Spark ) -
  • Spark RDD API
  • Spark SQL Dataframe API
  • Spark Streaming API
  • Hands on programming skills on - Scala, Python, Shell Scripting
  • Should have exposure to RDBMS, No-SQL like HBase etc...
  • Hands on experience on Versioning system like - Git, BitBucket
  • Sound knowledge on Spark Query tuning and performance optimization.
  • Experience working on Cloudera, HDFS
  • Deep understanding of distributed systems (e.g. CAP theorem, partitioning, replication, consistency,
  • and consensus)
  • Advantage on having knowledge on Real-time tool - Kafka
  • Advantage on having knowledge on Google Clod Platform
  • Added advantage to have awareness of Agile Methodology and tools like JIRA
  • Strong communicator
  • Ability to manage stakeholders' expectations.
Duties and Responsibilities:
  • Leadership, drive outcomes with upstream teams, engage with peer engineers, lead them to solutions that enable connectivity, be it code or system changes on their side, or simply new functional accounts.
  • Responsible to design and build analytics project which include task like data ingestion, data transformation, data quality, data engineering and data governance in Big Data.
  • Design, architecture and implementation of complex information system using various Big Data components, Batch (ETL/ELT) and streaming APIs, complex data model design.
  • Design and Create Scala/Spark jobs for data transformation and aggregation
  • Responsible to process huge volume of structured and semi structured data and ingest data in Data Lake.
  • Responsible for SDLC, including analysis, design, development, implementation, support and enhancement.
  • Understand banking remediation issues and provided the solutions to handle in big data
  • Work on defect fixes, enhancement and production support of different applications.
  • Develop SQL queries to perform data analysis and data validation.
  • Develop migration scripts to deploy the code from one environment to another environment.
  • Technical support to business system analyst and customers to assist them with resolution of business incidents/tickets
  • Researching new tools, technologies and practices for improving system efficiency.
Salary Range: >$100,000





Cognizant logo
Offres similaires
Plus d'offres
Close
Loading...
Loading...