We are sorry!

This job has been closed. You will find bellow the job description as a reminder. It is not possible to apply anymore.

Location: Pune
Salary: Open
Employment Type: Permanent
Sub-industry: IT Solutions, IT Solutions
Function: Technology

Company Overview

Opportunity with a US based Analytics Company based out in Pune for BigData Engineer -Azure

Job Description

JD-Title: Big Data Engineer with cloud
Location –Pune, Baner
Key Responsibilities:
• Selecting and integrating any Big Data tools and frameworks required to provide requested
capabilities
• Implementing the Ingestion & ETL/ELT processes
• Monitoring performance and advising any necessary infrastructure changes
• Required Experience, Skills and Qualifications:
• MUST have HANDS-ON experience on Hadoop tools/technologies like Spark (Strong in Spark),
Map Reduce, Hive, HDFS.
• HANDS-ON expertise and excellent understanding of big data toolset such as Sqoop,
Spark-streaming, Kafka, NiFi
• Proficiency in any of the programming language: Scala, Python or Java with total of 4years
experience
• Must have experience in Cloud infrastructures like MS Azure, data lake, data bricks etc
Good working knowledge in NoSQL DB (Mongo, HBase, Casandra)
• Must have experience of designing, developing and deploying big data project/s into
production
• Implemented complex projects dealing with the considerable data size (TB/ PB) and with
high complexity in the production environment.
• Hortonworks (HDPCA/HDPCD/HDPCD-Spark) or Cloudera certification is an added
advantage
• Bachelor's degree or higher in a quantitative/technical field (e.g. Computer Science,
Statistics, Engineering) and software development experience with proven hands-on
experience in Big Data technologies
• Experience in developing/architecting environments in the Hadoop ecosystem using HDP
and HDF
• Demonstrated strength in data modelling, ETL development, and Data.
Experience in designing and implementing an enterprise data lake
• Experience in Big Data Management and Big Data Governance
• Some experience with Kubernetes, Docker containers etc.
• Additional Skills:
• Exposure to 1 or more big data management tools like Talend, Informatica (IBD), Zaloni
Data Platform etc.
• Exposure to data cleansing/data wrangling tools like Trifecta etc.
• Exposure to Big Data Visualization tools like Tableau, Qlik etc