We are sorry!

This job has been closed. You will find bellow the job description as a reminder. It is not possible to apply anymore.

Location: Pune
Salary: Open
Employment Type: Permanent
Industry: Technology/Online
Sub-industry: Enterprise Software
Function: Technology

Company Overview

A US-based Product Engineering company, mostly known for its postage meters and other mailing equipment & services.

Job Description

• Analyzes, designs, build, test, and implement integration technology solutions that meet the specifications of a project or service request
• Develop new Data integration solutions with a focus on performance, reliability, durability, data quality and SLA expectations
• Develop solutions with target of data (could be) from below: S3, Kafka, Snowflake, Spark, Scala, Python, MongoDB or DynamoDB.
• Help implement and develop new Apache Kafka based data streaming applications.
• Participate in design review sessions and ensure all solutions are aligned to pre-defined architectural specifications
• Work with distributed teams to design and develop frameworks, solution accelerators, proofs of concept, and external customer facing products
• Evaluate and incorporate new technologies into new and existing frameworks and solutions as applicable
• Collaborate with and mentor members of the team and other co-workers
• Be agile and embrace change

Requirements

Must-haves

• Experience of developing new Data integration solutions with target of data (could be) from below: S3, Kafka, Snowflake, Spark, Scala, Python, MongoDB or DynamoDB.
• Hands on experience working on Java Enterprise technical stack
• Help implement and develop new Apache Kafka based data streaming applications.
• Strong SQL knowledge
• Knowledge of working with distributed teams to design and develop frameworks, solution accelerators, proofs of concept, and external customer facing products
• 2+ years' experience in an AWS/ETL development environment with experience working with hands on experience designing and implementing solutions.

Good to have
• Apache Kafka or Confluent & Kafka Connect
• Effectively utilizes standard enterprise tools to develop or implement technical solutions; experience in one or more of the following preferred
o Talend
o Informatica Power Center / Power Exchange
• AWS data streaming and big data technologies such as Amazon Kinesis, AWS Lambda, Amazon Simple Queue Service (Amazon SQS), Amazon Simple Notification Service (Amazon SNS), and Amazon Simple Workflow Service (Amazon SWF)

Required Qualifications & Skills
• This role requires a talented self-directed individual with a strong work ethic and the following skills: UG - B.Tech/B.E. OR PG – M.S. / M.Tech from REC or any other reputed institute