We are sorry!

This job has been closed. You will find bellow the job description as a reminder. It is not possible to apply anymore.

Location: Pune
Salary: Open
Employment Type: Permanent
Industry: Technology/Online
Sub-industry: Enterprise Software
Function: Technology

Company Overview

The firm is a global technology company providing commerce solutions that power billions of
transactions. Clients around the world, including 90 percent of the Fortune 500, rely on the accuracy and precision delivered by the firm's solutions, analytics, and APIs in the areas of ecommerce fulfillment, shipping and returns ;cross-border ecommerce; office mailing and shipping; presort services; and financing.

Job Description

We are looking for a skilled ETL/Snowflake Developer to be part of a highly-integrated, cross-discipline
agile DataOps team that will be responsible for building and supporting modern ETL services for cross
systems, applications and Business Intelligence leveraging Cloud Data Warehouses, SAP, SFDC and other
enterprise databases.
The Job
• Analyze, develop and maintain ETL and CDC pipelines in and out of data warehouse with a focus
on automation, performance, reliability, durability, data quality, security and SLA expectations
• Perform data mappings and design of system for ETL/CDC workflow solutions
• Adhere to and promote CI/CD best practices, processes, and deliverables in accordance with
modern standards in a DataOps environment
• Work closely with business analysts to implement integration technology solutions that meet
the specifications of a project or service request
• Develop automated data audit and validation processes
• Participate in architectural design review sessions and ensure all solutions are aligned to predefined specifications
• Provide production support for Data Warehouse issues such data load problems, transformation
translation problems
• Ensure accurate and timely data availability to meet business SLA, especially around critical time
periods
• Document ETL and data warehouse processes and flows

Requirements

• Ability to develop ETL pipelines and scripts in and out of data warehouse using combination of
Datastage, HVR, Goldengate, Snaplogic, Python and Snowflake’s SnowSQL
• Experience with different source systems like SAP, HANA, OLFM and SalesForce preferred
• Strong SQL skills including stored procedures
• Extensive experience performance tuning, troubleshooting and debugging solutions
• Experience in supporting Data Warehouse and data transformation issues
• Demonstrated experience providing production support during time sensitive situations
• Experience in integrating on premise infrastructure with public cloud (AWS, AZURE, Snowflake)
• Experience with AIX/Linux shell scripting
• Experience performing detailed data analysis (i.e. determine the structure, content, and quality
of the data through examination of source systems and data samples)
• Ability to translate requirements for BI and Reporting and data transformations
• Ability to design and outline solution using cloud based and on-premise technologies
• Ability to understand data pipelines and modern ways of automating data pipeline
• Actively test and clearly document implementations, so others can easily understand the
requirements, implementation, and test conditions

• Expert in source and target system analysis and prior experience on project analysis and cost
estimations
• Ability to work effectively in a remote team environment
• Excellent written and verbal communication, time management, and presentation skills