Company Overview
Hiring for a leading American technology company most known for its postage meters and other mailing equipment and services, and with expansions into e-commerce, software, and other technologies.
This job has been closed. You will find bellow the job description as a reminder. It is not possible to apply anymore.
Hiring for a leading American technology company most known for its postage meters and other mailing equipment and services, and with expansions into e-commerce, software, and other technologies.
The Job
• Creating Data Domains in Big Data Warehouse and preparing Curated Datasets for Data Scientist
and Bi Analysts.
• Developing and executing Frameworks on Data Quality.
• You will be working closely with the SME in US who will share the business knowledge required
to creates these domains & datasets.
• Keen on diving deep into big data using Big Data Ecosystem to perform Data aggregations, Data
curation, data ETL & ELT.
• Work with a highly integrated, cross-discipline agile team and will be responsible for building and
supporting modern Big Data Engineering solutions leveraging Enterprise data lake across
Commerce Services & Sendtech business units.
• Hands on and up to date with the latest technology stack on Big Data Analytics preferably on AWS
Cloud.
Must to have:
• Big Data Ecosystem with core competency in SQL optimization, efficient Joins – Snowflake,
Hadoop, Hive , Pyspark , Spark , Pig , Redshift
• AWS Services for Data, Big Data & Serverless – AWS Data Pipeline , EMR , Lambda , S3
• Hands on experience working on Production Solutions with understanding on Scalability,
Reliability, Uptime , Cost Optimization of a solution.
• Some exposure to GitLab , Jenkins , Jira tools for Agile development & CI/CD
• Participate in design review sessions and ensure all solutions are aligned to pre-defined
architectural specifications
• Work with distributed teams to design and develop frameworks, solution accelerators, proofs of
concept, and external customer facing products
• Evaluate and incorporate new technologies into new and existing frameworks and solutions as
applicable
• Collaborate with and mentor members of the team and other coworkers
• Be agile and embrace change
Good to have
• Automation of ETL Jobs and orchestration using scripts, pipelines, workflows.
• AWS data streaming and big data technologies such as Kafka , Amazon Kinesis, AWS Lambda,
Amazon Simple Queue Service (Amazon SQS), Amazon Simple Notification Service (Amazon
SNS), and Amazon Simple Workflow Service (Amazon SWF)
Qualification & work experience
• UG - B.Tech/B.E. OR PG – M.S. / M.Tech from REC or any other reputed institute
• 1+ year experience in Snowflake Queries , Procedures , Functions
• 1-2 years' experience in an AWS/ETL development environment with experience working with
hands on experience designing and implementing solutions.
• 3+ years in a data analyst and data engineering.
• 2+ year of Industry experience of working in Production environment