We are sorry!

This job has been closed. You will find bellow the job description as a reminder. It is not possible to apply anymore.

Location: Pune
Salary: Open
Employment Type: Permanent
Industry: Technology/Online
Sub-industry: Enterprise Software
Function: Technology

Company Overview

Hiring for a leading American technology company most known for its postage meters and other mailing equipment and services, and with expansions into e-commerce, software, and other technologies.

Job Description

• Must have around 6+ years of experience in SQL & Big data Warehouse Development.
• Create logical and physical data models using tool, create & Maintain OLAP/OLTP 
snowflake warehouse.
• Provide and enforce best practices and governance standards, normalization, data 
classification, and normalization in the delivery of high-quality data solutions for 
Enterprise Data Warehouse, business group focused
• BI platforms (Data Marts), metadata reporting applications, and OLAP systems.
• Conduct modeling sessions with project teams across all business groups, gather and 
define data requirements for Enterprise Data Model.
• Lead discussion with stakeholders designed to transform subscriber data requirements 
into a robust, canonical logical model and data dictionary, with clearly-written business 
definitions for logical entities and attributes that describe enterprise data in a way that is 
consumable by a wide business audience.
• Experience with Cloud BI Platform – Snowflake
• Strong experience of building report using Power BI/Tableau
• Experience in reporting tools, including SQL Database, Power BI / Tableau and Data 
warehouse.
• Experience in MSSQL Scripting & PL/SQL, Oracle database, TD, Snowflake
• Experience with big data and event streaming concepts and visualization (Kafka, ELK, 
Splunk, Hadoop, Hive, Graphite, Prometheus)
• Enforce architectural standards and influence Developers, Technical Leads, Project 
Managers, and Systems Analysts regarding best practices for Enterprise Data 
Warehouse Design.
• Perform light Data Analysis and Data Profiling as needed.
• Good hands-on experience on batch processing and streaming data/Analytics
Required Qualifications & Skills  
• Bachelor's or Master’s Degree in Statistics, Math, Computer Science, Management 
Information Systems, Data Analytics or another quantitative discipline, or equivalent 
work experience
• Minimum 5+ year of experience in SQL & Warehouse development.
• Experience with big data and event streaming concepts and visualization (Kafka, ELK, 
Splunk, Hadoop)
• Scripting Language (Python, PySpark)
• Familiarity with Restful APIs and general Web Services
• Any one of ETL tool Experience is pulse (Talend, Informatica, SSIS, Pentaho)
• Database Procedural Languages (PG/SQL,PL/SQL,T-SQL)
• Data Warehouse (Kimball) and OLAP Modeling
• General Relational Modeling (OLTP,OLEP, 3NF)
• Advanced Experience writing ad-hoc SQL queries (Postgres, Microsoft SQL, SnowSQL )
• Advanced Excel skills, including functions and pivot tables

Requirements

Any one of ETL tool Experience is pulse (Talend, Informatica, SSIS, Pentaho)
• Database Procedural Languages (PG/SQL,PL/SQL,T-SQL)
• Data Warehouse (Kimball) and OLAP Modeling
• General Relational Modeling (OLTP,OLEP, 3NF)
• Advanced Experience writing ad-hoc SQL queries (Postgres, Microsoft SQL, SnowSQL )
• Advanced Excel skills, including functions and pivot tables
• Experience with big data and event streaming concepts and visualization (Kafka, ELK, 
Splunk, Hadoop, Hive, Graphite, Prometheus)
• Scripting Language (Python, PySpark)