We are sorry!

This job has been closed. You will find bellow the job description as a reminder. It is not possible to apply anymore.

Location: Pune
Salary: Open
Employment Type: Permanent
Sub-industry: IT Solutions, IT Solutions
Function: Healthcare R&D

Company Overview

A Services company based out of Pune location carrying specialization in ChatBOTS, Artificial Intelligence, Machine Learning, Deep Learning, Big Data Analytics, Predictive Analytics, Data Mining, Data Sciences, Decision Sciences, Data Visualization.

Job Description

Solutioning –
Conceptualize and develop Machine Learning solutions. Envisioning opportunistic areas with our client by demonstrating relevant and credible data mining solutions and paradigms. It will involve
Designing the solution Architecture.
Design the solution and decide on the framework best suited for the given problem
Write business standard code to solve the problems
Find the right intent and entities wherever needed.
Customize the code and build on top of existing framework to address client specific requirements.
Optimize the code to reduce the turnaround time.
Expose the APIs as required to interact with other modules such as chatbot or backend services.
Dockerize the model to deploy on client’s environment.
Providing thought leadership and best practices related to machine learning to Abzooba customer.
Identify specific capability, practices, tools, and people that the organization would need to develop.

Requirements

Required Experience, Skills and Qualifications:

We are looking for an energetic and self-driven analytical professional with following technical competencies -

Atleast 3 years of experience working as an analytics professional
Atleast 3 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services.
Working knowledge of tools like Docker, Jenkins, Kubeflow, and Kubernetes for automating workflow and managing the configuration of deployed workloads is must.
Experience working with Cloud DataProc, Cloud Dataflow, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage, Cloud Functions & Github.
Experience working with Google cloud is added advantage. Google Professional Cloud Developer Certification is a plus.
Experience implementing data services, including storage, integration, and APIs.
Experience in writing software in Python is must
Strong fundamentals in data structures, algorithm, problem-solving, and object-oriented programming
Exposure to deep learning algorithms and experience with TensorFlow
Agile project planning and project management skills is a plus.