Image Loading

Senior Data Engineer

Job Description

Uniphore is one of the largest B2B AI-native companies—decades-proven, built-for-scale and designed for the enterprise. The company drives business outcomes, across multiple industry verticals, and enables the largest global deployments.  
  
Uniphore infuses AI into every part of the enterprise that impacts the customer. We deliver the only multimodal architecture centered on customers that combines Generative AI, Knowledge AI, Emotion AI, workflow automation and a co-pilot to guide you. We understand better than anyone how to capture voice, video and text and how to analyze all types of data.  
  
As AI becomes more powerful, every part of the enterprise that impacts the customer will be disrupted. We believe the future will run on the connective tissue between people, machines and data: all in the service of creating the most human processes and experiences for customers and employees.   

Job Description:
Position Summary

We're seeking a highly skilled Sr. Data Engineer to join our team and play a crucial role in delivering cutting-edge data-driven solutions.

• The ideal candidate will possess a deep understanding of data engineering principles, proficiency in Python, Scala good to have, Cloud technologies(AWS/ GCP/Azure), Databricks, SQL, ETL processes, and data warehousing.

You will collaborate closely with cross-functional teams including product managers, software engineers, and data scientists to ensure the seamless integration of machine learning capabilities into our solutions 

Key Responsibilities

Data and Transformation :

• Design and implement ETL pipelines to extract, transform, and load data from

various sources, including structured, unstructured, RAG

• Ensure data integrity, quality, and security throughout the data lifecycle.

Design, develop, and optimize RAG (Retrieval-Augmented Generation) pipelines to

facilitate effective information retrieval and generation in conversational AI

systems

Collaborate with software engineers to integrate machine learning models into

production systems, ensuring scalability, reliability, and performance

Engineer will contribute in Conversational AI, you will play a pivotal role in

architecting, integrating, and deploying state-of-the-art models to power our

conversational AI product suite.

Software Engineering

• Strong knowledge and hands-on experience of APIs, scripting, micro service

architecture and SQL.

• Build, contribute to software development lifecycle (SDLC) methodologies and

promote best practices for backend and data engineering.

• Participate in code reviews to ensure code quality, readability, and

maintainability.

• Design and develop robust APIs and SDKs (if applicable) for seamless data

integration and access.

• Demonstrate strong scripting skills (Python, Shell scripting, etc.) to automate

processes and workflows.

Data Warehousing And Architecture

• Architect, Implement and maintain robust data warehouse solutions leveraging

cloud technologies.

• Ensure data integrity, security, and scalability of the data infrastructure.

• Collaborate with data scientists and analysts to design data models that support

business intelligence and advanced analytics.

Agile And Scrum Collaboration

• Work within agile development methodologies, including Scrum, to deliver

iterative data solutions.

• Participate regularly in sprint planning, daily standups, and retrospectives to

ensure project alignment and continuous improvement.

Qualifications

Bachelor's degree/BE in computer science, Data Science, Engineering, or a

related field; master's degree preferred.

• 4+ years of hands-on experience as a Data Engineer, with a proven track record

of building and managing data pipelines and data warehouses.

Must-Have Skills

• Advanced proficiency in Python for data manipulation and ETL development.

• In-depth knowledge of cloud services, good to have Databricks

• Extensive experience with Salesforce data structures and API integration.

• Strong SQL skills and experience designing and optimizing relational databases.

• Expertise in relational databases (MSSQL/MySQL/Postgres)

• Strong understanding of data modeling, data warehousing, and data integration

concepts.

• Experience with building and deploying machine learning models in production

environments.

• Understanding of web services (REST, SOAP) and Internet architecture

principles.

Additional Attributes

Excellent problem-solving and analytical abilities.

• Strong communication and interpersonal skills.

• Self-driven, proactive, and adaptable to evolving project requirements.

• Team-oriented mindset and ability to thrive in a collaborative environment.

Location preference:

India - Bangalore, India - Chennai

Skills

  • Data Engineering
  • ETL
  • Data Manipulation
  • Python
  • Cloud services
  • REST
  • SOAP

Education

  • Master's Degree
  • Bachelor's Degree

Job Information

Job Posted Date

Nov 20, 2024

Experience

4 to 8 Years

Compensation (Annual in Lacs)

Best in the Industry

Work Type

Permanent

Type Of Work

8 hour shift

Category

Information Technology

Copyright © 2022 All Rights Reserved. Saas Talent