Image Loading

Data Platform Engineer

Job Description

  • Bengaluru, Karnataka, India

Strength in Trust

OneTrust is the trust intelligence cloud platform organizations use to transform trust from an abstract concept into a measurable competitive advantage. Organizations globally use OneTrust to enable the responsible use of data while protecting the privacy rights of individuals, implement and report on their cyber security program, make their social impact goals a reality, and create a speak up culture of trust. Over 14,000 customers use OneTrust's technology, including half of the Global 2,000. OneTrust currently ranks #24 on the Forbes Cloud 100 list of top private cloud companies in the world and employs over 2,000 people in regions across North America, South America, Asia, Europe, and Australia.

The Challenge

As a Data Platform Engineer - Platform, you will be working in our Business Technology Data and Analytics function to develop a sound data foundation and processes that will scale with the company’s growth. You will be the subject matter expert of managing all platform technology stack of the Enterprise Data & Analytics team. You will design and implement frameworks for data pipelines, enabling insights from our Product and Corporate Systems for key partners and decision makers at OneTrust. You would also be responsible for enhancing and maintaining our data warehouse in collaboration with business domain experts, analytics, and engineering teams.

This is a fun role for problem solvers, who can intuitively anticipate problems and can also look beyond immediate issues. It is for people who take the initiative to improve both our software and our development infrastructure. In short, we look for people who take pride in the craft and want to be part of creating and defining the team's operating model and contribution to the company. They will be a self-starter, detail and quality oriented, and passionate about having a huge impact at OneTrust. If this role has your name written all over it, please contact us with a resume so that we explore further.

Your Mission

You will work closely with other team members like data architects and business analysts to understand what the business is trying to achieve, help to build best in class data platform solutions and architecture for scale.
 

  • Partner with Data Engineers, Data architects, domain experts, data analysts and other teams to design and build data flow frame works in Snowflake using dbt/airflow/Fivetran/ELT tools
  • Administration, Production Support and Maintain analytics technology ecosystem (data warehouse, ETL Tools)
  • Build and automate AWS cloud native technologies, deploy applications, and provision infrastructure
  • Leverage the right tools for the right job to deliver testable, maintainable, and modern data solutions
  • Own and document data pipelines and data lineage
  • Passion for continuous betterment (i.e. better workflows, automation, efficiency)
  • Design, build and implement automation framework for scale that deliver data with measurable quality under the SLA
  • Identify, document and promote best practices
  • Reduces technical debt over time with root cause identification and resolution of system problems

You Are

  • Bachelor’s Degree or Master’s Degree in Computer Science, Engineering or related field
  • 5+ years of overall experience and 3+ years of experience with very large-scale data warehouse projects
  • Very strong experience in SQL, knowledge of dimensional modeling, supporting data warehouse, scaling, optimizing and performance tuning ETL pipelines
  • Strong experience in administration of databases, ETL tools and knowledge of basic Unix administration.
  • Experience with Python and manipulation of various data formats for extraction and transformation
  • Knowledge of ETL tool, Airflow is a big plus.
  • Preferred: Prior experience with Snowflake, ETL/Integration tools administration (e.g. Informatica, Matillion, FiveTran)
  • Experience with automating AWS cloud native technologies, deploying applications, and provisioning infrastructure.
  • Knowledge of microservices and distributed application architecture, such as containers, Kubernetes, and/or serverless technology.
  • Ability to work on multiple areas like Data pipeline ETL, Data modeling & design, writing complex SQL queries etc.
  • Staying up-to-date with industry trends and emerging technologies to drive continuous improvement.
  • Hands-on experience with Data Warehouse technologies (Snowflake, Redshift) and Big Data technologies (e.g Hadoop, Hive, Spark)
  • Excellent written and verbal communication and interpersonal skills, able to effectively collaborate with technical and business partners

Skills

  • Data Warehousing
  • ETL Tools
  • Python
  • Kubernetes
  • AWS
  • Data pipelines
  • Data Modeling

Education

  • Master's Degree
  • Bachelor's Degree

Job Information

Job Posted Date

Apr 26, 2024

Experience

5-10 Years

Compensation (Annual in Lacs)

Best in the Industry

Work Type

Permanent

Type Of Work

8 hour shift

Category

Information Technology

Copyright © 2022 All Rights Reserved. Saas Talent