Job Description

A bit about us:

PulsePoint is a leading technology company that uses real-world data in real-time to optimize campaign performance and revolutionize health decision-making. Leveraging proprietary datasets and methodology, PulsePoint targets healthcare professionals and patients with an unprecedented level of accuracy—delivering unparalleled results to the clients we serve. The company is now a part of Internet Brands, a KKR portfolio company and owner of WebMD Health Corp.

Data Engineer

PulsePoint Data Engineering team plays a key role in our technology company that’s experiencing exponential growth. Our data pipeline processes over 80 billion impressions a day (> 20TB of data, 220 TB uncompressed). This data is used to generate reports, update budgets, and drive our optimization engines. We do all this while running against extremely tight SLAs and provide stats and reports as close to real-time as possible.

The most exciting part about working at PulsePoint is the enormous potential for personal and professional growth. We are always seeking new and better tools to help us meet challenges such as adopting proven open-source technologies to make our data infrastructure more nimble, scalable and robust. Some of the cutting-edge technologies we have recently implemented are Kafka, Spark Streaming, Presto, Airflow, and Kubernetes.

What you'll be doing:

  • Design, build, and maintain reliable and scalable enterprise-level distributed transactional data processing systems for scaling the existing business and supporting new business initiatives

  • Optimize jobs to utilize Kafka, Hadoop, Presto, Spark, and Kubernetes resources in the most efficient way

  • Monitor and provide transparency into data quality across systems (accuracy, consistency, completeness, etc)

  • Increase accessibility and effectiveness of data (work with analysts, data scientists, and developers to build/deploy tools and datasets that fit their use cases)

  • Collaborate within a small team with diverse technology backgrounds

  • Provide mentorship and guidance to junior team members

Team Responsibilities:

  • Ingest, validate and process internal & third party data

  • Create, maintain and monitor data flows in Spark, Hive, SQL and Presto for consistency, accuracy and lag time

  • Maintain and enhance framework for jobs(primarily aggregate jobs in Spark and Hive)

  • Create different consumers for data in Kafka using Spark Streaming for near time aggregation

  • Tool evaluation/selection/implementation

  • Backups/Retention/High Availability/Capacity Planning

  • Review/Approval - DDL for database, Hive Framework jobs and Spark Streaming to make sure they meet our standards

Technologies We Use:

  • Airflow - for job scheduling

  • Docker - Packaged container image with all dependencies

  • Graphite/Beacon - for monitoring data flows

  • Hive - SQL data warehouse layer for data in HDFS

  • Kafka- distributed commit log storage

  • Kubernetes - Distributed cluster resource manager

  • Presto - fast parallel data warehouse and data federation layer

  • Spark Streaming - Near time aggregation

  • SQL Server - Reliable OLTP RDBMS

  • GCP BQ

Requirements

  • 5+ years of software engineering experience

  • Fluency in Python, experience in Scala/Java is a huge plus (Polyglot programmer preferred!)

  • Hive experience

  • Proficiency in Linux

  • Strong understanding of RDBMS, SQL;

  • Passion for engineering and computer science around data

  • Willing and able to work East Coast U.S. hours (9am-6pm EST)

  • Willingness to participate in 24x7 on-call rotation

  • Knowledge and exposure to distributed production systems i.e Hadoop is a huge plus

  • Knowledge and exposure to Cloud migration is a plus

Skills

  • Python
  • Java
  • Linux.
  • RDBMS
  • Cloud Migration
  • Docker

Education

  • Master's Degree
  • Bachelor's Degree

Job Information

Job Posted Date

Sep 26, 2024

Experience

5-10 Years

Compensation (Annual in Lacs)

₹ Market Standard

Work Type

Permanent

Type Of Work

8 hour shift

Category

Information Technology

Copyright © 2022 All Rights Reserved. Saas Talent