Image Loading

Senior Data Engineer

Job Description

  • Chennai

About Zocket

Founded in 2021, Zocket leverages GenAI to enable businesses to effortlessly create and launch ads in seconds. With Zocket, businesses can manage ads across channels like Facebook, Instagram, Google and WhatsApp in a single platform. 

Zocket's proprietary world-class tech stack is built with cutting edge Machine learning and Artificial intelligence capabilities. Zocket is founded by IIM Grad second time entrepreneurs and has raised $3.5 million to date from marquee investors, including Kalaari capital. 

Zocket's mission is to transform how businesses leverage digital marketing to acquire new customers and grow their business. Zocket's primary markets are USA, Canada and India. 

Zocket is proud to be building for the Globe from India . 

Key Responsibilities

  • Build Data ETL Pipelines
  • Develop data set processes
  • Strong analytic skills related to working with unstructured datasets.
  • Evaluate business needs and objectives
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery
  • Interpret trends and patterns
  • Work with data and analytics experts to strive for greater functionality in our data system
  • Build algorithms and prototypes
  • Explore ways to enhance data quality and reliability
  • Work with the Executive, Product, Data, and Design teams, to assist with data-related technical issues and support their data infrastructure needs.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.

Key Requirements :

  • Proven experience as a data engineer, software developer, or similar of at least 3-5 years experience
  • Bachelor's / Master’s  degree in data engineering, big data analytics, computer engineering, or related field.
  • Experience with big data tools: Hadoop, Spark, Kafka, etc .
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience with Azure, AWS cloud services: EC2, EMR, RDS, Redshift
  • Experience with Big Query
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with languages: Python, Java , C++ , Scala , SQL , R, etc.
  • Good hands-on with  Hevo, Presto.
  • Experience creating ad pipelines for Ad Platforms (Meta Platforms are a bonus) 

 

Skills

  • Data Engineering
  • Big Data Analytics
  • Kafka
  • Relational Databases
  • Python
  • Java
  • C++
  • AWS
  • Azure

Education

  • Master's Degree
  • Bachelor's Degree

Job Information

Job Posted Date

May 09, 2024

Experience

3 to 5 Years

Compensation (Annual in Lacs)

₹ Market Standard

Work Type

Permanent

Type Of Work

8 hour shift

Category

Information Technology

Copyright © 2022 All Rights Reserved. Saas Talent