Image Loading

Sr Data Engineer (Snowflake Developer)

Job Description

Scope:

  • Specializes in the architecture, design, and management of data solutions within the Snowflake Data Cloud. 

Our current technical environment:

  • Software: Java, Springboot, Gradle, GIT, Hibernate, Rest API, OAuth
  • Application Architecture: Scalable, Resilient, event driven, secure multi-tenant Microservices architecture
  • Cloud Architecture: MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD)
  • Frameworks/Others: Kubernetes, Kafka, Elasticsearch, Spark, Snowflake, Mongo, Springboot, Gradle GIT, Ignite

What you’ll do:

  •  Specializes in the comprehensive suite of the Snowflake data platform, to manage the design and implement the data storage solutions.
  • Make optimal use of Snowflake’s cloud driven infrastructure for effective data flow and availability.
  • Effectively collaborate with data analysts for supporting data-driven business decisions across the units.
  • Write and review service descriptions including relevant measures of service quality and drives architecture to deliver on these promises through self-healing, reliable services that require minimum manual intervention.
  • Provide early visibility and mitigation to technical challenges through the journey.

What we are looking for:

  • Minimum 6-10 Year / Bachelor's Degree in computer science, data science, information science or related field, or equivalent work
  • Must have Hands-on experience on implementing large-scale data intelligence solution around Snowflake DW
  • Must have Experience working with Snowflake Functions, hands on exp with Snowflake utilities, stage and file upload features, time travel, fail safe, procedure writing, tasks, snowpipe, SnowSQL, Snowpark
  • Focus on ELT to load data into database and perform transformations in database
  • Ability to use analytical SQL functions
  • Experience building dimensional Data marts, Data lakes and warehouses
  • Good experience on streaming services such as Kafka
  • Cloud Data Warehouse solutions experience (Snowflake, Azure DW, or Redshift); data modeling, analysis, programming
  • Experience with DevOps models utilizing a C|/CD tool
  • Work in hands-on Cloud environment in Azure Cloud Platform (ADLS, Blob)
  • Advanced SQL queries, scripts, stored procedures, materialised views
  • Orchestration workflow, Azure Data Factory, and Bl tools like Tableau preferred
  • Analyze data models

Skills

  • ETL
  • Database
  • Data Lakes
  • Snowflake Cloud
  • Devops
  • Azure Data Factory

Education

  • Master's Degree
  • Bachelor's Degree

Job Information

Job Posted Date

Nov 30, 2024

Experience

6 to 10 Years

Compensation (Annual in Lacs)

₹ Market Standard

Work Type

Permanent

Type Of Work

8 hour shift

Category

Information Technology

Copyright © 2022 All Rights Reserved. Saas Talent