Job Description

  • Hyderabad/ Bengaluru, India (Hybrid Mode 3 Days/Week in Office)

Job Description:

  • Collaborate with stakeholders to develop a data strategy that meets enterprise needs and industry requirements.
  • Create an inventory of the data necessary to build and implement a data architecture.
  • Envision data pipelines and how data will flow through the data landscape.
  • Evaluate current data management technologies and what additional tools are needed.
  • Determine upgrades and improvements to current data architectures.
  • Design, document, build and implement database architectures and applications. Should have hands-on experience in building high scale OLAP systems.
  • Build data models for database structures, analytics, and use cases.
  • Develop and enforce database development standards with solid DB/ Query optimizations capabilities.
  • Integrate new systems and functions like security, performance, scalability, governance, reliability, and data recovery.
  • Research new opportunities and create methods to acquire data.
  • Develop measures that ensure data accuracy, integrity, and accessibility.
  • Continually monitor, refine, and report data management system performance.

Required Qualifications and Skillset:

  • Extensive knowledge of Azure, GCP clouds, and DataOps Data Eco-System (super strong in one of the two clouds and satisfactory in the other one)
  • Hands-on expertise in systems like Snowflake, Synapse, SQL DW, BigQuery, and Cosmos DB. (Expertise in any 3 is a must)
  • Azure Data Factory, Dataiku, Fivetran, Google Cloud Dataflow (Any 2)
  • Hands-on experience in working with services/technologies like - Apache Airflow, Cloud Composer, Oozie, Azure Data Factory, and Cloud Data Fusion (Expertise in any 2 is required)
  • Well-versed with Data services, integration, ingestion, ELT/ETL, Data Governance, Security, and Meta-driven Development.
  • Expertise in RDBMS (relational database management system) – writing complex SQL logic, DB/Query optimization, Data Modelling, and managing high data volume for mission-critical applications.
  • Strong grip on programming using Python and PySpark.
  • Clear understanding of data best practices prevailing in the industry.
  • Preference to candidates having Azure or GCP architect certification. (Either of the two would suffice)
  • Strong networking and data security experience.

Awareness of the Following:

  • Application development understanding (Full Stack)
  • Experience on open-source tools like Kafka, Spark, Splunk, Superset, etc.
  • Good understanding of Analytics Platform Landscape that includes AI/ML
  • Experience in any Data Visualization tool like PowerBI / Tableau / Qlik /QuickSight etc.

About Us

Gramener is a design-led data science company. We build custom Data & AI solutions that help solve complex business problems with actionable insights and compelling data stories. We partner with enterprise data and digital transformation teams to improve the data-driven decision-making culture across the organization. Our open standard low-code platform, Gramex, rapidly builds engaging Data & AI solutions across multiple business verticals and use cases. Our solutions and technology have been recognized by analysts such as Gartner and Forrester and have won several awards.

We Offer You:

  • a chance to try new things & take risks.
  • meaningful problems you'll be proud to solve.
  • people you will be comfortable working with.
  • transparent and innovative work environment.

Skills

  • AWS
  • Azure
  • GCP
  • Python
  • Azure Data Factory
  • RDBMS

Education

  • Master's Degree
  • Bachelor's Degree

Job Information

Job Posted Date

Dec 05, 2023

Experience

4 to 8 Years

Compensation (Annual in Lacs)

Best in the Industry

Work Type

Permanent

Type Of Work

8 hour shift

Category

Information Technology

Copyright © 2022 All Rights Reserved. Saas Talent