8-10 years of experience and 6+ years in Data Engineering
Building and maintaining data pipelines using technologies such as Cloud Data Fusion, Cloud Dataflow, Azure Data Factory, Azure Stream Analytics, Dell Boomi. (Any 2)
Storing and processing data using technologies such as BigQuery, Cloud Dataproc, Azure Synapse Analytics, Snowflake, Redshift, Cosmos DB. (Any 3)
Managing data storage and access using technologies such as Cloud Storage, Azure Blob Storage, or Azure Data Lake Storage. (Any 1)
Managing and deploying data infrastructure on GCP or Azure using technologies such as Cloud Composer, Azure Kubernetes Service, or Azure Container Instances (Any 2)
Implementing security and access controls for data such as Cloud Identity and Access Management, Azure Active Directory, or Azure Private Link
Well-versed with the Data services, integration, ingestion, ELT/ETL, Data Governance.
Security and Meta-driven Development.
Expertise on RDBMS (relational database management system) – writing complex SQL logics, DB/Query optimisation, Data Modelling, able to manage high data volume for mission critical applications.
Strong grip on programming using Python and PySpark.
Data pipeline lifecycle management and DataOps.
Expertise on using Azure DevOps or Google Cloud Build. (Any 1).
Preference to the candidates having Azure or GCP data engineer certification. (Any 1)