The candidate should have extensive production experience (2-3 Years) in GCP, Other cloud experience would be a strong bonus.
Strong background in Data engineering 3-6 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc.
Exposure to enterprise application development is a must.
Roles & Responsibilities:
Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Cloud composer, Big Query, Big Table, - At least 4 of these Services.
Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs.
Good hands-on expertise on either Python or Java programming.
Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM.
Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos.
Ability to drive the deployment of the customers’ workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical roadmaps for GCP cloud implementations.
Experience in architecting and designing technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities.
Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies.
Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams.
Coach and mentor engineers to raise the technical ability of the rest of the team, and/or to become certified in required GCP technical certifications.