Image Loading

Hadoop Engineer, R&D

Job Description

  • Bengaluru, Karnataka, India 

Be part of something revolutionary.

At o9 Solutions, our mission is clear: be the Most Valuable Platform (MVP) for enterprises. With our AI-driven platform — the o9 Digital Brain — we integrate global enterprises’ siloed planning capabilities, helping them capture millions and, in some cases, billions of dollars in value leakage. But our impact doesn’t stop there. Businesses that plan better and faster also reduce waste, which drives better outcomes for the planet, too.

We're on the lookout for the brightest, most committed individuals to join us on our mission. Along the journey, we’ll provide you with a nurturing environment where you can be part of something truly extraordinary and make a real difference for companies and the planet.

About the role....

Managing, Installing & config. of Hadoop (hive, hdfs, Ambari & Nifi). Designing the architecture and to integrate with o9 Product. Involved in Hadoop maintenance & monitoring tasks, managing day to day support tickets(zendesk), implementing Jenkins and Ansible to automate Devops tasks.

What you’ll do for us…

  • Work with customers/technical consultants to devise and recommend big data solution architecture based on requirements
  • Analyze complex distributed production deployments, and make recommendations to optimize performance
  • Able to document and present complex architectures for the customers technical teams
  • Work closely with o9 Dev, Devops and project teams at all levels to help ensure the success of projects
  • Help design and implement Hadoop architectures and configurations for customers working with Cloud deployments
  • Write and produce technical documentation and manual to be provided to customer
  • Keep current with the Hadoop Big Data ecosystem technologies

What you’ll have...

  • Experience: 2+ years of DevOps experience, architecting large scale storage, data center and /or globally distributed solutions
  • Experience designing and deploying production large-scale Hadoop solutions.
  • Experience designing data queries against data in a Hadoop environment using tools such as Apache Hive, Apache Druid, Apache Phoenix or others.
  • Experience installing, administering and tuning multi-node Hadoop clusters
  • Strong experience implementing software and/or solutions in Cloud (Azure, AWS, GCP)
  • Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
  • Experience implementing MapReduce jobs
  • Experience in architecting data centre solutions – properly selecting server and storage hardware based on performance, availability and ROI requirements
  • Demonstrated experience implementing big data use-cases, understanding of standard design patterns commonly used in Hadoop-based deployments.
  • Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP
  • Skills: Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos
  • Strong understanding of network configuration, devices, protocols, speeds and optimizations
  • Strong understanding of Java development, debugging & profiling.
  • Solid background in Database administration and design, along with Data Modelling.
  • Excellent verbal and written communications
  • Demonstrable experience using R and the algorithms
  • Characteristics: Ability to understand and translate customer requirements into technical requirements .
  • We really value team spirit: Transparency and frequent communication is key. At o9, this is not limited by hierarchy, distance, or function

Skills

  • Devops
  • Hadoop
  • Cloud platform
  • REST
  • SOAP
  • R
  • Data Modeling
  • Apache NiFi

Education

  • Master's Degree
  • Bachelor's Degree

Job Information

Job Posted Date

Apr 23, 2024

Experience

2 to 6 Years

Compensation (Annual in Lacs)

Best in the Industry

Work Type

Permanent

Type Of Work

8 hour shift

Category

Information Technology

Copyright © 2022 All Rights Reserved. Saas Talent