Job Description
Conviva is the first and best place to go to understand and optimize digital customer experiences. Our Operational Data Platform harnesses full-census, comprehensive client-side telemetry—capturing every aspect of customer experience and engagement across all devices and linking them to the performance of underlying services, in real time and at a fraction of the cost of alternative solutions. Trusted by industry leaders like Disney, NBC, and the NFL, Conviva revolutionizes how businesses understand customer experience and engagement, maximizing satisfaction, conversion, and revenue.
As Conviva is expanding, we are building products providing deep insights into end user experience for our customers. We are seeking a Principal Engineer to join the Platform and TLB Team as an Individual Contributer.
The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real time. Engineer the next-gen Spark-like system for in-memory computation of large time-series dataset’s – both Spark-like backend infra and library based programming model. Build horizontally and vertically scalable system that analyses trillions of events per day within sub second latencies. Utilize the latest and greatest of big data technologies to build solutions for use-cases across multiple verticals. Lead technology innovation and advancement that will have big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.
What Success Will Look Like:
- Design, build and maintain the stream processing, time-series analysis system which is at the heart of Conviva’s products
- Responsible for architecture of Conviva platform
- Build features, enhancements, new services and bug fixing in Scala and Rust on a Jenkins based pipeline to be deployed as Docker containers on Kubernetes
- Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements etc.
- Lead a team to develop a feature or parts of a the product
- Adhere to the Agile model of software development to plan, estimate and ship per business priority
Who You Are & What You've Done:
- 14+ years of work experience in software development of data processing products.
- Engineering degree in software or equivalent from a premier institute.
- Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
- Excellent programming and debugging skills. Proficient in writing code in Rust/Scala /Haskell/Erlang that is reliable, maintainable, secure, and performant
- Experience/knowledge of actor model of concurrency (Akka in Scala or Actix in Rust) is a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
- Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS etc.
- Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning etc.
- Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
- Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity