Image Loading

Software Developer (Spark/Pyspark)

Job Description

Software Developer (Spark/Pyspark)

1. Highly proficient in writing good and efficient Python or Scala code. Preferrably Python as it has vast number of libraries

2. Experience in deploying applications, data pipelines (even with AWS lambda functions and S3 buckets) in AWS or Azure

3. Experience in handling and processing large files or huge amount of data. Knowledge of multi processing, multi threading, co-routines etc.

4. Strong in RDBMS - writing complex SQL queries, understanding of ORMs, optimizing SQL queries

5. Experience in NoSQL databases - MongoDB or Cassandra

6. Good understanding of ETL technologies - Any ETL tool or library usage in their projects is a plus

7. Experience in event driven programming - Async/Await, pub/sub etc

8. Experience in tools like Redis, RabbitMQ, Kafka etc

9. Experience in any of cloud infrastructure - AWS/Azure - Compute/Storage/Network/Databases/Security

10. Experience in PySpark - Any project related to the usage of PySpark with large data is an advantage.

11. Good knowlegde of software development process - coding standards, Git branching strategy, effort estimates, writing unit tests, proper documentation etc.

12. Experience in data warehouse is an advantage.

Skills

  • Spark Streaming
  • PySpark
  • AWS
  • Azure
  • RDBMS
  • MongoDb
  • Cassandra

Education

  • Master's Degree
  • Bachelor's Degree

Job Information

Job Posted Date

Oct 13, 2023

Experience

3 to 5 Years

Compensation (Annual in Lacs)

₹ Market Standard

Work Type

Permanent

Type Of Work

8 hour shift

Category

Information Technology

Copyright © 2022 All Rights Reserved. Saas Talent