Job Requirement:
• Ability to work independently on projects and processes with general supervision
• Practical knowledge of applicable work area
• Ability to situationally adapt and understand new technology/processes as per business requirement
• Knowledge with a variety of the field's concepts, practices, and procedures
• Ability to work constructively and independently under stress and pressure in a fast paced, multi-tasking environment
• Ability to interact positively and openly with colleagues and external business contacts, with strong verbal and written communication skills
• Knowledge of programming languages and software basics (PySpark, Python, Terraform, Cloud Formation)
• Experience with cloud-based preferably AWS (or GCP / Azure) data management platform infrastructure and typical storage/compute services (Snowflake, Redshift, Azure Synapse, Databricks, Open Source -Iceberg/Delta etc.)
• Experience with building, scaling, configuring proactive diagnostics and monitoring of Data infrastructure
• Knowledge in Real-time Data streaming technologies like Apache Kafka, Spark Streaming is desirable
• Knowledge of Data Orchestration tools like Airflow or similar tools (i.e. Dagster, Prefect)
Knowledge of relevant software development tools including version control, build processes, debuggers, and test frameworks