Image Loading

Senior Data Engineer

Job Description

What will you do:

  • Build data pipe lines t o assemble large, complex sets of data that meet non-functional and functional business requirements
  • Develop ETL s olutions using Python, Powershell , SQL, SSIS etc. to load and automate complex datasets in PDF, Excel, flat files, JSON, XML, EDI etc.
  • Take the full ownership of end-to-end data processes on Azure Cloud environments
  • Work closely with data architect, SMEs and other technology partners to develop & execute data architecture and product roadmap
  • Collaborate with the backend developers independently to understand the legacy applications and implement the feature s in a new system.
  • Troubleshoot issues and other operational bottlenecks to support continuous data delivery for various applications.
  • Take initiatives to make changes and improvements, work on technical debt, new and complex challenges .
  • Implement complex warehouse views, make database design decisions to support the UI need, optimize scripts to periodically refresh large volume datasets.
  • Perform code reviews and coach team members
  • Develop reports on data dictionary, server metadata, data files and implement the reporting tools as needed.
  • Implement best practices for data updates and development, troubleshoot performance and other data related issues on multiple product applications.
  • Keep current on big data and data visualization technology trends, evaluate, work on proof-of concept and make recommendations on cloud technologies.

What you bring:

  • 7 + years of data engineering experience working in partnership with large data sets and cloud architecture
  • Deep experience in building data pipelines using the ETL tools and paradigms and loading data to and from RDBMS such Postgres, SQL Server, Oracle or similar.
  • Proficient in cloud services technologies such as Microsoft Fabric, Azure Data Factory, Data Lake, or other related technologies such as AWS or GCP or Databricks
  • Proficient in using SSDT tools for building SQL server relational databases, databases in Azure SQL, Analysis Services data models, Integration Services packages and Reporting Services reports
  • Solid experience building data solu tions with the programming language s such as Python, Powershell ,Spark, Scala
  • Advanced T-SQL and ETL automation experience
  • Experience working with orchestration tools such as Airflow and building complex dependency workflows
  • Self-motivated with the ability to work and learn new technology independently
  • Great problem-solving capabilities, troubleshooting data issues and experience in stabilizing big data systems.
  • Excellent communication and presentation skills

Bonus points :

  • Hands-on de e p experience with cloud data migration, and experience working with analytic platforms like Fabric, Databricks on the cloud
  • Certification in one of the cloud platforms (AWS/GCP/Azure)
  • Experience with real -time data streaming tools like Kafka, Kinesis or any similar tools.
  • Experience with the US health care reimbursement-related terminology and data is a plus

Skills

  • Big Data
  • Business Requirements
  • Data Architecture
  • Data Modeling
  • ETL Tools

Education

  • Master's Degree
  • Bachelor's Degree

Job Information

Job Posted Date

Aug 27, 2024

Experience

7 to 10 Years

Compensation (Annual in Lacs)

Best in the Industry

Work Type

Permanent

Type Of Work

8 hour shift

Category

Information Technology

Copyright © 2022 All Rights Reserved. Saas Talent