Description
Job Description:
Design, develop and implement scalable batch/real time data pipelines (ETLs) to integrate data from a variety of sources into Data Warehouse and Data Lake
Design and implement data model changes that align with warehouse dimensional modeling standards.
Proficient in Data Lake, Data Warehouse Concepts and Dimensional Data Model.
Responsible for maintenance and support of all database environments, design and develop data pipelines, workflow, ETL solutions on both on-prem and cloud-based environments.
Design and develop SQL stored procedures, functions, views, and triggers
Design, code, test, document and troubleshoot deliverables.
Collaborate with others to test and resolve issues with deliverables
Maintain awareness of and ensure adherence to the company’s standards regarding privacy.
Create and maintain Design documents, Source to Target mappings, unit test cases, data seeding.
Ability to perform Data Analysis and Data Quality tests and create audit for the ETLs.
Perform Continuous Integration and deployment using Azure Devops and Git
Requirements:
ETL – Extract Transform Load, Microsoft Azure, Microsoft SQL, SSIS SSRS, Snowflake, Microsoft SQL Server, Data Warehousing