Azure Data Warehouse Developer

US-PA-Harrisburg

Global Data Consultants

Req #: 3965
Type: Hourly
logo

GDC IT Solutions

Connect With Us:
Connect To Our Company
				Overview:

GDC IT Solutions is currently seeking an Azure Data Warehouse Developer to work in a hybrid capacity, with occasional in-office meetings (minimum once per month) at the Pennsylvania Department of Health. Candidates must reside within Pennsylvania and will primarily work remotely. This role supports an enterprise-wide data modernization initiative focused on public health data, cloud transformation, and scalable analytics.

Responsibilities:

* Design, develop, test, and implement scalable data lakes, data warehouses, and ELT solutions using Azure Databricks, Data Factory, Delta Lake, Synapse, and related tools.

* Analyze business requirements and design technical solutions aligned with project goals.

* Produce robust ETL/ELT pipelines using SQL Server Integration Services (SSIS), T-SQL, and other tools.

* Manage Azure DevOps CI/CD pipelines, automate deployments, and maintain release integrity using Monorepo strategies.

* Perform data formatting, cleansing, extraction, classification, and quality control.

* Collaborate with business analysts, DBAs, developers, and stakeholders to define data architecture and meet project milestones.

* Participate in technical and process reviews, status reporting, solution testing, and stakeholder presentations.

* Develop and maintain procedural documentation, diagrams, and manuals aligned with SDLC best practices.

* Conduct training and knowledge transfer sessions for system maintenance and operational support.

* Ensure compliance with federal and commonwealth data standards and security protocols.

* Support multiple public health projects including PA NEDSS NextGen, PA LIMS Replacement, and COVID-19 initiatives.

* Track personal and project status through platforms like SharePoint, Daptiv, and PeopleFluent.

Qualifications:

* 5+ years of hands-on experience with Azure (Databricks, Data Factory, Synapse, Delta Lake), Apache Spark, and Python.

* 5+ years designing, implementing, and maintaining business intelligence and data warehouse solutions using SQL Server and Azure Synapse.

* 5+ years of experience developing ETL/ELT pipelines using SSIS and related tools.

* Strong proficiency in T-SQL, writing queries, scripts, and stored procedures.

* 5+ years of experience as a CI/CD Release Manager in Azure DevOps, with Monorepo pipeline experience.

* Strong understanding of relational and dimensional database design, including star schema, facts, and dimensions.

* Experience in data engineering, file system optimization, APIs, and analytics-as-a-service.

* Advanced knowledge of data modeling standards, data mining, and reporting methodologies.

* Strong documentation and SDLC discipline including technical diagrams, test plans, and peer code reviews.

* Proven ability to communicate technical concepts to both technical and non-technical audiences.

* Bachelor's degree in Computer Science or a related field.
			
Share this job: