Senior Data Warehouse Engineer

US-CO-Englewood

Attract-careers1

Req #: 97434
Type: Fulltime-Regular
logo

EchoStar

Connect With Us:
Connect To Our Company
				Overview:

Our Technology teams challenge the status quo and reimagine capabilities across industries. Whether through research and development, technology innovation or solution engineering, our team members play a vital role in connecting consumers with the products and platforms of tomorrow.

Responsibilities:

Candidates must be willing to participate in at least one in-person interview, which may include a live whiteboarding or technical assessment session.

Key Responsibilities:

* Monitor and provide operational support for large enterprise data warehouse systems, resolving complex ETL job-related issues and ensuring data quality
* Maintain and optimize scalable batch and streaming data pipelines and data lake solutions on the AWS cloud platform (S3, Glue, EC2, Redshift)
* Lead incident management and root cause analysis meetings to develop operational metrics and drive continuous improvement of production systems
* Collaborate with cross-functional Agile/Scrum teams (Data Scientists, Analysts, DevOps) to implement data transformation logic (ETL/ELT)
* Manage CI/CD pipelines and Infrastructure-as-Code (IaC) using GitLab while exploring Generative AI integration points like Amazon Q for pipeline optimization
* Participate in shift-based working hours and on-call support to ensure the continuous reliability and performance of enterprise data systems

Qualifications:

Education and Experience:

* Bachelor's degree in Computer Science or a related technical field
* 5+ years of professional experience in operation and production support of large Enterprise Data Warehouses

Skills and Qualifications:

* Advanced proficiency in Shell scripting, Python, and SQL, with demonstrated experience developing, optimizing, and maintaining scalable, production-grade data pipelines

* Strong hands-on experience with AWS data services (EC2, S3, Redshift) and solid understanding of core cloud architecture components including VPC, IAM, CloudWatch, and Data Lake frameworks

* Experience designing and managing workflow orchestration using tools such as Control-M or Apache Airflow to support reliable, automated data processing

* Deep understanding of data modeling concepts, including dimensional and 3NF structures, ensuring high-quality and performant data solutions

* Proficiency with Git/GitLab for version control, peer code reviews, and CI/CD processes, along with working knowledge of Infrastructure-as-Code tools such as Terraform or AWS CloudFormation

* Proven ability to evaluate and communicate findings from Proof-of-Concept (POC) initiatives, with strong analytical and problem-solving skills; familiarity with Generative AI tools (e.g., Amazon Q, Gemini, Databricks Genie Rooms) and their practical application within data engineering workflows
			
Share this job: