Data Engineer - AI Products

US-CO-Denver

Attract-careers1

Req #: 95768
Type: Fulltime-Regular
logo

EchoStar

Connect With Us:
Connect To Our Company
				Overview:

Our Technology teams challenge the status quo and reimagine capabilities across industries. Whether through research and development, technology innovation or solution engineering, our team members play a vital role in connecting consumers with the products and platforms of tomorrow.

Responsibilities:

Candidates must be willing to participate in at least one in-person interview, which may include a live whiteboarding or technical assessment session.

Key Responsibilities:

* Design, implement, and optimize scalable, secure data pipelines using Databricks, Spark, and Delta Lake
* Build robust ingestion, transformation, and CI/CD workflows with GitLab, ensuring performance, reliability, and cost-efficiency
* Apply software engineering best practices (SOLID, modular design, version control) to data workflows
* Develop unit tests, reusable testing frameworks, and enforce code quality through reviews and mentorship
* Refactor and maintain existing codebases for scalability, stability, and performance
* Collaborate with cross-functional teams and contribute to Agile ceremonies, shared libraries, documentation, and infrastructure standards supporting our AI Office

Qualifications:

Education & Experience:

* Bachelor's degree in Computer Science, Information Systems, or related field (Master's preferred)
* 1- 4+ years of data engineering experience building large-scale pipelines with Python, Databricks, Spark, and Delta Lake

Skills and Qualifications:

* Proficient in Python (required) and Scala, with experience writing clean, testable, and modular code
* Strong understanding of data engineering principles, distributed systems, and modern data architecture
* Skilled in CI/CD practices, DevOps workflows, and automation of data pipeline deployments
* Experienced with unit testing (e.g., pytest) and ensuring data quality and pipeline reliability
* Familiar with AWS, containerization (Docker), and orchestration tools (ECS, EKS, Kubernetes)
* Strong communication, problem-solving, and collaboration skills; prior experience handling sensitive data is a plus
* Proven ability to apply SOLID principles, design patterns, and best practices to data solutions
* Experience with ML/NLP frameworks and infrastructure-as-code tools (Terraform, CDK, CloudFormation) a plus

Visa sponsorship not available for this role
			
Share this job: