ETL Engineer

US-VA-McLean

External

Req #: 5870
Type: Full-Time
logo

Steampunk

Connect With Us:
Connect To Our Company
				Overview:

We are looking for a seasoned ETL Engineer to work with our team and our clients to develop enterprise grade data pipelines. We are looking for more than just an "ETL Engineer". We are looking for a technologist with excellent communication and customer service skills and a passion for data and problem solving.

Responsibilities:

* Assess and understand ETL jobs and workflows
* Create reusable data pipelines from source to target systems
* Test, validate, and deploy ETL pipelines
* Support reporting, business intelligence, and data science end users through ETL and ELT operations
* Work with data architects to create data models and design schemas for RDBMS, warehouse, and data lake systems
* Key must have skill sets - Python, SQL
* Work within an Agile software development lifecycle
* You will contribute to the growth of our Data Exploitation Practice!

Qualifications:

* Ability to hold a position of public trust with the US government.
* Bachelor's degree in computer science, information systems, engineering, business, or a scientific or technical discipline
* 2-4 years industry experience coding commercial software and a passion for solving complex problems
* 2-4 years direct experience in Data Engineering with experience in tools such as:
* ETL Tools: Python, Informatica, Pentaho, Talend
* Big data tools: Hadoop, Spark, Kafka, etc.
* Relational SQL and NoSQL databases, including Postgres, CloudSQL, MongoDB
* Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
* AWS cloud services: EC2, EMR, RDS, Redshift (or Azure and GCP equivalents)
* Data streaming systems: Storm, Spark-Streaming, etc.
* Search tools: Solr, Lucene, Elasticsearch
* Object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

* Advanced working SQL knowledge and experience working with relational databases, query authoring and optimization (SQL) as well as working familiarity with a variety of databases
* Experience with message queuing, stream processing, and highly scalable 'big data' data stores
* Experience manipulating structured and unstructured data for analysis
* Experience constructing complex queries to analyze results using databases or in a data processing development environment
* Experience working in an Agile environment
* Experience supporting project teams of developers and data scientists who build web-based interfaces, dashboards, reports, and analytics/machine learning models
* Experience with the following are required: Postgres, Crunchy Data, CloudSQL, MongoDB, BigQuery, Looker, Datastream, Dataform, Composer.
			
Share this job: