Lead Data Engineer (ETL/ELT, CI/CD, Databricks)
Location
United States
Posted
4 days ago
Salary
Not specified
Job Description
Overview
CommIT Enterprises, Inc. is seeking a highly skilled Lead Data Engineer to join a team in Charleston, SC, but this role can be REMOTE, to design, develop, and maintain advanced data ingestion pipelines and models that support the United States Marine Corps (USMC) logistics situational awareness and decision-making. This role requires expertise in ETL/ELT processes, CI/CD pipeline development, and data modeling to ensure reliable, scalable, and secure data solutions. This role directly supports the USMC by enabling data-driven situational awareness, ensuring leaders have timely, accurate, and actionable insights to inform operational decisions.
Established in 2001, CommIT is a Certified Veteran-Owned Small Business (CVOSB) providing innovative technical engineering and data science services. Our enterprise systems support includes the Department of Defense’s (DoD) GCSS-MC, CAC2S, TBMCS-MC, and the Department of Veteran’s Affairs’ (VA) telehealth communications. We offer acquisition management, systems engineering, Agile software development, cloud management, IT modernization, data analytics, cybersecurity, and training, including leading-edge DevSecOps, automated testing, and mobile application development.
Responsibilities
Your essential job functions will include but may not be limited to-
- Lead the design, development, and maintenance of CI/CD pipelines that enable ingestion teams to seamlessly move ETL/ELT processes through Development, Testing, and Production environments.
- Build and optimize ETL/ELT workflows to ingest, validate, and transform data from diverse sources.
- Apply data modeling techniques to create Silver and Gold Tier data assets that enhance situational awareness and support decision-making.
- Collaborate with cross-functional teams to ensure data pipelines and models align with operational requirements and reporting needs.
- Drive best practices in data engineering, automation, and pipeline reliability to support mission-critical analytics.
Qualifications
Required Experience and Education:
- Master’s degree with 6 years of experience (or Bachelors with 8 years of experience) in Computer Science, Software Engineering, Computer Engineering, Mathematics or relevant field. Degree may be substituted with additional relevant industry experience and / or industry accepted training and certification.
- 5+ years of experience in data engineering with a focus on ETL/ELT and CI/CD.
- Hands-on experience with Databricks, Delta Lake, and GitLab Enterprise.
- Strong proficiency in Python and SQL.
- Experience supporting logistics, defense, or mission-critical environments is a plus.
- Proven experience with ETL and ELT processes for ingesting structured, semi-structured, and unstructured data sources.
- Deep understanding of CI/CD practices, with the ability to design and implement pipelines using Databricks and GitLab Enterprise for Data Ingestion teams.
- Strong background in data modeling, with the ability to build Silver and Gold Tier Tables and Views to support operational dashboards and reporting.
Technical Requirements:
- ETL and ELT methods
- Databricks and Delta Tables
- Python and SQL programming
- CI/CD pipeline development and automation
- GitLab Enterprise for version control and deployment
Security Requirements:
- Secret Clearance
- Security+ Certification
Equal Opportunity Employer:
CommIT Enterprises, Inc. is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law.
Related Guides
Related Categories
Related Job Pages
More Data Engineer Jobs
Senior Data Platform Engineer II (Databricks)
AledadeAledade, a public benefit corporation, exists to empower the most transformational part of our health care landscape - independent primary care. We were founded in 2014, and since then, we've become the largest network of independent primary care in the country - helping practices, health centers and clinics deliver better care to their patients and thrive in value-based care. Additionally, by creating value-based contracts across a wide variety of health plans, we aim to flip the script on the traditional fee-for-service model. Our work strengthens continuity of care, aligns incentives and ensures primary care physicians are paid for what they do best - keeping patients healthy. If you want to help create a health care system that is good for patients, good for practices and good for society - and if you're eager to join a collaborative, inclusive and remote-first culture - you've come to the right place.
The engineer will architect and manage high-performance, distributed data environments, focusing on scaling Databricks Lakehouse and Snowflake platforms for healthcare analytics. Primary duties include developing scalable solutions, partnering on technical roadmaps using Agile, and mentoring junior engineers through code reviews.
This role involves designing, implementing, and supporting data-intensive software solutions, focusing on complex SQL development, ETL pipelines, and analytics infrastructure for content tooling. Responsibilities include analyzing, designing, programming, debugging, and modifying software enhancements that process and deliver content data for business intelligence and reporting.
The Senior Data Engineer will be responsible for developing, maintaining, and optimizing dbt models across numerous data marts and building/maintaining ETL/ELT pipelines using Airflow for data ingestion from over 40 sources. Key duties also involve managing Snowflake infrastructure, driving cost optimization, and supporting CI/CD pipelines.
As a Senior Data Engineer, you will be instrumental in designing and building the next generation of our data infrastructure. Your work will handle massive volumes of behavioral, operational, and customer data, directly impacting product features like segmentation, personalizatio...