We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia.
Data Engineer
Location
United States
Posted
3 days ago
Salary
$60 - $75 / hour
Job Description
Role Description
This is a Contract position based out of Indianapolis, IN.
100% REMOTE (need to live in EST OR CST TIME ZONES)
Top Skills REQUIRED:
- Data Engineer Expertise
- Snowflake & DbT
- SQL & Python
- Data Warehousing & Data Pipelining Core Understanding
Two Squads in total for this work:
-
Precision Squad:
- 5 Data Engineer consultants focused on switching from Kevel to Google Ad Manager (GAM) and moving from homegrown Ad Server into Operative AOS.
- Responsibilities include connecting GAM and AOS, migrating data, transforming data, normalizing data across platforms, and automating flows and reporting.
- Heavy DbT, Snowflake, and some Looker/BI understanding beneficial.
- Work focused on CDP (Customer Data Platform) after platform selection.
-
Programmatic Squad:
- Two engineers focused on overhauling the campaign management system.
- Communication skills and custom API writing skills with DbT and Snowflake are important.
- Similar technical work to Precision but requires more custom/problem-solving mentality.
Data Warehousing & Transformation:
- Snowflake — advanced modeling, performance tuning, warehouse optimization, semi-structured handling, curated/Gold layer design.
- dbt (Data Build Tool) — complex transformations, tests, metrics modeling, CI integration, documentation.
Programming & Data Processing:
-
Python — core language for:
- API consumption/custom connectors
- Automation jobs
- Data transformations
- Reusable libraries & validation frameworks
Data Pipelines & Orchestration:
- Airflow — production DAGs, dependency management, SLAs, retries, backfills, data quality checks, environment-aware configs.
- AWS Data Stack — especially S3, EMR, Lambda, Kinesis, Glue/Athena, IAM basics for secure pipeline operation.
Ingestion & Integration:
- Fivetran — managing connectors, sync strategies, and multi-system ingestion.
- Semi-structured data — JSON, Parquet, gz; flattening, metadata-driven ingestion.
Data Quality & Governance:
- Validation frameworks — source to target reconciliation, boundary checks, null audits, row-level testing.
- PII protection/access governance — experience with tools like Immuta, Atlan, or tokenization libraries (e.g., Protegrity).
- Documentation discipline — data dictionaries, lineage, flow diagrams, runbooks.
Qualifications
- Experience with Data Engineering principles and practices.
- Proficiency in Snowflake and DbT.
- Strong SQL and Python skills.
- Understanding of data warehousing and data pipelining.
Requirements
- Must reside in EST or CST time zones.
- Ability to work remotely.
- Strong communication skills.
- Experience with API writing and custom solutions.
Benefits
- Medical, dental & vision
- Critical Illness, Accident, and Hospital
- 401(k) Retirement Plan – Pre-tax and Roth post-tax contributions available
- Life Insurance (Voluntary Life & AD&D for the employee and dependents)
- Short and long-term disability
- Health Spending Account (HSA)
- Transportation benefits
- Employee Assistance Program
- Time Off/Leave (PTO, Vacation or Sick Leave)
Job Requirements
- Experience with Data Engineering principles and practices.
- Proficiency in Snowflake and DbT.
- Strong SQL and Python skills.
- Understanding of data warehousing and data pipelining.
- Must reside in EST or CST time zones.
- Ability to work remotely.
- Strong communication skills.
- Experience with API writing and custom solutions.
Benefits
- Medical, dental & vision
- Critical Illness, Accident, and Hospital
- 401(k) Retirement Plan – Pre-tax and Roth post-tax contributions available
- Life Insurance (Voluntary Life & AD&D for the employee and dependents)
- Short and long-term disability
- Health Spending Account (HSA)
- Transportation benefits
- Employee Assistance Program
- Time Off/Leave (PTO, Vacation or Sick Leave)
Related Guides
Related Categories
Related Job Pages
More Data Engineer Jobs
Principal Data Engineer leading data governance at Teachstone
Snowflake Senior Data Engineer/ Developer
General Dynamics Information TechnologyArt of the possible.
The role involves designing, building, testing, and maintaining scalable data engineering components and platform services for a cloud-native Enterprise Data Warehouse (EDW) in Snowflake. Responsibilities include developing high-quality, secure data pipelines, transformations, and integrations to support reporting and analytical objectives.
Director, Data Engineering
Floor & DecorAt Floor & Decor, our associates are entrepreneurs, innovators, and go-getters.
Director of Data Engineering leading data platform transformation.
Software Engineer II (Data Engineering)
R1 RCMTechnology-driven revenue cycle management services for healthcare providers.
The role involves designing, developing, and maintaining software applications focused on handling and processing large volumes of data as part of the company's state-of-the-art data platform foundation. Responsibilities include collaborating with cross-functional teams, building optimized data models, writing ETL code, implementing data quality checks, and troubleshooting data-related issues.