Harness the power of your healthcare data
Senior Data Engineer
Location
United States
Posted
3 days ago
Salary
Not specified
Seniority
Senior
Job Description
This is a hands-on technical role requiring strong self-management, a proven ability to mentor peers, and comfort working with sensitive healthcare data under strict security requirements. As a scaling technology organization, we value ownership, collaboration, and continuous improvement — you will have a direct hand in shaping the tools, processes, and architecture that power our data platform.
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States.
Key Responsibilities
- Design, build, and maintain ETL and reverse-ETL pipelines between Snowflake, Azure Data Factory, and legacy SQL Server systems.
- Develop and optimize Snowflake data warehouse models, ensuring performance, reliability, and high availability.
- Implement and maintain row-level and column-level security policies to protect sensitive healthcare data.
- Partner with the existing SQL Server Data Engineering team to plan and execute the migration of legacy ETL processes and warehouse models to Snowflake and the cloud.
- Build and maintain data transformations using dbt.
- Monitor pipeline health, troubleshoot failures, and ensure uptime and data integrity across all data flows.
- Mentor peers and contribute to engineering best practices, code reviews, and documentation.
- Learn and adopt Sigma as the organization’s BI and reporting tool.
Required Skills and Qualifications
- Bachelor’s degree in Computer Science, Data Engineering, or related field, or equivalent experience.
- 10+ years of experience in data engineering, ETL development, and data warehouse modeling.
- Proven hands-on experience with Snowflake, including architecture, optimization, and security.
- Strong proficiency in SQL and Python.
- Experience with at least one major BI/visualization platform (e.g., Power BI, Tableau, Looker, Sigma).
- Experience with dbt and Azure Data Factory.
- Prior experience working with healthcare data, including familiarity with data sensitivity, HIPAA, and security best practices.
- Experience implementing row-level and/or column-level security in a data warehouse environment.
- Strong self-management skills with the ability to work independently in a fully remote environment.
Preferred Qualifications
- Experience with healthcare claims, eligibility, or related payer/TPA data.
- Experience migrating data infrastructure from SQL Server to Snowflake or other cloud platforms.
- Familiarity with Sigma.
- Experience working in Agile development environments with tools such as Jira.
- Prior experience in a startup or fast-growing technology company.
About SmartLight Analytics
SmartLight Analytics was formed by a group of industry insiders who sought to reduce rising healthcare costs for self-funded employers. Through proprietary data analysis, SmartLight identifies and mitigates wasteful healthcare spending without disrupting employee benefits or requiring behavior changes.Job Requirements
- Bachelor’s degree in Computer Science, Data Engineering, or related field, or equivalent experience.
- 10+ years of experience in data engineering, ETL development, and data warehouse modeling.
- Proven hands-on experience with Snowflake, including architecture, optimization, and security.
- Strong proficiency in SQL and Python.
- Experience with at least one major BI/visualization platform (e.g., Power BI, Tableau, Looker, Sigma).
- Experience with dbt and Azure Data Factory.
- Prior experience working with healthcare data, including familiarity with data sensitivity, HIPAA, and security best practices.
- Experience implementing row-level and/or column-level security in a data warehouse environment.
- Strong self-management skills with the ability to work independently in a fully remote environment.
- Design, build, and maintain ETL and reverse-ETL pipelines between Snowflake, Azure Data Factory, and legacy SQL Server systems.
- Develop and optimize Snowflake data warehouse models, ensuring performance, reliability, and high availability.
- Implement and maintain row-level and column-level security policies to protect sensitive healthcare data.
- Partner with the existing SQL Server Data Engineering team to plan and execute the migration of legacy ETL processes and warehouse models to Snowflake and the cloud.
- Build and maintain data transformations using dbt.
- Monitor pipeline health, troubleshoot failures, and ensure uptime and data integrity across all data flows.
- Mentor peers and contribute to engineering best practices, code reviews, and documentation.
- Learn and adopt Sigma as the organization’s BI and reporting tool.
- Preferred Qualifications
- Experience with healthcare claims, eligibility, or related payer/TPA data.
- Experience migrating data infrastructure from SQL Server to Snowflake or other cloud platforms.
- Familiarity with Sigma.
- Experience working in Agile development environments with tools such as Jira.
- Prior experience in a startup or fast-growing technology company.
Related Guides
Related Categories
Related Job Pages
More Data Engineer Jobs
Product Member, Data Platform
Anchorage DigitalTrusted institutional partner in crypto and first federally chartered crypto bank
Financial Crimes Compliance Product Manager defining strategy for regulatory compliance
Data Management and Analysis Engineer
State of WashingtonFounded in 1889, the State of Washington was the 42nd American territory to be admitted to the United States. Located in the Pacific Northwest, Washington is si
WSDOT is currently seeking a skilled Transportation Technical Engineer (TTE) to serve as the Data Management and Analysis Engineer in Olympia, WA. The Capital Program Development and Management (CPDM) Division oversees WSDOT’s capital construction program, including highways, f...
The role involves building, maintaining, publishing, and monitoring Data Products across AWS, GCP, and Azure environments while translating raw data into accessible formats for operational and analytical customers. Responsibilities also include acting as a subject matter expert for managed data, ensuring data security, and applying Software Engineering principles to integrate Data Products.
The Kafka Engineer will join a dynamic team to contribute to transforming business processes through technology by leveraging cutting-edge solutions. This role involves working within an environment focused on Apache Kafka, Event-Driven Architecture, and Real-Time Data Processing.



