Openly logo
Openly

Premium, straightforward insurance

Senior Data Engineer

Data EngineerData EngineerOtherRemoteSeniorTeam 201-500Since 2017H1B SponsorCompany SiteLinkedIn

Location

United States

Posted

3 days ago

Salary

$140.8K - $158.4K / year

Seniority

Senior

Job Description

• Architect, build, and maintain high-quality, scalable data pipelines, data models, and data infrastructure across our GCP-based platform. • Lead technical design decisions related to data architecture, pipeline orchestration, data modeling, and cloud infrastructure. • Develop and optimize complex SQL queries and data transformations in BigQuery and PostgreSQL, ensuring performance, reliability, and correctness. • Write production-grade code in Python and/or Go to build and enhance data management frameworks, services, and pipeline tooling. • Partner with data science, business intelligence, product, and operations teams to translate business requirements into reliable data solutions. • Own and improve the full Software Development Lifecycle (SDLC) for data projects: design, development, testing, CI/CD deployment, monitoring, and maintenance. • Leverage streaming and event-driven architectures using Kafka (Aiven/Debezium) to build real-time data pipelines. • Utilize and optimize distributed data processing frameworks such as Apache Spark for large-scale data transformations. • Mentor and provide technical guidance to junior and mid-level data engineers; conduct code reviews and promote engineering best practices. • Share your knowledge within the data engineer team and the engineering organization through all-hands presentations, learning hours, domain meetings, and written documentation.

Job Requirements

  • 4+ years of data engineering and data management experience, with a proven track record of delivering complex, production-grade data systems.
  • Strong proficiency in SQL and SQL optimization — including query tuning, indexing strategies, execution plan analysis, and data modeling in BigQuery and PostgreSQL.
  • Expert-level scripting and programming in Python; experience with Go is a strong plus.
  • Deep expertise with Google Cloud Platform (GCP), including BigQuery, GCS, Composer/Airflow, Cloud Functions, Cloud Run, Pub/Sub, and CloudSQL.
  • Proven experience building and operating event-driven and streaming data pipelines using Kafka or similar technologies (Aiven/Debezium experience a plus).
  • Strong understanding of modern data warehouse and Lakehouse architectures, including multi-layered data modeling patterns (bronze/silver/gold or equivalent).
  • Infrastructure as Code (IaC) experience with Terraform to define, manage, and version cloud data infrastructure.
  • Solid understanding of Software Development Lifecycle (SDLC) best practices: CI/CD pipelines, automated testing, code review processes, code repositories, and deployment management.
  • Experience with data replication tools (e.g., Fivetran, Debezium) and understanding of CDC (Change Data Capture) patterns.
  • Ability to independently drive data architecture decisions, translate business requirements into source-to-target data mappings, and deliver working, maintainable solutions.
  • Strong communication skills; able to effectively collaborate with and educate both technical and non-technical stakeholders.
  • Experience mentoring engineers and leading technical initiatives within a team environment.

Benefits

  • Remote-First Culture - We supported #remotelife long before it was a given. We'll keep promoting it.
  • Competitive Salary & Equity
  • Comprehensive Medical, Dental, and Vision Plan Offerings
  • Life and disability coverage including voluntary options
  • Parental Leave - up to 8 weeks (320 hours) of paid parental leave based on meeting eligibility requirements (Birthing parents may be eligible for additional leave through STD)
  • 401K Company Contribution - Openly contributes 3% of the employee's gross income, even if the employee does not contribute.
  • Work-from-home stipend - We provide a $1,500 allowance to spend on setting up your home workplace
  • Annual Professional Development Fund: Each employee has $2,000 in professional development (PD) funds to spend on activities or resources annually. We want each Openly employee to achieve personal and professional success and to feel supported, confident, and informed about improving their efficiency and productivity.
  • Be Well Program - Employees receive $50 per month to use towards your overall well-being
  • Paid Volunteer Service Hours
  • Referral Program and Reward

Related Categories

Related Job Pages

More Data Engineer Jobs

OtherRemoteTeam 51-200Since 2010H1B Sponsor

Lead Data Engineering and Analytics Engineering at Philo's innovative streaming service

California + 2 moreAll locations: California, New York, Massachusetts
$200K - $275K / year

We are seeking a true Data Modeler who specializes in Kimball dimensional modeling to drive the architecture for our client’s platform. This is a hands-on modeling role (80% modeling, 20% ETL/Transformation) where you will be the resident Kimball expert. You will not be respons...

United States
Neon logo

Senior Data Engineer

Neon

Boas-vindas ao nosso Neonverso!

Data Engineer3 days ago
OtherRemoteTeam 1,001-5,000Since 2016

Senior Data Engineer designing data systems for consumer data monetization

United States
$160K - $220K / year

The Data Engineer will be responsible for designing, building, and maintaining scalable ETL/ELT data pipelines and developing reusable inference layers for audience modeling. This includes implementing data governance practices and performing geospatial aggregation to support data-driven initiatives.

United States