Bestow

Building cutting-edge technology and data solutions for life insurance and annuities.

Staff Data Engineer

Data EngineerData EngineerFull TimeRemoteTeam 51-200Since 2017H1B No SponsorCompany SiteLinkedIn

Location

United States

Posted

137 days ago

Salary

$190K - $210K / year

Bachelor Degree10 yrs expEnglishAirflowAmazon RedshiftApacheAWSAzureBig QueryCloudDockerGoogle Cloud PlatformGraph QLGRPCPythonSQLTerraform

Job Description

• Define and drive the technical roadmap for data infrastructure, establishing architectural patterns and standards that scale across the organization • Lead the design and implementation of complex, multi-system data architectures that support business-critical operations and enable innovation (data ingestion + export and delivery) • Evaluate and champion adoption of emerging technologies and best practices in data engineering, MLOps, and GenAI • Establish data governance frameworks, quality standards, and operational excellence practices across all data workloads • Drive cross-functional initiatives that require coordination between data, product, engineering, and business teams • Architect enterprise-scale data solutions for transferring data from first and third-party applications to and from our data warehouse • Design and oversee the development of robust, scalable APIs (REST, GraphQL, gRPC) that enable data access for internal teams and external partners • Lead the evolution of event-driven and API-first data architectures that support real-time data sharing and integration • Leverage Google Cloud (GCP) tools (Cloud Run, Cloud Functions, Vertex AI, App Engine, Cloud Storage, IAM, etc.) and services (Astronomer - Apache Airflow) to architect and bring enterprise data workloads to production • Design resilient, self-healing data systems with comprehensive monitoring, alerting, and automated remediation - and participating as part of an on-call rotation. • Lead the evolution of our data platform on Google Cloud (GCP), leveraging advanced services and optimizing for cost, performance, and reliability • Define patterns for streaming and batch data architectures that serve diverse use cases • Establish best practices for data contracts, API versioning, CI/CD, documentation, and partner integrations • Lead MLOps strategy and implementation, establishing patterns for model deployment, monitoring, and governance at scale • Architect and oversee Generative AI infrastructure, enabling rapid prototyping while ensuring enterprise-grade security, compliance, and cost management • Partner with Data Science leadership to translate research initiatives into production-ready solutions • Drive innovation in AI/ML tooling and infrastructure, staying ahead of industry trends • Mentor and guide Data Engineers at all levels, conducting design reviews and providing technical feedback • Establish engineering standards, documentation practices, and knowledge-sharing processes • Participate in hiring and onboarding processes, helping to build a world-class data engineering team • Foster a culture of engineering excellence, experimentation, and continuous improvement • Partner with product, engineering, and business leaders to align data strategy with organizational goals • Communicate complex technical concepts to non-technical stakeholders, building alignment and driving informed decision-making • Represent data engineering in cross-functional planning and architecture forums • Build strong relationships with external partners and vendors

Job Requirements

  • 10+ years working in a data engineering role that supports incoming/outgoing feeds as well as analytics and data science teams
  • 5+ years of advanced Airflow and Python experience writing production-grade, efficient, testable, and maintainable code
  • 3+ years of experience designing, building, and maintaining production APIs (REST, GraphQL, gRPC) for data access and integration, including API gateway management, rate limiting, authentication/authorization, and versioning strategies
  • 3+ years leading ML/MLOps initiatives, including model deployment, monitoring, and governance at scale
  • 3+ years of hands-on experience with Google Cloud Platform (GCP) including Cloud Run, Cloud Functions, Vertex AI, Cloud Storage, IAM, and other core services.
  • Deep expertise with columnar databases (BigQuery, Snowflake, Redshift) and advanced SQL optimization techniques.
  • Demonstrated experience with AI Coding assistants – AI tools are heavily engrained in Bestow culture.
  • Proven track record designing an end-to-end data pipeline in cloud frameworks (such as GCP, AWS, Azure) with requirements from multiple stakeholders
  • Experience with upstream data coordination through data contracts.
  • Experience building CICD pipelines for data processing using tools such as Docker, CircleCI, dbt, git, etc
  • Extensive experience with infrastructure as code (Terraform, Pulumi) and GitOps practices
  • Expert level knowledge of data orchestration frameworks such as Apache Airflow (or similar) to manage SLOs and processing dependencies
  • Experience in building streaming / real-time ingestion pipelines
  • Experience with creating alerts and monitoring pipelines which contribute to overall data governance.
  • Experience with containerization and container orchestration technologies with cloud architecture and implementation features (single- and multi-tenancy, orchestration, elastic scalability)
  • Deep understanding of standard IT security practices such as identity and access management (IAM), data protection, encryption, certificate, and key management.
  • Adaptability to learn new technologies and products as the job demands.
  • Proven ability to mentor engineers and lead technical initiatives across teams
  • Nice to have: Familiarity with building tools that draw upon Generative AI (GenAI) integrations (Enterprise-grade, not simply vibe-coded).

Benefits

  • Competitive salary and equity based on role
  • Policies and managers that support work/life balance, like our flexible paid time off and parental leave programs
  • 100% paid-premium option for medical, dental, and vision insurance
  • Lifestyle stipend to support your physical, emotional, and financial wellbeing
  • Flexible work-from-home policy and open to remote
  • Remote and WFH options, as well as a beautiful, state-of-the-art office in Dallas’ Deep Ellum, for those who prefer an office setting
  • Employee-led diversity, equity, and inclusion initiatives

Related Categories

Related Job Pages

More Data Engineer Jobs

Senior Data Engineer

Mitek Systems

The global leader in mobile capture and digital identity verification.

Data Engineer138 days ago
Full TimeRemoteTeam 201-500Since 1986H1B Sponsor

Senior Data Engineer driving impactful data solutions at Mitek

AirflowAWSDistributed SystemsMapReduceNoSQLPostgresPythonSQLTableauTerraform
United States
$135K - $160K / year

Data Architect – Data Modeling

CompassX Group

Be Inspired. Be Entrepreneurial. Be Local.

Data Engineer138 days ago
ContractRemoteTeam 11-50Since 2009H1B No Sponsor

Data Architect focusing on data models for life sciences clients at CompassX

Amazon RedshiftCloudSQL
United States
$80 - $100 / hour
Data Engineer139 days ago
Full TimeRemoteTeam 1,001-5,000Since 2008H1B Sponsor

Director of Data Engineering leading the evolution of Workiva’s data platform.

ERPKafka
United States
$177K - $284K / year

Senior Data Engineer

Rezilient Health

No ordinary doctor's office.

Data Engineer140 days ago
Full TimeRemoteTeam 51-200H1B No Sponsor

Data Engineer architecting data solutions for enhancing patient care in healthcare.

AirflowAmazon RedshiftApacheAWSAzureBigQueryCloudETLPythonScalaSQL
United States