Tech Lead, Data & AI

Data EngineerData EngineerFull TimeRemoteLeadTeam 51-200

Location

United States

Posted

23 hours ago

Salary

Not specified

Seniority

Lead

PythonAWSApache AirflowTerraformData Lake ArchitectureCi/cdMonitoringObservabilityCloud Native ArchitectureProduction SecurityEnterprise Security Standards

Job Description

About Kinetic
Backed by Nationwide, Kinetic is a tech-first Managing General Agent (MGA) focused on leveraging AI-driven technology and expert human support to transform workers' compensation claims management. By utilizing data to create actionable steps from the moment of injury through closure, Kinetic works to reduce workplace injuries, streamline the claims process, and lower premiums for midsize and large businesses in safety-critical industries like parcel delivery, manufacturing, and wholesale/warehousing operations. We also work directly with enterprise clients.

We’re a remote-first team headquartered in New York City, growing fast—10x in the past two years—and driven by accountability, continuous improvement, and collaboration. If you're excited to build a career with impact, we’d love to meet you.

Kinetic was named to Business Insurance's Best Place to Work in Insurance 2025.

The role:

Reporting to the VP of Engineering, this role will lead our Data team which is responsible for developing and maintaining our production data lake and our LLM-based AI solutions that serve as the foundation for Kinetic’s industry leading workers compensation insurance products. You will be hands-on, contributing directly to the design, development, and deployment of new data integrations and Agentic AI solutions in a Python-focused and cloud-native stack. You will also lead and manage a small team of engineers ensuring high performance, working alongside a dedicated Product Manager. This role is extremely cross-functional, and provides the opportunity for deep exposure to and support of business units across the organization.

We’re looking for someone who enjoys tackling complex, real-world challenges, thrives in a fast-paced environment, and is eager to learn and adapt as technologies evolve. You’ll contribute to everything from core infrastructure to experimental features, helping deliver high-impact solutions that reduce workplace injuries and improve operational outcomes for our customers. A strong engineering foundation, curiosity, and a collaborative mindset are key to success in this role.

Responsibilities:
  • Own the architectural direction of Kinetic's data platform and AI products, operating as a player-coach who balances rapid experimentation with production reliability while contributing significant hands-on code.
  • Design, develop, and implement robust, cloud-native data and AI solutions, writing clean, well-tested, and maintainable code in a team environment.
  • Participate in architecture and code reviews to ensure high-quality, scalable systems, and implement best practices for security, reliability, and observability.
  • Lead root-cause analysis of complex data and AI system issues, driving durable fixes and continuous improvement.

Basic Qualifications:
  • Bachelor's degree in Computer Science or a related field and at least 5 years of relevant hands-on software engineering experience.
  • At least 2 years experience leading projects and mentoring engineers, with the ability to manage and grow a small team.
  • Deep experience building and operating Python-based data pipelines and production-grade cloud-native systems (CI/CD, monitoring, long-term maintainability).
  • The working language at Kinetic is English.

 Preferred Qualifications:
  • Experience developing and deploying AI features in production software, especially those leveraging LLMs or agentic AI patterns.
  • Experience with AWS, Apache Airflow, and Terraform.
  • Strong testing and documentation practices. 
  • Experience with production security practices in the cloud including building systems that meet enterprise security standards and industry best practices.
  • Experience operating in early-stage or high-growth startup environments where pragmatism and speed matter.

Job Requirements

  • Bachelor's degree in Computer Science or a related field and at least 5 years of relevant hands-on software engineering experience.
  • At least 2 years experience leading projects and mentoring engineers, with the ability to manage and grow a small team.
  • Deep experience building and operating Python-based data pipelines and production-grade cloud-native systems (CI/CD, monitoring, long-term maintainability).
  • The working language at Kinetic is English.
  • Preferred Qualifications
  • Experience developing and deploying AI features in production software, especially those leveraging LLMs or agentic AI patterns.
  • Experience with AWS, Apache Airflow, and Terraform.
  • Strong testing and documentation practices.
  • Experience with production security practices in the cloud including building systems that meet enterprise security standards and industry best practices.
  • Experience operating in early-stage or high-growth startup environments where pragmatism and speed matter.

Related Categories

Related Job Pages

More Data Engineer Jobs

United Biosource Corporation logo

Senior Data Warehouse Engineer

United Biosource Corporation

UBC is devoted to empowering health solutions for a better tomorrow. We take pride in improving patient outcomes and advancing healthcare. At UBC, we provide services to enhance the entire drug development process and commercialization lifecycle - From clinical trial support to real-world evidence generation. UBC fosters a culture built on our Core Values of Respect, Accountability, Innovation, Quality, Integrity, and Collaboration. We believe in an inclusive workplace that fosters creativity.

Data Engineer1 day ago
Full TimeRemote

The Senior Data Warehouse Engineer will be a part of UBC’s Enterprise Data Warehouse (EDW) team. This team is responsible for the architecture, design, optimization, data modeling, extract/transform/load (ETL), data governance, and solutioning of the EDW. The primary objective ...

SQLETLSnowflakeSSISAzure Data FactoryAzure SynapseAPIJSONXMLPythonData ModelingData GovernanceHealthcare Data
United States
Data Engineer1 day ago
Full TimeRemoteTeam 501-1,000H1B No Sponsor

The Staff Data Platform Engineer will architect and optimize large-scale ClickHouse or OLAP databases, designing distributed systems capable of handling multi-terabyte datasets and defining best practices for analytical data models and query performance. This role also involves mentoring engineers on system design, guiding architectural decisions, and ensuring the platform supports real-time analytics and AI workloads.

ClickHouseOLAPDistributed SystemsData ArchitectureQuery OptimizationData ModelingReal-time AnalyticsHigh-performance ComputingData PipelineData IngestionData TransformationMulti-terabyte Dataset HandlingClickHouse Performance TuningDistributed Query OptimizationSystem ScalabilityAnalytical Schema DesignPythonSQLData WarehousingETL
United States + 2 moreAll locations: United States, Colombia, Costa Rica
Full TimeRemoteTeam 1,001-5,000

The primary responsibility involves architecting, implementing, and optimizing cloud-based solutions while leveraging industry best practices. This role also requires mentoring junior engineers and driving continuous improvement across CloudOps and DevOps practices.

KubernetesAWSTerraformDockerPythonBashGoVPCHIPAACloudFormation
United States
$125K - $187K / year
Veritone logo

Data Operations Engineer

Veritone

Making AI Work for You

Data Engineer1 day ago
Full TimeRemoteTeam 501-1,000Since 2014H1B Sponsor

Data Operations Engineer enabling data-driven decision-making for Veritone

AirflowAWSCloudETLITSMMySQLPostgreSQLPython
United States
$120K - $130K / year