Tech Lead, Data & AI
Location
United States
Posted
23 hours ago
Salary
Not specified
Seniority
Lead
Job Description
- Own the architectural direction of Kinetic's data platform and AI products, operating as a player-coach who balances rapid experimentation with production reliability while contributing significant hands-on code.
- Design, develop, and implement robust, cloud-native data and AI solutions, writing clean, well-tested, and maintainable code in a team environment.
- Participate in architecture and code reviews to ensure high-quality, scalable systems, and implement best practices for security, reliability, and observability.
- Lead root-cause analysis of complex data and AI system issues, driving durable fixes and continuous improvement.
- Bachelor's degree in Computer Science or a related field and at least 5 years of relevant hands-on software engineering experience.
- At least 2 years experience leading projects and mentoring engineers, with the ability to manage and grow a small team.
- Deep experience building and operating Python-based data pipelines and production-grade cloud-native systems (CI/CD, monitoring, long-term maintainability).
- The working language at Kinetic is English.
- Experience developing and deploying AI features in production software, especially those leveraging LLMs or agentic AI patterns.
- Experience with AWS, Apache Airflow, and Terraform.
- Strong testing and documentation practices.
- Experience with production security practices in the cloud including building systems that meet enterprise security standards and industry best practices.
- Experience operating in early-stage or high-growth startup environments where pragmatism and speed matter.
Job Requirements
- Bachelor's degree in Computer Science or a related field and at least 5 years of relevant hands-on software engineering experience.
- At least 2 years experience leading projects and mentoring engineers, with the ability to manage and grow a small team.
- Deep experience building and operating Python-based data pipelines and production-grade cloud-native systems (CI/CD, monitoring, long-term maintainability).
- The working language at Kinetic is English.
- Preferred Qualifications
- Experience developing and deploying AI features in production software, especially those leveraging LLMs or agentic AI patterns.
- Experience with AWS, Apache Airflow, and Terraform.
- Strong testing and documentation practices.
- Experience with production security practices in the cloud including building systems that meet enterprise security standards and industry best practices.
- Experience operating in early-stage or high-growth startup environments where pragmatism and speed matter.
Related Guides
Related Categories
Related Job Pages
More Data Engineer Jobs
Senior Data Warehouse Engineer
United Biosource CorporationUBC is devoted to empowering health solutions for a better tomorrow. We take pride in improving patient outcomes and advancing healthcare. At UBC, we provide services to enhance the entire drug development process and commercialization lifecycle - From clinical trial support to real-world evidence generation. UBC fosters a culture built on our Core Values of Respect, Accountability, Innovation, Quality, Integrity, and Collaboration. We believe in an inclusive workplace that fosters creativity.
The Senior Data Warehouse Engineer will be a part of UBC’s Enterprise Data Warehouse (EDW) team. This team is responsible for the architecture, design, optimization, data modeling, extract/transform/load (ETL), data governance, and solutioning of the EDW. The primary objective ...
Staff Data Platform Engineer (ClickHouse | OLAP)
Growth Acceleration PartnersConsult • Design • Build • Modernize
The Staff Data Platform Engineer will architect and optimize large-scale ClickHouse or OLAP databases, designing distributed systems capable of handling multi-terabyte datasets and defining best practices for analytical data models and query performance. This role also involves mentoring engineers on system design, guiding architectural decisions, and ensuring the platform supports real-time analytics and AI workloads.
The primary responsibility involves architecting, implementing, and optimizing cloud-based solutions while leveraging industry best practices. This role also requires mentoring junior engineers and driving continuous improvement across CloudOps and DevOps practices.
Data Operations Engineer enabling data-driven decision-making for Veritone

