IDEA Public Schools

At IDEA, the Staff Experience Team uses our Core Values to promote human connection and a culture of integrity, respect, and belonging for all Team and Family members.

Data Platform Engineer (26-27)

Data EngineerData EngineerFull TimeRemoteTeam 10,001

Location

United States

Posted

1 day ago

Salary

$89.6K - $105K / year

No structured requirement data.

Job Description

 

Data Platform Engineer

 

 This is a vacancy for the 26-27 school year with a target start date of July 1, 2026

Mission:

The Data Platform Engineer builds and operates IDEA’s data infrastructure on Snowflake, enabling reliable, scalable access to data that supports analytics, reporting, and research across multiple states. This role designs automated ingestion pipelines, optimizes platform performance and cost, and ensures the data platform functions as a production-grade system for downstream teams.

Reporting to the Manager of Data Platform Engineering, this engineer works hands-on with ELT pipelines, infrastructure-as-code, and Snowflake administration while contributing to IDEA’s transition from legacy ETL systems to a modern lakehouse architecture. 

 

Supervisory Responsibilities:

Individual contributor role with no direct reports. Senior engineers may mentor peers and lead technical initiatives. 

 

Location:

This is a full-time remote position based in Texas, with preference given to candidates who live in Austin, El Paso, Houston, Permian Basin (Midland/Odessa), Rio Grande Valley, San Antonio, and Tarrant County (Fort Worth), or who are willing to relocate.

 

Travel Expectations:    

Minimal travel (5–10% annually) for collaboration, training, or critical implementation milestones. 

 

What You’ll Do – Accountabilities

Essential Duties:

  • Design, build, and maintain automated ELT pipelines ingesting data from diverse source systems into Snowflake.
  • Configure and manage cloud-native ingestion tools and custom Python-based pipelines when needed. 
  • Build and maintain Bronze-layer tables with schema evolution handling, audit metadata, and lineage. 
  • Implement ingestion-level validation and monitoring to catch issues early. 
  • Document source configurations, refresh schedules, and troubleshooting procedures. 
  • Partner with Analytics Engineering to ensure ingestion patterns support downstream transformation needs. 
  • Administer Snowflake environments, including databases, schemas, warehouses, access controls, and security settings. 

 

Additional Duties and Responsibilities:

  • Optimize performance and cost through warehouse sizing, clustering, query analysis, and resource monitoring.
  • Manage Snowflake objects using infrastructure-as-code patterns.
  • Implement security best practices including RBAC, encryption, auditing, and network policies.
  • Evaluate and adopt new Snowflake capabilities as appropriate.
  • Own Terraform-based infrastructure definitions for Snowflake and related platform components.
  • Automate recurring operational tasks such as provisioning, access grants, and environment setup.
  • Build CI/CD pipelines for infrastructure changes with testing and safe deployment practices.
  • Develop reusable templates and modules to accelerate onboarding of new sources and domains.
  • Maintain clear documentation and runbooks for platform operations.
  • Implement monitoring and alerting for pipelines, platform health, and performance.
  • Troubleshoot pipeline failures and platform issues using systematic root-cause analysis.
  • Embed observability (logging, metrics, alerts) into all production pipelines.
  • Collaborate closely with Analytics Engineering, DataOps, and Data Governance partners.
  • Participate in code reviews and design discussions.
  • Share platform knowledge through documentation, mentoring, and team forums.
  • Contribute to retrospectives and continuous improvement efforts.

 

Knowledge and Skills – Competencies

  • Make Strategic Decisions: This team member uses data, feedback, and insights to inform thoughtful decision-making, while considering the impact on their direct reports and team. They communicate decisions with clear rationale and begin to connect their choices to broader team objectives.
  • Manage Work and Teams: This team member sets clear, measurable goals and regularly reflects on progress, adjusting actions as needed. They prioritize work aligned with their goals using a task management system and consistently meet deadlines through effective time management. 
  • Grow Self and Others: This team member regularly offers affirming and adjusting feedback, maintaining a positive balance that reinforces growth and motivation. They provide transparent, candid performance insights and offer consistent coaching and development aligned with individual goals, supporting both direct reports and cross-functional partners. 
  • Build a Culture of Trust:  This team member proactively builds strong personal and professional relationships with individual stakeholders and regularly seeks feedback to improve their work experience. They create a supportive environment where others feel safe to take risks and learn from mistakes without fear of retribution. 
  • Communicate Deliberately: This team member communicates thoughtfully by anticipating potential misunderstandings and providing necessary context to ensure clarity. They leverage structured communication channels to address challenges, ask meaningful questions, and guide conversations toward solutions, while actively listening to the concerns of others. 

 

Additional Skills:

Required

  • Hands-on experience administering Snowflake or similar cloud data platforms.
  • Strong SQL skills for data extraction, validation, and performance tuning.
  • Experience building and operating automated data pipelines.
  • Proficiency with Python and scripting for automation and operational tooling.
  • Experience operating production systems with monitoring and incident response.
  • Familiarity with infrastructure-as-code and CI/CD concepts.
     

Preferred

  • Experience with cloud-native ingestion tools (e.g., Fivetran, Airbyte).
  • Experience with Terraform or similar IaC tooling.
  • Familiarity with dbt and analytics engineering workflows.
  • Exposure to orchestration, data quality, or observability tools.
  • Experience with education, public sector, or regulated data environments. 

 

Required Education and Experience:

  • Bachelor’s degree in a technical field or equivalent practical experience.
  • 3+ years of experience in data engineering, platform engineering, or related roles.
  • Demonstrated experience building and operating production data systems.
  • Hands-on experience with Snowflake or comparable cloud data warehouses. 

 

Preferred Education and Experience:

  • Snowflake or cloud platform certifications.
  • Experience supporting multi-team data platforms at scale.
  • Strong Python proficiency beyond basic scripting. 

 

Physical Requirements:

  • Prolonged periods working on a computer and in virtual meetings
  • Ability to travel domestically via car and air travel to campuses and state office
  • Flexibility for occasional evening meetings with distributed stakeholders across time zones

 

What We Offer:


Compensation & Benefits:

Salaries for people entering this role typically fall between $89,600 and $105,300, commensurate with relevant experience and qualifications and in alignment with internal equity. This role is also eligible for performance pay based on organizational performance and goal attainment.

 

 

Additionally, we offer medical, dental, and vision plans, disability, life insurance, parenting benefits, flexible spending account options, generous vacation time, referral bonuses, professional development, and a 403(b) plan. You can find more information about our benefits at https://ideapublicschools.org/careers/benefits/.

 

* IDEA may offer a relocation stipend to defray the cost of moving for this role, if applicable.

 

 

Application process:

Submit your application online through Jobvite. Please note that applications will be reviewed on an ongoing basis until the position is filled. Applicants are encouraged to apply as early as possible.

 

 

Learn more about IDEA 

At IDEA the Staff Experience Team uses our Core Values to promote human connection and a culture of integrity, respect, and belonging for all Team and Family members.  Learn more about our Commitment to Core Values here: https://ideapublicschools.org/our-story/#core-values

 

 

Related Categories

Related Job Pages

More Data Engineer Jobs

Data Engineer1 day ago
ContractRemoteTeam 11-50

Maleda Tech is seeking an experienced Data Engineer to support the development and expansion of a large-scale data ecosystem used to inform strategic decision-making. In this role, you will design and maintain data pipelines that integrate internal and third-party datasets, enabl...

PythonSQLETLData ModelingData WarehousingBig DataAnalyticsMachine Learning
United States
$92 - $107 / hour

Data Architect – MuleSoft Integration, Migration Specialist

ICF

We are not a typical consulting firm and our people are not typical consultants.

Data Engineer1 day ago
Full TimeRemoteTeam 5,001-10,000Since 1969H1B Sponsor

Data Architect specializing in MuleSoft integration for government contracts

Cyber SecurityETLPythonSQL
Virginia
$98.6K - $167.6K / year

Data Engineer III

eSimplicity

An engineering firm that delivers high-quality Healthcare IT, Cybersecurity, and Telecommunication solutions.

Data Engineer1 day ago
Full TimeRemoteTeam 51-200Since 2016H1B No Sponsor

The role involves developing, expanding, and optimizing data and data pipeline architecture, including maintaining ETL processes and building Proofs of Concept using various cloud technologies. Responsibilities also include operating large-scale data processing pipelines, performing data quality analysis, and building infrastructure for optimal data extraction, transformation, and loading.

United States
$113K - $127K / year

Data Architect / MuleSoft Integration & Migration Specialist (Remote)

IIIIIIUS

ICF is a global advisory and technology services provider, but we’re not your typical consultants. We combine unmatched expertise with cutting-edge technology to help clients solve their most complex challenges, navigate change, and shape the future. We can only solve the world's toughest challenges by building a workplace that allows everyone to thrive. We are an equal opportunity employer. Together, our employees are empowered to share their expertise and collaborate with others to achieve personal and professional goals.

Data Engineer1 day ago
Full TimeRemoteTeam 1,001-5,000H1B No Sponsor

The role involves leading the design and implementation of scalable data solutions for a Salesforce Government Cloud Platform deployment, focusing on MuleSoft integration and secure data migration from legacy systems. Responsibilities include developing integration solutions, defining data models, ensuring compliance with federal standards, and leading data migration efforts.

United States
$98.6K - $167K / year