The Standard in Apple Enterprise Management
Data Platform Engineer II
Location
United States
Posted
45 days ago
Salary
$85.1K - $181.7K / year
Seniority
Mid Level
Job Description
Role Description
Business Intelligence at Jamf powers data-driven decision-making across the organization. As a Data Platform Engineer II, you’ll be responsible not just for building & transforming data, but for owning critical data infrastructure: from ingestion and storage, to governance, quality, and consumption by analytics/ML tools. You will partner with analysts, data scientists, product owners and engineers to ensure that Jamf’s data assets are reliable, high-performance, secure, and scalable.
For those candidates who live near a Jamf office, you may be expected to work periodically in-office or collaborative work location with other Jamf employees in your area for certain events or moments that matter.
What you can expect to do in this role:
- Design, build, maintain, and improve the data platform infrastructure (Snowflake environments, airflow workflows, orchestration, CI/CD pipelines for dbt / transformations)
- Develop and maintain Terraform (or equivalent IaC) definitions for provisioning data infrastructure (compute, storage, permissions, networking where needed)
- Automate deployment of data transformations (e.g. dbt CI/CD, staging / production pipelines)
- Ensure data platform availability, reliability, security and performance (e.g. enforce roles & permissions in Snowflake, resource monitoring, concurrency/usage optimisation)
- Instrument monitoring, logging and alerting of data workflows (Airflow / Kubernetes / dbt jobs)
- Collaborate with Data Engineers / Analysts / Architects to define platform capabilities, set standards & best practices around schema design, governance, version control, and performance
- Run capacity planning, ensure cost-efficiency, scaling strategy (e.g. concurrency limits in Snowflake warehouse sizing, cluster autoscaling, etc)
- Facilitate onboarding of teams to the data platform: document usage patterns, create templates or utilities (for example dbt macros, shared libraries)
- Participate in architecture reviews, evaluate new platform tooling (e.g. enhancements to orchestration, transformation frameworks, security strategy, etc)
- Troubleshoot critical incidents and participate in incident / post-mortem cycles for platform issues
Qualifications
- Minimum of 3 years experience building data pipelines with Python (Required)
- Minimum of 3 years experience working with data warehouse or other cloud based database technology, with strong proficiency in SQL. (Required)
- Experience with Docker / Kubernetes (Required)
- Exposure to Infrastructure-as-Code (IaC) such as Terraform or DevOps (Required)
- Experience working with dbt (Preferred)
- Strong experience with cloud infrastructure: AWS (EC2, ECR, S3, Glue, RDS, etc) or equivalent public cloud provider
- Hands-on experience in CI/CD, version control, unit / integration testing for data pipelines
- Comfortable working in agile teams, and mentoring others
- Strong Communication Skills
- Excellent Interpersonal Skills
- Excellent Organizational Skills
- Proven Analytical Skills
- Ability to communicate complex technical terms in an easy to understand, non-technical manner
- Ability to interact effectively with co-workers in a result driven culture
- Self-starter, energetic multi-tasker, highly motivated and team player
- Ability to engage with and establish trust and rapport with all levels of customers and employees
- Agile practitioner experienced in Scrum or Kanban
- General knowledge of Apple products and eco-systems
- Bachelor's Degree in Mathematics, Computer Science or related field (Required)
- A combination of relevant experience and education may be considered
Requirements
- Participation in ongoing security training is mandatory
- Established security protocols will be adhered to, sensitive data will be handled responsibly, and data protection practices are followed, including understanding relevant privacy regulations and reporting breaches
- Acknowledging the Jamf Code of Conduct, where applicable security and privacy policies can be found, is a requirement of all roles at Jamf
Benefits
- Named a 2025 Best Companies to Work For by U.S. News
- Named a 2024 Best Technology Company to Work For by U.S. News
- Named one of Forbes Most Trusted Companies in 2024
- Named a 2024 Best Companies to Work For by U.S. News
- Opportunity to make a real and meaningful impact for more than 75,000 global customers
- Support for new innovations and OS releases the moment they are made available by Apple
- Work with a small and empowered team where the culture is based on trust, ownership, and respect
- Clear career path that enables you to grow under supportive leadership and management
- Access to the Jamf Engineering blog for insights on innovative projects
- Pay Transparency Range: $85,100 — $181,700 USD
Job Requirements
- Minimum of 3 years experience building data pipelines with Python (Required)
- Minimum of 3 years experience working with data warehouse or other cloud based database technology, with strong proficiency in SQL. (Required)
- Experience with Docker / Kubernetes (Required)
- Exposure to Infrastructure-as-Code (IaC) such as Terraform or DevOps (Required)
- Experience working with dbt (Preferred)
- Strong experience with cloud infrastructure: AWS (EC2, ECR, S3, Glue, RDS, etc) or equivalent public cloud provider
- Hands-on experience in CI/CD, version control, unit / integration testing for data pipelines
- Comfortable working in agile teams, and mentoring others
- Strong Communication Skills
- Excellent Interpersonal Skills
- Excellent Organizational Skills
- Proven Analytical Skills
- Ability to communicate complex technical terms in an easy to understand, non-technical manner
- Ability to interact effectively with co-workers in a result driven culture
- Self-starter, energetic multi-tasker, highly motivated and team player
- Ability to engage with and establish trust and rapport with all levels of customers and employees
- Agile practitioner experienced in Scrum or Kanban
- General knowledge of Apple products and eco-systems
- Bachelor's Degree in Mathematics, Computer Science or related field (Required)
- A combination of relevant experience and education may be considered
- Participation in ongoing security training is mandatory
- Established security protocols will be adhered to, sensitive data will be handled responsibly, and data protection practices are followed, including understanding relevant privacy regulations and reporting breaches
- Acknowledging the Jamf Code of Conduct, where applicable security and privacy policies can be found, is a requirement of all roles at Jamf
Benefits
- Named a 2025 Best Companies to Work For by U.S. News
- Named a 2024 Best Technology Company to Work For by U.S. News
- Named one of Forbes Most Trusted Companies in 2024
- Named a 2024 Best Companies to Work For by U.S. News
- Opportunity to make a real and meaningful impact for more than 75,000 global customers
- Support for new innovations and OS releases the moment they are made available by Apple
- Work with a small and empowered team where the culture is based on trust, ownership, and respect
- Clear career path that enables you to grow under supportive leadership and management
- Access to the Jamf Engineering blog for insights on innovative projects
- Pay Transparency Range: $85,100 — $181,700 USD
Related Guides
Related Categories
Related Job Pages
More Data Engineer Jobs
Senior Data Engineer architecting cloud-native data platforms at OneDigital
Design, build, and manage advanced data systems that turn raw player data into actionable insights, powering personalized gaming experiences, fraud detection, and revenue optimization for our platform. Develop batch and real-time data pipelines on our GCP platform Develop APIs to...



