Sales enablement platforms customized for media, and ad tech companies that help you close more deals.
Data Engineering Lead
Location
United States
Posted
8 hours ago
Salary
Not specified
Job Description
Job Requirements
- Azure Databricks:
- ○ Expert-level experience managing workspaces, clusters, and job scheduling.
- ○ Solid understanding of data lakehouse architectures and Delta Lake.
- ○ Proven experience in Performance Tuning, Spark Optimization and Cost Reduction.
- PySpark: Advanced proficiency in Spark DataFrame APIs and Spark SQL for large-scale data processing involving various data formats.
- SQL Mastery: Exceptional ability to write, tune, and troubleshoot complex queries.
- PostgreSQL: Hands-on experience with relational database design, indexing, and performance optimization.
- ETL/ELT Frameworks: Proven track record of building scalable data pipelines from scratch.
- Workflow Orchestration: Experience with Apache Airflow for managing complex task dependencies.
- Containerization: Familiarity with Azure Kubernetes Service (AKS) for deploying containerized data services.
- Infrastructure as Code (IaC): Knowledge of Terraform or Bicep for managing Azure resources.
- 10+ years of experience in Data Engineering or Software Engineering.
- 3+ years as a formal technical Lead managing an agile team and implementing E2E solutions.
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Strong communication skills with the ability to explain complex technical concepts to non-technical stakeholders.
- Strong problem-solving skills and attention to detail.
Related Guides
Related Categories
Related Job Pages
More Data Engineer Jobs
Data Platform Engineer (26-27)
IDEA Public SchoolsAt IDEA, the Staff Experience Team uses our Core Values to promote human connection and a culture of integrity, respect, and belonging for all Team and Family members.
The Data Platform Engineer will design, build, and maintain automated ELT pipelines ingesting data into Snowflake, configuring cloud-native tools and custom Python pipelines as necessary. This role also involves administering Snowflake environments, optimizing performance and cost, and managing infrastructure using infrastructure-as-code patterns.
Data Architect – MuleSoft Integration, Migration Specialist
ICFWe are not a typical consulting firm and our people are not typical consultants.
Data Architect specializing in MuleSoft integration for government contracts
Data Engineer III
eSimplicityAn engineering firm that delivers high-quality Healthcare IT, Cybersecurity, and Telecommunication solutions.
The role involves developing, expanding, and optimizing data and data pipeline architecture, including maintaining ETL processes and building Proofs of Concept using various cloud technologies. Responsibilities also include operating large-scale data processing pipelines, performing data quality analysis, and building infrastructure for optimal data extraction, transformation, and loading.
Data Architect / MuleSoft Integration & Migration Specialist (Remote)
IIIIIIUSICF is a global advisory and technology services provider, but we’re not your typical consultants. We combine unmatched expertise with cutting-edge technology to help clients solve their most complex challenges, navigate change, and shape the future. We can only solve the world's toughest challenges by building a workplace that allows everyone to thrive. We are an equal opportunity employer. Together, our employees are empowered to share their expertise and collaborate with others to achieve personal and professional goals.
The role involves leading the design and implementation of scalable data solutions for a Salesforce Government Cloud Platform deployment, focusing on MuleSoft integration and secure data migration from legacy systems. Responsibilities include developing integration solutions, defining data models, ensuring compliance with federal standards, and leading data migration efforts.