Computer and Information Systems Manager
Location
United States
Posted
4 days ago
Salary
$201.0K - $221.1K / year
No structured requirement data.
Job Description
Role Description
Fox Cable Network Services, LLC seeks a Computer and Information Systems Manager (Manager, Data Engineering), responsible for designing, developing, and maintaining comprehensive Data Lake and warehouse solutions. Tackle challenges that come with complex large-scale data having different streams of data sources. Specific duties include:
- Lead the design, development, and implementation of data architectures, data pipelines, and data systems.
- Manage a team of 3-5 data engineers and provide technical guidance and mentorship.
- Build scalable data infrastructure and understand distributed systems concepts from a data storage and compute perspective.
- Ensure the accuracy and availability of data to customers and understand how technical decisions can impact the business’s analytics and reporting.
- Collaborate with cross-functional teams, business partners, stakeholders including senior leadership to understand business requirements, promote best practice and trade-offs, and to provide solutions that meet the organization's goals.
- Develop, democratize and maintain Data as an asset.
- Optimize data processes for efficiency, reliability, and scalability.
- Act as a Subject Matter Expert to the organization for end-to-end data pipeline.
- Ensure quality by performing root cause analysis and troubleshooting of defects.
- Stay current with industry trends and advancements in Data Engineering.
Telecommuting permitted from any location in the U.S.
Qualifications
- Bachelor’s degree in Computer Science, Computer Engineering or related field, plus two (2) years of data engineering or related experience executing multiple large data analytics and engineering projects.
- In lieu of Bachelor’s degree, will accept four (4) years of data engineering or related experience executing multiple large data analytics and engineering projects.
Requirements
- Two (2) years of experience designing and developing Platform Architecture and ETL/Data pipelines.
- Experience developing frameworks and reusable components for AWS cloud data migration and integration.
- Experience analyzing and capturing batch, streaming and near real-time data (ETL) requirements for Enterprise Data Lake and Data Warehouse.
- Experience working with various stakeholders on requirement gathering, defining project scope, roadmaps and short and long-term strategic goals.
- Specific skills required include:
- Programming languages such as Python or Java.
- pySpark or Scala.
- AWS Redshift, Athena, Spectrum, DynamoDB and RDS.
- Amazon Web Services including EC2, Elastic Map Reduce (EMR), Glue, Lambda, Kinesis, IAM, ECS, S3, API Gateway, Kinesis and Kafka.
- Agile Methodology, Software Development Life Cycle, Test Strategy and Performance Tuning.
- Databricks, PL/SQL, Shell scripting, Tableau and Looker.
- Jira, Bitbucket, Confluence and Git.
Telecommuting permitted from any location in the U.S.
Benefits
- Medical/dental/vision insurance.
- 401(k) plan.
- Paid time off.
- Other benefits in accordance with applicable plan documents.
- Benefits for Union represented employees will be in accordance with the applicable collective bargaining agreement.
Pursuant to state and local pay disclosure requirements, the pay rate/range for this role, with final offer amount dependent on education, skills, experience, and location is $200,970.00-221,067.00 annually.
Job Requirements
- Bachelor’s degree in Computer Science, Computer Engineering or related field, plus two (2) years of data engineering or related experience executing multiple large data analytics and engineering projects.
- In lieu of Bachelor’s degree, will accept four (4) years of data engineering or related experience executing multiple large data analytics and engineering projects.
- Two (2) years of experience designing and developing Platform Architecture and ETL/Data pipelines.
- Experience developing frameworks and reusable components for AWS cloud data migration and integration.
- Experience analyzing and capturing batch, streaming and near real-time data (ETL) requirements for Enterprise Data Lake and Data Warehouse.
- Experience working with various stakeholders on requirement gathering, defining project scope, roadmaps and short and long-term strategic goals.
- Specific skills required include:
- Programming languages such as Python or Java.
- pySpark or Scala.
- AWS Redshift, Athena, Spectrum, DynamoDB and RDS.
- Amazon Web Services including EC2, Elastic Map Reduce (EMR), Glue, Lambda, Kinesis, IAM, ECS, S3, API Gateway, Kinesis and Kafka.
- Agile Methodology, Software Development Life Cycle, Test Strategy and Performance Tuning.
- Databricks, PL/SQL, Shell scripting, Tableau and Looker.
- Jira, Bitbucket, Confluence and Git.
- Telecommuting permitted from any location in the U.S.
Benefits
- Medical/dental/vision insurance.
- 401(k) plan.
- Paid time off.
- Other benefits in accordance with applicable plan documents.
- Benefits for Union represented employees will be in accordance with the applicable collective bargaining agreement.
- Pursuant to state and local pay disclosure requirements, the pay rate/range for this role, with final offer amount dependent on education, skills, experience, and location is $200,970.00-221,067.00 annually.
Related Guides
Related Categories
Related Job Pages
More Data Engineer Jobs
Principal Data Engineer leading data governance at Teachstone
Snowflake Senior Data Engineer/ Developer
General Dynamics Information TechnologyArt of the possible.
The role involves designing, building, testing, and maintaining scalable data engineering components and platform services for a cloud-native Enterprise Data Warehouse (EDW) in Snowflake. Responsibilities include developing high-quality, secure data pipelines, transformations, and integrations to support reporting and analytical objectives.
Director, Data Engineering
Floor & DecorAt Floor & Decor, our associates are entrepreneurs, innovators, and go-getters.
Director of Data Engineering leading data platform transformation.
Software Engineer II (Data Engineering)
R1 RCMTechnology-driven revenue cycle management services for healthcare providers.
The role involves designing, developing, and maintaining software applications focused on handling and processing large volumes of data as part of the company's state-of-the-art data platform foundation. Responsibilities include collaborating with cross-functional teams, building optimized data models, writing ETL code, implementing data quality checks, and troubleshooting data-related issues.