Remitee

Remitee is an international, expanding organization with a vibrant culture that sets us apart. Our work environment is fast-paced and stimulating, offering numerous opportunities for growth and development. If you're a self-starter who thrives in a collaborative and challenging environment, we encourage you to apply. Our company values are fundamental to our daily operations. To succeed here, you'll need to embrace and live our company values. We build trust (Integrity and Transparency). We inspire through example, fulfilling promises, and communicating sincerely. We embrace diversity (Respect and Empathy). We listen and connect, valuing diverse perspectives. We recognize achievements and efforts. We trust in the synergy that emerges from effort and collaboration (Teamwork). We forge authentic bonds through offering opportunities and sharing responsibilities. We focus on what is essential (Simplicity). We simplify complexity, constructing effective solutions. We promote simple and accessible communication. We create our best version (Excellence). We act with discipline and perseverance, taking care of our physical and mental well-being. We live with passion and purpose in everything we do.

Senior Data Engineer

Data EngineerData EngineerFull TimeRemote

Location

United States

Posted

2 days ago

Salary

Not specified

SQLPythonDatabricksDistributed Data ProcessingTerraformAWSAzureCi/cdDockerMongo DB

Job Description

This description is a summary of our understanding of the job description. Click on 'Apply' button to find out more.

Role Description

Remitee, a rapidly expanding fintech company specializing in international payments, is seeking a Senior Data Engineer to join its Data Team. We are looking for a highly autonomous builder - someone who can take an ambiguous business problem, design the solution end-to-end, and execute it. This includes data modeling, pipeline development, optimization, and, when necessary, infrastructure setup. This is not a DevOps role. However, we value engineers who are not blocked by infrastructure and can deploy what they need to move forward.

Key Responsibilities

  • Own the end-to-end implementation of data solutions, from problem definition to production deployment.
  • Design scalable data models and pipelines that support business-critical use cases.
  • Improve performance, reliability, and cost-efficiency of our Databricks environment.
  • Lead technical decisions around data architecture and implementation patterns.
  • Ensure production-grade quality through testing, code review, observability, and monitoring.
  • Remove blockers independently, including infrastructure setup when necessary.

What you’ll build / first 90 days:

  • Partner with functional teams to clarify requirements and deliver 1–2 high-impact data solutions, running in Databricks and (when needed) cloud services (e.g., functions/jobs).
  • Review the current data platform (pipelines, tables, costs/performance) and produce a short, actionable improvement plan with clear priorities.
  • Build and productionize core datasets in Databricks across Bronze/Silver/Gold, including documentation, quality checks, and ownership boundaries.

Qualifications

  • Strong SQL skills (query optimization, partitioning, performance tuning).
  • Strong Python skills.
  • Experience building and deploying jobs in Databricks.
  • Solid understanding of distributed data processing.
  • Experience working with large datasets.
  • Ability to independently design and implement end-to-end solutions.
  • Comfort using Bash and cloud CLIs (AWS/Azure) when needed.

Requirements

  • Infrastructure-as-Code experience (Terraform, ARM, CloudFormation) - Nice to have.
  • Experience with Docker and CI/CD pipelines - Nice to have.
  • Experience deploying databases or services in cloud environments - Nice to have.
  • Experience with NoSQL databases (e.g., MongoDB) - Nice to have.
  • Understanding of Lakehouse architectures - Nice to have.
  • Strong system design fundamentals - Nice to have.
  • Open-source contributions - Nice to have.

Company Description

Remitee is an international, expanding organization with a vibrant culture that sets us apart. Our work environment is fast-paced and stimulating, offering numerous opportunities for growth and development. If you're a self-starter who thrives in a collaborative and challenging environment, we encourage you to apply. Our company values are fundamental to our daily operations. To succeed here, you'll need to embrace and live our company values:

  • Build Trust (Integrity and Transparency): Inspire by example, honor your promises, be sincere.
  • Embrace Diversity (Respect and Empathy): Actively listen and connect with others, value all perspectives. Acknowledge accomplishments, appreciate contributions.
  • Work in Team (Foster Trust and Collaboration): Create authentic partnerships by offering opportunities and sharing responsibilities.
  • Focus on Essentials (Be Simple): Make the complex simple by creating effective solutions. Communicate in a clear and easy to understand way.
  • Be Your Best Self (Excellence): Commit to discipline and perseverance, nurturing both your body and mind. Approach everything you do with passion and purpose.

Job Requirements

  • Strong SQL skills (query optimization, partitioning, performance tuning).
  • Strong Python skills.
  • Experience building and deploying jobs in Databricks.
  • Solid understanding of distributed data processing.
  • Experience working with large datasets.
  • Ability to independently design and implement end-to-end solutions.
  • Comfort using Bash and cloud CLIs (AWS/Azure) when needed.
  • Infrastructure-as-Code experience (Terraform, ARM, CloudFormation) - Nice to have.
  • Experience with Docker and CI/CD pipelines - Nice to have.
  • Experience deploying databases or services in cloud environments - Nice to have.
  • Experience with NoSQL databases (e.g., MongoDB) - Nice to have.
  • Understanding of Lakehouse architectures - Nice to have.
  • Strong system design fundamentals - Nice to have.
  • Open-source contributions - Nice to have.

Related Categories

Related Job Pages

More Data Engineer Jobs

Data Engineer2 days ago
Full TimeRemoteTeam 10,001+Since 1954H1B Sponsor

The role involves designing, building, and operating scalable end-to-end data pipelines and curated data products to support enterprise analytics and agentic AI use cases. This includes integrating data from various sources and delivering reliable data services for AI workflows, such as APIs and retrieval/indexing.

PythonJavaSQLNoSQLData PipelinesData GovernanceData QualityData ObservabilityAPIAzureAWSLakehouse ArchitectureData IntegrationReal-time StreamingDocument ProcessingEmbeddingsVector SearchHybrid Search
United States
$119K - $161K / year

Analytics Solution Architect

American Addiction Centers

Advocate Health is the third-largest nonprofit, integrated health system in the United States, created from the combination of Advocate Aurora Health and Atrium Health. Providing care under the names Advocate Health Care in Illinois; Atrium Health in the Carolinas, Georgia and Alabama; and Aurora Health Care in Wisconsin, Advocate Health is a national leader in clinical innovation, health outcomes, consumer experience and value-based care. Headquartered in Charlotte, North Carolina. Services nearly 6 million patients and is engaged in hundreds of clinical trials and research studies. Wake Forest University School of Medicine serves as the academic core of the enterprise. Nationally recognized for its expertise in cardiology, neurosciences, oncology, pediatrics and rehabilitation, as well as organ transplants, burn treatments and specialized musculoskeletal programs. Employs 155,000 teammates across 69 hospitals and over 1,000 care locations. Offers one of the nation’s largest graduate medical education programs with over 2,000 residents and fellows across more than 200 programs. Committed to providing equitable care for all, Advocate Health provides more than $6 billion in annual community benefits.

Data Engineer2 days ago
Full TimeRemoteTeam 1,001-5,000

This role involves defining and leading the enterprise-wide technical strategy for operational analytics platforms, ensuring scalability, security, and integration across tools like Epic Cogito, third-party BI tools, and cloud data platforms. The architect will also establish tool usage frameworks, evaluate emerging capabilities, and mentor developers to promote full-stack analytics development.

Epic CogitoSnowflakePower BIBusinessObjectsSSRSData GovernanceRBACDevOpsCI/CDSQLData ArchitectureHealthcare Data
United States
$55 - $82 / hour
Data Engineer2 days ago
Full TimeRemoteTeam 10,001+Since 1954H1B Sponsor

The Data Engineer Senior will develop and manage data analysis pipelines, including database development and statistical software programming for analyzing complex data supporting the Environmental Protection Agency. Responsibilities include administering MySQL, PostgreSQL, and MongoDB databases, designing ETL/ELT processes, and implementing highly optimized data models.

ETLMongoDBPentahoSQLMySQLPostgreSQLPythonNoSQLData ModelingAgile
United States
$112K - $145K / year

Enterprise Data Warehouse Developer - Remote- Epic Clarity Certification Required

LCMC Health

Eight hospitals + dozens of New Orleans area clinics and practices, all focused on keeping you well.

Data Engineer2 days ago
Full TimeRemoteTeam 10,001+H1B Sponsor

The role involves developing data products such as tables, DataMarts, and stored procedures to support reports and dashboards, while also conceiving and designing the analytics and business intelligence platform architecture, including ETL processes for data intake and quality checks.

SQLSSRSSSISETLData WarehousingData ArchitectureTableauEpic ClarityPerformance TuningVersion Control
United States