Job Description:
About Dimensional:
Dimensional was built around a set of ideas bigger than the firm itself. With
a confidence
in markets, deep connections to the academic community, and a focus on implementation, we go where
the science
leads, and continue to pursue new insights, both large and small, that can
benefit
our clients.
The Technology Department at Dimensional leverages the rapidly evolving
state of the art
to engineer the platforms that power the innovative, research-driven financial and technical products to improve our client’s financial lives.
As a Senior Python Engineer w
ithin the Data Distribution
team
you will
participate in the management of
Dimensional’s
enterprise investment data warehouse which supports Research, Portfolio management, Trading, and Analytic functions.
Y
ou will have the opportunity to understand our client's needs, collaborate on the design of solutions, and work with emerging data engineering tools and best practices. In this role, you will design, develop, document, and test multiple application services focusing on building scalable data
platform
and services. You will also expand and
optimize
our data and data pipeline architecture.
A
successful candidate
will
demonstrate
strong technical and analytical ability across multiple tech stacks as well as
bring
passion
for
optimizing
and
building data applications from the ground up.
You may be a fit for this role if you:
Are open-minded, curious, and resourceful.
Lead with vision and purpose to bring about transformational change.
Are passionate about/
stay
current with modern technologies/solutions.
Solve problems systematically and transparently.
Share ideas,
solicit
/integrate feedback, design and solve collaboratively.
Demonstrate
engineering
and security mindsets.
What you might work on:
Build and deliver investment data technology solutions in support of Research, Portfolio Management, Trading, Analytics and Reporting functions.
Formulate, design, develop, test, and deliver data technology solutions with a balanced focus on speed and quality.
Collaborate with business analysts, product owners, and project managers to develop user stories, estimates, and work plans.
Work with minimal supervision and
advise
business clients and IT management of technology capabilities and recommend strategies to maximize the benefits of
new technologies
.
Identify
, design, and implement changes to data pipelines at various stages including data ingestion, data validation, and quality control, data integration, storage, management, and data delivery.
Write unit/integration tests, contribute to engineering wiki, and write detailed documentation
Build high-performance and scalable data-transfer toolsets which reliably transfer datasets between endpoints within established SLA's
.
Focus on data consistency, refresh rates and caching requirements while keeping the data current across a variety of interfaces.
Qualifications:
B
achelor’s degree in engineering, math, computer science, or a related field
, or equivalent work experience
.
4
-
5 years of programming experience in Python (open source) or equivalent.
Proficiency
in building RESTful APIs and web services.
4
-
5 years of SQL experience
.
Proven
track record
of
leveraging
SOLID principles and Domain
D
riven
D
esign
to drive successful outcomes
.
Experience in high performance and high availability data applications
including
expertise
in
performance optimization and tuning.
Experience with automated acceptance testing and ability to write unit-tested, maintainable code.
Strong
understanding of cyber security risks and
demonstrated
ability to design and build highly secure applications.
Experience working in a dynamic and interactive team environment to build world-class software implementations.
Knowledge of best practices and IT operations in an always-up, always-available service.
Experience working with
both
Agile
/
Scrum and waterfall methodologies with a software development and integration focus.
Preferred Competencies:
Master’s degree in engineering, math, computer science, or a related field
Proficiency
with NoSQL database implementation and optimization
Ability to work on multiple programming languages and platforms is
strongly
preferred
Financial services industry knowledge or experience
Experience with the following:
Kafka
Airflow
PostrgeSQL
Ansible
Elastic Stack
RabbitMQ
Redis
Docker
Okta, OAuth2,
PlainID
#LI-Remote
Dimensional offers a variety of programs to help take care of you, your family, and your career, including comprehensive benefits, educational initiatives, and special celebrations of our history, culture, and growth.
It is the policy of the Company to provide equal opportunity for all employees and applicants. The Company recruits, hires, trains, promotes, compensates, and administers all personnel actions without regard to actual or perceived race, color, religion, religious practice, creed, sex, sex stereotyping, pregnancy (which includes pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), caregiver status, gender, gender identity, gender expression, transgender identity, national origin, age, mental or physical disability, ancestry, medical condition, marital status, familial status, domestic partnership status, military or veteran status or service, unemployment status, citizenship status or alienage, sexual orientation, status as a victim of domestic violence, status as a victim of stalking, status as a victim of sex offenses, genetic information, political activities or recreational activities, arrest or conviction record, salary history, natural hairstyle or any other status protected by applicable law except as otherwise required or permitted by law or regulation applicable to the Company or its affiliates.