Skip to main content

This job has expired

You will need to login before you can apply for a job.

Data Engineer

Employer
BNY Mellon Investment Management
Location
Pittsburgh, USA
Salary
Competitive
Closing date
Oct 10, 2020

View more

Job Function
Other
Industry Sector
Finance - General
Employment Type
Full Time
Education
Bachelors
Mellon is a global multi-specialist manager dedicated to servingour clients with a full spectrum of single and multi-asset investmentstrategies and solutions. With roots dating back to 1933, Mellon has beeninnovating across asset classes for generations and has the combined scale andcapabilities to offer clients a broad range of solutions. From asset classexpertise to broad market exposures, clients drive what we do. We are holisticin approach, client driven and committed to investment excellence. We aim to bea key partner for our clients by delivering customized investment outcomes andbest-in-class service.
Role Overview
The Data Engineer is responsible for building and supportingsystems to transform, store, and improve processes around data for MellonResearch. This role will focus on the Mellon research data pipeline,warehouse, databases, and BI tooling. He/she will work with businessanalyst, data scientists, and other data engineers to facilitate ETL/ELTprocesses that move, clean, and store data. The engineer will also betasked with creating data accessibility points and tooling to enable reportinginsights with ease of use and maintenance in mind. The data engineer isexpected to provide input to end state design and schema while enforcing bestpractices.
Responsibilities
Design, build, and maintain efficient and progressive datainfrastructure for Mellon research across disparate research silos in SanFrancisco, Boston, and Pune focusing on creating a transparent dataenvironment.
  • Engage in a variety of tactical projects including but not limited to ETL, storage, visualization, reporting, web- scraping, and dashboard development
  • Support, document, and evolve (re-architect as needed) existing core data stores
  • Utilize ETL tooling to build, template, and rapidly deploy new pipelines for gathering and cleaning data
  • Analyze existing data stores / data marts, clean, and migrate into a centralized data lake
  • Work with Technology and Research leads to implement central and/or virtualized warehousing solutions
  • Develop APIs for accessing data, for use by business users (i.e., researchers and portfolio managers)
  • Configure Tableau dashboards and reports while serving as SME for end consumers of data
  • Identify and deploy advanced BI tooling on top of datasets including AI/ML/DL techniques and algorithms
  • Assist in the design and development of enterprise data standards and best practices
  • Use modern tooling to focus on progressive technology and expand business capabilities and time to market

Work closely with business analysts, data scientists, andtechnologists through full project lifecycles which will provide deep insighton research needs, business processes, and research practices.
  • Gather requirements and analyze solution options
  • Develop solutions and define and execute test plans
  • Define and implement operational procedures
  • Automate the research and review of data quality issues to ensure data accuracy and reliability
  • Resolve data integrity and data validation issues.
  • Produce ad-hoc queries and reports for non-standard requests from Data Scientists and Data Consumers.
  • Become SMEs on the full suite of solutions delivered by the Research Data Engineering team with an eye to identify, analyze, and interpret trends or patterns to identify new solution options, define process improvement opportunities and generate value opportunities for our business partners.


Qualifications
  • Bachelor's degree or equivalent work experience required
  • 8+ years of experience as a data engineer, software engineer, or similar
  • Strong Experience building ETL pipelines and knowledge of ETL best practices
  • Experience with overall data architecture and data routing design
  • Familiarity with data quality control tools and processe
  • Strong communication skills and a keen attention to detail
Technical Qualifications:
Candidate is not expected to have expertise in all technical areaslisted but should be highly proficient in several of these including:
  • SQL, R, Python, Matlab, SSIS, Pentaho/Kettle, Excel, Tableau, MongoDB, Kafka, Hive/Spark, Parquet
  • Experience with CI/CD, container, and frameworks: GitLab, Selenium, Docker, Kubernetes
  • Disciplines: Microservice Architecture, Design Patterns
  • Environment Tooling: Agile, JIRA, Confluence
  • Familiarity with RDBS and/or NoSQL and related best practices
Nice to Have Qualifications:
  • Experience working in investment research and/or quantitative finance
  • Advanced Degree or CFA
  • Development experience with R or Python in a data-science or research setting
  • Knowledge/Experience with financial data provider API's (Bloomberg/Factset/Datastream/MSCI)
  • Experience in EAGLE PACE Access and Oracle
  • Knowledge/Experience with the following technologies:
    • Symphony (STC)
    • .Net Core
    • Snowflake
    • .Net Core
    • Dataiku
    • Cloud and distributed computing experience
    • Big Data Experience


BNY Mellon is an Equal Employment Opportunity/Affirmative Action Employer.
Minorities/Females/Individuals With Disabilities/Protected Veterans.

Our ambition is to build the best global team - one that is representative and inclusive of the diverse talent, clients and communities we work with and serve - and to empower our team to do their best work. We support wellbeing and a balanced life, and offer a range of family-friendly, inclusive employment policies and employee forums.

Primary Location: United States-Pennsylvania-Pittsburgh
Internal Jobcode: 45276
Job: Asset Management
Organization: Mellon With TOH ADJ-HR13428
Requisition Number: 2007800

Sign in to create job alerts

Sign in or create an account to start creating job alerts and receive personalised job recommendations straight to your inbox.

Create alert