Principal Data Engineer (Architect Data Architecture Engineering Principal Lead Manager Java Python GCP AWS Cloud Trading Front Office Finance Financial Services Spark Kafka Agile Remote Data Lake Datalake Investment Banking Asset Management Manager) required by our fixed income trading software client in London.
You MUST have the following:
- Good experience as a Principal Data Engineer/Lead Data Engineer/Senior Data Engineering Architect
- Advanced ability around architecture and design
- Expert in Java or Python
- GCP or AWS
- Experience in data lake design
- Apache Spark/PySpark for trade analytics
- Kafka for low-latency trading
The following is DESIRABLE, not essential:
- Data engineering within live trading environments
- Cloud strategy definition and implementation
Role: Principal Data Engineer (Architect Data Architecture Engineering Principal Lead Manager Java Python GCP AWS Cloud Trading Front Office Finance Financial Services Spark Kafka Agile Remote Data Lake Datalake Investment Banking Asset Management Manager) required by our fixed income trading software client in London. You will join a global team of 11 that report into the New York based Senior Data Engineering Manager. His responsibility is the man-management of the team, communication with other teams in the organisation and senior management, budgeting and some delivery. You will take responsibility for technical strategy, delegation of work, prioritisation, code and design reviews, the implementation of standards and processes and some delivery. This team is part of a greater department of about 70 that engineers that platforms behinds the company's trading applications, data products and post-trade products. You will be tasked with assessing the existing data lake, then defining, designing and delivering it's upgrade. Further, the company has moved towards AWS but this needs to be better defined and implemented. You will be collaborate with the cloud team and contribute heavily to the data aspect of this. You should be able to demonstrate experience of having done this in the past with AWS, GCP or Azure.
The role will require you to have Python or Java but ideally some of both. In the first 12 months, in addition to the cloud work, you will also be involved in implementing standards, putting in place processes and creating a high-performing team. As the team is responsible for the environment that allows the analytics and trading services to function, you will need to have Kafka and Spark experience.
Hybrid working is extremely flexible, as long as you are able to make it into the New York office if required.
Salary: £130k - £170k + 20% Bonus + Share Options + Pension