This firm is a highly respected, technology-centric investment business operating across a broad range of asset classes. Their success is built on a mix of quantitative research, cutting-edge engineering and scalable data infrastructure. Engineers here play a central role: they design, build and maintain the platforms that underpin research, trading and large-scale data analysis.
It’s a collaborative environment where technical ownership is encouraged, engineering craft is valued, and impactful work directly supports sophisticated investment strategies.
Work on the design and build of fast, scalable market-data systems used across trading and research groups.
Python, cloud-native tooling, containerisation, large-scale data lake technologies.
Partner closely with exceptional quantitative researchers, data engineers and traders.
Influence architectural decisions and continuously refine pipeline performance.
Benefit from strong compensation and long-term career growth within a high-performing engineering organisation.
Design, implement, and maintain high-throughput, low-latency pipelines for ingesting and processing tick-level market data at scale.
Operate and optimise timeseries databases (KDB, OneTick) to efficiently store, query, and manage granular datasets.
Architect cloud-native solutions for scalable compute, storage, and data processing, leveraging AWS, GCP, or Azure.
Develop and maintain Parquet-based data layers; contribute to evolving the data lake architecture and metadata management.
Collaborate closely with trading and quant teams to translate data requirements into robust, production-grade pipelines.
Implement monitoring, validation, and automated error-handling to ensure data integrity and pipeline reliability.
Maintain clear, precise documentation of data pipelines, architecture diagrams, and operational procedures.
3+ years of software engineering experience, preferably focused on market-data infrastructure or quantitative trading systems.
• Strong Python expertise with a solid grasp of performance optimisation and concurrency.
• Proven experience designing, building, and tuning tick-data pipelines for high-volume environments.
• Strong background in profiling, debugging, and optimising complex data workflows.
• Experience with timeseries databases (KDB, OneTick) and/or performance-critical C++ components.
• Deep understanding of financial markets, trading data, and quantitative workflows.
• Excellent communication skills with the ability to articulate technical solutions to engineers and non-engineers alike.