As the first pan-European exchange – spanning Belgium, France, Ireland, the Netherlands, Portugal and the UK – Euronext operates regulated securities and derivatives markets in Amsterdam, Brussels, Lisbon and Paris, as well as a regulated securities market in Ireland and the UK. Euronext recently developed a new event-driven trading platform, Optiq®, that provides a tenfold increase in capacity and an average performance latency of as low as 15 micro-seconds for order roundtrip as well as for market data. Underpinning the Optiq platform is a persistence layer that the Euronext development organization built using Confluent Platform. Confluent Platform provides a reliable, scalable streaming infrastructure for Optiq that supports millisecond latencies with no messages lost. “When we started, Kafka was a new technology to us, and one that we had decided to use for a very critical application in our system,” says Philippe Planchon, Architect and Innovative Trading Solutions Director at Euronext. “With Confluent we felt supported in our decision and we knew we had the right level of expertise to get prepared and to help if we encountered any issues. That was a key element in our success.”
Develop a new trading platform for markets across multiple European countries that sup- ports high-volume, high-speed trading and provides clients with access to real-time data.
Use Confluent Platform to implement a reliable, scalable persistence layer for market orders that supports millisecond latencies and billions of messages per day.
After evaluating several streaming platform alternatives, Euronext selected Confluent Platform with Apache Kafka® for the persistence layer of the Optiq multi-market trading platform.
The Euronext development team started with a proof of concept prototype: a Kafka-based matching engine for an order book with buyers and sellers. Benchmarks of this prototype in which data was pushed to multiple consumers via Kafka showed that it was capable of meeting the platform’s overall high-performance requirements.
“With the first benchmark we did with Kafka, we saw that the capacity to ingest messages up to a rate of one million per second was easily achieved,” says Pujalte. “Moreover, the integration of Kafka with our C++ world was straightforward to implement via the API and library. From the start of this large project, it was easy to see that Kafka was the right choice for us.”
We have been very satisfied with Confluent Platform as the backbone of our persistence engine. The platform has been super reliable. We have stringent requirements for real-time performance and reliability, and we have confirmed – from proof-of-concept to deployment of a cutting-edge production trading platform – that we made the right decision.
We chose event-driven architecture as the core of our platform
Flow Disrupts Payment Processing Industry with Confluent at Its Core
Funding Circle discusses the impact of Confluent, Apache Kafka and Exactly-Once Semantics on their lending marketplace.