Building a Real-Time, Event-Driven Stock Platform at Euronext
Use Cases

Building a Real-Time, Event-Driven Stock Platform at Euronext

Angela Burk

As the head of global customer marketing at Confluent, I tell people I have the best job. As we provide a complete event streaming platform that is radically changing how companies handle data, I get to work with customers in almost every industry, partner closely with our sales teams, and learn from and be inspired by the event streaming community.

Collectively, we have a great opportunity to leverage the community that has formed around interesting stories and to celebrate its successes. Starting today, we’ll use the blog to occasionally do just that.

Euronext, one of the largest stock exchanges in the world (spanning Belgium, France, Ireland, the Netherlands, Norway, Portugal, and the UK), has built a brand new market infrastructure and event-driven trading platform called Optiq with Confluent Platform at the core.

They’re a Customer Project of the Year finalist for 2019’s Computing Technology Product Awards, and we couldn’t be more proud.

As a company that can trace its roots back to 1602, reinventing the way it does business was a major undertaking. For mission-critical platforms that support the market capitalization of six countries, it’s important to ensure that everyone has access to the same data at the same time—performance and reliability are non-negotiable (no pressure at all!).

That’s why Euronext turned to Confluent. Using Confluent Platform, Euronext easily leveraged the power of Apache Kafka® to implement a reliable, scalable persistence layer for market orders that supports millisecond latencies. Euronext was able to replace its market data gateway with one that handles billions of messages per day, sending market data to vendors, as well as Euronext’s trading members that use the information in their trading strategies. Confluent Platform also enables them to build applications that interface with clearinghouses, monitor market latency, perform replication for disaster recovery, and store records in a data warehouse in compliance with regulatory requirements.


With Confluent Platform as the backbone of their persistence engine, Euronext is now running Optiq in production on all its cash markets. The results are impressive: Optiq provides a tenfold increase in capacity to ingest messages and an average performance latency as low as 15 microseconds.

According to Alain Courbebaisse, CIO at Euronext, “We have stringent requirements for real-time performance and reliability, and we have confirmed—from proof of concept to deployment of a cutting-edge production trading platform—that we made the right decision.”

Help support Euronext’s project of the year and vote! #TechProductAwards

To learn more about Euronext and their event streaming infrastructure, watch the video and read the full case study.

Angela Burk runs Confluent’s global customer marketing function. She joined Confluent from ServiceNow where she built and led a global customer advocacy function for seven years. She’s also led marketing and communications functions at NetApp, Jive, Interwoven (part of HP), Clarify (part of Amdocs), Octel Communications (acquired by Lucent), and NEC. Angela is a California native and graduate of San Jose State University.

Subscribe to the Confluent Blog


More Articles Like This

Google Cloud + Confluent Cloud + Scrapinghub
Ian Duffy

Why Scrapinghub’s AutoExtract Chose Confluent Cloud for Their Apache Kafka Needs

Ian Duffy .

We recently launched a new artificial intelligence (AI) data extraction API called Scrapinghub AutoExtract, which turns article and product pages into structured data. At Scrapinghub, we specialize in web data ...

kafka summit
Luanne Dauber

The Top Sessions from This Year’s Kafka Summit Are…

Luanne Dauber .

This past April, Confluent hosted the inaugural Kafka Summit in San Francisco. Bringing together the entire Kafka community to share use cases, learnings and to participate in the hackathon. The ...

Jay Kreps

Introducing Kafka Streams: Stream Processing Made Simple

Jay Kreps .

Update: Today, KSQL, the streaming SQL engine for Apache Kafka®, is also available to support various stream processing operations, such as filtering, data masking and streaming ETL. It is complementary ...

Fully managed Apache Kafka as a Service

Try Free

We use cookies to understand how you use our site and to improve your experience. Click here to learn more or change your cookie settings. By continuing to browse, you agree to our use of cookies.