Project Metamorphosis: Unveiling the next-gen event streaming platformLearn More

Event Sourcing & How it Works

 

Event streams have revolutionized the way businesses handle data. In order to make efficient use of real-time data streams, companies are moving towards event-driven architectures that organize their applications as a set of loosely-coupled services communicating through an event pipeline. Event sourcing is a way to create high-performance software that thrives in this new, dynamic and data-rich environment. Learn how it works, complete with examples, and use cases.

 

Event Sourcing Explained

What is Event Sourcing?

Event sourcing is a design pattern that recognizes the nature of real-time, dynamic data. In real life, data is constantly generated as continuous streams, evolving as events happen. Each change to the state of an application is associated with an event, which contains information about the time and nature of a specific change. Event sourcing means that we can capture and store these events, instead of working with static snapshots of our data.

Event sourcing differs from the traditional CRUD model (create, read, update, and delete model) of data handling. In the CRUD model, a process must ensure consistency by putting locks on shared data. As a result, the CRUD model performs poorly in data-rich applications with many concurrent users. Event sourcing was invented to solve this problem.

Events are historical facts. They never change. They can be stored in an immutable log using an append-only operation. Once created, events never get updated or locked by any process. With event sourcing, we can create scalable high-performance systems comprising loosely-coupled components that don't block each other. This is the key advantage of event sourcing.

How it Works

Event sourcing works like a version control system inside a running application. Instead of storing snapshots of our data, we store incremental changes to our data in an immutable log.

Technically, event sourcing is a solution to the problem of synchronizing the domain with the data model in a distributed architecture with many concurrent users. We model each change as an event object and store it in an immutable log, perhaps in the form of a Kafka topic. Every event is captured and stored as a permanent record of an incremental change.

The current state of an object may vary from moment to moment, but events are immutable facts signifying that a particular change occurred at a particular time. By storing all events, we capture a data stream that contains the history of our system in rich detail. In this way, our event log becomes the single immutable source of truth about every object in the system.

In order to maintain the integrity of the event log, the storage system must support programming in a purely functional style, which means programming without "side effects". The injunction against side effects fits in perfectly with the concept of microservices and with functional programming styles which have become popular in big data projects for much the same reasons.

The key insight of event sourcing is that it is possible to compute the current state of every object in our application from the event log, but not vice versa. For any entity, we can compute its current state by folding (or replaying) all subsequent events starting from a given initial state.

Examples and Real-Life Use Cases

There are many use cases for event sourcing. Here are three real-life examples.

    1. An online bank can use event sourcing to store deposits and debits as a sequence of events. The current balance can then be computed by folding the events into the starting balance. This is perhaps the most familiar example of event sourcing.

    2. An inventory management system can store information about shipped and returned items as a sequence of events. The event log is never overwritten, so the system only needs to be able to read data from the log and append data to the end of the log. We can query the availability of an item by replaying the events that describe shipments and returns.

    3. A fraud detection system can remember all user actions as a sequence of events in an immutable log. This means that every change can be traced to a specific action by a specific user at a specific time. We can run different microservices on the event log and search for a variety of fraud patterns. Complex Event Processing enables such patterns to be found across multiple event streams.

Benefits of Event Sourcing

Given the ever-growing abundance of real-time data, event sourcing has major benefits over traditional data storage.

Scalability and Performance Event-sourcing is a design pattern to create applications that perform well and scale well to massive amounts of data. One of the key advantages of event sourcing is that writes and reads can be decoupled. In general, event sourcing encourages loose coupling between components, enabling companies to migrate towards a microservices-based architecture.

Consistency: It is well-known that object-oriented programming styles and relational databases do not fit together easily. (Sometimes this is called an "impedance mismatch problem"). By storing data in an event log, event sourcing creates a consistent data model that fits in with the domain model.

Resilience: As event sourcing provides a complete history of every change in an application, it can dramatically improve resilience and fault-tolerance in data-rich applications. Every application can simply rewind the event log and run as many times as necessary. The results of a rerun would simply be appended to the immutable event log as a new sequence of events.

Audit Trail: Every user action can be stored permanently, so event sourcing is a natural way to create a complete audit trail for compliance and security. In particular, every change in the data can be identified and traced back to a specific event.

Better Software: Event sourcing encourages exploratory programming and improves testing, trouble-shooting, and maintenance. By using immutable data, different teams are empowered to deploy microservices without requiring central coordination and control.

Future-Proofing:Businesses are always evolving and coming up with new ways to use data. It’s impossible to predict what we may want to do with our data in five year's time. Event sourcing encourages a style of a system architecture that enables organizations to keep their historical data for future applications.

Why Confluent?

Real-world businesses need real-time data.

Confluent is the only complete event streaming platform that encompasses event sourcing, complex event processing, and real-time event streaming at mass scale. Deploy on your own infrastructure, multi-cloud, or serverless in minutes.

Sign Up Now

Start your 3-month trial. Get up to $200 off on each of your first 3 Confluent Cloud monthly bills

New signups only.

By clicking “sign up” above you understand we will process your personal information in accordance with our Privacy Policy.

By clicking "sign up" above you agree to the Terms of Service and to receive occasional marketing emails from Confluent. You also understand that we will process your personal information in accordance with our Privacy Policy.

Get Confluent Cloud

Get up to $200 off on each of your first 3 Confluent Cloud monthly bills


Choose one sign-up option below

Marketplaces

  • AWS
  • Azure
  • Google Cloud

  • Billed through your Cloud provider*
  • Stream only on 1 cloud
*Billing admin role needed

Marketplaces

  • Billed through your Cloud provider*
  • Stream only on 1 cloud
  • Billing admin role needed

*Billing admin role needed

Confluent


  • Pay with a credit card
  • Stream across multiple clouds

Confluent

  • Pay with a credit card
  • Stream across multiple clouds

By clicking “sign up” above you understand we will process your personal information in accordance with our Privacy Policy.

By clicking "sign up" above you agree to the Terms of Service and to receive occasional marketing emails from Confluent. You also understand that we will process your personal information in accordance with our Privacy Policy.

Free Forever on a Single Kafka Broker
i

The software will allow unlimited-time usage of commercial features on a single Kafka broker. Upon adding a second broker, a 30-day timer will automatically start on commercial features, which cannot be reset by moving back to one broker.

Select Deployment Type
Manual Deployment
  • tar
  • zip
  • deb
  • rpm
  • docker
or
Auto Deployment
  • kubernetes
  • ansible

By clicking "download free" above you understand we will process your personal information in accordance with our Privacy Policy.

By clicking "download free" above, you agree to the Confluent License Agreement and to receive occasional marketing emails from Confluent. You also agree that your personal data will be processed in accordance with our Privacy Policy.

This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising, and analytics partners.