Elevating Kafka: Driving operational excellence with Albertsons + Forrester | Watch Webinar

Event Streaming

Event stream processing (ESP) is a technology that can process a continuous flow of data as soon as an event or change happens. By processing single points of data rather than an entire batch, event streaming platforms provide an architecture that enable software to understand, react to, and operate as events occur.

Learn how event stream processing works, its major benefits, and how to get started building event-driven architectures in the free stream processing guide.

Whether in e-commerce, finance, travel, or gaming, every business is inundated with event streams on a day-to-day basis. With customers increasingly looking for responsive interactions and experiences, companies are just discovering the importance of event streaming, allowing real-time data to be processed, stored, and acted upon as real-time events occur. Learn how event streaming is revolutionizing the way business run with an overview of how event streams work, benefits, and use cases.

What is Event Streaming?

What is Event Streaming?

Similar to streaming data, event sourcing, complex event processing (CEP), event streaming is the continuous flow of data generated with each event, or change of state.

By using event stream processing technologies like Apache Kafka, these events (i.e. a credit card swype, server outage, or social media update) can be processed, stored, analyzed, and acted upon as they're generated in real-time.

In this Online Talk we will cover the basics of event streaming and the Kafka technology that brings event streaming to life. Beyond the basics, we will share some customer use cases on how they are using Kafka to grow their business and stay ahead of the competition.

How it Works

How Event Streaming Works

Data processing is not new. In previous years, legacy infrastructure was more structured because it only had a handful of sources that generated data and the entire system could be architected in a way to specify and unify the data and data structures.

Modern data comes in the form of events.

Pretty much every program uses, and responds to events of some kind: the mouse moving, input becoming available, web forms being submitted, bits of JSON being posted to your endpoint, the sensor on the pear tree detecting that a partridge has landed on it. Kafka encourages you to see the world as sequences of events, which it models as key-value pairs.

Applications that analyze and process data streams need to process one data packet at a time, in sequential order. Each data packet generated will include the source and timestamp to enable applications to work with data streams.

Applications working with data streams will always require two main functions: storage and processing. Storage must be able to record large streams of data in a way that is sequential and consistent. Processing must be able to interact with storage, consume, analyze and run computation on the data.

This also brings up additional challenges and considerations when working with data streams. Many platforms and tools are now available to help companies build streaming data applications.

Batch vs Streams

Batch Processing vs Event Streaming

Legacy batch data processing methods required data to be collected in batch form before it could be processed, stored, or analyzed whereas streaming data flows in continuously, allowing that data to be processed in real time without waiting for it to arrive in batch form.

Today, data arrives naturally as never ending streams of events. This data comes in all volumes, formats, from various locations and cloud, on-premises, or hybrid cloud.

With the complexity of today's modern requirements, legacy data processing methods have become obsolete for most use cases, as it can only process data as groups of transactions collected over time. Modern organizations actively use real-time data streams, acting on up-to-the-millisecond data. This continuous data offers numerous advantages that are transforming the way businesses run.

Examples and Use Cases

Event Streaming Examples and Use Cases

Some common examples of streaming data are real-time stock trades, retail inventory management, and ride-sharing apps.

For example, when a passenger calls Lyft, not only does the application know which driver to match them to, they know how long it will take based on real-time location data and historical traffic data, and how much it should cost based on both real-time and past data.

Event streams play a key part in the world of big data, providing real-time analyses, data integration, and data ingestion.

Event Streaming Use Cases

  • Data pipelines (ETL, integration)
  • Real-time monitoring, metrics, and analytics
  • Event-driven microservices
  • Enterprise wide event-driven microservices
  • IoT
  • Customer insights and analytics
  • Digital transformation
  • Legacy IT modernization
  • Central nervous system/digital nervous system
How Confluent Can Help

How Data Streaming Platforms Can Help

Used by 80% of the Fortune 100, Confluent's data streaming platfoirm helps you set your data in motion, no matter where your data resides.

From real-time fraud detection, financial services, and multi-player games, to online social networking, Confluent lets you focus on deriving business value from your data rather than worrying about the underlying mechanics of how data is streamed, integrated, stored, and connected at scale.

No credit card required! Plus, new signups get a free $400 credit to spend during their first 60 days.

์ŠคํŠธ๋ฆฌ๋ฐ ์žฅ์  ๋ฐ ์‚ฌ์šฉ ์‚ฌ๋ก€

๋ฐ์ดํ„ฐ ์ŠคํŠธ๋ฆฌ๋ฐ์˜ ์ด์ 

๋ฐ์ดํ„ฐ ์ˆ˜์ง‘์€ ํผ์ฆ์˜ ํ•œ ์กฐ๊ฐ์— ๋ถˆ๊ณผํ•ฉ๋‹ˆ๋‹ค. ์˜ค๋Š˜๋‚ ์˜ ์—”ํ„ฐํ”„๋ผ์ด์ฆˆ ๋น„์ฆˆ๋‹ˆ์Šค๋Š” ์ผ๊ด„์ ์œผ๋กœ ์ฒ˜๋ฆฌ๋˜๋Š” ๋„๊ตฌ์—๋งŒ ์˜์กดํ•  ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค. ์ฃผ์‹ ๊ฑฐ๋ž˜ ํ”Œ๋žซํผ, Netflix, ์†Œ์…œ ๋ฏธ๋””์–ด ํ”ผ๋“œ ๋“ฑ ๋ชจ๋“  ๊ฒƒ์ด ์‹ค์‹œ๊ฐ„ ๋ฐ์ดํ„ฐ ์ŠคํŠธ๋ฆผ์— ์˜์กดํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.

์• ํ”Œ๋ฆฌ์ผ€์ด์…˜์€ ์ŠคํŠธ๋ฆฌ๋ฐ ๋ฐ์ดํ„ฐ๋ฅผ ํ™œ์šฉํ•˜๋ฉด์„œ ๋ฐ์ดํ„ฐ๋ฅผ ํ†ตํ•ฉํ•  ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ ์ด๋ฒคํŠธ ๋ฐœ์ƒ ์‹œ ์ฒ˜๋ฆฌ, ํ•„ํ„ฐ๋ง, ๋ถ„์„ํ•˜๊ณ  ์ด์— ๋Œ€์‘ํ•˜๋Š” ์ˆ˜์ค€์œผ๋กœ ๋ฐœ์ „ํ–ˆ์Šต๋‹ˆ๋‹ค. ์ด๋ฅผ ํ†ตํ•ด ์ „ ์„ธ๊ณ„ ์ˆ˜๋ฐฑ๋งŒ ๋ช…์˜ ์‚ฌ์šฉ์ž, ์•ฑ, ์‹œ์Šคํ…œ์œผ๋กœ ํ™•์žฅํ•  ์ˆ˜ ์žˆ๋Š” ์˜ˆ์ธก ๋ถ„์„, ์ƒ์„ฑํ˜• AI, ์›ํ™œํ•œ ์˜ด๋‹ˆ์ฑ„๋„ ์‡ผํ•‘ ๊ฒฝํ—˜๊ณผ ๊ฐ™์€ ์ƒˆ๋กœ์šด ์‚ฌ์šฉ ์‚ฌ๋ก€๊ฐ€ ๋“ฑ์žฅํ•˜๊ฒŒ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.

์š”์ปจ๋Œ€, ์ง€์†์ ์ธ ์‹ค์‹œ๊ฐ„ ๋ฐ์ดํ„ฐ ์ŠคํŠธ๋ฆผ์˜ ์ด์ ์„ ๋ˆ„๋ฆด ์ˆ˜ ์žˆ๋Š” ๋ชจ๋“  ์‚ฐ์—…์—์„œ ๋ฐ์ดํ„ฐ ์ŠคํŠธ๋ฆฌ๋ฐ ํ”Œ๋žซํผ์˜ ์ด์ ์„ ๋ˆ„๋ฆด ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.


์ด๋ฒคํŠธ ์ŠคํŠธ๋ฆฌ๋ฐ๊ณผ Confluent์˜ ์ŠคํŠธ๋ฆผ ๋ฐ์ดํ„ฐ ํ”Œ๋žซํผ์ด ํ•จ๊ป˜ ์‹ค์‹œ๊ฐ„ ๋ฐ์ดํ„ฐ์— ์ƒ๋ช…์„ ๋ถˆ์–ด๋„ฃ์Šต๋‹ˆ๋‹ค. ๋ชจ๋“  ์—…๊ณ„์— ์ด๋ฒคํŠธ ์ŠคํŠธ๋ฆฌ๋ฐ์— ๋Œ€ํ•œ ์‚ฌ์šฉ ์‚ฌ๋ก€๊ฐ€ ์กด์žฌํ•˜์ง€๋งŒ ์‹ค์‹œ๊ฐ„, ๋Œ€๊ทœ๋ชจ๋กœ ๋ฐ์ดํ„ฐ๋ฅผ ํ†ตํ•ฉ, ๋ถ„์„, ๋ฌธ์ œ ํ•ด๊ฒฐ ๋ฐ/๋˜๋Š” ์˜ˆ์ธกํ•  ์ˆ˜ ์žˆ๋Š” ์ด ๊ธฐ๋Šฅ์€ ์ƒˆ๋กœ์šด ์‚ฌ์šฉ ์‚ฌ๋ก€๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค. ์กฐ์ง์€ ์Šคํ† ๋ฆฌ์ง€์˜ ๊ณผ๊ฑฐ ๋ฐ์ดํ„ฐ ๋˜๋Š” ๋ฐฐ์น˜ ๋ฐ์ดํ„ฐ๋ฅผ ์ด๋ฒคํŠธ ์ŠคํŠธ๋ฆผ๊ณผ ๊ฒฐํ•ฉํ•˜์—ฌ ๋ฐœ์ƒํ•˜๋Š” ๋ชจ๋“  ์ด๋ฒคํŠธ์— ๋Œ€ํ•ด ์‹คํ–‰ ๊ฐ€๋Šฅํ•œ ์ธ์‚ฌ์ดํŠธ๋ฅผ ์–ป์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

์ผ๋ฐ˜์ ์ธ ์‚ฌ์šฉ ์‚ฌ๋ก€๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค.

  • ์œ„์น˜ ๋ฐ์ดํ„ฐ
  • ์‚ฌ๊ธฐ ํƒ์ง€
  • ์‹ค์‹œ๊ฐ„ ์ฃผ์‹ ๊ฑฐ๋ž˜
  • ๋งˆ์ผ€ํŒ…, ์˜์—… ๋ฐ ๋น„์ฆˆ๋‹ˆ์Šค ๋ถ„์„
  • ๊ณ ๊ฐ/์‚ฌ์šฉ์ž ํ™œ๋™
  • ๋‚ด๋ถ€ IT ์‹œ์Šคํ…œ ๋ชจ๋‹ˆํ„ฐ๋ง ๋ฐ ๋ณด๊ณ 
  • ๋กœ๊ทธ ๋ชจ๋‹ˆํ„ฐ๋ง: ์‹œ์Šคํ…œ, ์„œ๋ฒ„, ์žฅ์น˜ ๋“ฑ์˜ ๋ฌธ์ œ ํ•ด๊ฒฐ
  • SIEM(Security Information and Event Management): ๋ชจ๋‹ˆํ„ฐ๋ง, ๋ฉ”ํŠธ๋ฆญ ๋ฐ ์œ„ํ˜‘ ํƒ์ง€๋ฅผ ์œ„ํ•œ ๋กœ๊ทธ ๋ฐ ์‹ค์‹œ๊ฐ„ ์ด๋ฒคํŠธ ๋ฐ์ดํ„ฐ ๋ถ„์„
  • ์†Œ๋งค/๋ฌผ๋ฅ˜์ฐฝ๊ณ  ์žฌ๊ณ  ๊ด€๋ฆฌ: ๋ชจ๋“  ์ฑ„๋„ ๋ฐ ์œ„์น˜ ์ „๋ฐ˜์˜ ์žฌ๊ณ  ๊ด€๋ฆฌ, ๋ชจ๋“  ์žฅ์น˜ ์ „๋ฐ˜์—์„œ ์›ํ™œํ•œ ์‚ฌ์šฉ์ž ๊ฒฝํ—˜ ์ œ๊ณต
  • ์ฐจ๋Ÿ‰ ๊ณต์œ  ๋งค์นญ: ์˜ˆ์ธก ๋ถ„์„์„ ์œ„ํ•œ ์œ„์น˜, ์‚ฌ์šฉ์ž ๋ฐ ์š”๊ธˆ ๋ฐ์ดํ„ฐ ๊ฒฐํ•ฉ - ๊ทผ์ ‘์„ฑ, ๋ชฉ์ ์ง€, ์š”๊ธˆ, ๋Œ€๊ธฐ ์‹œ๊ฐ„์„ ๊ธฐ์ค€์œผ๋กœ ํƒ‘์Šน์ž์—๊ฒŒ ๊ฐ€์žฅ ์ ํ•ฉํ•œ ์šด์ „์ž ๋งค์นญ
  • ๋จธ์‹  ๋Ÿฌ๋‹ ๋ฐ A.I: ํ•˜๋‚˜์˜ ์ค‘์•™ ์‹ ๊ฒฝ๊ณ„์— ๋Œ€ํ•œ ๊ณผ๊ฑฐ ๋ฐ ํ˜„์žฌ ๋ฐ์ดํ„ฐ๋ฅผ ๊ฒฐํ•ฉํ•˜์—ฌ ์˜ˆ์ธก ๋ถ„์„์— ๋Œ€ํ•œ ์ƒˆ๋กœ์šด ๊ฐ€๋Šฅ์„ฑ ์ œ์‹œ

์œ ํ˜•์— ์ƒ๊ด€์—†์ด ์ฒ˜๋ฆฌ, ์ €์žฅ ๋˜๋Š” ๋ถ„์„์ด ํ•„์š”ํ•œ ๋ฐ์ดํ„ฐ๊ฐ€ ์กด์žฌํ•˜๋Š” ํ•œ, Confluent๋Š” ๋ชจ๋“  ๊ทœ๋ชจ์˜ ๋ชจ๋“  ์‚ฌ์šฉ ์‚ฌ๋ก€์— ๋Œ€ํ•ด ๋ฐ์ดํ„ฐ๋ฅผ ์ตœ๋Œ€ํ•œ ํ™œ์šฉํ•  ์ˆ˜ ์žˆ๋„๋ก ๋„์›€์„ ๋“œ๋ฆด ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

Real-world businesses need real-time data.

Why Confluent

Built by the original creators of Apache Kafka, Confluent takes Kafka's stream processing technology to a fully managed, multi-cloud data streaming platform. Easily connect 120+ data sources with enterprise grade security, performance, and scalability. Stream data across any infrastructure in minutes.