Application architecture is shifting from monolithic enterprise systems to flexible, event-driven approaches. Welcome to the microservices era.
How will you, and your organization, build modern applications on microservices that satisfy your need for scalability, yet not obscure what’s possible with a principally different and observable data flow?
Not only is a new way of thinking needed, but also a new set of tools and infrastructure. Apache Kafka® is often chosen as the backbone for microservices architectures because it enables many of the attributes that are fundamental to what microservices hope to achieve, such as scalability, efficiency and speed.
This three-part online talk series introduces key concepts, use cases and best practices for getting started with microservices. It’s the definitive guide for all things microservices, and includes a thorough understanding of the design principles behind microservices, the problems that arise as you grow and how you can leverage an event streaming platform as the foundation for building your modern application architectures.
Microservices, it turns out, need a backbone. Kafka can be just that.
Services come with a problem: they’re not well suited to sharing data. This talk will examine the underlying dichotomy we all face as we piece such systems together. One that is not well served today. The solution lies in blending the old with the new and Apache Kafka plays a central role.
Should you use REST to sew services together? Is it better to use a richer, brokered protocol? This practical talk will dig into how we piece services together in event driven systems, how we use a distributed log to create a central, persistent narrative and what benefits we reap from doing so.
How small can a microservice be? This talk will look at how Stateful Stream Processing is used to build truly autonomous, often minuscule services. With the distributed guarantees of Exactly Once Processing, Event Driven Services supported by Apache Kafka become reliable, fast and nimble, blurring the line between business system and big data pipeline.