Register Now for Current 2022: The Next Generation of Kafka Summit, let's explore the future of data streaming live!
Choose Your deployment
Choose Your deployment
Leggi lo studio Total Economic Impact™ di Forrester e scopri quanto puoi risparmiare con Confluent Cloud
You’ll walk away with an understanding of how to modernize your SIEM architecture for higher throughput, lower latency, and more cost efficiency. You’ll also be able to run the demo and explore a series of hands-on labs for yourself and dig into the technical details.
Confluent permet de connecter facilement vos données en temps réel
Simplifiez l’ingestion, l’intégration et le traitement des données sur votre cloud préféré grâce à une plateforme complète.
In this IDC Tech Brief, we share our research on streaming data platforms, and the advantages they’re bringing for innovation, improved operational efficiency, ROI, and more.
Your data streaming platform needs to be truly elastic to match with customer demand. It should scale up with your business’s peak traffic, and back down as demand shrinks.
This IDC Market Note discusses the main takeaways from the 2022 Kafka Summit in London, hosted by Confluent.
The modern world is defined by speed. Grocery delivery, rideshare apps, and payments for just about anything can happen instantly using a mobile device and its apps. Every action of every consumer creates data, and businesses must make sense of it quickly to take advantage in real time.
In this demo, we’ll show you how to modernize your database and move to the cloud by connecting multi-cloud and hybrid data to Google Cloud SQL in real time.
In this demo, we will show you how to connect on-premises and multi-cloud data to Azure Cosmos DB, process that data in a stream before it reaches Azure Cosmos DB, and connect your Azure Cosmos DB data to any application.
In this demo, we’ll walk through how you can start building a persistent pipeline for continuous migration from a legacy database to a modern, cloud database. You’ll see how to use Confluent and Amazon Aurora to create a bridge across your Amazon cloud and on-prem environments.
Dans cet Ebook, vous découvrirez comment le streaming d’évènements se met au service d’une nouvelle vision de la donnée.
The companies most successful in meeting the demanding expectations of today’s customers are running on top of a constant supply of real-time event streams and continuous real-time processing. If you aspire to join the ranks of those capitalizing on data in motion, this is the place to start.
93% des organisations sont confrontées à des difficultés pour exploiter leurs données en temps réel. C’est ce que révèle l’étude menée par IDC en France auprès de 200 organisations privées et publiques, dont les résultats sont présentés dans ce document.
The secret to modernizing monoliths and scaling microservices across your organization? An event-driven architecture.
How do you mobilize your data securely and cost effectively to power a global business in real time?
In this session, we'll explore how to build a serverless, event-driven architectures by using AWS Lambda with Kafka. We'll discuss how event-based compute like Lambda can be used to decrease the complexity of running, scaling, and operating stream-based architectures when building new applications.
Join Kai Waehner, Field CTO at Confluent, for an online talk in which he will explore the latest data in motion & Apache Kafka® use cases for the defence industry.
In this session we will discuss how Apache Kafka has become the de facto standard for event driven architecture, its community support and the scale at which some customers are running it.
In Sencrop’s case, working with IoT at the “edge” means collecting and providing accurate data from the farm fields. Find out how AWS and Confluent Cloud are powering this real-time processing of data for anomaly detection and weather prediction.
This webinar explores use cases and architectures for Kafka in the cybersecurity space, also featuring a very prominent example of combining Confluent and Splunk with Intel’s Cyber Intelligence Platform (CIP).
We’ll discuss the challenges Storyblocks attempted to overcome with their monolithic apps and REST API architecture as the business grew rapidly, and the advantages they gained from using Confluent Event-driven architecture to power their mission-critical microservices.
During his tenure at Walmart, Suman Pattnaik, Director of Engineering at Walmart has built 2 of the applications that continue to play a critical role in customer satisfaction, using Kafka: real-time inventory and real-time replenishment systems.
Download this white paper to read how Confluent can power the infrastructure necessary to run Autonomous Networks.
This eBook will explain how you can modernize your data architecture with a real-time, global data plane that eliminates the need for point-to-point connections and makes your data architecture simpler, faster, more resilient, and more cost effective.
Join Joseph Morais, Staff Cloud Partner SA, and Braeden Quirante, Cloud Partner SA at Confluent as they discuss Apache Kafka and Confluent.
This demo webinar will provide you with everything you need to get started with the latest capabilities of our cloud-native data streaming platform, Confluent Cloud.
Forrester recently released a Total Economic Impact report that identified $2.5M+ in savings, a 257% ROI, and <6 month payback for organizations that used Confluent Cloud instead of Open Source Apache Kafka.
This three-part online talk series introduces key concepts, use cases, and best practices for getting started with microservices.
Join us for this webinar to see how Confluent and Databricks enable companies to set data in motion across any system, at any scale, in near real-time.
We’ve got an exciting line up of sessions designed to get you up to speed on all things Confluent Cloud! You’re sure to gain invaluable insights, no matter how many you’re able to join.
In einer interaktiven Panel-Diskussionen wollen wir gemeinsam mit Daten-Verantwortlichen von Allianz Technology, HDI und der Gothaer Versicherung diskutieren, wie der Weg hin zu einem zentralen Nervensystem für Daten aussehen kann.
As the DoD presses forward with Joint All-Domain Command and Control (JADC2) programs and architectures the Air Force is working to stand up technology centers that will not only allow for the sharing of data but for the sharing of data in motion.
Listen to this On-Demand online talk to hear how BT's digital strategy is becoming an event-driven business.
We invite you to join Jesse Miller, our lead Product Manager for Health+, in an upcoming webinar to learn about how Health+ can optimize your deployment, give you the highest level of monitoring visibility, and provide intelligent alerts and accelerated support when you need it.
Migrating, innovating, or building in the cloud requires retailers to rethink their data infrastructure. Confluent and Azure enable companies to set data in motion across any system, at any scale, in near real-time.
This demo webinar will show you how Confluent is the world’s most trusted data streaming platform, with resilience, security, compliance, and privacy built-in by default.
Leverage Confluent Cloud and Google Cloud Platform products such as BigQuery to modernize your data in minutes, setting your data in motion.
Confluent Cloud alleviates the burden of managing Apache Kafka, Schema Registry, Connect, and ksqlDB so teams can effectively focus on modern app development and deliver immediate value with your real-time use cases.
Join us for a live hands-on lab and learn how you can set your data in motion with Confluent Cloud - a fully managed platform that delivers Apache Kafka and the surrounding toolchain you’ll need for your streaming use cases.
Download this Forrester study to understand the economic benefits of Confluent Cloud.
Introduction to serverless, how it works, and the benefits stateful serverless architectures provide when paired with data streaming technologies.
This webinar offers suggestions for best practices, the kinds of tools you’ll need, and how to get your organization started down a path toward a data mesh.
TDWI Webinar "Data-Warehouse-Modernisierung zur Continuous Intelligence; Innovativer Ansatz bei Kunden in der Medienbranche" mit Confluent & Data Reply
The demand for fast results and decision making, have generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge.
Interested in bringing stream processing to your organization, but unclear on how to get started? Designed to help you go from idea to proof of concept, this online talk dives into a few of the most popular stream processing use cases and workloads to help get you up and running with ksqlDB.
Retailers that have embraced the opportunities of the prolonged pandemic are emerging leaner and stronger than before. Hear Lawrence Stoker, Senior Solutions Engineer at Confluent, walk through the data in motion use cases that are re-inventing the retail business.
Hivecell and Confluent deliver the promise of bringing a piece of Confluent Cloud right there to your desk and deliver managed Kafka at the edge for the first time at scale
In 2022, if you want to deliver high-value projects that drive competitive advantage or business differentiation quickly, your best people can’t be stuck in the day-to-day management of Kafka, and your budget is better spent on your core business. By now you know, the answer is cloud.
Apache Kafka® was built with the vision to become the central nervous system that makes real-time data available to all the applications that need to use it, with numerous use cases like stock trading and fraud detection, and real-time analytics.
Today’s data sources are fast-moving and dispersed, which can leave businesses and engineers struggling to deliver data and applications in real-time. While this can be hard, we know it doesn’t have to be - because we’ve already made it easy.
Learn more about Confluent Platform 7.0 and how Cluster Linking enables you to leverage modern cloud-based platforms and build hybrid architectures with a secure, reliable, and cost-effective bridge.
To learn more about the E2E Encryption Accelerator and how it may be used to address your data protection requirements, download the Confluent E2E Encryption Accelerator white paper.
This webinar will address the problems with current approaches and show you how you can leverage Confluent’s platform for data in motion to make your data architecture fast, cost-effective, resilient, and secure.
Kafka is now a technology developers and architects are adopting with enthusiasm. And it’s often not just a good choice, but a technology enabling meaningful improvements in complex, evolvable systems that need to respond to the world in real time. But surely it’s possible to do wrong!
To learn more about how you can implement a real-time data platform that connects all parts of your global business, download this free Confluent hybrid and multicloud reference architecture.
Selbstfahrende Autos sind schon lange nicht mehr nur Fiktion, sondern werden aktiv im Rahmen von Smart-City-Projekten getestet. In unserer Live-Demo zeigen wir, wie Daten in Echtzeit korreliert werden um kontextsensitive Entscheidungen treffen zu können.
This webinar will provide you with everything you need to get started with all the latest capabilities available on our cloud-native data streaming platform, Confluent Cloud.
Listen back and view the presentations from the Data in Motion Tour 2021 - EMEA.
We will discuss Confluent’s applicability to SIEM and shows an end-to-end demo of Confluent and Confluent Sigma, an open source project built by Confluent for processing streams of SIEM data in action, showing how to bridge the gap between old-school SIEM solutions and a next-gen architecture.
Differentiating cloud-native, cloud, and cloud services, and lessons learned building a fully managed, elastic, cloud-native Apache Kafka.
Today, with Confluent, enterprises can stream data across hybrid and multicloud environments to Amazon Redshift, powering real-time analysis while reducing total cost of ownership and time to value.
Kai Waehner, Field CTO at Confluent, will deliver his predictions on the hottest and most important data in motion use cases for 2022.
Confluent hosted a technical thought leadership session to discuss how leading organisations move to real-time architecture to support business growth and enhance customer experience.
Die Digital Banking Suite ist die neue eBanking Lösung von mimacom, welche die digitalen Touchpoints von Banken auf ein neues Level bringt. Diese vereint innovative Funktionen mit bewährten Banking Standards. Confluent ist mit "Data in Motion" ein essenzieller Bestandteil dieser Lösung.
To help organisations understand how data in motion can transform business, Watch ‘The Strategic Importance of Data in Motion’ hosted by Tech UK.
Kafka Streams, a scalable stream processing client library in Apache Kafka, defines the processing logic as read-process-write cycles in which all processing state updates and result outputs are captured as log appends.
bietet Confluent Unternehmen eine vollständige und sichere Bereitstellung von Kafka und macht es überall dort, wo sich Apps und Daten befinden, verfügbar.
In this online talk, we will answer the question, 'How much can we do with Kafka in 30 minutes of coding?'
Kafka is now a technology developers and architects are adopting with enthusiasm. And it’s often not just a good choice, but a technology enabling meaningful improvements in complex, evolvable systems that need to respond to the world in real time. But surely it’s possible to do wrong!
In this webinar, we'll introduce you to Confluent Platform 7.0, which offers Cluster Linking to enable you to leverage modern cloud-based platforms and build hybrid architectures with a secure, reliable, and cost-effective bridge between on-prem and cloud environments.
Dieses umfassende E-Book enthält eine detaillierte Einführung in Apache Kafka® , die verteilte Publish-Subscribe-Queue zur Verarbeitung von Echtzeit-Datenfeeds.
Learn how Confluent Cluster Linking can seamlessly integrate and share data across these environments in real-time by leveraging your current Confluent/Apache Kafka deployments.
In this webinar, we’ll show you how to leverage Confluent Cloud and Google Cloud Platform products such as BigQuery to streamline your data in minutes, setting your data in motion.
Learn how ACERTUS leverages Confluent Cloud and ksqlDB for their streaming ETL, data pre-processing and transformations, data warehouse modernization, and their latest data mesh framework project.
Optimize your SIEM to Build Tomorrow’s Cyber Defense with Confluent
Learn how to break data silos and accelerate time to market for new applications by connecting valuable data from your existing systems on-prem to your AWS environment using Confluent.
Today, with Confluent, enterprises can stream data across hybrid and multicloud environments to Google Cloud’s BigQuery, powering real-time analysis while reducing total cost of ownership and time to value.
Eine Kafka-basierte Event-Streaming-Architektur, welche als intelligentes Bindegewebe konzipiert ist, ermöglicht, dass Echtzeitdaten aus verschiedenen Quellen kontinuierlich durch das Unternehmen fließen und das Data Mesh-Paradigma unterstützten.
In this webinar, see how Confluent’s data warehouse modernization solution leverages the Azure Synapse connector to help enterprises create a bridge across your Azure cloud and on-prem environments. We’ll explain how the solution works, and show you a demo!
The world is changing! Organisations are now more globally integrated than ever before and new problems need to be solved. As systems scale and migrate into the cloud, those seeking to infiltrate enterprise systems are presented with new and more frequent opportunities to succeed.
Watch this webinar to hear more about how Generali, Skechers and Conrad Electronics are using Qlik and Confluent to increase Kafka’s value.
Die sieben häufigsten Use Cases, die OEMs und Zulieferer in Deutschland und der ganzen Welt bereits umsetzen, indem sie auf ihre Daten als Echtzeit-Streams zugreifen, Legacy-Daten erschließen und Datensilos integrieren. ... und wenn die Daten nicht gelöscht wurden, dann streamen sie noch heute!
Unternehmen benötigen Zugang zu “Data in Motion”. Um die Erwartungen moderner Kunden zu erfüllen, muss ihnen ein ganzheitliches Echtzeit-Erlebnis geboten werden.
This webinar presents a solution using Confluent Cloud on Azure, Azure Cosmos DB and Azure Synapse Analytics which can be connected in a secure way within Azure VNET using Azure Private link configured on Kafka clusters.
From data collection at scale to data processing in the Cloud or at the Edge—IoT architectures and data can provide enormous advantages through useful business and operational insights.
Dieses ausführliche Whitepaper nennt die häufigsten Anwendungsfälle von Echtzeit-Daten-Streaming mit Cloud-nativem Apache Kafka® auf einer AWS-Infrastruktur und zeigt am konkreten Beispiel, wie eine verwaltete Echtzeit-Architektur im Unternehmen eingeführt werden kann
Se você também quer ganhar dinheiro com dados em movimento, basta escolher por onde começar. O e-book "Coloque seus dados em movimento com o Confluent e o Apache Kafka®" pode ajudar a iluminar o caminho.
Erfahren Sie mehr über drei entscheidende Geschäftsanwendungsfälle für Event-Streaming im Einzelhandel: Umsatzsteigerung durch Echtzeit-Personalisierung, Schaffung einheitlicher Omnichannel-Erlebnisse für Kunden und Steigerung der operativen Agilität mit Echtzeit-Bestandsdaten.
Erfahren Sie mehr über drei entscheidende Anwendungsfälle im Versicherungswesen für Event-Streaming: Reduzierung der operativen Kosten durch automatisierte digitale Erlebnisse, Personalisierung des Kundenerlebnisses und Risikominderung durch Echtzeit-Betrugs- und -Sicherheitsanalyse.
This webinar presents the decision making framework we use to coach our customers toward the most impactful and lowest cost PoC built on Kafka. The framework considers business impact, technology learning, existing resources, technical backgrounds, and cost to ensure the greatest chance of success.
Wenn Sie mehr über Event-Streaming erfahren und wissen möchten, wie damit höhere Kundenbindung, KI-Automatisierung, Echtzeitanalytik und vieles mehr möglich werden, laden Sie dieses kostenlose E-Book von Confluent herunter.
Il est fort probable que des services entièrement gérés dans le Cloud tels que S3, DynamoDB ou Redshift soient déjà utilisés au sein de l'entreprise. Il est maintenant temps de mettre en œuvre la gestion intégrale pour Kafka également - avec Confluent Cloud sur AWS.
Su empresa necesita acceso a datos en movimiento. Las compañías que tienen más éxito a la hora de satisfacer las expectativas exigentes de los clientes de la actualidad operan con una oferta constante de flujos de eventos en tiempo real y procesamiento continuo en tiempo real.
Hear how Fortune 500 companies and leading technology providers are driving real-time innovation through the power of data in motion to deliver richer customer experiences and automate backend operations.
Learn the challenges of traditional messaging middleware, hindering innovation, low fault tolerance at scale, ephemeral persistence liming data usage for analytics, and soaring technical debt and operational costs.
Explore new ways that your organization can thrive with a data-in-motion approach by downloading the new e-book, Harness Data in Motion Within a Hybrid and Multicloud Architecture.
In this eBook from Confluent and AWS, discover when and how to deploy Apache Kafka on your enterprise to harness your data, respond in real-time, and make faster, more informed decisions.
Confluent is pioneering a new category of data infrastructure focused on data in motion, designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly and securely stream across any organization.
The Cloud - as we all know - offers the perfect solution to many challenges. Many organisations are already using fully-managed cloud services such as AWS S3, DynamoDB, or Redshift. This creates an opportunity to implement fully-managed Kafka with ease using Confluent Cloud on AWS.
Learn how teams around the world continue building innovative, mission-critical applications fueled by data in motion. This 4-part webinar series will provide you with bite-sized tutorials for how to get started with all the latest capabilities available on the platform.
In 10-15 Minuten zeigen euch unsere Kafka-Experten, wie schnell ihr eure Daten in Bewegung bringen könnt: Von der Erstellung der on-demand Cluster über die Daten-Erstellung via Echtzeit-Stream-Processing bis hin zum Account-Management gehen wir durch die wichtigsten Schritte
This webinar will cover how you can protect your Kafka use cases with enterprise-grade security, reduce your Kafka operational burden and instead focus on building real-time apps that drive your business forward, and pursue hybrid and multi-cloud architectures with a data platform.
In this short, 20-minute session you’ll gain everything you need to get started with development of your first app based upon event-driven microservices.
Bei Interesse an einem vollständig verwalteten Cloud-Service, der in noch höherem Umfang Kosten senkt und die Wertschöpfung beschleunigt, empfehlen wir das folgende Whitepaper: Confluent Cloud kosteneffizient einsetzen
Discover how Homepoint uses Confluent and Azure to Speed up Loan Processes
Watch this session to learn how to streamline infrastructure, increase development velocity, unveil new use cases, and analyze data in real-time.
Establish event streaming as the central nervous system of your entire business, perhaps starting with a single use case and eventually architecting a system around event-driven microservices or delivering net-new capabilities like streaming ETL or a comprehensive customer 360.
Discover how to fuel Kafka-enabled analytics use cases—including real-time customer predictions, supply chain optimization, and operational reporting—with a real-time flow of data.
Confluent’s platform for data in motion unifies silos and sets data in motion across an organization. Learn how this empowers developers to build the kinds of real-time applications that make their organizations more competitive and more efficient.
Dans ce livre électronique gratuit, vous découvrirez trois cas d'utilisation clés du streaming d'événements par les compagnies d'assurance
Dans ce livre électronique gratuit, vous découvrirez trois utilisations cruciales du streaming d’événements pour le commerce de détail.
Join Confluent and Imply at this joint webinar to explore and learn the use cases about how Apache Kafka® integrates with Imply to bring in data-in-motion and real-time analytics to life.
Pour en savoir plus sur la façon dont le streaming d’événements permet d’accroître l’engagement des clients, l’automatisation de l’intelligence artificielle, l’analyse en temps réel et plus encore, téléchargez ce livre électronique gratuit sur Confluent.
By shifting to a fully managed, cloud-native service for Kafka, you can unlock your teams to work on the projects that make the best use of your data in motion.
In the world that real time analytics, cloud, event streaming and Kafka are hot topics, how does “Data In Motion” come into play? What are the core ideas behind and why is it a big deal to companies that are going through digital transformation?
The companies most successful in meeting the demanding expectations of today’s customers are running on top of a constant supply of real-time event streams and continuous real-time processing. If you aspire to join the ranks of those capitalizing on data in motion, this is the place to start.
Confluent Platform completes Kafka with a set of enterprise-grade features and services. Confluent Platform can reduce your Kafka TCO by up to 40% and accelerate your time to value for new data in motion use cases by 6+ months. Learn how Confluent Platform drives these outcomes for our customers.
Apache Kafka is the foundation of modern data architectures, but the open-source technology alone doesn’t offer everything enterprises need. Confluent offers a complete and secure enterprise-grade distribution of Kafka and makes it available everywhere your apps and data reside.
Die Cloud – das wissen wir alle – bietet für viele Herausforderungen die perfekte Lösung. Sehr wahrscheinlich sind bereits fully-managed Cloud-Services wie S3, DynamoDB oder Redshift im Einsatz. Nun ist es an der Zeit, “fully-managed” auch für Kafka umzusetzen – mit Confluent Cloud auf AWS.
In this webinar, we'll introduce you to Confluent Platform 6.2, which offers Health+, a new feature that includes intelligent alerting and cloud-based monitoring tools to reduce the risk of downtime, streamline troubleshooting, surface key metrics, and accelerate issue resolution.
Los Data Lakes se han utilizado para almacenar información histórica proveniente de diversos sistemas y estructurados en formato Raw, para luego, en base a diferentes tratamientos (SQL, ETLs, etc), podamos sacar el mayor provecho posible a los datos.
Si quieres que tu organización aproveche todo el valor de las arquitecturas “event driven”, no basta con integrar Apache Kafka® y esperar a que la gente se una a la fiesta.
En está sesión aprendemos porqué Confluent va más allá de ser simplemente un Apache Kafka on Steroids a ser una plataforma.
Libérer la donnée avec un traitement au fil de l’eau est une nécessité. Se focaliser sur les projets est essentiel. Venez découvrir pourquoi Confluent Cloud est la seule solution Kafka Cloud-Native qui vous permet d’utiliser Kafka à la vitesse de la donnée.
Sei stanco di pensare a tutte queste questioni tecniche per gestire lo stream di eventi della tua azienda? Hai bisogno di una piattaforma Kafka pronta all’uso, scalabile a piacere e con retention infinita?
This two part series provides an overview of what Kafka is, what it's used for, and the core concepts that enable it to power a highly scalable, available and resilient real-time event streaming platform.
Este informe técnico ofrece respuestas a estas preguntas para Confluent Cloud, esbozando nuestro enfoque respecto al TCO y el ROI, compartiendo ejemplos explícitos de clientes y aportando las lecciones aprendidas por el camino.
En confiant l'infrastructure et les opérations Kafka à un service entièrement géré tel que Confluent Cloud, vous permettez à vos meilleurs éléments de se concentrer sur des projets plus essentiels et vous économisez de l'argent.
Questo white paper risponde a queste domande per Confluent Cloud descrivendo il nostro approccio al TCO e al ROI, condividendo esempi espliciti di clienti e fornendo le lezioni apprese lungo il percorso.
Dieses Whitepaper gibt Antworten auf diese und mehr Fragen für Confluent Cloud, indem es unseren Ansatz für TCO und ROI beschreibt, konkrete Kundenbeispiele aufzeigt und Erkenntnisse aus eigener Erfahrung vermittelt.
This paper presents Apache Kafka’s core design for stream processing, which relies on its persistent log architecture as the storage and inter-processor communication layers to achieve correctness guarantees.
We live in a world of exponential data growth, and businesses are increasingly built around events - the real-time data in a company.
This white paper explores the potential benefits and relevance of deploying Confluent with the Istio service mesh.
Se desideri ricevere maggiori informazioni su come Confluent Cloud accelera lo sviluppo delle applicazioni, sblocca le persone e libera il budget, scarica l'e-book tecnico gratuito Eseguire Kafka nel 2021: un servizio nativo su cloud o il briefing tecnico.
Si desea obtener más detalles sobre cómo Confluent Cloud acelera el desarrollo de aplicaciones, desbloquea a sus empleados y libera su presupuesto descargue el eBook técnico gratuito Utilizar Kafka en 2021: un servicio nativo de la nube
Si vous souhaitez savoir comment Confluent Cloud permet d'accélérer le développement d'applications, de dégager du temps à vos équipes et de récupérer du budget, téléchargez l'e-book technique Utiliser Kafka en 2021 – Un service cloud-native
Wer mehr darüber erfahren möchte, wie Confluent Cloud die Entwicklung von Anwendungen beschleunigt, Mitarbeiter entlastet und das Budget schont kann sich das kostenlose Tech-E-Book So läuft Kafka im Jahr 2021
Ce livre électronique explique comment les assureurs peuvent utiliser Google Cloud et le streaming d’événements de Confluent pour créer une expérience d’assurance moderne tout en maîtrisant les coûts et les risques.
In diesem E-Book erklären wir, wie Versicherer Google Cloud mit Event-Streaming von Confluent nutzen können, um ein modernes Versicherungserlebnis zu bieten und gleichzeitig die Kontrolle über Kosten und Risiko zu behalten.
During this session you’ll see a pipeline built with data extraction from MongoDB Atlas, real-time transformation with ksqlDB, and simple loading into Snowflake.
Ce livre électronique explique comment les commerces peuvent utiliser Google Cloud avec la plateforme complète de streaming d’événements de Confluent pour Apache Kafka.
In diesem E-Book erklären wir, wie Einzelhändler Google Cloud mit der umfassenden Event-StreamingPlattform von Confluent für Apache Kafka.
Leveraging Confluent’s fully managed, cloud-native service for Apache Kafka®, DriveCentric has been able to successfully transform and grow their business within a rapidly changing market.
In diesem E-Book erklären wir, wie Finanzinstitute Google Cloud im Zusammenspiel mit Event-Streaming von Confluent nutzen können, um ein digitales Fundament für die Transformation des Kundenerlebnisses, die Effizienzsteigerung und Wachstumsbeschleunigung aufbauen können.
A l’issu de ce talk vous saurez: • Mettre des mots sur les difficultés fondamentales de nos systèmes • Déterminer le rôle du streaming dans vos architecture • Présenter des cas d’usages concrets dans l’usage du streaming
Dans cette présentation, vous découvrirez quels peuvent être les écueils d'une conception intégralement fondée sur des états, quelles alternatives s'offrent à vous et comment les mettre en oeuvre pour combler les besoins des systèmes modernes.
The ASAPIO Connector for Confluent allows true application-based change data capture, along with full database access. This webinar will showcase a SAP- and Confluent-certified solution to enable real-time event streaming for on-prem SAP data.
Dans ce Webinar, venez découvrir comment on essaye de répondre à ce besoin, sur notre plateforme Topic as a Service au sein d'Adeo.
In this Online Talk we will discuss some of the key distinctions between Confluent and traditional message oriented middleware. We will go into detail about the architecture of Confluent and how it enables a new level of scalability and throughput.
Listen back and view the presentations from the Confluent Streaming Event Series in Europe 2020
Kai Waehner gibt einen Überblick über die Trends für “Data in Motion” und daraus entstehenden Use Cases für Unternehmen aller Branchen. Dieses Webinar sollte man nicht verpassen!
Die Handelsbranche ist im Umbruch, denn die Erwartungshaltung der Kunden heißt: Echtzeit! Hierfür setzen sowohl stationärer Handel als auch eCommerce auf “Data in Motion” mit Confluent und Apache Kafka.
Kein Bereich ist so auf große Datenmengen ausgelegt wie das Internet of Things und daher ist der Siegeszug der “Data in Motion” in diesem Umfeld nicht verwunderlich. Anhand einiger Use Cases zeigt Marcus Urbatschek, wie skalierbare Umgebungen entstehen und flexible Architekturen umgesetzt werden.
Daten werden auch im Mittelstand immer häufiger in Bewegung verarbeitet. BAADER zeigt in diesem Vortrag, wie dies erfolgreich umgesetzt wurde.
Banken und Finanzdienstleister stehen vor Architekturherausforderungen, die über Jahrzehnte gewachsen sind. Mit “Data in Motion” können Silos aufgebrochen und innovative Services ermöglicht werden.
In this white paper, you’ll learn about five Kafka elements that deserve closer attention, either because they significantly improve upon the behavior of their predecessors, because they are easy to overlook or to make assumptions about, or simply because they are extremely useful.
With Confluent, you can start streaming data into MongoDB Atlas in just a few easy clicks. Learn how to bring real-time capabilities to your business and applications by setting data in motion.
In this Online Talk you will learn:
Real-time ETL with Apache Kafka® doesn’t have to be a challenge. Join this webinar and see how with Confluent Cloud. With out-of-the-box source & sink connectors and SQL-based stream processing, all fully managed on a complete platform for data in motion.
Dieses Webinar in Kooperation mit HiveMQ und Computerwoche zeigt unter anderem, wie Unternehmen Fabriken, Produkte & Services in Echtzeit miteinander vernetzen und warum Edge übergreifend gedacht werden muss.
Wie Now-Marketing die Zukunft des Marketings und Vertriebs verändert
In this webinar, Dan Rosanova, Group Product Manager at Confluent, will cover:
Learn about the benefits of leveraging a cloud-native service for Kafka, and how you can lower your total cost of ownership (TCO) by 60% with Confluent Cloud while streamlining your DevOps efforts. Priya Shivakumar, Head of Product, Confluent Cloud, will share two short demos.
Bürger erwarten reaktionsschnellen, personalisierten und effizienten Service von Behörden - in Echtzeit. Dieser 2-Seiter erklärt, wie Echtzeit-Bürger-Service aussehen kann.
Microservices have become a dominant architectural paradigm for building systems in the enterprise, but they are not without their tradeoffs.
This three-part online talk series introduces key concepts, use cases, and best practices for getting started with microservices.
In this 30-minute session, top Kafka experts will show everything for quickly getting started with real-time data movement ranging from on-demand cluster creation and data generation through to real-time stream processing and account management.
View this webinar with Confluent and Microsoft experts to:
Stream processing is a data processing technology used to collect, store, and manage continuous streams of data as it’s produced or received. Also known as event streaming or complex event processing (CEP), stream processing has grown exponentially in recent years due to its powerful...
Responsive, relevant, timely, insightful. Agencies are asking a lot of their data these days and treating it as a strategic asset. It’s a big job and a big change for agencies, which have been dealing with disconnected data silos, legacy applications and practices, and under-resourced data operations for decades. Making that shift from data as a passive to an active asset takes some work, but it pays off. In this report, you’ll learn how to use event streaming to process, store, analyze and act on both historical and real-time data in one place. You'll also explore: Data access and management challenges agencies are facing and how to address them. How the CDC tracked COVID test events to maximize value from COVID testing. Best practices on data analysis and productivity.
The Fourth Industrial Revolution (also known as Industry 4.0) is the ongoing automation of traditional manufacturing and industrial practices, using modern smart technology. Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way using integrating with various legacy and modern data sources and sinks.
In this talk, we are going to observe the natural journey companies undertake to become real-time, the possibilities it opens for them, and the challenges they will face
Government agencies understand the need to augment traditional SIEM systems. And, with this knowledge comes the pressure to do so in a way that is better, faster, and cheaper than before.
This 60-minute online talk is packed with practical insights where you will learn how Kafka fits into a data ecosystem that spans a global enterprise and supports use cases for both data ingestion and integration
Wer in einem Unternehmen tätig ist, das von Automatisierung, IoT und Echtzeitdaten profitieren kann oder es bereits tut, sollte jetzt weiterlesen. Das Herzstück der Industrie 4.0 sind die Streaming-Daten.
To succeed, insurance companies must unify data from all their channels that may be scattered across multiple legacy systems as well as new digital applications. Without the ability to access and combine all this data in real time, delivering a truly modern insurance experience while assessing fast-changing risks can be an uphill battle. Our eBook explains how event streaming, an emerging technology for analyzing event data in real time, can help insurers compete with their insuretech peers. You will learn how combining event streaming from Apache Kafka® and Confluent with Google Cloud can help you.
To succeed, retailers must unify data scattered across point-of-sale, e-commerce, ERP, and other systems. Without integrating all of this data in motion—and making it available to applications in real time—it’s almost impossible to deliver a fully connected omnichannel customer experience.
Banking customers today demand personalized service and expect real-time insight into their accounts from any device—and not just during “business hours.” Financial institutions trying to meet those expectations have intense competition from each other as well as fintech startups...
Most insurance companies today are somewhere along the spectrum of digital transformation, finding new ways to use data while staying within the confines of strict regulatory complexity and capital requirements. But only a few insurtech leaders and innovative startups have really tapped into real-time streaming data as the architecture behind these efforts. In this free ebook, learn about three pivotal insurance business uses for event streaming: reducing operating costs with automated digital experiences, personalizing the customer experience, and mitigating risks with real-time fraud and security analytics.
Every one of your customer touch points, from an actual purchase to a marketing engagement, creates data streams and opportunities to trigger automations in real time.
In this ebook, you’ll learn about the adoption curve of event streaming and how to gain momentum and effect change within your organization. Learn how to wield event streaming to convert your enterprise to a real-time digital business, responsive to customers and able to create business outcomes in ways never before possible.
In this ebook, we cover five of the more common use cases Confluent has supported, with real-world customer examples and insights into how your organization can make the leap. You’ll get insight into how event streaming can help with use cases such as customer 360° and website clickstream analysis, legacy IT modernization, a single view of the business, next-gen apps, and real-time analytics. These are just a handful of the ways we’ve witnessed forward-thinking companies integrate event streaming into the core of their business models.
In this ebook, you’ll learn about the profound strategic potential in an event streaming platform for enterprise businesses of many kinds. The types of business challenges event streaming is capable of addressing include driving better customer experience, reducing costs, mitigating risk, and providing a single source of truth across the business. It can be a game changer.
We used to talk about the world’s collective data in terms of terabytes. Now, according to IDC’s latest Global Datasphere, we talk in terms of zettabtytes: 138Z of new data will be created in 2024—and 24% of it will be real-time data. How important is real-time streaming data to enterprise organizations? If they want to respond at the speed of business, it’s crucial. In this digital economy, having a competitive advantage requires using data to support quicker decision-making, streamlined operations, and optimized customer experiences. Those things all come from data.
Banks and financial institutions are looking toward a future in which most business is transacted digitally. They’re adding new, always-on digital services, using artificial intelligence (AI) to power a new class of real-time applications, and automating back-office processes.
La perspectiva de IDC sobre Confluent Platform 6.0 está aquí, y en ella, puede leer la perspectiva de IDC sobre la importancia de la transmisión de eventos para las empresas hoy en día, así como las principales recomendaciones, acciones y aspectos destacados de Confluent Platform 6.0.
Learn how companies will leverage event streaming, Apache Kafka, and Confluent to meet the demand of a real-time market, rising regulations, and customer expectations, and much more in 2021
In this 30-minute session, hear from top Kafka experts who will show you how to easily create your own Kafka cluster and use out-of-the-box components like ksqlDB to rapidly develop event streaming applications.
Hands-on workshop: Using Kubernetes, Spring Boot, Kafka Streams, and Confluent Cloud to rate Christmas movies.
Découvrez le retour d'IDC sur Confluent Platform 6.0 ici, et retrouvez-y le point de vue d'IDC sur l'importance du streaming d'événements pour les entreprises d'aujourd'hui.
IDC Perspective wirft einen Blick auf die Highlights von Confluent Platform 6.0 und erklärt die Bedeutung von Event-Streaming für das moderne Unternehmen.
In diesem E-Book werden die fünf häufigsten Use Cases für Event-Streaming vorgestellt, inklusive Kundenbeispielen aus der Praxis und Best Practices für die unternehmensinterne Transformation.
Dieses E-Book stellt die typische Adoptionskurve von Event-Streaming im Unternehmen vor und zeigt Beispiele, wie Schritt für Schritt Veränderungen in der Organisation umgesetzt werden könne
In diesem E-Book erfahren Sie, wie viel strategisches Potenzial für Unternehmen jeder Größe in einer Event-Streaming-Plattform steckt
Dans ce livre électronique, vous découvrirez la courbe d'adoption du streaming d'événements ainsi que la meilleure façon de stimuler son adoption et d'implémenter ce changement au sein de votre organisation.
Dans ce livre électronique, nous vous présentons cinq cas d'utilisation couramment traités avec Confluent, avec des exemples concrets de clients et des idées sur la façon dont votre organisation peut réaliser cette transformation.
Dans ce livre électronique, vous découvrirez le potentiel stratégique immense que le streaming d'événements offre aux entreprises commerciales de toutes sortes.
Learn how Apache Kafka, Confluent, and event-driven microservices ensure real-time communication and event streaming for modernized deployment, testing, and continuous delivery.
In Live-Demos erklären wir, welchen Mehrwert Audit Logs in Confluent Platform 6.0
Basierend auf dem Überblicks-Webinar zeigen wir einen detaillierten Einblick in weitere neue Funktionalitäten von Confluent Platform 6.0.
If you’re a leader in a business that could or does benefit from automation, IoT, and real-time data, don’t miss this white paper. The lifeblood of Industry 4.0 is streaming data, which is where event streaming comes in: the real-time capture, processing, and management of all your data in order to drive transformative technology initiatives.
Der Aufbau und die Skalierung event-getriebener Anwendungen ist eine echte Herausforderung, da sich die Quellen für Event-Daten oft über mehrere Rechenzentren, Clouds, Microservices und stark verteilte Umgebungen erstrecken.
In this two-hour spooktacular workshop with Bruce Springstreams, learn about event-driven microservices with Spring BOOOOt and Confluent Cloud.
Basierend auf dem Überblicks-Webinar zeigen wir einen detaillierten Einblick in drei neue Funktionalitäten von Confluent Platform 6.0
For financial services companies, digital technologies can solve business problems, drastically improve traditional processes, modernize middleware and front-end infrastructure, improve operational efficiency, and most importantly, better serve customers.
Hear from Intrado’s Thomas Squeo, CTO, and Confluent’s Chief Customer Officer, Roger Scott, to learn how Intrado future-proofed their architecture to support current and future real-time business initiatives.
Technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .
Confluent Cloud enabled the company to get started quickly, minimize operational overhead, and reduce engineering effort.
In this Online Talk Henrik Janzon, Solutions Engineer at Confluent, explains Apache Kafka’s internal design and architecture.
In diesem Online Talk werden die wichtigsten Funktionalitäten der neuesten Version Confluent Platform 6.0 behandelt, darunter viele Bestandteile des Project Metamorphosis.
The IDC Perspective on Confluent Platform 6.0 is here, and in it, you can read IDC’s lens on the importance of event streaming to enterprise companies today.
In this talk, we are going to show some example use cases that Data Reply developed for some of its customers and how Real-Time Decision Engines had an impact on their businesses.
View the recordings and slides from Kafka Summit 2020, the premier event for those who want to learn about streaming data.
Confluent is happy to announce that we will be providing new early release chapters of Kafka: The Definitive Guide v2 every month until the completion of the new e-book in Summer 2021.
In this webinar, we take a hands-on approach to these questions and walk through setting up a simple application written in .NET to a Confluent Cloud based Kafka cluster. Along the way, we point out best practices for developing and deploying applications that scale easily.
Confluent implements layered security controls designed to protect and secure Confluent Cloud customer data, incorporating multiple logical and physical security controls that include access management, least privilege, strong authentication, logging and monitoring, vulnerability management, and bug bounty programs.
Replace the mainframe with new applications using modern and less costly technologies. Stand up to the dinosaur, but keep in mind that legacy migration is a journey. This session will guide you to the next step of your company’s evolution!
This ENTERPRISE MANAGEMENT ASSOCIATES® (EMA™) eBook will show how, with fully managed cloud-based event streaming, executives, managers, and individual contributors gain access to real-time intelligence and the enterprise will achieve unprecedented momentum and material gain.
Comment Michelin a réussi à péréniser son infrastructure informatique pour les années à venir.
Comment Michelin a réussi à péréniser son infrastructure informatique pour les années à venir.
Databases represent some of the most successful software that has ever been written and their importance over the last fifty years is hard to overemphasize. Over this time, they have evolved to form a vast landscape of products that cater to different data types, volumes, velocities, and query characteristics. But the broad definition of what a database is has changed relatively little.
Découvrez comment Accor a révolutionné son infrastructure système autour du streaming événementiel avec Apache Kafka et Confluent.
Décuplez la valeur du temps réel pour votre entreprise avec Google Cloud & Confluent.
Event streaming: from technology to a completely new business paradigm.
Découvrez comment Nexthink a redynamisé l'engagement et l'expérience employés avec le streaming événementiel.
La nouvelle génération de gestion de système bancairen, vue au travers d'Apache Kafka & Confluent : découvrez l'expérience de BNP Paribas.
Event Streaming Paradigm: rethink data as not stored records or transient messages, but instead as a continually updating stream of events.
Pourquoi l'utilisation d'Apache Kafka et du streaming événermentiel représente une révolution pour les entreprises.
You know the fundamentals of Apache Kafka. You are a Spring Boot developer and working with Apache Kafka. You have chosen Spring Kafka to integrate with Apache Kafka. You implemented your first producer, consumer, and maybe some Kafka streams, it's working... Hurray! You are ready to deploy to production what can possibly go wrong?
De nombreuses entreprises collectent et stockent leurs données dans divers centres de données et utilisent de multiples applications et services commerciaux pour y accéder, les analyser et agir en conséquence. Gérer cette montagne de données, qui proviennent toutes de sources disparates est très difficile. Bien souvent, les méthodes employées sont inefficaces et produisent peu de résultats.
Nos encontramos ante el desafío de tomar decisiones utilizando datos distribuidos en entornos dispares y heterogéneos, lo que trae como consecuencia que sea muy complicado y complejo tomar acciones adecuadas. La extracción de datos de diferentes ambientes (nube, VPC y On-Premise) es difícil de manejar, ineficiente e ineficaz si se quiere producir resultados confiables y que posean valor para la organización.
Learn how NAV (Norwegian Work and Welfare Department) are using Apache Kafka to distribute and act upon events. NAV currently distributes more than one-third of the national budget to citizens in Norway or abroad. They are there to assist people through all phases of life within the domains of work, family, health, retirement, and social security. Events happening throughout a person’s life determines which services NAV provides to them, how they provide them, and when they offer them.
Cette série de webinars en quatre parties donne un aperçu de ce qu'est Kafka, de ses usages et des concepts fondamentaux qui lui permettent d'alimenter une plateforme de streaming d'événements en temps réel hautement évolutive, disponible et résiliente.
Die schnell wachsende Welt der Datenstromverarbeitung kann entmutigend sein, da neue Konzepte wie verschiedene Arten von Zeitsemantiken, Aggregationen, Change Logs und Frameworks zu beherrschen sind. KSQL ist eine Open-Source, Apache 2.0 lizenzierte Streaming-SQL-Engine basierend auf Apache Kafka, die all dies vereinfacht und die Stream-Verarbeitung für jeden verfügbar macht, ohne dass Source Code geschrieben werden muss.
Eine weitere neue sicherheitsrelevante Funktion in Confluent Platform 5.4 sind Structured Audit Logs. Jetzt ist natürlich alles in Kafka ein Log, aber Kafka protokolliert nicht, was Kafka mit Kafka macht - nur das, was in einen Topics geschrieben wird.
In the world of online streaming providers, real-time events are becoming the new standard, driving innovation and a new set of use cases to react to a quickly changing market. We explain how, from simple media player heartbeats, Data Reply fueled a diverse set of near-real-time use cases and services for his customer, from blocking concurrent media streams, to recognizing ended sessions and trending content.
Large enterprises, government agencies, and many other organisations rely on mainframe computers to deliver the core systems managing some of their most valuable and sensitive data. However, the processes and cultures around a mainframe often prevent the adoption of the agile, born-on-the web practices that have become essential to developing cutting edge internal and customer-facing applications.
A company's journey to the cloud often starts with the discovery of a new use case or need for a new application. Deploying Confluent Cloud, a fully managed cloud-native streaming service based on Apache Kafka, enables organisations to revolutionise the way they build streaming applications and real-time data pipelines.
In unserem digitalen Zeitalter von Big Data und IoT, in dem täglich mehrere Trillionen Byte an Daten produziert werden, ist es für Unternehmen von ganz erheblicher Bedeutung, die richtigen Daten, zur richtigen Zeit bereit zu haben – egal in welcher Applikation und unabhängig davon, ob in der Cloud oder on-premise.
Oggi, molto spesso, i dati aziendali risiedono su diverse applicazioni, con logiche di rappresentazione e di gestione degli stessi dati, troppo eterogenee. Laddove si richiede il consumo di questi dati sparsi, per un uso centralizzato, il processo di estrazione da fonti cosí disparate, diventa difficile da gestire ed estremamente inefficiente.
¿Adminstras un cluster de Kafka? ¿Dominas las funcionalidades básicas pero quieres ir más allá en el streaming de tu datos? En este webinar para usuarios avanzados de Kafka, discutiremos las posibilidades de procesamiento ofrecidas por Kafka Streams & ksqlDB a través de ejemplos y casos de uso así como las mejores prácticas a implementar cuando nos involucramos en un proyecto o iniciativa con estas tecnologías.
Aproveche el potencial de sus datos acoplando soluciones de gestión avanzada de APIs con una arquitectura de eventos Kafka. Abrir acceso a los datos adecuados en tiempo real, desde cualquier lugar y en el momento exacto en que se necesiten, se ha convertido en un gran reto para los CIOs en la digitalización de las interconexiónes.
Massimizzare il potenziale dei dati accoppiando le soluzioni di API management con un'architettura di eventi Kafka e permettere l'accesso ai dati corretti in tempo reale, da qualsiasi luogo e nel momento esatto in cui è necessario, sono diventate una grande sfida per i CIO all'interno della loro...
Apache Kafka® ist eine Streaming-Plattform, die entscheidende Geschäftsereignisse aus jeglichen Bereichen eines Unternehmens zu einer Art zentralem Nervensystem vereint, das sämtliche relevanten Aktivitäten in Form von Event-Datenströmen zusammenfasst.
Im zweiten Teil der Deep Dive Sessions besprechen wir den Aufbau eines logischen Clusters über mehrere Regionen, der alle Ihre HA SLAs (Bronce, Silver, Gold) abdeckt. Außerdem behandeln wir das große Potenzial zum Hochladen aller Ihrer Anwendungsfälle in einem Multi-Tenant-Cluster Ihrer Organisation.
Learn how Apache Kafka and Confluent help the gaming industry leverage real-time integration, event streaming, and data analytics for seamless gaming experiences at scale.
Apache Kafka is an open source event streaming platform. It is often used to complement or even replace existing middleware to integrate applications and build microservice architectures. Apache Kafka is already used in various projects in almost every bigger company today. Understood, battled-tested, highly scalable, reliable, real-time. Blockchain is a different story. This technology is a lot in the news, especially related to cryptocurrencies like Bitcoin. But what is the added value for software architectures? Is blockchain just hype and adds complexity? Or will it be used by everybody in the future, like a web browser or mobile app today? And how is it related to an integration architecture and event streaming platform? This session explores use cases for blockchains and discusses different alternatives such as Hyperledger, Ethereum and a Kafka-native tamper-proof blockchain implementation. Different architectures are discussed to understand when blockchain really adds value and how it can be combined with the Apache Kafka ecosystem to integrate blockchain with the rest of the enterprise architecture to build a highly scalable and reliable event streaming infrastructure. Speakers: Kai Waehner, Technology Evangelist, Confluent Stephen Reed, CTO, Co-Founder, AiB
Dieser Use Case soll aufzeigen, wie einfach es ist, reale und performante Umgebungen in der Confluent Cloud in kurzer Zeit zur Verfügung zu stellen und Analysen sofort zu starten
In this presentation, Lyndon Hedderly, Team Lead of Business Value Consulting at Confluent, will cover how Confluent works with customers to measure the business value of data streaming.
Developing a streaming solution working against a self-managed Kafka cluster, can be awkward and time consuming, largely due to security requirements and configuration red-tape. It's beneficial to use Confluent Cloud in the early stages to get quick progress. Creating the cluster in Confluent Cloud is super easy and allows you to concentrate on defining your Connect sources and sinks as well as fleshing out the streaming topology on your laptop. It also shows the client how easy it is to swap out the self-managed Kafka cluster with Confluent Cloud.
Without any coding or scripting, end-users leverage their existing spreadsheet skills to build customized streaming apps for analysis, dashboarding, condition monitoring or any kind of real-time pre-and post-processing of Kafka or KsqlDB streams and tables.
Join Kai Waehner, Technology Evangelist at Confluent, for this session which explores various telecommunications use cases, including data integration, infrastructure monitoring, data distribution, data processing and business applications. Different architectures and components from the Kafka ecosystem are also discussed.
Die Anpassung an die Echtzeit-Anforderungen unternehmenskritischen Anwendungen ist nur mit einer Architektur möglich, die sich elastisch skalieren lässt.
In this webinar we want to share our experience on how the Swiss Mobiliar, the biggest Swiss household insurance enterprise, introduced Kafka and led it to enterprise-wide adoption with the help of AGOORA.com.
Le serving de modèle de Machine Learning pour la prédiction en temps réel présente des défis tant en Data Engineering qu'en Data Science. Comment construire un pipeline moderne qui permet de réaliser des prédictions en continu ? Dans le cas d'un exercice supervisé, comment allier tracing et tracking des performances ? Comment récupérer un feedback pour déclencher un réentraînement réactif ? Dans ce talk nous vous proposons de dresser, ensemble, une proposition concrète de pipeline, qui prend en compte les phases d'exploration et de monitoring dans un contexte temps réel. Les ingrédients : un event log, une plateforme notebook et d'autres surprises nous venant tout droit du Cloud.
Mit Confluent Platform 5.5 machen wir es Entwicklern noch einfacher, Zugang zu Kafka zu finden und mit der Erstellung von Event-Streaming-Anwendungen zu beginnen, unabhängig von der bevorzugten Programmiersprache oder den zugrunde liegenden Datenformaten, die in ihren Anwendungen verwendet werden.
Einzelhändler können ab sofort die Kaufabsicht der Kunden vorhersehen, als Reaktion auf einen Verkauf sofort den Bestand nachbestellen und neue Geschäfte in einem Bruchteil der Zeit integrieren. Und das ist nur die Spitze des Eisbergs...in diesem Vortrag wird Carsten einige Ideen rund um die Apache Kafka Streaming-Plattform im Einzelhandel vorstellen und einige davon auch live zeigen.
Join this Online Talk, to understand how and why Apache Kafka has become the de-facto standard for reliable and scalable streaming infrastructures in the finance industry.
This document provides an overview of Confluent and Snowflake’s integration, a detailed tutorial for getting started with the integration, and unique considerations to keep in mind when working with these two technologies.
TCO is the total cost of ownership, calculating out purchase price plus costs to operate. A comprehensive TCO assessment should factor in time, manpower, and other costs across an entire organization over time.
Dieses COMPUTERWOCHE-Whitepaper stellt Streaming-Plattformen vor, zeigt deren Business-Nutzen und präsentiert eine Reihe von Anwendungen.
Adjusting to the real-time needs of your mission-critical apps is only possible with an architecture that scales elastically. Confluent re-engineered Apache Kafka into an elastically scalable, next-gen event streaming platform that processes real-time data wherever it lives - making it accessible for any budget or use case.
Join Unity, Confluent and GCP to learn how to reduce risk and increase business options with a hybrid cloud strategy.
Mainframe offloading with Apache Kafka and its ecosystem can be used to keep a more modern data store in real-time sync with the mainframe. At the same time, it is persisting the event data on the bus to enable microservices, and deliver the data to other systems such as data warehouses and search indexes.
This white paper reports the results of benchmarks we ran on a 2-CKU multi-zone dedicated cluster and shows the ability of a CKU to deliver the stated client bandwidth on AWS, GCP, and Azure clouds.
Explore the use cases and architecture for Apache Kafka®, and how it integrates with MongoDB to build sophisticated data-driven applications that exploit new sources of data.
Experts from Confluent and Attunity share how you can: realize the value of streaming data ingest with Apache Kafka®, turn databases into live feeds for streaming ingest and processing, accelerate data delivery to enable real-time analytics and reduce skill and training requirements for data ingest.
Get answers to: How you would use Apache Kafka® in a micro-service application? How do you build services over a distributed log and leverage the fault tolerance and scalability that comes with it?
Get an introduction to Apache Kafka® and how it serves as a foundation for streaming data pipelines and applications that consume/process real-time data streams. Part 1 in the Apache Kafka: Online Talk Series.