In 13 minutes, this demo will showcase how to use Confluent as a streaming data pipeline between operational databases. We’ll walk through an example of how to connect data and capture change data in real-time from a legacy database such as Oracle to a modern cloud-native database like MongoDB using Confluent.
We’ll look at how to go about
- Streaming and merging customer data from an Oracle database and credit card transaction data from RabbitMQ.
- We will perform stream processing using ksqlDB aggregates and windowing to create a customer list with potentially stolen credit cards.
- Finally, we’ll load the results into MongoDB Atlas using the fully managed MongoDB Atlas sink connector, for further analysis.
At the end of this demo, we’ll have run through everything you’ll need to build your first streaming data pipeline.
- Github repository for demo: https://github.com/confluentinc/demo-database-modernization
- Streaming data pipelines solution page: https://www.confluent.io/streaming-data-pipelines