국내 No.1 에너지 IT기업 ‘해줌’의 컨플루언트 클라우드 도입 스토리 | 알아보고 등록하기

Online Talk

Show Me How: Build Streaming Data Pipelines for Cloud Databases

지금 시청하기

Available On-demand

Data pipelines do the heavy lifting of helping organizations integrate, transform, and prepare data for downstream systems in operational use cases. However, legacy databases and ETL pipelines hold organizations back as real-time data streaming becomes business critical.

This Show Me How will walk through the story of a bank that uses an Oracle database to store customer information and RabbitMQ as the message broker for credit card transaction events. Their goal is to perform real-time analysis on credit card transactions to flag fraudulent transactions and push these to MongoDB, their new cloud database that powers their in-app mobile notifications.

During this session, we'll show you step by step how to:

  • Connect data sources to Confluent Cloud using Confluent’s fully managed Oracle CDC and RabbitMQ source connectors. We’ll also use a fully managed sink connector to load aggregated, transformed data into MongoDB Atlas.
  • Process and enrich data in flight using ksqlDB to merge multiple data streams, generating a unified view of customers and their credit card activity in order to flag fraudulent transactions.
  • Govern your data pipelines using Schema Registry and Stream Lineage.

We’ll have a Q&A to answer any of your questions. Register today and learn to build your own streaming data pipelines.

Resources:

추가 리소스

cc demo

Confluent Cloud 데모

Apache Kafka에서 제공하는 업계 유일의 완전 관리형 클라우드 네이티브 이벤트 스트리밍 플랫폼인 Confluent Cloud의 라이브 데모에 참여하십시오
kafka microservices

Kafka 마이크로서비스

마이크로서비스 아키텍처에 강력한 실시간 스트림 기능을 활용할 수 있도록 주요 개념, 사용 사례 및 모범 사례를 확인하십시오.
Image-Event-Driven Microservices-01

e-book: 마이크로서비스 고객 사례

다양한 산업 부문의 5개 조직이 Confluent를 활용하여 어떻게 새로운 차원의 이벤트 기반 마이크로서비스를 구축했는지 확인해 보십시오.