국내 No.1 에너지 IT기업 ‘해줌’의 컨플루언트 클라우드 도입 스토리 | 알아보고 등록하기

Online Talk

Show Me How: Build Streaming Data Pipelines from SQL Server to MongoDB

지금 시청하기

Available On-demand

Data pipelines do the heavy lifting of helping organizations integrate, transform, and prepare data for downstream systems in operational use cases. But legacy databases and rigid batch-based pipelines hold organizations back as real-time data streaming becomes a business-critical technology. Streaming pipelines have become essential for businesses to serve modern consumers.

During this hands-on workshop, we'll guide you through the journey of an ecommerce company that started with siloed data spread across multiple environments. See how integrating and processing real-time customer order and clickstream data across various sources enabled them to unlock Customer 360 and build hyper-personalized campaigns so customers get faster, better experiences.

You’ll learn how to:

  • Connect data sources and sinks to Confluent Cloud, using Confluent’s fully managed SQL Server CDC (Change Data Capture) Source Connector and MongoDB Atlas Sink Connector.
  • Process data streams using ksqlDB to join and enrich customer data, generating a unified 360 view.
  • Govern data using Schema Registry, Stream Lineage, and Stream Catalog.
  • Share ready-to-use data products securely in one click with teams and external organizations.

Don’t miss our live Q&A! Register today and get started building your own streaming pipelines.

Resources:

추가 리소스

cc demo

Confluent Cloud 데모

Apache Kafka에서 제공하는 업계 유일의 완전 관리형 클라우드 네이티브 이벤트 스트리밍 플랫폼인 Confluent Cloud의 라이브 데모에 참여하십시오
kafka microservices

Kafka 마이크로서비스

마이크로서비스 아키텍처에 강력한 실시간 스트림 기능을 활용할 수 있도록 주요 개념, 사용 사례 및 모범 사례를 확인하십시오.
Image-Event-Driven Microservices-01

e-book: 마이크로서비스 고객 사례

다양한 산업 부문의 5개 조직이 Confluent를 활용하여 어떻게 새로운 차원의 이벤트 기반 마이크로서비스를 구축했는지 확인해 보십시오.