Getting Started with KSQL Recipes
To accelerate your stream processing project, we've pre-built a series of stream processing use cases and KSQL code samples. Much like a recipe, we provide the steps to help get you started.
Level Up your KSQL
Whether you are brand new to KSQL or ready to take it to production, now you can dive deep on core KSQL concepts, streams and tables, enriching unbounded data and data aggregations, scalability and security configurations, and more.
Use Cases and Examples
Streaming ETL
Apache Kafka is a popular choice for powering data pipelines. KSQL makes it simple to transform data within the pipeline, readying messages to cleanly land in another system.
CREATE STREAM vip_actions AS
SELECT userid, page, action FROM clickstream c LEFT JOIN users u ON c.userid = u.user_id
WHERE u.level = 'Platinum';
Anomaly Detection
KSQL is a good fit for identifying patterns or anomalies on real-time data. By processing the stream as data arrives you can identify and properly surface out of the ordinary events with millisecond latency.
CREATE TABLE possible_fraud AS
SELECT card_number, count(*)
FROM authorization_attempts
WINDOW TUMBLING (SIZE 5 SECONDS)
GROUP BY card_number
HAVING count(*) > 3;
Monitoring
Kafka’s ability to provide scalable ordered messages with stream processing make it a common solution for log data monitoring and alerting. KSQL lends a familiar syntax for tracking, understanding, and managing alerts.
CREATE TABLE error_counts AS
SELECT error_code, count(*) FROM monitoring_stream WINDOW TUMBLING (SIZE 1 MINUTE) WHERE type = 'ERROR' GROUP BY error_code;