[ウェビナー] ストリーミングデータメッシュを構築する方法 | 今すぐ登録

A (Stream Processing) Recipe for Thankfulness

Get started with Confluent, for free

Watch demo: Kafka streaming in 10 minutes

作成者 :

Earlier this year, we introduced Stream Processing Use Case Recipes to help folks tackle real-world use cases through Apache Kafka® and ksqlDB quickstarts and tutorials. These recipes cover a wide range of topics and industries, ensuring that there’s something for almost everyone!

Today we’re introducing a new recipe that covers another common use case: survey response analysis. And to honor the upcoming US holiday, it just made sense to make this recipe Thanksgiving-themed.

If you’ve ever found yourself needing to analyze survey responses in real-time so that your company could make a well-informed decision––or if you’re trying to bake a last-minute pie for Friendsgiving that will appeal to as many people in your friend group as possible––then this use case recipe is for you!  

Read on to see how this recipe for real-time survey analysis comes together and how you, too, can bake up something sweet!

From the recipe book

“Please rate your service with us today.”

“How likely are you to recommend us to a friend?

Regardless of the questions being asked or who the target audience is, surveys are an opportunity for businesses to connect with their customers, users, and employees to get a better idea of how they’re doing on the whole. The data and insights that can be collected through a simple survey are endless. 

It should come as no surprise that survey responses, like many other datasets, are more valuable when analyzed closer to when they’re recorded. Insights from a survey can go stale, and the responses themselves have the potential to lose value the longer they sit around waiting to be analyzed. Think about it: If your customer had a poor experience with your product or service, you’d want to know about it as soon as possible–it’s one of the reasons you’d implement a feedback survey in the first place. But that immediate feedback is practically useless if it’s analyzed hours or even days after the interaction. Wouldn’t it be better to understand and, more importantly, act on the survey responses in real time?

Preheat the oven

Before we dive into any baking, it’s always a good idea to make sure that your environment is set up properly. In this case, that means having a Confluent Cloud account and ksqlDB cluster ready to go.

Ingredients

For this recipe, we need some data to serve as input. Now, there are a ton of different ways that your business could be gathering survey results, so we won’t dive too much into the implementation details. For now, it’s safe to assume that the survey results are being collected and kept in some sort of data store. 

When we need to move data from a data store into Kafka, your first thought should be to use a Kafka Connect Source Connector. In the context of data ingestion, Kafka Connect is a configuration-driven tool that programmatically polls data sources and writes data into Kafka. 

If we were running an internal employee survey through ServiceNow, for example, the survey results and information on the respondents might be stored in a couple of ServiceNow tables. To get that data into a Confluent Cloud Kafka cluster, use the fully-managed ServiceNow Source Connector.

If you don’t have a dataset to use just yet, don’t worry. The recipe offers up some convenient sample data that you can insert directly into a Kafka topic using ksqlDB later on.

Instructions 

If you’d like to follow along, the survey analysis recipe has everything you need to bring all of the ingredients together. Or, you can read on for a summary of how it happens.

  • Create two Kafka topics, survey-responses and survey-respondents.
  • [Optional] If using a Source Connector to fetch your responses and respondents datasets, execute the CREATE SOURCE CONNECTOR command in the ksqlDB console.
  • Next, measure out the survey-respondents and survey-responses data into a table and a stream, respectively. 
  • Mask the respondent data. Set aside.
  • Gently add the masked respondent data to the response data. Stir to combine.
  • Using functions, form into desired shapes. Note from the chef: There are so many ways to transform your data in ksqlDB, so feel free to get creative here!
  • Place in the ksqlDB editor to execute. Bake indefinitely.
  • Enjoy your real-time survey analysis!

That’s a pretty good recipe. But how do we make it even better?

Adjust to taste

If you’ve ever followed a recipe at home, you probably know that most recipes are more or less loose guidelines. As long as you follow the basic gist of it, you generally have some flexibility to add and mix new flavors to suit your tastes.  It’s exactly the same with these stream processing use case recipes.

With that in mind, I had some ideas for how to spice up this survey analysis recipe and make it useful for me personally.

I’m an avid baker–from sourdough to pastries and everything in between, I love trying new recipes and baking up treats for friends and family. But as the holiday season approaches, I have a problem… What kind of pie(s) should I make for Thanksgiving to appeal to the most people?

Off to the farmer’s market

Because we all know that every recipe is better with farm-fresh produce, the biggest change I made to the recipe was where I got my data. In the past, if I wanted to poll my friends for their preferences on anything, I’d reach out to each of them individually and ask. But that’s boring and inefficient. 

Inspired by the recipe, and knowing that there are no shortage of questions that I could use to poll my friends, I created an interactive survey-issuing bot using Telegram. This survey bot takes the place of a connector in my pipeline and writes data directly to Kafka for me. All of my survey questions are maintained in a Kafka topic for the Telegram bot to consume. From there, my friends can view and respond to the surveys on their phones through a conversation with the Telegram bot. 

When my friends are done responding to the survey, the information is collected, formatted nicely according to a schema, and produced back into Kafka using a bit of Python. And of course, with the survey response data in Kafka, I’m exactly where the original recipe starts off, and I can dive right into the analysis.

If you’re curious to learn more about the bot, take a look at the project on GitHub.

Serving it up, real time

All that’s left to do is issue the Thanksgiving pie survey and the pipeline I built will do the rest! As soon as their responses flow into Kafka, I can monitor the results in real time and view the aggregate results to see what pies my friends prefer. 

And all that adds up to some very tasty pies at my next Friendsgiving!

Conclusion

As this blog post illustrates, Stream Processing Use Case Recipes are a great way to dive into the world of stream processing. They provide a solid foundation for you to expand on a variety of common use cases and make them your own. Even if you’re not a chef, it doesn’t take much effort to start with a plain recipe for real-time survey analysis and dress it up into a fun Telegram application that an entire friend group can be thankful for.

For more on Apache Kafka and its fun applications, take a look at the following resources:

  • Danica began her career as a software engineer in data visualization and warehousing with a business intelligence team where she served as a point-person for standards and best practices in data visualization across her company. In 2018, Danica moved to San Francisco and pivoted to backend engineering with a derivatives data team which was responsible for building and maintaining the infrastructure that processes millions of financial market data per second in near real-time. Her first project on this team involved Kafka Streams – she never looked back. Danica now works as a Developer Advocate with Confluent where she helps others get the most out of their event-driven pipelines.

    Outside of work, Danica is passionate about sustainability, increasing diversity in the technical community, and keeping her many houseplants alive. She can be found on Twitter, tweeting about tech, plants, and baking @TheDanicaFine.

Get started with Confluent, for free

Watch demo: Kafka streaming in 10 minutes

このブログ記事は気に入りましたか?今すぐ共有