Confluent Platform 7.0 と Cluster Linking でクラウドへのリアルタイムブリッジを構築 | ブログを読む

12 Days of Tech Tips

For the first 12 days of December, Confluent shared a daily tech tip related to managing Apache Kafka® in the cloud. These tips make it easier for you to get started with Kafka, ramp up on KSQL and move applications to the cloud. We’ve collected all the tips here, so if your New Year’s resolution is learning to use Kafka or its ecosystem—bookmark this page!

Day 1

One of the easiest ways to get started with Apache Kafka is by…not installing Kafka at all. You can get started in three minutes and nine seconds by creating a cluster on Confluent Cloud.Tech Tip 1

Day 2

Once you have access to your cluster, the next logical step for Java developers is to learn the APIs by writing a small application. Confluent Cloud is secured by default, so your first application will include a secured connector to the Kafka cluster…Let us show you how easy it is!Tech Tip 2

Day 3

Once you’ve learned the basic APIs, perhaps it is time to create a data pipeline that transforms events and loads them to any external end system, which in this case, is Google BigQuery. With KSQL and a BigQuery connector, this doesn’t take long.Tech Tip 3

Day 4

If you liked the pipelines example, you might wonder how to run a KSQL server with your Confluent Cloud cluster. This is quite easy when you know about a few simple parameters.Tech Tip 4

Day 5

But why stop there? Apache Kafka and the Confluent Platform include many more components, and you probably want to run all of them—Confluent Schema Registry, connectors, Confluent REST Proxy and more. Our Terraform automation will help you spin all of them up.Tech Tip 5

Day 6

And if you use Kafka Streams to build event-driven microservices, you will want to connect those to Confluent Cloud, too.Tech Tip 6

Day 7

If all goes well, then you’ve developed an app or two and have run them against Confluent Cloud. This means that you are now working toward a production rollout. For production, you’ll want to know about your cluster capabilities and limitations, so make sure you read the list.Tech Tip 7

Day 8

If your production rollout was successful, you want to keep an eye on your cloud usage—what’s your network throughput? What’s your storage usage? Are you close to hitting any limits? You don’t want to be surprised when you hit a limit, or when you look at your next bill.Tech Tip 8

Day 9

Going to production sometimes means you need extra security. For that, you may decide to run your Confluent Cloud cluster within your corporate VPC. You can use VPC peering to merge the Confluent Cloud VPC with yours.Tech Tip 9

Day 10

If your organization already uses Kafka, you may need to stream some data from your on-prem cluster to Confluent Cloud so you can use this data in your cloud applications.Tech Tip 10

Day 11

You may decide to use your new data replication capabilities to assist in a larger-scale cloud migration effort—2019 is a great year to migrate applications and data to the cloud!Tech Tip 11

Day 12

Bridge to cloud is more than just lifting and shifting data from on prem to the cloud. It is about integrating on-prem and cloud datastores and data processing. Learn how to use Kafka Connect and KSQL to bridge between two ecosystems of data and applications:Tech Tip 12

With that, enjoy the holidays and have a productive 2019! If you have additional tips to share with new Kafka users, please let us know in the comments below.

To learn more about Apache Kafka as a service

Did you like this blog post? Share it now

Subscribe to the Confluent blog

More Articles Like This

Intro to Apache Kafka: How Kafka Works

We recently published a series of tutorial videos and tweets on the Apache Kafka® platform as we see it. After you hear that there’s a thing called Kafka but before

12 Programming Languages Walk into a Kafka Cluster…

When it was first created, Apache Kafka® had a client API for just Scala and Java. Since then, the Kafka client API has been developed for many other programming languages