Project Metamorphosis: Unveiling the next-gen event streaming platformLearn More

The PipelineDB Team Joins Confluent

Some years ago, when I was at LinkedIn, I didn’t really know what Apache Kafka® would become but had an inkling that the next generation of applications would not be islands disconnected from one another, or lashed together with irregular, point-to-point bindings. When we founded Confluent, we took the radical approach of viewing data—and the infrastructure that supported it—as a series of real-time streaming events rather than something kept in static, sedentary data repositories. In the process, we created a new way for developers to think about creating modern applications that lean on a backbone of data in flight.

In four short years, Confluent has grown from an idea into an industry pioneer. And today, event streaming is changing the trajectory and velocity of business with more than 60 percent of the Fortune 100 rearchitecting their core data infrastructure around an event streaming platform. The catalyst behind this movement wasn’t necessarily just the technology but rather a community of developers who imagined what was possible and adopted this paradigm shift with us.

Our belief is that as companies collect their data in an event streaming platform—one that integrates real-time data across different parts of the company—it is only natural that they also need to process the event streams, join and summarize them on the fly. Whether you’re building a fraud detection system or a ride-sharing system, modern applications require data to be joined and summarized to serve their individual needs and have the summarized views continuously updated as new events are generated. That was the reason we created KSQL which has now become one of the most popular tools in the Kafka ecosystem.

We continue to challenge ourselves to help developers expand what is possible and create more value with Confluent. To that end, we’re announcing that the PipelineDB team will be joining Confluent. As you may know, PipelineDB has piqued the attention of many and shown what is possible with a unique take on integrating streaming datasets into Postgres—one of the world’s most popular data-at-rest technologies that underpin the systems built by many of our customers as well as the industry at large. The underlying approach of continuously updated views on streaming data matched our vision of how developers build summarizations and joins on event streams with KSQL. Instead of throwing data to yet another database for it to sit idle—waiting to be queried—why not explore how that approach could be applied to event streams in Kafka? And that’s when both teams got serious about working together.

The PipelineDB team brings with them a vast wealth of experience, spanning both databases and stream processing. They also share our vision of a future powered by event streaming, and I very much look forward to seeing the result of their contributions as we complete the event streaming platform journey we started all those years ago.

Neha Narkhede is co-founder, chief technology and product officer at Confluent, which offers an event streaming platform based on Apache Kafka. Prior to founding Confluent, Neha led streams infrastructure at LinkedIn, where she was responsible for LinkedIn’s streaming infrastructure built on top of Apache Kafka and Apache Samza. She is one of the initial authors of Apache Kafka, and a committer and PMC member on the project.

Did you like this blog post? Share it now

Subscribe to the Confluent blog

More Articles Like This

Confluent Raises $250M and Kicks Off Project Metamorphosis

It’s an exciting day for Confluent, in the middle of a very unusual and difficult time in the larger world. Nonetheless, I thought it was important we share this news […]

Confluent’s Commitment to Our Customers, Employees, and Community Amid COVID-19 (Coronavirus)

As the impact of COVID-19 (coronavirus) continues to spread, our top priority is the health and well-being of our customers, employees, and community. We are acutely aware that these are […]

Celebrating 1,000 Employees and Looking Towards the Path Ahead

During the holiday season, it’s a particularly relevant time to pause, reflect, and celebrate, both the days past and those ahead. Here at Confluent, it’s a noticeably nostalgic moment, given […]

Sign Up Now

Start your 3-month trial. Get up to $200 off on each of your first 3 Confluent Cloud monthly bills

New signups only.

By clicking “sign up” above you understand we will process your personal information in accordance with our Privacy Policy.

By clicking "sign up" above you agree to the Terms of Service and to receive occasional marketing emails from Confluent. You also understand that we will process your personal information in accordance with our Privacy Policy.

Free Forever on a Single Kafka Broker
i

The software will allow unlimited-time usage of commercial features on a single Kafka broker. Upon adding a second broker, a 30-day timer will automatically start on commercial features, which cannot be reset by moving back to one broker.

Select Deployment Type
Manual Deployment
  • tar
  • zip
  • deb
  • rpm
  • docker
or
Auto Deployment
  • kubernetes
  • ansible

By clicking "download free" above you understand we will process your personal information in accordance with our Privacy Policy.

By clicking "download free" above, you agree to the Confluent License Agreement and to receive occasional marketing emails from Confluent. You also agree that your personal data will be processed in accordance with our Privacy Policy.

This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising, and analytics partners.