Confluent is the underlying infrastructure that lets us orchestrate our business logic. To be honest, our team mostly takes Kafka for granted these days. We’re not worried about outages, and we know we have an architecture that is easy to work with and will scale as Ladder continues to grow.
Apply machine learning to accelerate the life insurance underwriting process, enabling customers to apply for and activate policies within minutes if approved
Use Confluent to deploy a reliable, scalable, and low-maintenance event-driven architecture that streams data to machine learning models in near real time
Obtaining life insurance was—and in many cases still is—a weeks-long, arduous process involving in-person medical exams, paper forms, phone interviews, and brokers. Ladder has eliminated all of that with an online, direct-to-consumer, full-stack approach to life insurance that is powered by data and AI. In just five minutes, eligible customers can apply,get approved, and activate their policy for immediate coverage.
To improve the customer experience, make life insurance easier to get, and streamline the underwriting process, Ladder relies on a continuous flow of data from third-party providers to its AI underwriting engine. From the start, Ladder designed its data architecture around Apache Kafka® for these essential data flows. More recently, however, the company’s explosive growth—more than quadrupling year over year—began to put a strain on this architecture and the team that supports it. To improve scalability and reliability while reducing administrative overhead, Ladder transitioned from self-managed Kafka to Confluent.
Meesho Democratizes E-commerce with Real-Time Data Streaming from Confluent
Michelin saves 35% by powering Kafka with Confluent Cloud and is now moving toward an event-driven information system
Learn how Mojix, with its ViZixⓇ item chain management platform, is helping major retailers store, analyze and act on inventory data collected from IoT sensor streams in real time.