Level Up Your Kafka Skills in Just 5 Days | Join Season of Streaming

Digitizing Customer Experience in the Travel Industry

Written By

Background: Elevating travel (and vacation) experiences in real time 

Watching “Last Breath” recently (a 2019 documentary on Netflix), I learned that saturation divers while on a dive job are transported to their diving location through a dive bell, which is kept in place by a diver support vessel (in the case of this story, the Bippy Topaz). Due to a failure of the dynamic positioning system on the Bippy Topaz during a heavy storm, the support vessel lost the ability to control its location, wreaking havoc down below for the divers. While I do really love the documentary, there’s a reason I reference it here. Much like a dive bell to a support vessel, the applications that power customer experience are only as useful and reliable as the vessel or infrastructure that support those applications. I also bring it up because today we’re talking about cruise lines and their practical approach to data streaming: relevant data, served in real time to the applications used by the cruise liners, provides memorable experiences for their customers. Anything short of real time is not just inconvenient, it’s unacceptable. 

The Amazon Effect (and its widespread consequent disruptions) had implications far beyond retail; the company changed where the bar for acceptable inconvenience was found. Gone are the days of calling a taxi in advance. I could run out of the door an hour before my flight and be in a car on the way to the airport in a matter of minutes. A website taking seconds, let alone minutes, to load information is all but an urban legend today. Application users want their data now and they want it to be accurate, relevant, and consistent. They also want the customer service representatives (real or in bot form) to be privy to all of their information from the moment they start speaking with them (just the right amount of “human”). 

While we have become unaccustomed to being inconvenienced in every aspect of our life, we are especially allergic to it when we are on vacation. I would hate to be a hotel employee telling a customer on vacation that their room isn’t ready, and I would hate to be a cruise line employee when a customer’s booked excursion reservation doesn’t show up in the system even more. On cruises, guests are at the mercy of the staff on board and the mobile applications to be able to effectively communicate their needs. When the need involves time off the boat, after long days at sea with less-than-reliable Wi-Fi (more on that later), the bar for acceptable inconvenience is on the seafloor. 

Navigating choppy waters: Legacy systems and the resulting challenges

The typical legacy systems onboard cruise lines host data that is served to the cruisers’ devices via mobile and web applications. These systems come from companies like Oracle (Fidelio platform) and SAP (Hybris platform), and are run alongside homegrown applications. For example, there are reservation systems for both cruise reservations and onboard reservations (like daily activities, the spa, etc.) and often these systems span both ship and shore. Before the smartphone revolutionized personal devices, booking daily activities required a trip to the activities desk that often had long wait times that dug into pool time. The ship, limited in resources and IT personnel, often relied heavily on shore-based systems and their data. Onboard systems needed to be lean, and required an efficient way to communicate with their shore counterparts that didn’t rely on large amounts of network bandwidth. 

There is added complexity in the hybrid nature of these architectures. Often, the cruise liners (most cruise liners have ~40 ships) want to provide a “seamless” experience to guests with much different realities in terms of technical landscape. They also need to account for updates that happen, regardless of where (on board or on shore), and the customer expects that their consistent point of access (the website, mobile application, and/or customer service line) is privy to the latest information. Those of us that work in data integration know that this is not always the case: the systems/applications that power the mobile application are usually different systems than the ones accessed by customer service agents, and this can cause an inconsistent experience for cruise lines and their customers.  Running lean systems in the days of traditional relational databases and monolithic backend applications meant limited capabilities at the application level on ships. Systems on the ship often shared limited storage and processing capacity. The introduction of microservices and database software that utilized hardware more efficiently brought new capabilities to onboard applications, but they still needed to solve one more challenge: ship-to-shore communication. Not only did applications on ships need to be able to communicate back to the systems on shore for certain functionality, they also needed to be able to share data back and forth as customers made reservation updates, boarded and disembarked the ship, and more. The cruise lines could achieve this only through data movement in batch, causing a large utilization of bandwidth and a delay between systems.

Typically, the batch data transfer had to occur when the ships were docked, due to the bandwidth limitations while at sea. Further, cruise liners often share already limited bandwidth with customers who view Wi-Fi as a necessity in their day-to-day life. Any systems that relied on data transfer (like onboard reservations) were therefore unable to serve up real-time information to customers and internal stakeholders. To combat these issues, companies at the forefront of innovation have turned to event-driven architecture to serve their customers in real time. 

Smooth seas: The case for data streaming

Data streaming unlocks new capabilities for customer experience across all industries and the cruise or travel industry is no exception. Imagine for a moment that you and your significant other go on a cruise. As an app user and consumer, it makes no difference to you whether you make a reservation request while at sea or at port, or any differences in the backend systems that support this reservation. Your tolerance for delay in the synchronization of these systems is low. This is where Confluent’s cloud-native platform, built on top of Apache Kafka®, comes in, providing a central nervous system for reservations across an entire fleet of ships (40+ ships or more in many cases) and back at shore. Although exact architectures vary based on the specific technical landscape of each cruise line, there are a few key consistencies: 

Source databases on legacy technology provided by companies like SAP and Oracle that serve as a system of record for reservation and customer data on and off the ship.

  1. Limited bandwidth (according to Carnival, the average speed on cruise ships is 10-20 MBps) that has the tendency to drop, leaving ships and their customers dependent upon the onboard technology.

  2. A fleet of microservices backing their customer-facing front-end application layer which are accessed through an API gateway. 

There are many common technologies used within the cruise industry to capture reservation information and streamline customer experience. Many of these companies are Confluent partners and therefore have a strong interoperability with Kafka. Oracle has a suite of cruise-specific technologies built on top of Oracle Fidelio; and SAP Hybris provides a suite of products related to billing, commerce, sales, marketing and customer service. RDBMS technologies like PostgreSQL, MySQL, and SQL Server are also common systems of record. For retrieval databases, cruise liners turn to NoSQL databases like Couchbase. Analytics technologies are also seen on the shore side, with technologies like Snowflake and Databricks frequently used. 

When building an architecture for ship/shore replication, any system that is introduced needs to be lean, utilizing limited ship resources effectively. The payloads sent between ship and shore also need to be optimized so as not to hog limited network bandwidth. Finally, whatever processing or transformations that happen in real time need to serve data up to the customer-facing applications, meaning that they need access to customer-specific data and happen without apparent delay to the user. Furthermore, creating curated data products optimized for the various data applications that gain access to them is critical. For instance, analytics teams require a much different view of the reservation data than the users of the mobile applications. Next, let’s take a closer look at the tech stack that powers these cruise liners by walking through an example end to end.

The technical solution

Let’s first take a look at the core components on each ship. 

  • Once data changes are made to the system of record, they are captured in real time with CDC-based self-managed Connectors and events are logged into Kafka topics. Although Kafka has the ability to process GBs/second of throughput, typically the cruise liners send much smaller payloads across the wire between ship and shore, but use Kafka in both locations to process reservation updates and application communications. Those who process and analyze CDC data from their shore-side reservation systems can process thousands of rows per second. Kafka Streams powers event-driven microservices that join change data for reservations with pertinent customer information, providing a view of data that can be accessed by the cruiser through their mobile application. On the ship, Kafka needs to be self-managed, so cruise lines turn to technologies like Confluent for Kubernetes to consistently deploy and manage their onboard clusters. Central platform teams manage these ship clusters and monitor them through centralized monitoring using technologies like Health+. 

Next, let’s check out the systems on shore. 

  • On shore, the cluster can either be hosted in a self-managed capacity (to mirror the setup onboard as closely as possible) or cruise lines can turn to a fully managed technology like Confluent Cloud to sit alongside a suite of cloud-based SaaS applications and lower their operational overhead and failure points. Cruise liners that utilize Confluent Cloud can offload the management of Kafka Connect using fully managed connectors to integrate with external systems and fully managed stream processing technologies like ksqlDB and Flink. To replicate data between ship and shore clusters cruise liners can choose between Confluent Replicator and Cluster Linking, depending upon the topic structure and synchronization needs. Finally, since there are not typically Kafka admins on the ship themselves, centralized platform teams export metrics to a centralized monitoring solution, leaning on technologies like Confluent Health+, Prometheus, Grafana, and Datadog. This piece of the architecture is critical, due to the number of ships in a typical fleet and the need to monitor them alongside the shore cluster. 

The reference architecture below shows all the components discussed above in greater detail.

Reference architecture digitizing customer experiences in travel

Sailing into the future: How data streaming transforms customer experiences

With a modern data streaming platform, cruise liners unlock a number of operational and customer experience benefits. Not only are they able to provide a more modern suite of customer-facing applications onboard, they also unlock the ability to synchronize their shipboard applications with those that serve customers on the shore. They’re able to cultivate real-time insights and serve their customers better, providing a consistent experience for cruisers no matter which system they use to create reservations. From a marketing perspective, cruise liners can curate specific recommendations for customers based on a combination of analytical insights and customer-specific information. The new generation of cruise ships involve sophisticated technology that meets the real-time expectations of their customers.

Learn more about this topic in Kafka Summit London: Seamless Guest Experience.

  • Amanda Gilbert is a Staff Solutions Engineer for Confluent and a Confluent Cloud subject matter expert. She has been working in the data engineering space since graduating college in 2014. She is passionate about building event driven architectures that utilize existing technologies, limit technical debt and create performant applications. Amanda lives in Baltimore, MD where she enjoys playing poker, hiking, traveling and reading in her free time.

Did you like this blog post? Share it now

Win the CSP & MSP Markets by Leveraging Confluent’s Data Streaming Platform and OEM Program

This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...


Atomic Tessellator: Revolutionizing Computational Chemistry with Data Streaming

With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.