[Webinar] Q1 Confluent Cloud Launch Brings You the Latest Features | Register Now

Digital Transformation in Style: How Boden Innovates Retail Using Apache Kafka

Get started with Confluent, for free

Watch demo: Kafka streaming in 10 minutes

Écrit par

As a clothing retailer with more than 1.5 million customers worldwide, Boden is always looking to capitalise on business moments to drive sales. For example, when the Duchess of Cambridge is spotted wearing one of our dresses, we need to be ready to respond rapidly to the increased interest and potential surge in demand that this moment generates.

This type of scenario—among numerous others that underscore the importance of near-real-time responses—has led Boden to embark on its own digital transformation journey with event streaming and Apache Kafka®. The batch-oriented, legacy monoliths that we relied on for many years after boden.com launched in 1999 are not well suited to enabling the responsiveness that our business now needs.

As the systems we had in place for catalog-driven sales struggled to keep pace with the company’s growth and new omnichannel approach, we envisioned a new IT architecture based on microservices and event streaming that would enable us to address two pressing needs. The first was integration: Some of our systems (order management, for example) had hundreds of point-to-point integrations, and there was no way to safely carve out these systems and replace them. The second need centered around timeliness: As we shifted more from catalog to online, the traditional data-warehouse-oriented sales reports we ran overnight began to look slower and slower. We wanted to see what was happening on the web right now, not tomorrow morning.

Shortly after deciding to launch our event streaming journey with Kafka, we made another key decision: not to self-manage Kafka. We are primarily an AWS cloud and on-premises Microsoft business, so the idea of spinning up Linux boxes to run Kafka ourselves was daunting and risky. To mitigate that risk, we identified Confluent Cloud. Going with Confluent Cloud helped us to focus on integrating microservices without coupling them, it enabled us to decouple core systems containing multiple domains of data to third-party systems; ultimately speeding up our development processes. It also enabled us to move to an event-driven architecture that gives us unlimited retention, direct connection to our new data platform, with no concerns around the complexities of the underlying infrastructure and how to scale it.

Selecting and completing our first projects with Kafka

As a first proof-of-concept (POC) project, we built a retail stock service that would be used to provide up-to-date views of stock levels as they changed due to online orders and in-store purchases. For retailers, stock is really the lifeblood of the business. You can have the best website and the best marketing in the world, but if you can’t deliver customer orders because your stock is not managed correctly, you’re not going to retain customers and grow your business. The POC gave us a great insight into how we wanted to set up our microservice architecture but also highlighted why stock wouldn’t be the best place to start on our transformation, it was too embedded in the as-is world, and we knew it would be better to start with something that had less legacy impact/change.

Instead, we selected a product domain service as the initial pilot. Being in the process of implementing a product information management (PIM) system, we wanted the rich data now available from this system to be available to downstream systems/integrations, such as our website or marketplace selling channels, as quickly as our digital team could enhance product information.

Our PIM system is great at information management, but it’s not great in making that information highly available to downstream applications. We needed a way to expose changes in product data, from PIM, as an event that could be then consumed by multiple downstream systems. We also needed to avoid old habits of point-to-point integrations (P2P), meaning specific contracts between our PIM and other systems, so that whenever a downstream system changes or the PIM interface changes, all those P2P integrations would be changed as well. By generating standard product change events, any system, including our analytics platform, can get near-real-time product information by subscribing to the events. On top of this, we can ensure compatibility (forward and/or backward) by maintaining versions of the event schema within the Confluent Schema Registry.

These product events, coupled with our rich PIM system, will allow us to constantly tweak and enhance our product-selling attributes and have these changes show, across various channels, very quickly compared to these types of changes often taking days to change. So now when the Duchess of Cambridge wears Boden, we can highlight this on our website, allowing customers, and prospects to be able to search and find the outfit worn by her.

As we built our initial product domain service, we were also implementing a new cloud data platform, Snowflake, to empower Boden to be a data-driven business by having access to near-real-time data, dealing with both larger datasets and faster-occurring data sets such as web clicks. We were able to use a combination of clickstreams and ksqlDB to aggregate and organise the raw data from our website and feed this into the Snowflake platform, which can then be matched to our traditional BI data for analysis of web sessions against orders, and give our business analysts access to data such as customer journeys, drop-offs, and buying preferences. This allows us to better serve our customer base and improve their Boden experience.

Overall accessibility of all this data enables us to analyse more data as it occurs, even aggregating and enhancing that data as it streams so that we can unlock insights previously not even considered!

What’s next?

Building on the momentum from our initial successes, we are now establishing standards for our business events and architecture patterns for our domain services, including new concepts like aggregation services, which can consume multiple domain events and provide aggregate views of data (e.g., product, price, image, and stock) to downstream consumers, such as third parties and websites.

This also gives us a way in which to abstract the complexities of our legacy systems, and iteratively replace these systems without needing to rebuild hundreds of P2P integrations or have the technical debt of building integrations to other legacy systems that are planned to also be replaced.

This brings its own challenges due to a lot of “off-the-shelf” systems not being event driven. However, the important principle for us was not to fall back to P2P integrations but to build a new type of broker service or service layer that can consume our business events and “own” the specific interface needs of the consuming, third-party application.

We have also focussed on education and preparing our IT department for the next big wave of change. On the education front, we’re hosting Kafka 101 talks led by Confluent engineers and our tech leads. These talks are designed to ensure that our teams understand event-driven architectures and microservice principles, why these are important, use cases, and how Confluent fits.

Digital transformation in a challenging economy

Our CTO encourages us to continually question traditional retail and technology approaches and architectures. Our vision is to blend industry-leading, off-the-shelf applications with build-your-own approaches that enable Boden to remain at the forefront of our industry, all while retaining and attracting talented engineers and IT professionals.

In the current economic climate, that vision remains intact, with Boden confident that businesses that are still able to invest and advance their technology can leap ahead of competition even before the economy fully recovers. By being reactive to market demand and changes in customer retail requirements, Boden can continue to delight our customers.

The capability of our technology platform to enable this is key to our business vision and strategy. Enabling global price adjustments to be set and available on our websites and third-party marketplaces, as well as making stock information available across our estate and third parties as stock changes are made are just a couple of examples in which we empower Boden’s growth.

We are still in the process of building this out as we continue to learn and adjust our architecture patterns and standards. However, our vision is being realised and I am expecting the pace of change to rapidly increase as we see the benefits these new technologies deliver both the business outcomes and engineering capability needed to be successful.

Where should you start?

For teams just getting started with Kafka, my advice is to roll your sleeves up and have a go at just building something out. One of the great advantages of Confluent Cloud is that you can build out solutions really easily. Mistakes are part of the process; in retrospect, we probably tried to do more with ksqlDB than we should have at the start, but we learned from it. The idea is to make a start and build upon it. That’s what Boden’s founder did in 1991, when he launched the company with just eight menswear products.

Take time out to truly understand event-driven and microservice architectures and focus on things like domain-driven design. Event-driven architecture that enabled unprecedented responsiveness was not a requirement of our corporate vision back then, but it is today, and I’m excited to be part of the team making this happen.

To learn more, you can check out Streaming Audio, hosted by Tim Berglund, where I talk about scaling microservices using Kafka. To get started on your Kafka journey, you can also sign up for Confluent Cloud with the promo code CL60BLOG and receive an additional $60 of free Confluent Cloud usage.* If you’d like, you can also check out a live demo of Confluent Cloud.

  • Matt Simpson is a solution architect for the market-leading retailer Boden. He is helping Boden with a significant transformation involving the replacement of all legacy systems using a mix of industry-leading, off-the-shelf systems and in-house solutions. Across these, Matt and his team are implementing a microservice and event-driven architecture. He has spent the last five years in enterprise architecture across finance and retail industries, specialising in data architecture with nearly 20 years in BI.

Get started with Confluent, for free

Watch demo: Kafka streaming in 10 minutes

Avez-vous aimé cet article de blog ? Partagez-le !