[Demo] Design Event-Driven Microservices for Cloud → Register Now

Unlocking Industry 4.0: Rise of Smart Factory with Data Streaming

Écrit par

In the era of Industry 4.0, the fusion of smart, autonomous systems with digitization is propelling manufacturing and supply chain operations into uncharted efficiency. Industry 4.0, otherwise known as the Fourth Industrial Revolution, marks the normalization of the smart factory. The foundation for this revolution lies in the convergence of the Internet of Things (IoT), cloud computing, analytics, and AI. Two manufacturing giants, Siemens and Brose, have emerged as pioneers in this transformative journey.

Embracing real-time transformation: Siemens’ and Brose's Kafka adoption

In an interactive discussion leveraging the Confluent Lightboard Studio, Siemens and Brose shared how they initially embraced open source Apache Kafka® to usher in real-time data transformation within their production facilities. As these companies sought to streamline the flow of data from factory floors to business units, the limitations of manual processes became evident. Operating internationally—Siemens in around 200 countries and Brose in over 20—both companies recognized the need for a more efficient and seamless management of their IoT data.

Traditionally, manufacturing environments utilize data historians, a tool to collect, store, and retrieve time-series data from industrial equipment. Data samples would include production processes and machine performance. A comprehensive record of time-stamped data helps optimize manufacturing operations through predictive maintenance and process optimization.

The heart of data streaming is the ability to share real-time data with anyone, regardless of the technology or communication protocol used. Data streaming in manufacturing shop floors enhances data historians. The enablement of real-time updates and continuous monitoring fosters flexibility, scalability, and integration of data sources for adaptive, interconnected systems. This approach ensures timely and accurate insights, supporting improved decision-making and operational efficiency.

The decision to move beyond traditional open source Kafka was strategic for both companies. But they still recognized the potential of an intermediary for data streaming technologies. Siemens and Brose eventually transitioned to Confluent's offerings, equipped with fully managed connectors. 

Confluent provided a comprehensive solution that extends beyond the capabilities of open source Kafka. With Confluent, the complexities of integrating and managing diverse data sources were not only simplified but optimized, aligning seamlessly with the companies' global operations.

Why Siemens made a cloud-based pivot in its data streaming application

For Siemens, the shift to Confluent Cloud was not just strategic but a necessity. Stefan Baer, service manager at Siemens Digital Industries, shared that Siemens’ former legacy sales system caused blockages and delays in customer orders. Manual integration of data changes took weeks, involving cumbersome batch processing. 

Baer explained the processes required for the data to be “generated on the producing system, packed into a zip file, [and] transferred to the middleware.” After which, the team had to conduct four additional steps, all to transfer the data. 

While the team first addressed this issue via Apache Kafka, Siemens eventually moved to Confluent Cloud. “We had this Siemens cloud-first strategy, which we needed to take into consideration,” Baer said. “The other important part is we did a total cost of ownership analysis with the colleagues from Confluent, and the result was that the Confluent Cloud had the best [cost-benefit] ratio for us.”

Siemens evolved its data streaming application, incorporating new functionalities like Kafka Streams apps and ksqlDB. This not only reduced the load on source systems but significantly improved product master data update times. However, integrating legacy systems and ensuring security remained challenges, albeit ones significantly alleviated by Confluent Cloud.

Brose, the automotive supplier’s transformation into a data-driven company 

The journey for automotive supplier Brose started in 2019 as part of their Future Pro Program. The initiative's aim was to transform via large-scale renewal and restructuring. “In this phase a lot of different IT solutions were established,” said Sven Matuschzik, head of IT platforms and databases at Brose. “There the vision of being a data-driven company was born.”

Brose built its data streaming platform to implement its first use case successfully over the years: a robust on-premise platform. In other words, computing infrastructure installed and operated on the shop floor. While establishing a robust on-premise platform, Brose faced challenges in managing a diverse range of machines and equipment. Matuschzik said, “The machine landscape looks sometimes partially very heterogeneous because we are talking about a life cycle of 10 plus years for a machine.” 

Although “a lot of other organization hurdles are still there… we have defined a lot of different fields of action to work on this in a continuous new way,” shared Matuschzik. Brose’s strategy of “anything is connected to anything” introduced complexities that required meticulous handling. Zero trust policies require checkpoints and firewalls at each “zone” of data streaming. End-to-end monitoring of machines, connected to IoT and open source data streaming, also require extensive quality assurance. Matuschzik added that still, “at the end with a good partner, for example, Confluent, you can find solutions and [continue] to develop.”

For Matuschzik, communication in handling the data among multiple departments and 45 factories is Brose’s key to success. Brose transitioned from a centralized database with SAP. “Now, with IoT and also with [our] vision to being a data-driven company, we are getting more and more decentralized platforms,” Matuschzik shared. 

The Fourth Industrial Revolution’s next frontier: Data sharing and monetization

Even if the consumer is not real time, platforms like Confluent store the collected information for later consumption. With Confluent, companies can natively share streamed data internally and with others—confident in its secure encryption.

For Matuschzik, “data sharing [and] data monetization is also the next big thing next to artificial intelligence. Tesla and the Chinese OEMs are requesting more and more data that we have to share with them.”

Maintaining data quality is paramount in this transition, and organizations are working on improving data filtering and quality control. Security, especially in an industrial environment, remains critical. With data mesh and a zero trust approach, data streaming solutions aim to ensure data security at each point in the communication chain.

The manufacturers’ commitment to robust data filtering, quality control, and security frameworks underscores their transformative journeys. As more industries recognize the strategic value of real-time data sharing and processing, the indispensable role of data streaming solutions becomes evident.

Siemens and Brose have harnessed the full potential of data streaming in IoT, and can explore new avenues for sharing data securely. Of course, challenges remain. Though the commitment of these pioneering organizations to overcome them is an inspiring testament to the future of Industry 4.0.

Learn more about the data streaming adoption of Siemens and Brose and general use cases in the manufacturing industry: 

  • Kai Waehner is Field CTO at Confluent. He works with customers across the globe and with internal teams like engineering and marketing. Kai’s main area of expertise lies within the fields of Data Streaming, Analytics, Hybrid Cloud Architectures, Internet of Things, and Blockchain. Kai is a regular speaker at international conferences such as Devoxx, ApacheCon and Kafka Summit, writes articles for professional journals, and shares his experiences with new technologies on his blog: www.kai-waehner.de. Contact: kai.waehner@confluent.io / @KaiWaehner / linkedin.com/in/kaiwaehner.

Avez-vous aimé cet article de blog ? Partagez-le !

MiFID II: Data Streaming for Post-Trade Reporting

The Markets in Financial Instruments Directive II (MiFID II) came into effect in January 2018, aiming to improve the competitiveness and transparency of European financial markets. As part of this, financial institutions are obligated to report details of trades and transactions (both equity and...


Unlocking the Edge: Data Streaming Goes Where You Go with Confluent

While cloud computing adoption continues to accelerate due to its tremendous value, it has also become clear that edge computing is better suited for a variety of use cases. Organizations are realizing the benefits of processing data closer to its source, leading to reduced latency, security and...