The rise of enterprise integration

Jan. 9, 2025
How our business landscape calls for an architectural rethink.

The real world is deeply interconnected, but businesses still aren’t – witness the global impact of the recent CrowdStrike outage, drought in the Panama Canal, and other prime shipping route disruptions. Only 12% of businesses have integrated data systems, operating only at a macro level of data integration with an incomplete picture of events. Business processes are still in silos, subsidiaries across the globe are still in silos, and, more importantly, the data is all still in silos.

This article explains how an event-driven approach to integration can empower business leaders by enabling integration down to the micro level—connecting IoT devices, SaaS applications, legacy applications, and mobile devices to exchange events in real time seamlessly.

Tackling Data at the Enterprise Level

A large multi-national enterprise typically comprises thousands of applications, with data inputs spanning hybrid/multi-cloud environments, IoT/mobile devices, and distributed operations.

These data events happen in real time—a customer places an order online, a supplier updates available inventory, a passenger scans a boarding pass, or a sensor detects a sudden temperature change. These events are not synchronous; they all trigger follow-on actions that ripple throughout different departments and operations across an enterprise.

The vast volumes of information in play contain data critical to an enterprise’s day-to-day operations and underscore a dramatic shift in how increasingly globalized business systems are integrated.

The Old Integration Ways Cannot Match the Needs of Today’s Real-Time Business World

Traditional methods such as iPaaS and ESB are still used to “knit” together data across this complex web of disparate systems. However, they struggle to keep pace with the demands of today's data-driven landscape. Synchronous, point-to-point solutions aren't built to handle the ever-growing volumes of real-time data coursing through modern enterprises to an ever-increasing number of applications. Traditional integration methods result in a complex web of connectivity that is difficult to understand, fragile, can’t deal with bursts, and generally not robust to failures and outages. This creates longer and longer application runtimes, which, in turn, result in poor user experience. 

Organizations need to adopt an “event-driven” approach to simplify integration, become more real-time, and become more connected.

Organizations need to adopt an “event-driven” approach to simplify integration, become more real-time, and become more connected.

Event-Driven Data Needs Event-Driven Integration

Event-driven integration offers a new and innovative approach to connecting systems by instantly sharing real-time events.

At its core, a message is published to a central hub called an event broker whenever an event occurs within a system. Other systems subscribe to this broker or a network of event brokers called an event mesh, receiving real-time messages and reacting accordingly. As we say in the real world, they are all on the same page! This on-demand, “data as a service” approach unlocks flexibility and scalability far more dynamic, responsive, and real-time than traditional methods.

The Analyst Community is Already There 

Leading analysts now recognize event-driven integration as a key component to optimize the real-time movement of business-critical data. Gartner identified this trend in the evolution of IT systems architecture, heralding an “event native” mindset that moved from looking at IT systems as being “data (centric) custodians” to one that looks at IT systems as comprising the “nervous system” of an enterprise. More to the point – viewing “data in motion” as opposed to “data at rest” as the real source of effective decision-making.

It should be noted that this event-driven integration approach works perfectly well with an organization’s existing Integration Platform as a Service (iPaaS), essentially augmenting the iPaaS with an event-driven platform. In fact, with iPaaS and event-driven integration working in tandem, IT teams can migrate the right information flows incrementally to move towards becoming event-driven, allowing a phased implementation of key business processes over time.

This is reflected by IDC recently, noting in "IDC Market Glance - Connectivity Automation, 2Q24", Shari Lava, Andrew Gens, June 2024: “Another trend that is helping fuel the growth of event-driven architecture (EDA) is the potential to utilize EDA in conjunction with iPaaS to split queuing, avoid bottlenecks, and manage workload and data traffic asynchronously. With an event broker layer, an organization can also ensure they have visibility into the state of queued messages even if there is a sending error.”

Turning Integration Inside Out

An “event-driven” approach means rethinking and re-architecting integration. 

Over the past 20 years, integration has been fairly centralized, with monolithic applications with many dependencies. Current integration approaches have placed integration components in the data path – i.e., at the core—integration, whether real-time or batch, requires connectivity and transformation. The challenge with centralized integration approaches is that they couple connectors, transformations, mappings, and potentially transactional context into one deployable runtime, through which all data must flow synchronously. As such, the integration component can become a bottleneck and suffer many of the same problems as monolithic applications.

Event-driven integration turns this approach inside out.

Over the past 20 years, integration has been fairly centralized, with monolithic applications with many dependencies. Current integration approaches have placed integration components in the data path – i.e., at the core—integration, whether real-time or batch, requires connectivity and transformation.

It moves integrations and connectors to the edge, with decentralized and real-time data flow and events in the middle. The result is an application and integration architecture that is more agile, scalable, robust and real-time – much like event-driven microservices.

Today’s use cases demand an integration architecture that can handle traffic bursts and slow/offline consumers without impairing performance, scale to handle an increase in consumers, producers, and data volume, and guarantee delivery of data even to temporarily unavailable consumers. It also demands an architecture that lets you easily plug in processing components and distribution mechanisms without impairing the overall design. It also requires easily adopting new technologies you had not even imagined before – such as Gen AI Agents and LLMs. 

Going Inside Out Unlocks Transformative Business Insights 

Arriving at an optimal event-driven integration implementation does not happen overnight. It’s an evolutionary process that can be measured in four transformative milestones:

·         Breaking down data silos

Traditional systems often create data silos, hindering accessibility and informed decision-making. Event-driven integration fosters information democratization and availability, allowing businesses to access the data they need when they need it readily.

 ·         Dealing with unexpected bursts in traffic

The decoupling of components serves as a “shock absorber” that gracefully deals with unexpected bursts of traffic, such as sudden spikes in demand, leading to vastly increased order volumes. This decoupling also provides a more robust infrastructure that is tolerant of failures and outages and prevents cascading failures.

 ·         Nurturing innovation at warp speed

Event-driven integration streamlines the process of integrating new applications and services, empowering businesses to innovate faster and stay ahead of the curve. If you used event-driven integration today, I guarantee it would be faster and easier to give your Gen AI applications real-time enterprise context to work from, because you would add these new components to your existing event distribution architecture.

 ·         Keeping the user’s finger on the pulse

Real-time data integration provides users with a more consistent and up-to-date view of information, leading to a smoother and more efficient experience.

Event-Driven Integration in the Real World: Organizations that have ‘thought out’ of the traditional integration box

Word is getting out that organizations across industries, such as financial services, manufacturing, retail, and more, are beginning to embrace event-driven use cases. 

Heineken, as part of its strategy to become the world’s most connected brewer, implemented an event-driven strategy where production line events trigger real-time inventory updates and automatic order fulfillment for distributors – on a scale across 350 global and local beer and cider brands it sells in 190-plus countries.

Leading German grocery chain, EDEKA, leverages an event-driven approach to modernize its supply chain and merchandise management systems, replacing synchronous batch updates between siloed systems with real-time data sharing. Powered by a continuous event-driven flow of information, EDEKA is now providing a better shopping experience for customers.

A recent IDC Infobrief found that 82% of survey respondents have either begun or are planning to add between two to three event-driven use cases in the near future.

Rethinking Integration Architecture to Meet Today’s Needs

Increasing data volumes and connectivity levels amid shifting consumption models and customer expectations are changing how large organizations must architect critical information flow across their businesses. Traditional integration approaches can no longer support companies that have to match customer, employee, and supplier requirements in real-time.

Event-driven integration offers multinational businesses a path to modernizing their integration strategy and empowering them to be more adaptable, scalable, and robust in a landscape that will only continue to become more digitized and real-time.

About the Author

Shawn McAllister | Chief Technology Officer & Chief Product Officer at Solace.

Shawn McAllister is the Chief Technology Officer & Chief Product Officer at Solace. He is responsible for the strategy and delivery of the Solace PubSub+ Event Streaming and Management Platform, leading a team of incredibly talented engineers and architects in this endeavor.

Before joining Solace, Shawn led software, hardware, and test engineering teams at Newbridge Networks (later Alcatel Canada), where he was responsible for developing features on ATM and Ethernet switches and the 7750 Multiservice IP Router. Shawn holds a Bachelor of Mathematics from the University of Waterloo, majoring in Computer Science and Combinatorics/Optimization.