The worldwide growth of event-driven architecture (EDA) is a win-win scenario, just look at the facts. In a recent IDC Infobrief, 93% of companies that have deployed EDA across multiple use cases said EDA has met or exceeded their expectations. Furthermore, 82% of IT leaders plan to apply EDA to 2-3 new use cases within the next 24 months.
Use cases are growing as more organizations realize the value of decoupling applications, allowing them to be more responsive to user requests, reduce runtime, and ultimately offer enterprises the freedom to operate in real time.
But more event streaming use cases mean ever-greater data volumes. It also means more brokers, projects and products that can be used to stream events. For example, an organization could be using open-source Kafka for one use case, and brokers such as Confluent to address another and Amazon MSK elsewhere.
While it is good news to see the data processed in real-time, many mature EDA adopters are still not realizing the best return on their investment. Often, a stream only being consumed once is literally a 1 to 1 exchange, or an organization has reached a stage of maturity with multiple streaming use cases, but is left with a tangled web of brokers, clusters, topics, and schemas weaved on top of each other, making it almost impossible to clearly spot where they can improve returns.
It’s called growing pains – typical of any evolving IT standard, and EDA of course is no exception.
The Pivotal Role of an Event Portal
The real value of EDA lies in being able to easily discover and then reuse all existing real-time data assets, and that can only be done if they are properly audited, managed, and governed. This requires an event management tool, such as an event portal, that will discover the real-time data assets and provide a single source of truth to manage event streams more effectively.
Here are three common pain points for organizations who find themselves on the path to event-driven maturity, and how the introduction of an event portal can help overcome them.
1. Poor visibility into your event streaming estate – you need a portal with a view
Application decoupling is great for runtime, but this lack of visibility on both the publish and subscribe side can cause issues when making changes to existing applications. Application producers don't know who the consumers are, and consumers don't know who the publishers are.
For example, how do you know who the downstream consumers are to notify them of impending changes or even application decommissioning? From a user/consumer perspective – the person utilizing the application daily – may have a requirement to add an extra attribute to the data. But they have no idea who to contact, as they don’t have visibility of who the producer of the data is.
An event portal offers a single window into an event-steaming ecosystem. It provides a native discovery agent to scan, for instance, a Kafka cluster and its schema registry to produce a visual representation of every topic and its schema(s) across multiple versions – and, importantly, who its consumers are.
Case in point - getting a health check on event streaming
Freeus, LLC, is a global manufacturer of mobile medical alerts and personal safety systems. After decomposing its monolithic applications into event-driven microservices that communicate with each other via Apache Kafka, the company found itself struggling to understand and manage its infrastructure and information flows.
Specifically, they needed to see the microservices affected by a given change before deploying a new feature or function to ensure it wouldn’t bring that system down – even for as few as five seconds.
Using an event portal, Freeus can now automatically scan its system and visualize a complete map of endpoints and event streams instead of manually diagraming event streams between microservices. Today, the Freeus development team can deploy new services much more quickly, running at a rate of a new release every two weeks in 2023 versus just four in all of 2022.
2. Limited sharing and reuse – event streaming must live up to its hype
Let’s extrapolate the poor visibility issue from one application, across the entire enterprise.
For instance, if you've been using Kafka for several years, especially with many departmental or specific application-type use cases, your Kafka cluster now is a treasure trove of real-time data. But data is most valuable at the moment it’s produced – as per Forrester: “Data is, without a doubt, valuable. But when stored in vaults and locked down, it is not.” Real-time data is the most valuable data that exists. But siloed event streaming data means other departments, decision-makers, customers and partners, don't know about it – therefore it's not getting shared or reused to its full potential.
Typically, developers don't have anywhere to go to find this data treasure trove. Some have resorted to building Confluence or Wiki pages that try to document that this data exists, using SharePoint or Word documents. Noble intentions, but without real-time data mining and updates, this information quickly gets stale and out of date.
Again, this is where an event portal makes a difference, providing a perpetually up-to-date catalog of data detailing all topics, event streams, schemas and pub/sub interfaces for each application, along with owners and points of contact, as well as changes for each of the managed EDA entities. This helps expedite development by letting developers easily share, discover and re-use any existing Kafka or event streaming asset, both inside and outside the organization.
Open whole new opportunities to do more with your data – case in point
For a real-world example of this in action, look at the Federal Aviation Administration and its SWIM (System Wide Information Management) infrastructure that distributes real-time information to FAA systems across the United States.
Through secure gateways, external partners of the FAA can tap into the flow of events. Airlines and other industry partners that need the SWIM data get it in real-time, without perpetually requesting updates. Whether it’s the availability of gates at an airport, the position data of planes in the sky, or the weather for a region – external partners have the latest information without needing to ask.
3. Inability to effectively secure and govern event streams
The decentralized and dynamic nature of event-driven systems introduces unique security challenges. One of the common trade-offs with event streaming is that it does include access control rules, but developers can err on the side of being too permissive to ensure agility. If streams are not visible, properly cataloged, and data updated regularly, this presents problems with data security, governance and compliance.
This issue will only worsen with time, as applications and use of data evolves – every producer or consumer added to the system creates a new potential vulnerability.
Become proactive, not reactive – let the software do the talking
Using an event portal designed to give organizations visibility and control over their event streaming can turn security from reactive – to proactive. Users can organize systems into application domains, create and import payload schema definitions in a variety of formats including AsyncAPI, define event interactions between application and microservices, and create events and associated topic addresses using topic structure-proven practices.
Ultimately this allows them to govern and control who can access which resources, the ability to create and track every version of each EDA object as they evolve from cradle to grave, and actively promote new versions throughout development, staging, and production environments. With the right visibility, administrators can ensure security, governance and compliance with internal policies and government regulations.
Unlock the Power of EDA
Real-time streams and powerful architecture success have brought with it the need to address the rapidly growing complexity of event streaming estates. Single, multi-broker event portal technology is the solution to helping organizations discover, govern and manage the lifecycle of their real-time event streams across the enterprise, and it will become increasingly important as more organizations embrace EDA as a foundational platform.
Jonathan Schabowsky is the Field CTO at Solace. As Solace’s Field CTO, Jonathan helps companies understand how they can capitalize on the use of event-driven architecture to make the most of their microservices and deploy event-driven applications into platform-as-a-services (PaaS) environments running in cloud and on-prem environments. He is an expert at architecting large-scale, mission-critical enterprise systems, with over a decade of experience designing, building and managing them in domains such as air traffic management (FAA), satellite ground systems (GOES-R), and healthcare.
Based on that experience with the practical application of EDA and messaging technologies, and some painful lessons learned along the way, Jonathan conceived and has helped spearhead Solace’s efforts to create powerful new tools that help companies more easily manage enterprise-scale event-driven systems, including the company’s new event management product: PubSub+ Event Portal.
Jonathan is highly regarded as a speaker about event-driven architecture, having given presentations as part of SpringOne, Kafka Summit, and API Specs conferences. Jonathan holds a BS Computer Science, Florida State University, and in his spare time he enjoys spending time with his family and skiing the world-class slopes of Utah where he lives.