How Integrators can Rise with the Digital Tide

March 9, 2023
Edge computing can lay the foundation for business success in a new age of cloud computing, data analysis and emerging services

This article originally appeared in the March 2023 issue of Security Business magazine. When sharing, don’t forget to mention Security Business magazine on LinkedIn and @SecBusinessMag on Twitter.

The digital transformation has steadily occurred over the past 30 years – first from paper to digital, and later from analog to Ethernet, and it appears the final stage has arrived: Hardware to cloud.

Each phase brought an eventual paradigm shift to the integration business, and this third phase is no exception. Integrators are now looking at cloud-based and subscription services as the new frontier of the security business, and one of the most critical aspects that they will need to address is the need for a robust infrastructure to be able to support these emerging services.

That infrastructure can be found in the form of edge computing, and once it is fully established, it can be integrated with traditional hardware – such as sensors and cameras – to truly harness the power of data and intelligence.

The Rise of the Virtualized Environment

Over the past decade, the absolute speed at which systems and civilization interact are, frankly, scary. We have reached a quantum level of interaction that seems to have put us in a cloud frenzy. This digital tide has become a tempest in a teapot – much of which is without any form of controls or standards other than those defined by NIST and other regulated organizations. It sounds scary, but it is also one of the most exciting times to see how humans will interact with this virtualized environment.

Security is one industry that has found it difficult to embrace these technology shifts, especially when they grow beyond the control of a human “overseer.” Unfortunately, this puts those in control of our industry in diametrical opposition to the rest of the digitized tide.

The world is no longer one-dimensional, and because of that, it is impossible to define threats and vulnerabilities with old-fashioned, cloak-and-dagger logic. Not to say that isn’t incredibly important to have, but if you cannot rely on correlated event data in the world of inference edge computing tied to secure cloud infrastructure, then you are fooling yourself or others into believing you can.

It is in this converged and interconnected world that the security professional now lives. There is no choice but to address it or simply become a dinosaur.

“As customers leverage network and computing assets, multiple sensors and management systems, real-time data and metadata, and emerging analytics, the next evolutionary step is to provide data-driven insights with applicability in security operations and throughout the enterprise,” explains Tom Gates, director of sales and marketing at surveillance radar manufacturer Observation Without Limits. “The mission is to enable customers to move from being reactive to predictive, which enables them to make their businesses more resilient from top to bottom.”

The digital tide has become a tsunami tied to distributed edge infrastructure. The growth of edge computing has led to the ability of data to be defined between critical and non-critical, which has led to greater efficiencies in detection and prevention tied to response.

“The digital tide is really about sensor fusion,” says PK Gupta, CEO and founder of Megh Computing. “We can now acquire disparate fields of metadata from multiple sources stored over time to identify correlated events and tie them to outcomes required for our customers.”

Edge Computing: The Foundation of the Digital Transformation

Cloud computing has come to define the modern tech era. Everything from enterprise data analysis to gaming runs on the cloud, and traditional offerings may not be sufficient for much longer. As digitization continues to ramp up, the need for the distributed edge grows. Edge computing is not an alternative to the cloud but rather an extension of it, and the shift to the edge has already begun, and it could revolutionize cloud computing.

Edge computing is the deployment of computing and storage resources at the location where data is produced. This ideally puts compute and storage at the same point as the data source at the network edge. For example, a small enclosure with several servers and some storage might be installed atop a wind turbine to collect and process data produced by sensors within the turbine itself.

Another example, a railway station might place a modest amount of compute and storage within the station to collect and process myriad track and rail traffic sensor data. The results of any such processing can then be sent back to another data center for human review, archiving and to be merged with other data results for broader analytics.

Why is edge computing important? Computing tasks demand suitable architectures, and the architecture that suits one type of computing task does not necessarily fit all types of computing tasks. Edge computing has emerged as a viable and important architecture to deploy computing and storage resources closer to – or ideally, in the same physical location as – the data source.

Distributed computing models are not new, but decentralization can be challenging – demanding high levels of monitoring and control that are easily overlooked when moving away from a traditional centralized computing model. Edge computing has become relevant because it offers an effective solution to emerging network problems associated with moving the enormous volume of data that today’s organizations produce and consume. It is not just a problem of amount; it is also a matter of time – as applications depend on processing and responses that are increasingly time sensitive.

Edge computing addresses three principal network limitations: bandwidth, latency, and congestion.

Bandwidth is the amount of data which a network can carry over time. All networks have a limited bandwidth, and the limits are more severe for wireless communication. Latency is the time needed to send data between two points on a network. Latency can delay analytics and decision-making processes and reduces the ability for a system to respond in real time. Finally, the volume of data involved with tens of billions of devices can overwhelm the internet, causing high levels of congestion and forcing time-consuming data retransmissions. In other cases, network outages can exacerbate congestion and even sever communication to some internet users, making the Internet of Things useless during outages.

By deploying servers and storage where the data is generated, edge computing can operate many devices over a much smaller and more efficient LAN where ample bandwidth is used exclusively by local data-generating devices, making latency and congestion virtually nonexistent. Edge computing can also help optimize network performance by measuring performance for users across the internet and then employing analytics to determine the most reliable, low-latency network path for each user’s traffic.

Use-Case Benefits of Edge Computing

Edge computing addresses the vital infrastructure challenges of bandwidth limitations, excess latency, and network congestion, but there are several potential additional benefits that can make the approach appealing in many specific use-case scenarios: 

Manufacturing: An industrial manufacturer deployed edge computing to monitor manufacturing, enabling real-time analytics and machine learning at the edge to find production errors and improve quality. Edge computing supported the addition of environmental sensors throughout the manufacturing plant, providing insight into how each component is assembled and stored. The manufacturer now makes faster and more accurate business and operations decisions.

Farming: Using sensors enables this business to track water use, nutrient density and determine optimal harvest. Data is collected and analyzed to find the effects of environmental factors and continually improve the crop growing algorithms to ensure crops are harvested in peak condition.

Workplace safety: Edge computing can combine and analyze data from on-site cameras, employee safety devices and various other sensors to help businesses oversee workplace conditions or ensure that employees follow established safety protocols. This is particularly relevant to remote or unusually dangerous workplaces, such as construction sites or oil rigs.

Healthcare: The healthcare industry has dramatically expanded the amount of patient data collected from devices, sensors, and other medical equipment. That enormous data volume requires edge computing to apply automation and machine learning to access the data, ignore “normal” data and identify problem data to help patients avoid health incidents in real time.

Transportation: Autonomous vehicles require and produce anywhere from 5 to 20 TB of data per day, gathering information about location, speed, vehicle condition, road conditions, traffic conditions and other vehicles. The data must be aggregated and analyzed in real time while the vehicle is in motion, which requires significant onboard computing. In essence, each autonomous vehicle becomes an “edge.”

Retail: These businesses produce enormous data volumes from surveillance, stock tracking, sales data, and other real-time business details. Edge computing can help analyze this diverse data and identify business opportunities, such as an effective endcap or campaign, predict sales and optimize vendor ordering, etc.

Remote locations: Edge computing is useful where connectivity is unreliable, or bandwidth is restricted because of a site’s environmental characteristics. Examples include oil rigs, ships at sea, remote farms, forests, deserts, and other remote locations. By processing data locally, the amount of data to be sent can be vastly reduced, requiring far less bandwidth or connectivity time than might otherwise be necessary.

Privacy and legal compliance: Data’s journey across national and regional boundaries can pose additional problems for data security, privacy, and other legal issues. Edge computing can be used to keep data close to its source and within the bounds of prevailing data sovereignty laws, such as GDPR. Raw data can be processed locally, obscuring or securing any sensitive data before sending anything to the cloud or primary data center.

Edge Computing Deployment Best Practices

Although edge computing has the potential to provide compelling benefits across a multitude of use cases, the technology is far from foolproof. Beyond traditional network limitations, there are several key considerations that can affect the adoption of edge computing:

Defining its capability: Deploying an infrastructure at the edge can be effective, but the scope and purpose of the edge deployment must be clearly defined – even an extensive edge computing deployment serves a specific purpose at a predetermined scale using limited resources and few services.

Connectivity: Edge computing overcomes typical network limitations, but even the most forgiving edge deployment will require some minimum level of connectivity. It is critical to design an edge deployment that accommodates poor or erratic connectivity and consider what happens at the edge when connectivity is lost.

Security: IoT devices are notoriously insecure, so it is vital to design an edge computing deployment that will emphasize proper device management, such as policy-driven configuration enforcement, as well as security in the computing and storage resources. This includes factors such as software patching and updates, as well as data encryption at rest and in flight.

Data lifecycles: The perennial problem with today’s data glut is that so much of it is unnecessary. Consider a medical monitoring device: The problem data is critical, and there is little point to keeping terabytes of normal patient data. A business must decide which data to keep and what to discard. Data that is retained must be protected in accordance with business and regulatory policies.

Planning the Best Edge Computing Environment for a Customer

Edge computing is a straightforward idea that might look easy on paper, but developing a cohesive strategy and implementing a sound deployment can be challenging.

The first vital element of any successful technology deployment is the creation of a meaningful business and technical edge strategy. This strategy is not about picking vendors or gear; instead, understanding the “why.” This creates a clear understanding of the technical and business problems that the organization is trying to solve, such as overcoming network constraints and observing data sovereignty.

Such strategies might start with a discussion of just what the edge means, where it exists for the business and how it should benefit the organization. Edge strategies should also align with existing business plans and technology roadmaps – for example, if the business seeks to reduce its centralized data center footprint, then edge and other distributed computing technologies might be a good fit.

As the project moves closer to implementation, it is important to evaluate hardware and software options carefully. There are many vendors in the edge computing space, including Ad link Technology, Cisco, Amazon, Dell EMC and HPE. Each product offering must be evaluated for cost, performance, features, interoperability, and support. From a software perspective, tools should provide comprehensive visibility and control over the remote edge environment.

The actual deployment of an edge computing initiative can vary dramatically in scope and scale, ranging from some local computing gear in a battle-hardened enclosure atop a utility, to a vast array of sensors feeding a high-bandwidth, low-latency network connection to the public cloud. No two edge deployments are the same, and it is these variations that make edge strategy and planning so critical to project success.

Maintenance and Monitoring

An edge deployment demands comprehensive monitoring. Remember that it might be difficult or even impossible to get IT staff to the physical edge site, so these deployments should be architected to provide resilience, fault-tolerance, and self-healing capabilities.

Monitoring tools must offer a clear overview of the remote deployment, enable easy provisioning and configuration, offer comprehensive alerting and reporting, and maintain security of the installation and its data. Edge monitoring often involves an array of metrics and KPIs, such as site availability or uptime, network performance, storage capacity and utilization, and compute resources.

No edge implementation would be complete without a careful consideration of edge maintenance. This falls into four key areas:

1. Security: Physical and logical security precautions are vital and should involve tools that emphasize vulnerability management and intrusion detection and prevention. Security must extend to sensor and IoT devices, as every device is a network element that can be accessed or hacked, presenting a bewildering number of possible attack surfaces.

2. Connectivity: Provisions must be made for access to control and reporting – even when connectivity for the actual data is unavailable. Some edge deployments use a secondary connection for backup connectivity and control.

3. Management: The remote and often inhospitable locations of edge deployments make remote provisioning and management essential. IT managers must be able to see what is happening at the edge and be able to control the deployment when necessary.

4. Physical maintenance: IoT devices often have limited lifespans with routine battery and device replacements. Gear fails and eventually requires maintenance and replacement. Practical site logistics must be included with maintenance.

Pierre Bourgeix MS, MBA, CPP, is CTO and founder of ESI Convergent LLC and the CEO of Elevated Sale. In more than 20 years as a global security consultant and innovator at such industry leaders as ADT, HySecurity, Wallace International, Tyco and SecureState, he has been deeply involved in developing security governance programs and has been integral in the design of enterprise security systems.
About the Author

Pierre Bourgeix | CEO of ESI Convergent

Pierre Bourgeix is Chief Executive Officer of ESI Convergent, a management consulting firm focused on helping companies assess and define the use of people, processes, and technology within the physical security and cybersecurity arenas. He has spent 30 years as a global security consultant and innovator through his experience with Rand Corporation, U.S. State Department, ADT/Tyco Security, HySecurity, Wallace International, SecureState, and Boon Edam.

https://www.esiconvergent.com

siriwat | 1026886771 | Getty Images
soi20242
paul | 273977821 | Adobe Stock
sbz_56_sgb