Interoperability, Sense and Sensibility: The Foundation of a Safe & Smart City

Aug. 16, 2019
STE July/August 2019 Cover Story: The use of sophisticated sensor technology and unified communication spurs rapid expansion of the smart home ecosystem

Back in the 1960s, Hanna Barbera Productions got a good deal right with the futuristic Jetsons cartoon series, about 50 years before smart city technology became a reality. 

Although the flying car is currently in park, mainstream autonomous vehicles are coming soon, Automated Driver Assistance Systems (ADAS) are now so popular they are base features in dash cams and many new model vehicles already display “time-to-green-signal” and congestion alerts as part of a suite of Vehicle to Infrastructure (V2X) interoperability services already here in Safe Cities. 

Robots help us vacuum indoors and perform outdoor lifesaving missions via Unmanned Aerial Systems (UAS). The smartwatch is now a necessary lifestyle accessory and is the delivery point for Safe City citizen-focused programs like Wireless Emergency Alerts (WEA).  Based upon the success of the Amber and Silver Alerts and using the same notification system, the Blue Alert is now in 34 states, provides the means to speed the apprehension of violent criminals who kill or seriously injure local, state, or federal law enforcement officers.

IDC’s Worldwide Semiannual Smart Cities Spending Guide, a comprehensive regional and global spending forecast for smart city initiatives for the period spanning 2019 and 2023, predicts more than 50% of all worldwide outlay on smart city projects will be directed towards smart grid, infrastructure upgrades like 5G, intelligent transport projects and public safety initiatives driven by data analysis.

Where are the Smart City Standards? 

Unfortunately, like many other industries, safe/smart cities have chosen to forge ahead in the hope that each of the six-device category/steps (read further) will be deployed using each section’s current Standards. 

The Open Geospatial Consortium (OGC), with support from the US Department of Homeland Security (DHS), is currently leading a process to create a Smart City Interoperability Reference Architecture (SCIRA).

But let’s first concern ourselves with selecting the best-of-best, most easily Standardized six areas to our safe/smart city solution. In order to support multi-service networks, it is sensible for a municipality to use the same communications infrastructure for a wide variety of current smart city applications, such as smart lighting, waste management, or smart parking solutions, as well as applications which will emerge only at some point in the future.

Six Steps to Successfully Specifying Safe, Smart and Interoperable Sensors

There are numerous different types of Safe/Smart City solutions and devices, adapted for different cities, with multiple service-specific platforms, creating a level of fragmentation which results in all kinds of interoperability challenges. Technology must, therefore, be adapted to efficiently operate within a connected smart city ecosystem to more fully realize its value.

1.   Sensor data types – what data will be processed and streamed from sensor (AKA a network’s “edge” device) to acquisition platform (cloud, video management system, physical security information management system, security enterprise information management system). Examples include IP video cameras (video, audio, metadata, analytics process stream), acoustic sensor (natural language, acoustic signature, sound level), LiDAR (object detection, recognition, size, elevation via point cloud), Wireless Access Point (WAP, MAC address, ESN, Location Based Services), Thermal Imaging (Temperature range, Object Detection)

What’s trending? Use of all the above in varying quantities to improve detection and lower the average sensor cost; AKA “Differential Securityä.”  For example, transportation agencies are substituting LiDAR for IP Video Cameras to detect vehicles in restricted lanes, mixed classes of vehicles, wrong-way drivers, average speed and pedestrian behavior. WAPs are increasingly performing “double duty” to detect occupancy indoors and crowd density outdoors.  Acoustic sensors are growing in use for gunshot detection; however, AI-based weapons detection by companies like Athena Security now offer an early detection alternative and may be deployed in most IP Video Cameras.

 2.    Infrastructure – Cloud, Server, Microcomputer, Self-contained

What’s trending? At ISC West 2019, SIW Interviewed Verkada and Camcloud, both launching IP Video Camera-to-Cloud solutions offering analytics processing and Verkada offering restreaming different resolutions to multiple user requirements, from the cloud to mobile platforms and display workstations.  SIW also interviewed Athena Security offering Camera Analytic App detecting weapons detection that is processed by a Cloud Service, that automatically dispatches alerts.

3.    Processing requirements, location (edge, edge neural network, cloud, mixed)

What’s trending? At CES 2019, SIW visited Ambarella who demonstrated the importance of leveraging the most powerful and thermally efficient sensor AI processor so multiple data visualization is possible, colorizing potential threats, objects and decoding not only vehicle license plates in real-time, but vehicle make, model, speed and number of passengers. SIW Video Interview here.

4.    Sensor power (low power battery (see 0G), Power over Ethernet (through 100W), Solar, regenerated power (as from electric vehicles).

What’s trending? At CES 2019, Ring introduced the latest version of its “Stick up camera,” capable of processing outdoor motion alarms for three weeks (tested by this author, not compensated by the vendor) before a battery recharge. 802.3bt’s 100W higher power level permits WAP devices to become power sources for many sensors, reducing cabling cost and labor.

5.    Connectivity performance requirements: data packet size, number of devices, latency requirements (time to reply)

What’s trending? If you are a Verizon or AT&T partner, you’re already specifying your sensors as requiring one of three network slice identifiers, mMTC, eMBB or cMTC (read on for descriptions).  If you’re not, you will be identifying “traditional” bandwidth consumption measurements, unless a sensor to cloud solution that only transmits low payload video metadata until the real streams are requested, like the Verkada architecture is used.

6.    Authentication and sensor encryption:

  • Mainstream Single Sign On (SSO) via providers like including Okta, Onelogin, Google Business Apps and others
  • Multifactor Authentication (MFA) like standard 2-factor authentication options supporting both SMS text and authenticator apps for mobile devices
  • Video data encrypted at rest using modern RSA and AES encryption standards
  • Video data encrypted in transit using HTTPS/SSL and over the HTTPS default Port 443.
  • Outbound connections are restricted to cloud services.

What’s trending? End-to-end, simple, yet sophisticated sensor to cloud platforms. Users are finally realizing that VMS and PSIM solution providers claiming “open platforms,” when used with a specific sensor manufacturer providing “cyber hardening,” seems to be code for “trust us, we’ll protect you as long as you buy our IP Video Cameras with a specific version of their VMS.”  

Interoperability Through Infrastructure

The “superglue” of safe and smart cities is also one mode of interoperability, or simply put, getting the Internet of Things to work together. That’s no easy task, but last year we reported how smart cities will leverage network slicing, meaning how interoperability through performance is achieved with 5G, Fixed Wireless Access and WIFI.  Simply put, if the data doesn’t get to its recipient, is late, interferes with other sensor data transmission or does not allow the user to scale and meet demands, the city’s IoT tech is less useful, less sensible.

This model basically defines three combinations of sensor quantity within a region, city or building’s geographic area, how big the data payload is and what are the performance requirements.  Putting Safe, Smart City use cases aside, this new model is one of the most comprehensive design tools for the Security Consultant, Security Integrator and Program Manager.  Using these simple 3x3 combinations x network slices gets the value engineering task done concurrently with system design: more sensible.

1.    mMTC (massive Machine Type Communications – many sensors; scalability for Safe/Smart Cities)

  • Sensor quantity – Massive IoT
  • Payload – typically Low, but reserves intermittent high payloads
  • Excellent reliability and performance for use cases like Safe City Video Surveillance, gunshot detection, vehicle presence sensors and Traffic Intersection safety sensors (LiDAR)

2.    eMBB (enhanced Mobile Broadband – AKA streaming video, audio, metadata media content)

  • Sensor quantity – Average
  • Payload – High
  • Average reliability and paid-for improved performance use cases like Cloud-based, Streaming Live Events, Event Security, Traffic/Transportation, Video Surveillance monitoring

3.    cMTC (critical Machine-Type Communications)

  • Sensor quantity – Low – Average
  • Payload – Low
  • Ultra-High reliability, ultra-low latency, mission-critical, life safety communications like Autonomous Vehicle, Vehicle to Vehicle (V2V), Vehicle to Infrastructure (V2X), incident event image snapshots, short event video clips, First Responder Communications and Wearable Sensor Status

Interoperability Through Cloud and Unified Mobile Platform

Let’s look briefly at the other end of our simple sensor to platform in our safe/smart city simplified model. If you walk through many Emergency Operations Centers, you’ll find most applications are either in the agency’s own cloud or are displayed through a cloud service provider. The cloud can be a sensible portal with comforting “retro” features even with some of the camera-to-cloud “plug ‘n play” solutions. One solution provider can create a video wall of high-resolution IP video camera streams from their cloud service, and a virtual control menu that would lead you to believe it is a customized screen from a Video Management System (VMS) costing far greater in license costs.

The sensible aspect of this interoperability approach is that there is virtually and theoretically an infinite combination of displays at a given stream resolution for these end-user “seats,” available.

Interoperability Through Sensor Data Stream Mirroring

The sensibility and simplicity of choosing a sensor with high processing power, capable of streaming multiple instances of video, audio and metadata content is persuasive.  Leave it to the sensor; camera, lidar device or acoustic microphone, the easiest path to interoperability is to just receive a new data stream.

The City of Houston faced a challenge to deploy a complex visible and thermal imaging video surveillance solution to monitor 1.5M fans attending their “Super Bowl LIVE” event at their downtown CBD. This 15-day event would be monitored by at least four operations and command centers. The participants around the project meeting table did not intentionally want to over-complicate things. Queue Houston’s Homeland Security Law Enforcement professional Jack Hanagriff, who directed Verizon, then in the middle of deploying a 5G network for the event to verify that the IP video cameras and network could handle the additional video streams. Not only did they perform well, but a thermal imaging camera I specified at the time, as project solutions architect, wound up saving a life, by identifying a fallen attendee at one of the Fan fest’s concerts.

The Fastest Way to Get Systems to Work Together Is to Find Your Digital Twin

At CES 2019 this past January, Microsoft’s Azure platform was used to simulate public safety and operations challenges in Los Angeles. The CDC is now using digital twin simulations and global synergies to combat a potential health emergency due to the combination of increased homelessness and rodent infestation. A safe and smart city’s digital twin will not only allow the city to predict slow-moving threats but determine readiness reaction in real-time to catastrophic weather events, earthquakes and of course, cyber-readiness.

Your safe and smart city, stadium, campus or building’s digital twin is a virtual model built from data collected by camera or other IoT sensorlike LiDAR, a UAS or cyber process. Scenarios are built using output from AI processes and training data from other successful digital twin models. This method is most popular with cybersecurity attack scenarios that feed the operation of AI-based cybersecurity intrusion detection solutions. The most popular digital “twinning” for safe/smart cities enjoying growth are transit and transportation improvements, electrical grid efficiencies and event security.

From Zero to 5G in Two Years

Powering safe/smart cities is one of the most ambitious use cases for the internet of things (IoT), and now, using wireless networks to connect large numbers of low-power objects, ranging from vehicle presence/parking sensor, smartwatches and electricity meters is commonplace.

These little, powerful gems inexpensively convey small amounts of data over long ranges—without sacrificing quality and can stay powered for weeks and years, thanks to years of battery research and matching the most efficient low power processor.

Whereas other networks aim to collect and transmit as much data as possible, as quickly as possible, the 0G or ZeroG communications approach delivers small packets of information at regular intervals, giving end-users in smart/safe cities only the critical information they need.

The software-based wireless 0G network listens to devices without the need to establish and maintain network connection, eliminating signaling overhead. With network and computing complexity managed in the cloud, energy consumption and costs of connected devices are dramatically reduced, the company says. Just as important, the low power requirements can also dramatically cut battery requirements for IoT devices.  Some of these devices are ballistic sensors, listening for the energy signature of an automatic weapon and its caliber. Others are parking and vehicle-presence sensors, sitting flush in the pavement at a dedicated turning lane with a traffic light protected interval, waiting for multiple vehicles to be the trigger at a busy traffic intersection.

In facilities management, the 0G network can connect IoT devices that track ambient factors such as temperature, humidity, and occupancy. Doing so helps managers leverage occupancy data to adjust the amount of space a company needs to rent, reducing overhead costs. Operating as a backup solution to ensure connectivity during a broadband network outage, 0G networking built into a WAP or router could allow service providers to access hardware even when the primary network is down.

According to Christophe Fourtet, CSO and co-founder of Sigfox “The 0G network does not promise a continuation of these services,” Fourtet noted, “but it can provide access to the necessary information to solve challenges associated with outages.”

In a more critical example in the home and commercial building security market, sophisticated burglars could use cellular and Wi-Fi jammers to block a security system’s access to a network so even though alarms were issued, the service might never receive them, Fourtet said. But the 0G network can send an alert to the alarm system provider even if it has been jammed or blocked, he said.

At the Special Olympics World Games Abu Dhabi 2019, iWire, LITE-ON and Sigfox worked together, incorporated a panic button for the athletes use in case of emergencies; in fact, the system was used to locate a lost athlete and return them to the Games without incident, a great safe/smart event story.

About the author: With 30 years of security industry experience, Steve Surfaro is Chairman of the ASIS Security Applied Sciences Council. He is also Standards Team Leader for the DHS Video Quality in Public Safety Group. Steve is published in a wide range of security publications and delivers an average of 100 industry-accredited sessions each year. He is author and contributor of the Digital Video Handbook, a DHS S+T publication. Steve recently received the Roy N. Bordes Council Member Award of Excellence from ASIS International.

About the Author

Steve Surfaro | Steve Surfaro

Steve Surfaro is Chairman of the Public Safety Working Group for the Security Industry Association (SIA) and has more than 30 years of security industry experience. He is a subject matter expert in smart cities and buildings, cybersecurity, forensic video, data science, command center design and first responder technologies. Follow him on Twitter, @stevesurf.