What You Need to Know About the Building Blocks for an Extensible Smart City

Cities around the world have begun embracing the Internet of Things and cutting-edge technologies to operate more effectively and efficiently. But for every city that successfully begins this transformation, there are many more that are unsure where to begin.

 

Austin Ashe, the General Manager for Intelligent Cities at Current by GE, joined Sameer Sharma, Intel’s GM of Smart Cities, to discuss how smart cities can be productive today while preparing for tomorrow. Their joint webinar, “Building Blocks for an Extensible Smart City,” covered how to design a digital infrastructure to accommodate existing and future technologies, the biggest trends impacting smart cities and advice on how to get started.

 

Check out the recap below to see what happened during their conversation, or watch the full on-demand webinar now.

 

What Makes a Smart City?

The webinar kicks off at 2:00 with a refresher on how all the building blocks of a smart city revolve around data. More specifically, ubiquitous data. Today, cities don’t have access to data relating to all populations, areas and subsets within its limits, and this incomplete view makes it difficult to act. The ideal data is ubiquitous, open and actionable, cross-departmental, owned by the city and available in real time.

 

Then, at 6:25, Ashe discusses how streets and sidewalks are a nervous system within a city. Putting your finger on the pulse of this system gives us the ability to see how it’s working and what we can do better. Doing this with roadways through digital infrastructure provides insight into vehicular and pedestrian traffic, parking, public safety, bicycle use, environmental issues and more.

 

An all-in-one platform for managing smart city data can help departments more easily access, understand and apply information. Ashe noted at 11:22 that Current’s platform consolidates all devices from inductive loop, in-ground parking sensors, acoustic sensors, CCTV cameras, and more into one elegant device fits on a streetlight.

 

Ashe provides a closer look at Current’s CityIQ™ platform at 13:07. Using over 30 sensor capabilities, the platform can pull in real-time data to the cloud. Partners and developers can then access that data to generate insights for use cases, predictive modeling and better outcomes across departments.

 

What Technology Trends Are Impacting Intelligent Cities?

At 15:54, see how edge computing and machine learning are already affecting consumer devices—think of facial or voice recognition in smartphones or tablets. They are making waves with data collection as well. Current’s CityIQ nodes use artificial intelligence and machine learning to “see” what’s happening on a street. Ashe shows an example of how computer vision analytics works and the variety of data CityIQ nodes can extract at 18:58.

 

Sensor talk continues at 19:33. Sensors can do more than see; they can hear as well. Acoustical machine learning can detect sounds such as gunshots, sirens, explosions, alarms and more. These capabilities are made possible with Intel’s edge computing technology. At 20:10, Sameer Sharma covers the journey that Intel and Current have taken to develop and launch the sensors. It comes back to edge computing, which is putting compute storage and ways of analyzing data closer to where the event is happening, such as a camera tracking traffic patterns and parking.

 

The talk of edge computing continues at 25:30 when Sharma touches on some of the key drivers behind this technological trend. Not only does it allow organizations to exercise greater control over the data by reducing the movement necessary, which also decreases costs and increases security, but it offers low latency that can’t be achieved with a centralized cloud. Other advantages include avoiding scenarios where connectivity may be unreliable, more contextual awareness and a better overall experience for users. The improved information security is paramount. In Portland, Oregon, for example, the CityIQ nodes can conduct a real-time metadata extraction to detect traffic data that is pertinent to its studies without storing the footage.

 

The vast amount of data being collected and analyzed at the edge can be used to make those edge devices more effective and autonomous. At 29:00, Sharma highlights the many ways the resulting artificial intelligence can positively impact municipalities, including the creating smarter grids, implementing autonomous vehicles, fostering better resident engagement and more. These advancements aren’t far off in the future—many can be realized today.

 

Artificial intelligence is essential for data analytics, especially when considering intelligent environments. AI will move from providing operational analytics to more advanced actions such as predictive and prescriptive analytics. Sharma explains this and the data analytics journey in-depth at 30:30. He then segues into a brief overview of three approaches to AI: cognitive reasoning, machine learning and deep learning. He goes deeper into machine learning at 33:40, including covering how deep learning takes it one step further. Deep learning is made possible through both training and inference. Training requires a huge data set and neural network to be accurate, and it gets better as more data is added. Edge devices that process massive amounts of information are made smarter and more efficient through this process.

 

Learn More About CityIQ

Want a snapshot of where things are today? At 36:45, Sharma goes over some deep learning breakthroughs affecting modern technology, specifically image recognition and speech recognition. Advancements in these areas took major leaps recently, enabling systems like CityIQ to apply intelligence and analytics wherever possible. You can check out video clips of CityIQ’s machine learning and deep learning evolution at 39:35.

 

Cities across the U.S. have begun the transition to smart city technology with the help of CityIQ. You can see for yourself at 41:30. Smart city technology is enhanced with Intel’s OpenVINO™ Toolkit, which delivers computer vision and deep learning capabilities from the edge to the cloud. You can learn more about Intel’s AI portfolio at 46:45.

 

Ashe takes over at 47:56 to recap a few of the desirable outcomes made possible by smart city technology. For example, computer vision can detect where cars are parked and where they are not, potentially allowing cities to leverage applications to provide on-street parking guidance to drivers and reducing congestion by as much as 30%. Other outcomes include optimizing routes for public transportation, increasing safety for pedestrians and bicyclists, better serving the homeless population and making city planning more accessible for people with disabilities.

 

A lot of questions from the audience centered around personal privacy. Ashe answered this at 56:00, explaining that CityIQ uses optical sensors but is not actually bringing in camera feeds. These edge devices can identify an object of interest, such as a person or car, but only extracts the metadata—i.e., that a person is there, not who that person is or any other personal information about them. The sensors also filter out information from above ground level and only focus on the public domain.

 

Data security is just as important to cities. At 58:38, Sharma says that smart cities undergo a comprehensive review of security practices before implementation to ensure that information is locked down and kept in accordance with government regulations without hampering its use.

 

What Ashe and Sharma discussed in the webinar just scratches the surface of smart cities. Data-driven urban areas have the potential to unlock positive outcomes for citizens, tourists, businesses and entrepreneurs, and app developers. Municipalities that look forward and strive to embrace smart city technology can experience a brighter future with the help of Current’s CityIQ nodes and the Intel processing that powers them.

 

 

WATCH NOW