Skip to main content

AI: from sensor data to smart monitoring

Artificial Intelligence
(Image credit: Image Credit: Razum / Shutterstock)

Sensor technology is evolving and becoming more precise. The benefits of this for businesses are clear, for example better ground resolution will produce more accurate data. However, the data in question is a challenge. Just as the technology is growing, so too is the volume of data that is produced. This data must be appropriately managed to help inform business processes.

The solution to navigating the data lies in automation and artificial intelligence (AI). Automation involves a series of algorithms, which are designed to automatically carry out actions based on inputted commands. AI is a key part of this process as it ensures that the automated processes are still intuitive. The result is that data-driven procedures can be implemented without human interaction which reduces the pressure on businesses as it frees up a significant portion of time.

Making sense of sensor data

Sensor technology is used to identify objects. This presents a challenge because there are many variables that can influence the data set so the automated approach can struggle. The issue is that the algorithms cannot be fully designed for every variable. 

Solutions to this issue have been developed that are based on AI systems that have been developed to better manage the large volumes of data produced by sensor technology. Real-life applications of this include lane departure warning systems and traffic sign recognition. 

There are three types of results from AI applications to sensors and the data borne from them. First, there is object classification that determines if an object is present in an image. Then there is object recognition in which one or more objects are detected in an image and their position is determined. Lastly, there is object segmentation which fills in the actual outlines of the object.

However, the complexity of the process increases from object classification to segmentation. More intuitive approaches must be used that suitably address the impact of multiple variables on the data sets.

The role of neural networks 

AI is an umbrella term for technology that enables machines to perceive their environment and learn to adapt to the changes occurring there. Within AI, Machine Learning is a subset that uses statistical methods to enable machines to improve their performance without being explicitly programmed. More intuitive still is Deep Learning which is a subset of Machine Learning. Deep Learning can process huge amounts of data using Deep Neural Networks (DNNs).  

These neural networks are key to object classification in sensor technology. Although multiple neural networks may be used, they all have certain properties in common that influence how they learn to interpret and act on sensor data. 

First, all neural networks must be trained. To classify, recognize or segment objects, the networks must be instructed how to treat different object classes. The networks are trained on known images where the objects are appropriately labeled until the network can correctly classify, recognize, and segment all objects. 

Neural networks are not infallible. The result of neural network analysis expresses a percentage probability that an object is recognized correctly. The higher the probability percentage, the surer the AI is of a certain statement. However, since 100 percent is rarely achieved, it is important to decide a probability percentage that is high enough to be used for business processes. 

Different neural network models have different properties. For example, some network models can be trained very quickly but deliver lower probability percentages when it comes to object analysis. As a rule, the more complex and time-consuming the training of a neural network, the more reliable it is likely to be. Therefore, organizations implementing the technology must decide whether they want to prioritize speed or reliability.

Getting to the point (cloud) 

Different models of neural networks are also required depending on whether images, videos or point clouds are used by the sensor. A point cloud is a set of data points in space representing a 3D shape or object. To create a point cloud, laser scanning technology is used. Each point contains a large amount of data that can be integrated with other data sources or used to create 3D models.

There are special neural network models for point clouds that address the specific requirements of the data material. For example, for a research project by UK Network Rail, a point cloud of an approximately 90 km long route section was created which included the recognition of 18 object classes.

The main challenge presented by this project was in the registration of data, as several scan journeys had to be connected to get usable elements. The quality of object classification by the neural network varied for individual object classes according to the geometric properties of the object. For example, electricity pylons and overpasses were easily identified by the neural network, but distinctions between surfaces, for example wall types, were unreliable. 

Point clouds can also be used within a hybrid system where the technology is combined with other resources such as computer-aided design (CAD) systems and other forms of mapping. This is particularly valuable when used in combination with neural networks to facilitate automatic surveying and determining land use from above. Hybrid systems are more reliable for tasks such as detecting surfaces than point cloud technology alone.

Decision support for better outcomes

All these AI-based technologies have different benefits based on the tasks they resolve. However, what object classification, point clouds, and hybrid models all have in common is that they are underpinned by the supportive function of AI. 

AI helps humans make key decisions faster and with more insight. In practice this is a best of both worlds scenario in that it allows for human reasoning towards problems that may have social, political or business contexts, but augments the decision-making process, allowing for data-driven decisions based on what is being picked up by the sensor. 

Sensors produce vast quantities of data. Within this data lies actionable insights that can be used across industries to enhance business processes. However, there is simply too much data produced to deliver these insights on a timely basis by humans alone – unlocking this involves an AI approach through neural networks to augment human-decision making processes. Once implemented, this technology will drive changes across business and government, with pioneers securing the maximal potential of their sensor investments.

Dean McCormick, Head of Smart Monitoring Solutions, Hexagon’s Safety, Infrastructure & Geospatial division

Dean McCormick

Dean McCormick, Head of Smart Monitoring Solutions, Hexagon’s Safety, Infrastructure & Geospatial division.