Skip to main content

Choosing optimised technology for the Internet of Things

The dreams and promises of the Internet of Things (IoT) are many, as well as the prediction of its shining future. There is an underlying expectation that Internet of Things will change the way we use and behave on the internet since it’s predicted to connect 10 times as many “things” to the Internet by 2020. Everything from bracelets, to cars and smart homes, the lines between hardware, software, and mobile will be blurred to accommodate all these new data-driven connected items.    

IoT will (eventually) be used everywhere. There are many areas where huge benefits will be apparent. Collecting data of how a person exercises and how the body reacts can aid to increased health, not only for that person, but also as research material for others. Many jobs can start to be automated such as drivers of various vehicles or control of other machinery. Our homes already have a lot of automation, such as machines for washing, cleaning, cooking and so on. Support and services of these machines will be simplified when the first level of trouble shooting can be done remotely. Updates with improvements can be done automatically.   

For consumers, this all sounds glorious and we can’t wait to test, try, and consume this new revolution. The opportunity is huge, but there are uncertainties for companies. There are questions yet to be answered, such as how to accurately assess the business opportunities, and how to build a technology stack (the layers of hardware, software applications, operating platforms, and networks) to support current and future IoT applications and devices?

Transitioning to an optimised architecture

To optimise for IoT, companies need to transition from traditional architecture to a more optimised architecture. Elements of the current technology stacks may need to be redesigned so they can support billions of interdependent processing events per year from millions of products, devices, and applications. Since networked devices are always on, companies must be able to react to customer and system requests in real-time; agile software development and delivery will therefore become a critical competency. 

Seamless connectivity will also become a must-have, as well as collaboration across IT and business units, which have traditionally been siloed. Moreover, companies must be able to securely and efficiently collect, analyse, and store the data emerging from these refined IT architectures.  To truly take advantage of the newly connected devices, this calls for real-time data capabilities and analysis. 

User actions, behaviour, habits, needs, and preferences, with the more data the better user experience and satisfied customer. Based on the data, we need to be able to make smart analysis and then an action needs to occur based on the analysis. Now the complexity is, all of this needs to happen within a fraction of a second.   

The reason we say “complexity” is because a new generation of technology can’t rely on traditional and legacy software. As mentioned, the technology needs to be optimised for the Internet of Things. So, what demands are there on the new generation software? A new generation software is what we call intelligent software, it needs to:

  • Handle large amounts of data (terabytes of data)
  • Handle data in real-time
  • Do real-time analysis
  • Act on the data as it is received

We are rapidly getting to the point where wireless networks have the capacity to handle very large volumes of devices. To be able to handle large amounts of data in real-time we need to have extremely fast software platforms that only store one sample of the data needed. It requires a next-generation platform that is super-fast and can manage real-time needs. To do real-time analysis and to be able to act-on-spot on the analysis a smart system with very high performance is required.

In-memory platforms

In-memory computing should be considered. By bundling an in-memory database with a web application framework into a single server, developers can quickly build a cooperating suite of fast micro-applications.    

A reduction in the distance between hardware, software and data results in an extremely fast software platform with very high performance, since it eliminates data transfers between the application and the database as well as transformation between different data formats. This is possible because the application can access data in the database as fast as its internal temporary data. The data is never moved, the data resides in the database at all times, and the application directly accesses the data managed by the database management system. There is no need to have a copy of the data locally saved to the application. 

The “local” state of change in progress is no longer a product of a separate Java or C# object on the heap, but rather a consequence of the isolation and atomicity of the ACID engine managing the joint heap/database. As long as the log(s) are secured on disk, the database image can calmly be hibernated to disk in its own pace using asynchronous writes. Checkpoints and recovery works the same way as with an existing legacy database.

In conclusion

IoT will play a major role in our lives. However, it requires a new technology that is able to collect, analyse and respond to millions of transactions in real-time.

Therefore, it requires an extremely fast system. Since the traditional system cannot handle this kind of operation, we need to look into in-memory platforms.

Stefan Edqvist, senior operations officer, Starcounter
Image Credit: Chesky / Shutterstock

Stefan Edqvist
Stefan Edqvist is a Senior Operations Officer at Starcounter.