The 2020 Vision of IoT
With the arrival of private IoT networks, 5G, and long-range radio technology, the year 2020 has all the necessary ingredients to be pivotal for Industry 4.0 and the Internet of Things. Due to all of these interesting developments happening, we wanted to share our perspective on what is to come and how it will affect businesses in the foreseeable future.
Here, at Anchormen we’ve seen a lot of activity lately. With customers ranging from industrial manufacturing to chemicals, and from government utilities to universities, everyone is anxious to see where does the technology go from here. As a data consultancy company, the developments in IoT are very intriguing because eventually, connected things (devices that can be connected to an IoT network) will generate more data than we humans can:
FRAMINGHAM, Mass., June 18, 2019 – The number of devices connected to the Internet, including the machines, sensors, and cameras that make up the Internet of Things (IoT), continues to grow at a steady pace. A new forecast from International Data Corporation (IDC) estimates that there will be 41.6 billion connected IoT devices, or “things,” generating 79.4 zettabytes (ZB) of data in 2025. Source
Everyone working in the field can tell you that the massive amounts of data are both a challenge and an opportunity. On one hand, handling large amounts of data is now a very manageable task, and it allows us to generate insights and make predictions. Especially in Cloud environments with solutions such as the Data Hub, making efforts very effective while keeping the costs at a profitable level. On the other hand, just because we’ve gotten better at it, doesn’t necessarily mean we are ready for the mountains of data that objects interconnected through a worldwide network can transmit. Time, resources, and processing capabilities are still a constraint.
As we speak, there are hundreds of companies looking into ways of opening up their existing environments so they can measure, transport, and collect data in a centralized platform with the goal of building long-term datasets, or in layman’s terms – constructing an IoT network. Still a herculean task for many, but why? This is what I aim to discuss in this article.
The IoT Architecture
In order to understand the issues, we first need to see how a typical IoT architecture looks like.
Usually built in three distinct layers, most people will ever deal with the Business Layer as it is the one that manages the whole IoT system, including applications, business and profit models, as well as users’ privacy.
Cloud Computing is quite well established nowadays as well, but the steps to actually start measuring at the “Things” Layer and transporting these measurements through some kind of a Transport Layer (with or without performing any Edge Computing*) is a relatively new concept. The reason is that these two layers are not on the Cloud; although it’s not certain where they reside, they are usually closer to the physical consumer than to the Cloud. Often these layers are referred to as Fog Computing.
At the Fog Computing level, many things are happening behind the scenes. A social movement called “The Makers Movement” is working on democratizing their knowledge by building all kinds of products and stimulating learning-by-doing culture in their ranks. Thanks to them, we are seeing a lot of new developments.
*Edge Computing – a distributed computing paradigm which brings computation and data storage closer to the location where it is needed, to improve response times and save bandwidth.
Previously, developing new things was quite hard. Creation of just an individual PoC thing or a small batch of the same thing was difficult and costly. For example, the development and production costs were only cost-effective if one would produce large quantities, usually in the order of magnitude of thousands, following a long, complex and therefore expensive developmental process. Thanks to “The Makers Movement”, development is no longer the exclusive right of just a few specialized companies, but can be done by a much wider audience. Furthermore, producing just a few or small batches of things has become possible as well, thanks to more standardized processes, 3D printing, etc.
Going forward, at the Transport Layer there are several emerging tools that aim to solve some problems of existing technology when it comes to transporting data. Every thing needs to transport its measurements, either through connecting wires or via wireless communication. Which brings us to the other big development in IoT – wireless transport.
For things at a fixed, close proximity location, a wired connection such as Ethernet/LAN IP interface is not a bad option. However, it can be difficult or downright intrusive to put new wiring into an existing environment. Using wired networks also consumes quite a lot of power meaning that a permanent power source is a must, which leads to even more wires. This is why so many companies are opting for a wireless transport solution.
Wireless makes it easier to add measures of things to a new or existing environment, especially when the environment is geographically dispersed or already heavily overbuild. So far so good. But, what’s the catch? Well, unfortunately wireless transport has two main hurdles:
- Telecom costs – the cost of sending a message when using traditional M2M (machine-to-machine) protocols like GPRS can be quite burdensome. Most of these M2M protocols are offered by the traditional telecom operators and they charge per message (or bundle of messages). The more measurements with more things, the higher these transports fees get.
- Power – most wireless protocols are relatively power hungry, meaning that every thing needs to have a power source. A battery is always an option, but they are drained quite rapidly and replacing batteries of many things on a regular basis is a costly operation by itself, making it harder to justify the business value.
The good news is that there are new radio technologies around that can solve these two issues. Some of them are already available for use or are likely to become mainstream in 2020.
Of particular interest are the so called LPWAN (LoPower Wide Area Network) technologies such as LoRa, NB-IoT and SigFox. All three have the advantage of being a very low-power consumption source with the ability to cover wide distances. This makes it possible to power devices for years rather than hours as well as increase the range from hundreds of meters to hundreds of kilometers away from the receiving gateway device (records are over 500 kilometers in rural areas with LoRa). Depending on the use-case, one could easily cover huge swats of land with self-deployed gateway devices. Something that is impossible to do over WiFi.
When the relevant area isn’t easy to cover or the things are constantly on the move around the world, using NB-IoT (NarrowBand-IoT) is also an option. Although this technology is mostly offered by the telecom providers, as NB-IoT is part of the 5G standard from the 3GPP specification, the expected costs for data transports are still significantly lower than the current M2M models. But telecom providers will be charging for the data transport as NB-IoT uses the licensed 5G bands that they have bought/hired to earn back their investments. The one benefit is that ‘adding’ NB-IoT to a 5G network is just a ‘software’ thing that doesn’t involve a lot of additional work or costs for the telcos. Most likely they will be pushing NB-IoT as a better alternative to their existing M2M networks so in the long run they are able to decommission the older 2G/3G networks and save on operating costs. NB-IoT can already be used in some countries in the world through some of the large telecom providers even before their full roll-out of 5G, because it can already be incorporated into upgraded 4G networks (albeit without utilizing the benefits, the bandwith, and speed of 5G licensed bands).
Thanks to the lower transport costs and the new power-friendly wireless technologies, adding things that measure and subsequently transport these measurements, has become much easier and more cost-efficient for companies. More and more use-cases are becoming feasible and the effort needed to start measuring and collecting data is becoming more profitable.
What remains to be seen is how effective these new tools are at transporting data to a centralized platform. Many parties, including the Public Cloud providers are offering solutions for such centralized environments. What these Cloud providers have in common with many other big vendors is that they are all offering IoT solutions which receive MQTT (MQ Telemetry Transport) messages, so it would seem that MQTT will be the new standard protocol and main backbone of transporting data from things to data platforms in the future.
In the next blog post we will be diving deeper in the promises of Predictive Maintenance, as this is the holy grail for most Industry 4.0 activities and the next logical step once an IoT network is set up. More and more data is arising from the use and maintenance of machines. Structural analysis of this data can help in predicting when a machine or a system will malfunction. This enables smart maintenance planning as well as a myriad of other A.I. solutions which can improve organizational efficiency, and minimize operational risk and costs.