5 Considerations for Building a Scalable IoT System
DornerWorksDornerWorks
There are a lot of details and considerations that go into building up an Internet of Things (IoT) infrastructure that can scale. Most people immediately think of scaling their servers – using bigger machines or a cluster of smaller machines – but there’s much more to consider to meet an increasing demand. For instance, as the number of devices increases so does the need for a streamlined way to maintain and troubleshoot them.
Tools must be developed so that system administrators can do routine tasks with thousands of devices. Monitoring must scale when the number of devices grows, so that there’s a way to alert on and investigate impending issues, faulty hardware or even malicious attacks on your infrastructure.
And all along the way a developer is faced with a litany of small decisions that may, unbeknownst to them, make future growth easier or much harder. Below are a few ways that IoT infrastructures can be made to scale.
This removes redundant code from individual solutions and makes it easier to maintain, test and add features. Separating the API from the rest of the web application, often done through the use of Single Page Applications (SPA), then will give system architects the ability to scale the high-load portions of the site independently.
Because the API is under load from both web app and mobile app users it will likely be the first place a load balancer and parallel servers will be required. In turn, developers must consider the many facets of a growing number of servers, such as how to federate login functionality and the use of session management over multiple servers.
Additionally, there are situations where a user may wish to generate certain reports or perform some sort of computation on the data present. A typical request and wait approach to building a web application will fall short if the user’s request is resource intensive. Not only will web servers get overloaded at scale, but expecting end users to simply sit and stare at a spinner in their browser makes for a terrible User Experience (UX).
Scalable IoT infrastructures offload this data crunching and intensive work to specialized servers sometimes called web workers. The number of servers in this pool can dynamically grow, but their sole purpose in life is to be constantly doing the hard work.
To work well, they would work from a single queue of tasks that need to be done. As a new request comes in or a new set of data is stored, the computational work to be done would be loaded into the queue. As a web worker comes available, this work would be completed and the resultant data would be made available. Because the work isn’t completed on-demand, but through a queue system, it is inherently asynchronous and easily scalable.
What happens when a large number of devices are connected to the net simultaneously or they all happen to send data up stream at the same moment?
Servers can be overwhelmed, data can be lost and services break down. However, if communications paths are made asynchronous they can more easily scale to increasing data. This can be accomplished by using communication protocols that break up the sending and receiving processes, such as MQTT or message queues.
If you have built an infrastructure like this, you also know that there are times when you need more synchronous behavior. For instance, turning on an IoT end device like a lamp cannot be something that happens asynchronously “at some time in the future.” Or for a more industrial example, slow-to-execute commands sent to remote factory equipment could result in lost revenue, especially when added up over the course of months or years.
With this in mind, it may make sense to add more immediate or synchronous-like communication paths next to the asynchronous ones. They may not be truly synchronous, but they can be made to mimic this through low-latency, highly-durable channels and protocols. And the data that can quickly swell and overwhelm an infrastructure, such as telemetry data, can be kept on its own, asynchronous path.
Part of this process of managing a large number of devices is bringing them all online quickly and efficiently. It’s important that each device is uniquely identified and authenticated but also that they are immediately allowed to securely interact with the rest of the infrastructure before being fully provisioned – two requirements that are often at odds. With careful planning and the use of modern cryptography this is achievable but must be considered before going into full production.
It also has the natural quality that it scales with the computation burden of the infrastructure – more IoT devices means more data but also more computational resources. Taking this into consideration early on in the design of an IoT infrastructure, through the selection of appropriately sized hardware and the enablement of Over-the-Air (OTA) updates, will provide developers and maintainers the needed capability to successfully scale the operation when its needed.
A working product is always better than the concept for a perfect product that doesn’t exist. As long as growth is manageable or you have an expert technology partner who can guide you through to the next stage, even a system’s scalability can scale.
Written by Eric Bigoness, Chief IoT Engineer at DornerWorks
The Most Comprehensive IoT Newsletter for Enterprises
Showcasing the highest-quality content, resources, news, and insights from the world of the Internet of Things. Subscribe to remain informed and up-to-date.
New Podcast Episode
Recent Articles