Why Interdisciplinary IoT Battery Optimization Is Crucial
Tasmin LockwoodTasmin Lockwood
There are millions of IoT devices in the world, which means many, many thousands of failing (or soon to fail) batteries. Why is it that batteries still suck despite our dependence on them?Â
IoT technology and the market for it are both growing at a rapid pace. The batteries powering many of these IoT devices are failing to keep up. The industry is desperate for a simpler, easy to use, and portable IoT battery optimization toolchain.
IoT battery optimization is often an afterthought instead of a core component of a product’s journey from concept to consumer. Software updates can fix power inefficiencies later, but why should they have to do so? Revisiting a product post-launch is costly in both time and reputation.
Power-hungry products are a nuisance for users as well—expecially for low-power, wide-area networks (LPWANS), which are often placed in remote locations to monitor temperature and other environmental variables, it doesn’t bode well for a manufacturer when batteries are unreliable and must be replaced often.
An IoT device is essentially worthless if it doesn’t have power since the purpose of most IoT devices is to generate and transmit sensor data.
For example, there' no point in using sensors to track infrastructural health—where the data influences maintenance patterns and detects potential points of failure—if the sensor has an unreliable battery lifespan. If you can't be assured of the reliability of the sensor's lifespan, how can you trust your sensor to detect infrastructural health reliably? Then there’s added maintenance costs of deploying engineers to replace the battery.
Plus, imagine how inconvenient it would be to have to keep changing batteries on an IoT system that claimed to have a "10-year lifespan," such as some LPWANs?Â
It’s understandable that battery life isn't optimized throughout the IoT stack. Legacy equipment that analyzes different elements of the build requires specialist training to interpret. Moreover, the hardware is often made immobile by how enmeshed it is into an IoT network's architecture; removing devices, replacing batteries, and then reassociating those devices—and then, finally, testing them—generally costs far more than scrapping a system and replacing its hardware entirely.Â
To compound the problem, engineers sometimes think that adjacent engineering teams or hardware vendors are responsible for optimization. That operationally deflective approach becomes a major problem when businesses invest so much of their "digital transformation" budgets in IoT initiatives.
A culture change is long overdue. Software and hardware devs, CTOs and product managers must collaborate and take responsibility for optimizing their contribution to a device.
Some developers use continuous deployment (CD) to push firmware updates back to devices without needing to retrieve devices physically. However, Over The Air (OTA) updates—I'm referring here to IoT battery optimization capability—really should be built into IoT architectures from the start.
It’s generally hard to read battery life because each product can have several functionalities. There’s a dynamic and wide-ranging difference in power consumption between sleep mode and a device being used to its maximum capabilities.
Lifespan is affected by three things: environment, application, and the battery itself. This includes factors such as temperature, mobility, activity, load, battery leakage, and density.
These wide-spanning factors mean lifespan is different for every product. Given the potential for versatility in IoT systems, different products are exposed to different environments. As a result, they'll experience different problems. This reinforces the needs for a new, innovative toolchain to consider, monitor, and help prevent all potential points of failure.
Optimization methods must be made simpler and more accessible so that everyone involved in the development process can understand readings and use dedicated tools. Otherwise, IoT adoption may stagnate as a result of failed deployments, performance issues, bad user experience, and unreliability in general.
The whole team—from designers to developers to project managers to, ultimately and most importantly, the customer—must adopt a battery optimizing mindset. IoT battery optimization can no longer be an afterthought. Devs must work together to raise awareness with the broader team regarding battery optimization possibilities and challenges. They must consider how what they’re doing throughout product development and maintenance cycles is affecting the overall outcome of the IoT initiative over a multi-year horizon.
Although batteries may still suck right now, the industry has begun to pick up the pace of innovation. For example, Swedish startup Qoitech has created a tool called "Otii" that condenses problematic legacy equipment and simplifies hard to understand readings for IoT improvements, bug fixes, and performance enhancements. It's unclear if there is anything else like this right now—a tool that tries to fix battery issues from the inside out. We need to foster a rich ecosystem of such toolchains to power our IoT networks efficiently.
Developers often focus on battery optimization 'hacks' after the product is pushed to market. For any kind of progression, however, battery optimization needs to be taken into account throughout overall system performance analyses from prototyping to deployment and on toward maintenance cycles. Neither design nor engineering nor deployment is individually responsible for IoT battery optimization and yet it will ultimately affect every single stakeholder.
New Podcast Episode
Recent Articles