burgerlogo

Interoperability Considerations and Assessments for IoT Products

Interoperability Considerations and Assessments for IoT Products

avatar
Intertek

- Last Updated: November 25, 2024

avatar

Intertek

- Last Updated: November 25, 2024

featured imagefeatured imagefeatured image

As IoT products connect and work together, interoperability is critical. Understanding the tests, standards and protocols that address functionality and security is important to achieving this goal.

Lack of Guidelines

There are no set regulations or guidelines that focus solely on interoperability. Instead, each manufacturer is left to evaluate their product's functionality and safety in a connected environment. As such, they need to consider aspects impacting interoperability. Then they must build a test plan, conduct evaluations, analyze data and make necessary adjustments. How can one achieve this?

Considerations

Start by considering the elements that may impact interoperability. For example:

  • Other devices on the network, including their software, origins and reliability
  • Access control, via the network and other devices
  • Possible disruptions to the connected ecosystem
  • Default credentials or those that are hard-coded
  • Vague or imprecise paths for updating legacy firmware
  • Open ports
  • Interference
  • Other devices and/or networks with cybersecurity issues or problems
  • Device performance within the intended environment

Testing

Interoperability testing goes beyond simply placing devices in a room one time. The most effective approach is to test devices in a real-world environment at various points throughout the development cycle. One of the most effective ways to do this is recreating a live environment similar where a device will be used.

Start the testing process by identifying potential issues, based on intended use and environment. Then identify necessary assessments and build a test plan that includes a mixture of automated and manual testing. Importantly, plans should include objectives, resources, and processes. They also should include a detailed workflow for completing the assessments.

Evaluations

When making a test plan, consider the following evaluations:

  • Simulation/automation testing, emulating real environments and usage scenarios. This can evaluate scale, security and reliability while accounting for other devices, traffic, interference, data loads or other concerns. It assesses a device without using real boards or servers. Moreover, it allows for specific test conditions to help identify issues and errors.
  • Usability Considerations: Account for end-use and consider human interactions. Testing includes running assessments for usability in a connected environment. This will ensure products meets consumer expectations and needs. It considers interoperability with other products, networks and the IoT infrastructure.
  • Performance: Performance testing is straightforward. It consists of validating performance across networks in a simulated real-world environment. Tools such as JMeter or crowd sourcing a large open alpha or beta test can identify weak points.
  • Benchmark Testing: Benchmark testing can evaluate a product's performance against similar products already in the market. It can assess how products compares with competitors. It can also help plan necessary changes that may be needed to compete with other products.
  • Regression Testing: Regression testing helps make sure previously developed (and tested) software continues to perform once it is altered, interacts with other software or when new features are added. Importantly, it helps safeguard performance during updates, enhancements and configuration changes. It can be automated and it may lead to additional testing.
  • Security Evaluations: It’s important to evaluate a connected product for security issues. This ensures it keeps data secure and is not infecting other devices. In the U.S., ANSI/UL 2900 was adopted for cybersecurity purposes. It assesses vulnerabilities, software weakness and malware through risk management, testing and requirements for security risk controls in a product’s architecture and design.

Data

Once evaluations are complete, you must collect, review, analyze and save the relevant data. A product may require additional fine-tuning, testing and analysis . Or it may be ready for the next stage of development, including production and distribution. However, even if ta product is ready to launch, interoperability assessments are not over.

Ongoing Considerations

Once a product is on the market, it still requires security and interoperability evaluations. Therefore, manufacturers and developers should plan on issuing updates, upgrades and patches on a regular basis to address security and performance concerns. This will help protect against developments in the industry, new software platforms, emerging viruses, malware and other threats and new technology from competitors that may disrupt existing eco-systems.

Need Help Identifying the Right IoT Solution?

Our team of experts will help you find the perfect solution for your needs!

Get Help