Building a complex IoT solution for healthcare, Part 3: best testing practices

13 Sep 2022 Oleksii Yerokhin, Dmytro Arkhanhelskyi, Kirill Zakharenkov

We’d like to bring your attention to the third article in the series dedicated to the best healthcare IoT practices. This time we will focus on the peculiarities of IoT healthcare testing. The market for IoT healthcare monitoring devices is expected to grow almost twice between 2020 and 2025.

Healthcare IoT industry is well-known for its numerous challenges. To name a few, there are strict quality demands, device connectivity standards, and data protection regulations. Failure to address these challenges and ignore these demands may lead to quite a disaster. For example, a failure to secure protected medical data may cost healthcare companies up to $16 million according to HIPAA data protection regulations. A solid and reliable IoT system testing approach is a vital part of the strategy that helps to protect these companies from such problems.

IoT testing should prove that all components in each device in a workflow integrate with each other in a true production environment. We are happy to share some tips and insights on testing of the IoT healthcare systems. We build this material on our massive experience in developing complex IoT solutions for medical businesses. As a company that promotes quality and innovation, ABCloudz applies the most advanced testing practices and creates custom QA strategies to ensure that the developed IoT solutions run safe and dry.

IoT testing services for a large healthcare project

We continue exploring our large IoT healthcare project where the ABCloudz team had to develop a system for asthma and COPD management device. In our previous articles, we provided insights on project organization and the parallel development of software and firmware. Now we will dive deep into a system testing. We will cover two distinguished testing stages:  app testing and BLE device integration testing.

App testing

Application testing is a complex process that cannot fully simulate true end-user experience. That is because once discovering an issue with the app the end-user would rarely stop and think about an issue and/or report it. He or she rather drop using it and, in most cases, never get back to using it again.

That’s why the ABCloudz team approaches app testing with extreme caution and runs different types of manual and automated app testing practices.

Here’s our general checklist for the features that should be included in any healthcare app:

• Compatibility of an app across platforms, OS, and browsers

• Compatibility of in-app features
• Graphical user interface (GUI) quality
• Network compatibility
• App navigation quality
• Security vulnerabilities
• App performance
• App installation/uninstallation
• Memory usage
• Battery consumption

Our approach covered complex test cases that included many variables and factors that couldn’t be handled automatically.

App testing has included the following:
• Functional testing (integration and system review)
• User acceptance tests
• Regression testing
• Post-release testing
• GUI (graphical user interface) testing

The image below illustrates our general workflow for different manual testing types:

1. In our manual testing practice, we created various test scenarios. Once test scenarios were created, a reporter prepared documentation with these test scenarios divided into test cases.
2. A reviewer examined this documentation to approve the quality of scenarios and cases.

According to our quality demands, a great test case should:
– Test only one thing at a time
– Provide clear and measurable results
– Be format-consistent with other test cases
– Consider preconditions
– Be close to real user experience
– Not introduce certain values (only approximate and non-specific numbers)

3. After the test case review we undertook test planning.
4. When all test cycles were prepared, we proceeded with test execution and bug creation.
5. The reports with detected bugs were then sent to developers who fixed them.
6. Later, QA specialists examined and verified these bug fixes (7).

We also applied automated testing practices. Mobile app development cycles call for repeated testing with the same test often running numerously. It can also deliver more accurate and reliable results compare to the manual testing alone.

The automated approach helped us with:
• Regression testing
• GUI testing
• Unit testing
• API testing
• Load tests
• Performance testing
• Security testing
• Functional testing

Most practices mentioned on this list are rather common and you will find more about them in our materials on the QA process and techniques.

As for custom practices for automated IoT app testing, let’s focus on mobile device integration functional testing.

Custom framework for automated mobile device tests

While developing an IoT solution that involves a mobile app, it is vital to make sure that a mobile app receives all signals and requests from the device. If the app doesn’t receive these requests or fails to respond to them properly, it becomes simply not usable. That’s why we need to conduct as many mobile device tests as possible. Handling such a number of tests manually would take too much time and effort, so we created our custom framework which automates most test scenarios. It enables us to customize test scenarios and environmental factors to the slightest details so that all possible mobile app interactions are covered. Unsurprisingly, we applied this custom framework while testing the mobile app for a COPD and asthma management device.

Here’s a scheme that illustrates our custom framework.

IoT device testing

We applied CI/CD configurations to configure, schedule, and automate tests created with the TestNG testing framework. Tests connected with a puff device through the SocketController Java class. A Puff device is our custom device that emulates IoT hardware. It was created from parts of other IoT and sensor devices with which we had worked in previous projects. Its role was to deliver simple hardware commands to a mobile device.

To establish SocketController connectivity with the puff device, we had to implement custom WiFi configurations. That’s why our system administrators and firmware developers created the WiFi API controller that quickly changed WiFi configurations according to test cases and SocketController connectivity needs.

Once the puff device received a test command, it transferred it to the mobile (iOS or Android) device through the Web API service, as if someone had pushed the hardware button. To produce an output, the mobile device sent a request to the database, while Web API served as a transitional data exchange service between the two. Finally, the Web API connected to the Appium driver, which returned the information on mobile device response to the TestNG environment. Once TestNG received the information on test responses, it recorded it in the form of test reports.

These tests could be launched at any time, so our automated testing specialists received data on mobile device responses in record tests. Such custom testing practice enabled us to test mobile device functionality by up to 50% faster than traditional practices.

BLE device integration testing

The next must-have task is testing the device’s BLE connectivity with software. Even a minor BLE connection issue can interrupt the work of the entire system. Therefore, it is vital to create the path for testing of the device connectivity in a broad range of situations.

The main focus of this stage is creating a custom framework that runs automated tests and is designed specifically to interact with a certain device. While working on the COPD and asthma management device, we created a Java framework that perfectly connects with test drivers. It runs hundreds of tests that allow us to examine device BLE connectivity

Check out our BLE device integration testing principle on the scheme below:

• We used the Java test framework that runs hundreds of automated device connectivity tests.
• We used the COM port to connect with a dev kit with custom firmware as a bridge.
• The dev kit transferred all the tests to our device. It interacted with such test commands and produced the output (functional responses).
• The outputs were recorded and transferred to the test framework through the development kit. There we could review the results of all the tests.

Applying this custom practice we managed to exclude the threat of failed BLE connection between the device and the software. Besides, this technique was applicable to the development kit, which enabled us to practice it long before we received the actual hardware prototype from the client.

Conclusions

We have covered the most useful insights on custom practices the ABCloudz team applied to deliver an excellent COPD and asthma management system. Some hardware and firmware testing practices were handled on the client’s side with a specially-manufactured device that applied needles emulating user interactions with a device.

In this article, we primarily focused on our custom practices that stand out from the common QA routine. These are the solutions that help customers to save time by running multiple test scenarios quickly. These practices are only available to customers who have our custom devices in their possession.

Currently, we are working on additional IoT testing practices that may help us provide our clients with extra testing features and run additional test scenarios. WE will definitely share more information about these innovations in the future, so make sure to subscribe to our blog!

As always, please contact us if you need to deliver a complex IoT solution in the shortest term and with a highest quality!

Ready to start the conversation?