Our client is a healthcare testing and biotechnology company that quickly grew from a small antigen development lab into a large producer of various healthcare tests distributed across the U.S. The client developed a new custom device for some of the most rapid COVID-19 tests. It includes a pipette for collecting blood and a small test cassette in which the tested blood should be placed. 15 minutes after placing the patient’s blood into the cassette, rose or red stripes become visible on its surface. These stripes enable the patient to determine one’s disease status.

However, the customer needed to develop an efficient mechanism for decoding these stripes and integrating the device into a digital network. The customer decided to go with a mobile app that scans test results, interprets them, and provides the results to the customer in record time.

The ABCloudz team possesses the required expertise in developing secure healthcare solutions and a clear understanding of the client’s unique test logic. Therefore, we were selected as the perfect technology partner for building this app. The most significant was building custom test interpretation algorithms appropriate to the client’s tests. In addition, we had to develop an image recognition algorithm that would be capable of scanning tests despite any issues, like the device shaking or poor illumination.

6 fast steps for test decoding

The ABCloudz team of specialists immediately swung into action to produce the most efficient test decoding process.

As a result, the entire process was divided into 6 consecutive stages, each solving a corresponding test decoding challenge:

1. Mitigating the camera shake
2. Illumination correction
3. Correcting phone tilt
4. Defining the test area
5. Translating the test into a color range understandable to a machine
6. Decoding the test results

Here’s a more detailed description of each step.

1. Mitigating the camera shake

Typically, an image recognition app may have issues with processing test results if the smartphone camera shakes. To mitigate the impact of shaking hands or any other issue preventing the camera from focusing on the test, the ABCloudz team of specialists implemented an accelerometer algorithm. Complex mathematical calculations enabled us to develop a mechanism that continuously determines and updates the phone’s position according to the three coordinate axes. The app uses these measures to determine the error caused by the phone shake to significantly reduce or eliminate it. This custom algorithm perfectly solves a frequent problem that spoils the quality of image recognition apps.

2. Illumination correction

The ABCloudz experts built an illumination correction algorithm to make sure that the app decodes test results properly despite poor illumination conditions. According to this algorithm, before processing the test, the app reviews the image on the screen for the number of black pixels. If the number of these pixels is normal, it allows for scanning the test result. If the number of such black pixels is higher than required, the app notifies the user to launch a flash for better illumination. If the number of such black pixels is extremely high, the app notifies the user to try scanning the test in a place that is better illuminated.

3. Correcting phone tilt

Our specialists integrated the C++ library, called OpenCV, to ensure the most efficient test decoding. It provides an algorithm that compares the default QR code sample with the QR code scanned with a device. The app processes and compares the two images to determine the tilt of the device. As a result, the app determines the error caused by this tilt and considers it when processing the image of the scanned area with maximum accuracy. This custom solution can be integrated into any other image recognition app to make image decoding more efficient and convenient regardless of the conditions.

4. Defining the test area

The ABCloudz experts applied OpenCV algorithms to ensure that once the user points the camera at the test area of the cassette, test results will be captured. The key is to keep the test area visible to the camera. Our specialists also introduced a screen frame as a user interface component. Once the user takes a photo of the test area, the app’s ColorMap algorithm converts its main colors, namely gray with red stripes, into a green-blue color range. This provides a more efficient test area for decoding. The app’s algorithms compare various parts of the color massif according to RGB color definitions to identify the stripes that show test results.

5. Translating the test into a color range understandable to a machine

Decoding the stripes may be challenging if the colors are displayed of inferior quality. To solve this challenge, the ABCloudz team built an algorithm that compares the color of control test stripes with the colors of the surrounding stripes. It enables the app to determine the level of contrast between the stripes. If there is a 10% difference, the level of brightness is high. Medium brightness stands for the contrast range of 10-30%. If the contrast is higher than 30%, the app defines it as a low level of brightness. It uses this classification for eliminating shadows on the image and ensuring the highest possible definition. As a result, the image of the test area becomes perfectly readable for the app. These features provide for the most accurate image recognition possible.

6. Decoding test results

Finally, the ABCloudz team of experts established a software algorithm that perfectly understands the test logic. Once the app provides a high-definition image of the stripes, the algorithm instantly decodes it. This decoding allows it to tell whether a tested patient is ill or not and even determines the stage of the disease.

Rapid breaking results

Because of our effort, the client received an efficient image recognition app for the most precise and rapid decoding of COVID-19 test results.

One of the most remarkable things about the given app is that most of the above-mentioned stages are processed instantly and the user receives precise COVID-19 test results in less than 15 minutes. Also, the app saves all test results and stores them in a secure HIPAA, and GDPR-compliant database. This enables users, whether they be patients or doctors, to track the recovery process.

Get successful image recognition innovations

ABCloudz has all the required expertise for building the most efficient image recognition apps. With our custom development solutions, the software will capture and decode texts or images regardless of illumination, definition, tilt, or any other factors. Contact us to start building your own image recognition app now. Our engineers have solutions for any client and are ready to deliver state-of-the-art image recognition quality.

Ready to start the conversation?