While wearables are at the forefront of medical applications for mobile technology, COVID-19 has sent researchers looking at older technologies like the humble smartphone for remote diagnosis.
Microsoft Research has been working on telehealth applications via a smartphone for over a decade, but the current pandemic presents new opportunities that make the approach more relevant than it was even a year ago.
Telehealth has held promise in healthcare since the advent of video but smartphones, better cameras and artificial intelligence might produce a more workable an answer at time when it’s difficult for people to visit medical facilities due to the pandemic.
As the Center for Disease Control and Prevention (CDC) noted in June, telehealth can help provide patients with care while minimizing the risk patients passing on of SARS-CoV-2 — the virus that causes COVID-19 — to doctors and nurses.
Microsoft points out that atrial fibrillation (AFib) is a tell-tale sign of heart disease. If it’s detected early enough, doctors can tell more patients how to reduce the risk of having a stroke.
Using a smartphone that can do the analysis on the device has the potential to knock down existing barriers to conducting physiological sensing that otherwise requires a patient going to a medical center to get measurements from scarce medical equipment.
With the aid of artificial intelligence, smartphones can address the mismatch between access to medical hardware and the need for continual assessments of a person’s heart.
As Microsoft explains, the smartphone can play a role in the “measurement of very subtle changes in the appearance of the body across time”, which are often imperceptible to a even doctor’s eye. This would allow a medical professional to acquire key physiological information about a person directly from the user’s smartphone.
How it works: when ambient light in a room hits your body, some is absorbed and some is reflected: a smartphone camera can pick up this reflected light, and the changes in pixel intensities over time can be used to understand the underlying source – like changes in pulse and respiration.
“Using optical models grounded in our knowledge of these physiological processes, a video of a person can be processed to determine their pulse rate, respiration, and even the concentration of oxygen in their blood,” Microsoft said.
To achieve this, Microsoft Research, the University of Washington, and OctoML worked together to create a video-based optical cardiopulmonary vital sign measurement that can be handled entirely on a smartphone.
The smarts of the system is a convolutional attention network, a neural network design that’s adept at classifying images. The network is called MTTS-CAN and enable real-time cardio-pulmonary measurements on smartphones with “state-of-the-art accuracy”
Microsoft and company detail the key challenge for telehealth in its current state and in light of additional pressures due to COVID-19 in a new paper.
“Despite the longstanding promise of telehealth, it is difficult to provide a similar level of care on a video call as during an in-person visit. The physician can diagnose a patient based on visual observations and self-reported symptoms; however, in most cases they cannot objectively assess the patient’s physiological state,” the researchers state in a paper.
“Ubiquitous sensing could help transform how telehealth is conducted, and it could also contribute to establishing telehealth as a mainstream form of healthcare,” it said.