The same devices used to take selfies are being repurposed and commercialized for quick access to information needed for monitoring patient health. A fingertip pressed against a phone’s camera lens can measure a heart rate. The microphone, kept by the bedside, can screen for sleep apnea.

In the best of this new world, the data is conveyed remotely to a medical professional for the convenience and comfort of the patient — all without the need for costly hardware.

But using smartphones as diagnostic tools remains a work in progress. Although doctors and their patients have found some real-world success, experts said their overall potential remains unfulfilled and uncertain.

Smartphones come packed with sensors capable of monitoring a patient’s vital signs. They can help assess people for concussions, watch for atrial fibrillation and conduct mental health wellness checks, to name the uses of a few nascent applications.

Eager companies and researchers are tapping into phones’ built-in cameras and light sensors; microphones; accelerometers, which detect body movements; gyroscopes; and even speakers. The apps then use artificial intelligence software to analyze the collected sights and sounds to create an easy connection between patients and physicians. In 2021, more than 350,000 digital health products were available in app stores, according to a Grand View Research report.

“It’s very hard to put devices into the patient home or in the hospital, but everybody is just walking around with a cellphone that has a network connection,” said Andrew Gostine, a physician and CEO of the sensor network company Artisight. Most Americans own a smartphone, including more than 60 percent of people 65 and over, according to the Pew Research Center. The pandemic has also made people more comfortable with virtual care.

The makers of some of these products have sought clearance from the Food and Drug Administration to market them as medical devices. Others have been designated as exempt from the regulatory process, placed in the same clinical classification as a Band-Aid. But how the agency handles AI and machine learning-based medical devices is still being adjusted to reflect software’s adaptive nature.

Ensuring accuracy and clinical validation is crucial to securing buy-in from health-care providers. And many tools still need fine-tuning, said Eugene Yang, a clinical professor of medicine at the University of Washington.

Judging these new technologies is difficult because they rely on algorithms built by machine learning and artificial intelligence to collect data, rather than the physical tools typically used in hospitals. So researchers cannot “compare apples to apples” with medical industry standards, Yang said. Failure to build in such assurances can undermine the technology’s goals of easing costs and access because a doctor still must verify results, he added.

Big tech companies such as Google have heavily invested in the area, catering to clinicians and in-home caregivers, as well as consumers. Currently, Google Fit app users can check their heart rate by placing their finger on the rear-facing camera lens or track their breathing rate using the front-facing camera.

Google’s research uses machine learning and computer vision, a field within AI based on information from visual inputs such as videos or images. So instead of using a blood pressure cuff, for example, the algorithm can interpret slight visual changes to the body that serve as proxies and biosignals for blood pressure, said Shwetak Patel, director of health technologies at Google and a professor of electrical and computer engineering at the University of Washington.

Google is also investigating the effectiveness of its smartphone’s built-in microphone for detecting heartbeats and murmurs and using the camera to preserve eyesight by screening for diabetic eye disease, according to information the company published in 2022.

The tech giant recently bought Sound Life Sciences, a Seattle start-up company with an FDA-cleared sonar technology app. It uses a smart device’s speaker to bounce inaudible pulses off a patient’s body to identify movement and monitor breathing., based in Israel, is also using the smartphone camera to calculate vital signs. Its software studies the region around the eyes and analyzes the light reflecting off blood vessels back to the lens, company spokesperson Mona Popilian-Yona said.

Applications even reach into disciplines such as optometry and mental health:

  • With the microphone, Canary Speech uses the same underlying technology as Amazon’s Alexa to analyze patients’ voices for mental health conditions. The software can integrate with telemedicine appointments and allow clinicians to screen for anxiety and depression using a library of vocal biomarkers and predictive analytics, said Henry O’Connell, the company’s CEO.
  • Australia-based ResApp Health got FDA clearance in 2022 for an iPhone app that screens for moderate to severe obstructive sleep apnea by listening to breathing and snoring. SleepCheckRx, which will require a prescription, is minimally invasive compared with sleep studies now used to diagnose sleep apnea.
  • Brightlamp’s Reflex app is a clinical decision support tool for helping manage concussions and vision rehabilitation, among other things. Using an iPad’s or iPhone’s camera, the mobile app measures how a person’s pupils react to changes in light. Through machine learning analysis, the imagery gives practitioners data points for evaluating patients. Brightlamp sells directly to health-care providers and is being used in more than 230 clinics. Clinicians pay a $400 standard annual fee per account, which is not covered by insurance. The Defense Department has an ongoing clinical trial using Reflex.

In some cases, such as with the Reflex app, data is processed directly on the phone — rather than in the cloud, Brightlamp CEO Kurtis Sluss said. By processing everything on the device, the app avoids running into privacy issues, as streaming data elsewhere requires patient consent.

But algorithms need to be trained and tested by collecting reams of data, and that is an ongoing process.

Researchers, for example, have found that some computer vision applications, including some for heart rate and blood pressure monitoring, can be less accurate for darker skin. Studies are underway to find better solutions.

“We’re not there yet,” Yang said. “That’s the bottom line.”

This article was produced by Kaiser Health News, a program of the Kaiser Family Foundation, an endowed nonprofit organization that provides information on health issues to the nation.

More health and wellness stories

Read More