In our eyes, Google’s software sees heart attack risk


By looking at the human eye, Google’s algorithms were able to predict whether someone had high blood pressure or was at risk of a heart attack or stroke, Google researchers said recently, opening a new opportunity for artificial intelligence in the vast and lucrative global health industry.

The algorithms didn’t outperform existing medical approaches such as blood tests, according to a study of the finding published in the journal Nature Biomedical Engineering. The work needs to be validated and repeated on more people before it gains broader acceptance, several outside physicians said.

But the new approach could build on doctors’ current abilities by providing a tool that people could one day use to quickly and easily screen themselves for health risks that can contribute to heart disease, the leading cause of death worldwide.

“This may be a rapid way for people to screen for risk,” Harlan Krumholz, a cardiologist at Yale University who was not involved in the study, wrote in an email. “Diagnosis is about to get turbo-charged by technology. And one avenue is to empower people with rapid ways to get useful information about their health.”

Google researchers fed images scanned from the retinas of more than 280,000 patients across the United States and United Kingdom into its intricate pattern-recognizing algorithms, known as neural networks. Those scans helped train the networks on which telltale signs tended to indicate long-term health dangers.

Medical professionals today can look for similar signs by using a device to inspect the retina, drawing the patient’s blood or assessing risk factors such as their age, gender, weight and whether they smoke. But no one taught the algorithms what to look for: Instead, the systems taught themselves, by reviewing enough data to learn the patterns often found in the eyes of people at risk.

The true power of this kind of technological solution is that it could flag risk with a fast, cheap and noninvasive test that could be administered in a range of settings, letting people know if they should come in for follow-up.

The research, one of an increasing number of conceptual health-technology studies, was conducted by Google and Verily Life Sciences, a subsidiary of Google’s parent Alphabet.

The idea that people’s eyes might reveal signs of underlying cardiovascular disease isn’t as outlandish as it might seem. Diabetes and high blood pressure, for example, can cause changes in the retina.

Krumholz cautioned that an eye scan isn’t ready to replace more conventional approaches. Maulik Majmudar, associate director of the Healthcare Transformation Lab at Massachusetts General Hospital, called the model “impressive” but noted that the results show how tough it is to make significant improvements in cardiovascular risk prediction. Age and gender are powerful predictors of risk, without the need for any additional testing.

Google’s algorithms approached the accuracy of current methods but were far from perfect. When presented images of the eyes of two different people - one who suffered a major adverse cardiac event such as a heart attack or stroke within five years of the photo and the other who did not - the algorithms could correctly pick the patient who fell ill 70 percent of the time.

Similar deep-learning technologies have exploded in the past five years and are widely used today in systems such as Google’s image search and Facebook’s facial recognition. They are also showing promise in other arenas of health, including by looking for signs of cancer in the X-ray scans reviewed by radiologists.

The Google researchers used similar machine-learning methods in 2016 to look for diabetic retinopathy, an eye disease that is a major cause of blindness. This time, they also used a machine-learning technique, known as “soft attention,” to help pinpoint which parts of the image were most instrumental in driving the algorithms’ prediction. One vulnerability of many neutral networks today is that it’s often unclear how or why they reached that conclusion - a “black box” problem that could undermine doctors’ or patients’ trust in the results.

The idea that the hallmarks of disease could be detected through computational analysis has been alluring to engineers. DeepMind, the London-based AI-development firm bought by Google in 2014 that often operates autonomously, released research earlier this month showing similar algorithms could help detect signs of glaucoma and other eye diseases.

Apple late last year launched a heart study tied to its Apple Watch to see if it could detect and alert people to irregular heart rhythms that could be a sign of atrial fibrillation, a leading cause of stroke.



Reader Comments ...


Next Up in Lifestyle

Austin cyclist Andrew Willis wins national championship at 24-hour race
Austin cyclist Andrew Willis wins national championship at 24-hour race

Andrew Willis approaches the 400-mile mark at the 24 Hours in the Canyon race. Photo by Scott Thomas Chalk up another long-distance cycling accomplishment for Andrew Willis, who gets kicks out of riding a bicycle for hundreds of miles through fry-an-egg-on-a-sidewalk heat. Willis won the World Ultra Cycling Association’s National...
Austin cyclist Andrew Willis wins endurance national championship
Austin cyclist Andrew Willis wins endurance national championship

Andrew Willis approaches the 400-mile mark of the 24 Hours in the Canyon race. Photo by Scott Thomas Chalk up another long-distance cycling accomplishment for Andrew Willis, who gets kicks out of riding a bicycle for hundreds of miles through fry-an-egg-on-a-sidewalk heat. Willis won the World Ultra Cycling Association’s National...
Austin group gives people in hospice one last wish of a little thing
Austin group gives people in hospice one last wish of a little thing

It’s the Wendy’s Frosty delivered to a person with cancer who can no longer tolerate other foods. The trips to the zoo with a grandfather who loved animals before he is no longer well enough to go with his family. A robotic cat that brings comfort for a person with Alzheimer’s disease who is missing the cat that couldn’t come...
Ballet Austin teen Matilda Solis featured in PBS Kids show ‘Kid Stew’
Ballet Austin teen Matilda Solis featured in PBS Kids show ‘Kid Stew’

Matilda Solis, 15, started dancing with Ballet Austin when she was 9. The Austin teen recently was featured on the new PBS Kids show “Kid Stew,” which is about kids, for kids and made by kids. Her episode premiered earlier this month and will re-air June 30. Solis now is studying ballet as a student at the School of American Ballet in New...
Does ‘Incredibles 2’ get the work-life balance wrong?
Does ‘Incredibles 2’ get the work-life balance wrong?

The family in “Incredibles” are just like us. OK, yes, they are superheroes with super powers, but they, too struggle with the work-life balance. In the first movie, Mr. Incredible (voiced by Craig T. Nelson) struggles with a job he hates, but it pays the bills. He’s looking for excitement, for the glory days of saving the day. His...
More Stories