In our eyes, Google’s software sees heart attack risk

By looking at the human eye, Google’s algorithms were able to predict whether someone had high blood pressure or was at risk of a heart attack or stroke, Google researchers said recently, opening a new opportunity for artificial intelligence in the vast and lucrative global health industry.

The algorithms didn’t outperform existing medical approaches such as blood tests, according to a study of the finding published in the journal Nature Biomedical Engineering. The work needs to be validated and repeated on more people before it gains broader acceptance, several outside physicians said.

But the new approach could build on doctors’ current abilities by providing a tool that people could one day use to quickly and easily screen themselves for health risks that can contribute to heart disease, the leading cause of death worldwide.

“This may be a rapid way for people to screen for risk,” Harlan Krumholz, a cardiologist at Yale University who was not involved in the study, wrote in an email. “Diagnosis is about to get turbo-charged by technology. And one avenue is to empower people with rapid ways to get useful information about their health.”

Google researchers fed images scanned from the retinas of more than 280,000 patients across the United States and United Kingdom into its intricate pattern-recognizing algorithms, known as neural networks. Those scans helped train the networks on which telltale signs tended to indicate long-term health dangers.

Medical professionals today can look for similar signs by using a device to inspect the retina, drawing the patient’s blood or assessing risk factors such as their age, gender, weight and whether they smoke. But no one taught the algorithms what to look for: Instead, the systems taught themselves, by reviewing enough data to learn the patterns often found in the eyes of people at risk.

The true power of this kind of technological solution is that it could flag risk with a fast, cheap and noninvasive test that could be administered in a range of settings, letting people know if they should come in for follow-up.

The research, one of an increasing number of conceptual health-technology studies, was conducted by Google and Verily Life Sciences, a subsidiary of Google’s parent Alphabet.

The idea that people’s eyes might reveal signs of underlying cardiovascular disease isn’t as outlandish as it might seem. Diabetes and high blood pressure, for example, can cause changes in the retina.

Krumholz cautioned that an eye scan isn’t ready to replace more conventional approaches. Maulik Majmudar, associate director of the Healthcare Transformation Lab at Massachusetts General Hospital, called the model “impressive” but noted that the results show how tough it is to make significant improvements in cardiovascular risk prediction. Age and gender are powerful predictors of risk, without the need for any additional testing.

Google’s algorithms approached the accuracy of current methods but were far from perfect. When presented images of the eyes of two different people - one who suffered a major adverse cardiac event such as a heart attack or stroke within five years of the photo and the other who did not - the algorithms could correctly pick the patient who fell ill 70 percent of the time.

Similar deep-learning technologies have exploded in the past five years and are widely used today in systems such as Google’s image search and Facebook’s facial recognition. They are also showing promise in other arenas of health, including by looking for signs of cancer in the X-ray scans reviewed by radiologists.

The Google researchers used similar machine-learning methods in 2016 to look for diabetic retinopathy, an eye disease that is a major cause of blindness. This time, they also used a machine-learning technique, known as “soft attention,” to help pinpoint which parts of the image were most instrumental in driving the algorithms’ prediction. One vulnerability of many neutral networks today is that it’s often unclear how or why they reached that conclusion - a “black box” problem that could undermine doctors’ or patients’ trust in the results.

The idea that the hallmarks of disease could be detected through computational analysis has been alluring to engineers. DeepMind, the London-based AI-development firm bought by Google in 2014 that often operates autonomously, released research earlier this month showing similar algorithms could help detect signs of glaucoma and other eye diseases.

Apple late last year launched a heart study tied to its Apple Watch to see if it could detect and alert people to irregular heart rhythms that could be a sign of atrial fibrillation, a leading cause of stroke.

Reader Comments ...

Next Up in Lifestyle

Austin’s Brett Tutor joins relaunch of TLC’s ‘Trading Spaces’
Austin’s Brett Tutor joins relaunch of TLC’s ‘Trading Spaces’

Change is in the air for Austinite Brett Tutor. Tutor, 32, is one of two new carpenters on the relaunch of the TLC show “Trading Spaces.” On April 7, when the first episode airs, he may become a household name. “The show hasn’t aired yet, so no one knows who I am,” he says. “Trading Spaces” aired from 2000...
Cat reunited with owner 14 years after hurricane disappearance
Cat reunited with owner 14 years after hurricane disappearance

Perry Martin probably can’t stop pondering about his cat. In 2004, the orange tabby Thomas 2, or simply just “T2,” disappeared. It happened when the Fort Pierce man moved into a friend’s house in Stuart after Hurricane Jeanne stormed through the area, according to TCPalm. The retired K-9 officer grieved, but then came...
What not to donate to Goodwill
What not to donate to Goodwill

Tuesday night an employee was injured when he reached his hand into a box at the Goodwill on Brodie Lane and found an artillery simulator. It made a loud boom heard throughout my neighborhood and sent the police, FBI and ATF to the area to investigate if it was another bomb. It made me think, what shouldn’t you donate...
University of Texas researcher tests app for colon cancer information
University of Texas researcher tests app for colon cancer information

Could an app help save you from colon cancer? Possibly. A researcher at Dell Medical School at the University of Texas at Austin has developed an app that can help explain to you the benefits of each type of colon cancer screening (there are many) and even help you schedule one. Dr. Michael Pignone, chairman of the department of internal medicine at...
Kerri Walsh Jennings launches volleyball, music and wellness festivals

Add Kerri Walsh Jennings to the list of celebs who landed in Austin recently during South by Southwest. The professional beach volleyball player, who has three gold and one bronze Olympic medals to her name, came to launch her new p1440 beach volleyball event series. The organization was a title sponsor of day parties at Waterloo Records. We intended...
More Stories