A Key Detail in Your Retina Could Indicate How Healthy Your Brain Is

Alzheimer’s is an insidious brain disease marked by a slow mental decline that can develop unnoticed for decades before symptoms arise, but hidden signs of the condition might exist much sooner.

 

New research suggests that the thinning of a person’s retina – the light-sensitive tissue that lines the back of the eye – in middle age is linked to cognitive performance in their early and adult life. 

While much more research is needed, the team behind this new study says the findings might one day pave the way towards a simple eye test that could help predict a person’s risk for conditions such as Alzheimer’s disease, the most common form of dementia.

“Given we haven’t been able to treat advanced Alzheimer’s, and that the global prevalence of the disease is increasing, being able to identify people in the preclinical stage, when we may still have the chance to intervene, is really important,” says health researcher Ashleigh Barrett-Young from the University of Otago, New Zealand.

People with Alzheimer’s often live with visual impairments that might contribute to mental confusion, disorientation, and social withdrawal – all symptoms that, along with memory loss, disrupt the daily lives of millions of people living with the disease worldwide.

This is not the first time, though, that scientists have suggested the eyes could be a window into the brain. Over a decade ago, researchers found amyloid-beta proteins, the hallmark of Alzheimer’s, in the retinas of people with the disease, and subsequent eye imaging studies revealed Alzheimer’s patients had thinner retinas, too.  

 

A 2018 study also found strong links between Alzheimer’s disease and three common eye conditions, including glaucoma and macular degeneration.

While such observed associations are intriguing, the risk factors for Alzheimer’s are many and varied, so for now, any links between Alzheimer’s and eye health are still under intense investigation.

In the new study, researchers crunched data from the long-running Dunedin Study, which has followed the lives of over 1,000 babies born in the early 1970s at one hospital in New Zealand since their birth.

Five decades later, Barrett-Young and colleagues selected for their analysis a subgroup of 865 adults who had had eye scans at the age of 45 years, along with a battery of neuropsychology tests in adulthood and early childhood, as part of the Dunedin experiment.

The thickness of two different parts of the retina (retinal nerve fiber layers and ganglion cell layers), were measured on the scans.

Analysis showed that participants in the study with thinner retinal layers scored lower on cognitive performance tests, both as adults, and back when they were children.

However, no associations were found between retinal thinning and an overall decline in cognitive performance (between childhood and middle age) that might indicate something is afoot in the brain.

 

While thinner retinal nerve fiber layers at 45 were linked to a decline in brain processing speeds since childhood, that might just be a sign of general aging, and not necessarily linked to Alzheimer’s disease. 

“The findings suggest that [retinal thickness] could be an indicator of overall brain health,” says Barrett-Young, who led the study.

Whether an eye test for predicting a disease as complex and insidious as Alzheimer’s will ever be possible remains unknown. However, a number of previous studies including people with dementia, not just healthy adults, have suggested retinal thinning may precede cognitive decline and dementia diagnoses.

But this is a relatively new field and results have been mixed. More research is needed to tease apart the order of events, to see whether retinal thinning actually precedes Alzheimer’s onset, if the changes are secondary symptoms of the disease, or simply reflect aging or other lifestyle factors. All are possibilities.

Despite the odds, researchers clearly think investigating retinal thinning as a biomarker of cognitive change is worth pursuing, given what’s been found so far, and the growing burden of Alzheimer’s disease.

Considering routine eye tests are less expensive than the brain imaging scans that are commonly used to investigate brain health, they would be a cost-effective alternative to monitor for changes in brain health over time – if future studies stack up.

“In the future,” says Barrett-Young, “these findings could result in [artificial intelligence] being used to take a typical optical coherence tomography scan, done at an optometrist, and combine it with other health data to determine your likely risk for developing Alzheimer’s.”

The study was published in JAMA Ophthalmology.