How do you intuitively know that you can walk on a footpath and swim in a lake? Researchers from the University of Amsterdam have discovered unique brain activations that reflect how we can move our bodies through an environment. The study not only sheds new light on how the human brain works, but also shows where artificial intelligence is lagging behind. According to the researchers, AI could become more sustainable and human-friendly if it incorporated this knowledge about the human brain.

When we see a picture of an unfamiliar environment -- a mountain path, a busy street, or a river -- we immediately know how we could move around in it: walk, cycle, swim or not go any further. That sounds simple, but how does your brain actually determine these action opportunities?

PhD student Clemens Bartnik and a team of co-authors show how we make estimates of possible actions thanks to unique brain patterns. The team, led by computational neuroscientist Iris Groen, also compared this human ability with a large number of AI models, including ChatGPT. "AI models turned out to be less good at this and still have a lot to learn from the efficient human brain," Groen concludes.

Viewing images in the MRI scanner

Using an MRI scanner, the team investigated what happens in the brain when people look at various photos of indoor and outdoor environments. The participants used a button to indicate whether the image invited them to walk, cycle, drive, swim, boat or climb. At the same time, their brain activity was measured.

"We wanted to know: when you look at a scene, do you mainly see what is there -- such as objects or colors -- or do you also automatically see what you can do with it," says Groen. "Psychologists call the latter "affordances" -- opportunities for action; imagine a staircase that you can climb, or an open field that you can run through."

Unique processes in the brain

The team discovered that certain areas in the visual cortex become active in a way that cannot be explained by visible objects in the image. "What we saw was unique," says Groen. "These brain areas not only represent what can be seen, but also what you can do with it." The brain did this even when participants were not given an explicit action instruction. 'These action possibilities are therefore processed automatically," says Groen. "Even if you do not consciously think about what you can do in an environment, your brain still registers it."

The research thus demonstrates for the first time that affordances are not only a psychological concept, but also a measurable property of our brains.

What AI doesn't understand yet

The team also compared how well AI algorithms -- such as image recognition models or GPT-4 -- can estimate what you can do in a given environment. They were worse at predicting possible actions. "When trained specifically for action recognition, they could somewhat approximate human judgments, but the human brain patterns didn't match the models' internal calculations," Groen explains.

"Even the best AI models don't give exactly the same answers as humans, even though it's such a simple task for us," Groen says. "This shows that our way of seeing is deeply intertwined with how we interact with the world. We connect our perception to our experience in a physical world. AI models can't do that because they only exist in a computer."

AI can still learn from the human brain

The research thus touches on larger questions about the development of reliable and efficient AI. "As more sectors -- from healthcare to robotics -- use AI, it is becoming important that machines not only recognize what something is, but also understand what it can do," Groen explains. "For example, a robot that has to find its way in a disaster area, or a self-driving car that can tell apart a bike path from a driveway."

Groen also points out the sustainable aspect of AI. "Current AI training methods use a huge amount of energy and are often only accessible to large tech companies. More knowledge about how our brain works, and how the human brain processes certain information very quickly and efficiently, can help make AI smarter, more economical and more human-friendly."

Read more …Affordances in the brain: The human superpower AI hasn’t mastered

Insulin resistance detected by routine triglyceride-glucose (TyG) index can flag people with early Alzheimer's who are four times more likely to present rapid cognitive decline, according to new research presented at the European Academy of Neurology (EAN) Congress 2025.1

Neurologists at the University of Brescia reviewed records for 315 non-diabetic patients with cognitive deficits, including 200 with biologically confirmed Alzheimer's disease. All subjects underwent an assessment of insulin resistance using the TyG index and a clinical follow-up of 3 years. When patients were divided according to TyG index, those in the highest third of the Mild Cognitive Impairment AD subgroup deteriorated far more quickly than their lower-TyG peers, losing >2.5 points on the Mini Mental State Examination per year (hazard ratio 4.08, 95% CI 1.06-15.73). No such link appeared in the non-AD cohort.

"Once mild cognitive impairment is diagnosed, families always ask how fast it will progress," said lead investigator Dr. Bianca Gumina. "Our data show that a simple metabolic marker available in every hospital laboratory can help identify more vulnerable subjects who may be suitable candidates for targeted therapy or specific intervention strategies."

While insulin resistance has been linked to the onset of Alzheimer's disease, its role in how quickly the condition progresses has received less attention. This study aimed to fill that gap by focusing on its impact during the prodromal mild cognitive impairment (MCI) stage, when patients follow highly variable trajectories. The researchers used the TyG index, which offers a low-cost, routinely available surrogate for insulin resistance, to explore whether metabolic dysfunction could help predict the pace of cognitive decline after diagnosis.

In AD specifically, insulin resistance is believed to impair neuronal glucose uptake, promote amyloid accumulation, disrupt the blood-brain barrier, and fuel inflammation - pathways that are less relevant or differently regulated in other neurodegenerative diseases.

"We were surprised to see the effect only in the Alzheimer's spectrum and not in other neurodegenerative diseases," Dr. Gumina noted. "It suggests a disease-specific vulnerability to metabolic stress during the prodromal window, when interventions may still change the trajectory."

The researchers at University of Brescia, led by Professor Padovani and Professor Pilotto, found that high TyG was also associated with blood-brain barrier disruption and cardiovascular risk factors, yet it showed no interaction with the APOE ε4 genotype, indicating that metabolic and genetic risks may act through distinct pathways.2

Identifying high-TyG patients could refine enrolment for anti-amyloid or anti-tau trials and prompt earlier lifestyle or pharmacological measures to improve insulin sensitivity. The researchers are currently investigating whether TyG levels also track with neuroimaging biomarkers to aid earlier detection and stratification.

"If targeting metabolism can delay progression, we will have a readily modifiable target that works alongside emerging disease-modifying drugs," concluded Dr. Gumina.

References:

  1. Gumina B., Galli A., Tolassi C. et al. The Triglyceride-Glucose Index as Predictor of Cognitive Decline in Alzheimer's Spectrum Disorders. Presented at the European Academy of Neurology (EAN) Congress 2025; 23 June 2025; Helsinki, Finland.
  2. Padovani A., Galli A., Bazzoli E., et al. (2025). The role of insulin resistance and APOE genotype on blood-brain barrier integrity in Alzheimer's disease. Alzheimer's & Dementia. Advance online publication. https://doi.org/10.1002/alz.14556[1]
Read more …The common blood test that predicts how fast Alzheimer’s hits

More Articles …