A new understanding of how the human brain controls the hands

A new understanding of how the human brain controls the hands
Credit: AI-generated image (disclaimer)

Understanding how the brain controls certain actions—such picking up a knife in the correct way—is important for many reasons. One of these is working towards the development of brain-computer interfaces that may help people with artificial limbs control them, using their minds.

Yet how the human brain controls our hands to correctly grasp 3D objects, such as tools, is not well understood. In a recent study, my colleagues and I wanted to find out whether we could use signals from specific parts of the brain to distinguish whether people were handling tools correctly—for example grasping a knife by the handle rather than the blade.

Now, we've published the first investigation into whether the human brain automatically processes 3D objects in terms of how we grasp them for use. And, as we have seen, our new discovery could have important implications for health and society, too.

Tool use—such as using a knife—is considered a defining feature of our species. Its emergence is believed to be a critical step in the evolution of primates, even thought to delineate the appearance of Homo sapiens. The unique ability of humans to design, manufacture and use tools is unsurpassed across the animal kingdom.

Most studies that investigate the brain mechanisms behind record while people view images of tools or hands, and not when people perform actual movements with tools. Recording brain activity while using tools is really challenging, because the space inside a magnetic resonance imaging (MRI) scanner is small and the participants need to stay really still.

But perceiving an image is quite different to acting on a 3D . Even though you can recognize tools in images, you'd never try to grasp or use the picture of a .

Seeing images of tools activates different regions of the brain to when we see other kinds of objects such as chairs. Until now it was assumed this was an evolutionary trait that developed to optimize the processing of the hand actions associated with tools.

We used a MRI scanner to collect brain imaging data while participants interacted with 3D objects. Using a special imaging technique called functional MRI (fMRI) we measured brain activity by extracting the pattern of blood flow changes in certain brain regions.

To allow real hand actions to be studied in the limited confinement of the MRI scanner, we used a one-of-a-kind real action set-up for presenting 3D tools and other objects. Our participants lay in the dark, on a custom-built bed with a revolving table mounted above their waist, so we could show them 3D objects and they could grasp them.

Plastic tools

We designed and 3D-printed everyday kitchen tools such as a plastic knife, pizza cutter and a spoon as well as another group of 3D-printed bars to represent items that were not tools, which we used as control objects.

We scanned the brains of 20 volunteers at the Norfolk and Norwich University Hospital in two separate sessions. In the first session, participants were asked to grasp the 3D tools and other 3D objects correctly or incorrectly. The same participants returned to the scanner for a second session in which they simply looked at pictures of tools, hands and other objects (no hand movements were performed).

We studied brain activity when participants viewed pictures of tools and hands to identify which parts of the brain are the areas that respond to images of hands.

We then used machine learning to decode activity in the brain to test if we could predict whether people grasped a tool by its handle or not. This is important because knowing not to grasp an object—like a knife, by its blade—is critical to successful tool-use.

In contrast to what most scientists might have expected, we were able to predict whether a tool was grasped correctly from the signals of brain areas that respond to the sight of pictures of hands and not from visual areas that respond to the sight of pictures of tools.

Importantly, the signals from the areas of the brain that process images of hands could only be used to predict hand actions with tools but could not predict matched actions with the control 3D bar objects. This suggests the visual hand areas are specially attuned for actions with tools.

Hand control

This discovery changes our fundamental understanding of how the brain controls our hands.

The emergence of hand-held tools marked the beginning of a major separation between humans and our closest primate relatives. Our findings could help us understand the brain regions that specifically evolved in the .

In the future, we could use the activity of visual hand areas to improve brain-driven interfaces and prosthetics or predict the recovery of people with brain injury. Our research could even be used to allow people without limbs to control prosthetics with their minds.

I'm fascinated by how the brain controls the body and interacts with the world. In the future, I hope to expand these findings by further studying the mechanisms involved in using 3D objects—in both healthy individuals and people with neurological conditions.

Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation: A new understanding of how the human brain controls the hands (2021, May 11) retrieved 26 April 2024 from https://medicalxpress.com/news/2021-05-human-brain.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Research team reads minds to understand human tool use

1 shares

Feedback to editors