Skip to content

Researchers begin to hone cybernetics by mapping grip pressure variance in the brain.

For designers of brain-computer interfaces, whose goal is to translate neural patterns into commands for a prosthetic device, it may be important to know that the emerging plan to execute a ‘power grip’ style, for example, may look different when the object is a hammer versus a soda can. Because the information is distributed across many neurons in a local network, it is possible to have many of these special object-action representations together.

Many groups have looked at encoding of different grips and different hand positions. Typically what’s studied is the relationship between a single object and a grip associated with it. What had not been done before is to investigate how the brain can formulate different grips on the same object or the same grip on different objects.

Now, a study from researchers at Brown University has found that neurons in the area of the brain responsible for planning grasping motions retain information about the object to be gripped as they make their movement plan. The collective neural activity therefore looks different when executing the same grip on one object versus another. This may help the brain design unique patterns when similar actions are performed in different environments.  The findings show  that the brain has many ways to formulate a grip command, and those seem to be influenced by what it’s gripping, each giving off a different brain pattern.

The team state that researchers now have a firmer grasp on the way the brain formulates commands for the hand to grip an object. They go on to add that the advance could lead to improvements in future brain-computer interfaces that provide people with severe paralysis a means to control robotic arms and hands using their thoughts.  The study is published in the Journal of Neuroscience.

The current study recording and analyzed the neural activity in the ventral premotor cortex of three trained rhesus macaques as they participated in a series of grip tasks. Over about a five-second span, the researchers would present one of two different objects. Then they’d show a red or yellow light to signal which of two different grips to use for each, and then flash a green light to signal that the grip should begin. After analysis, the researchers were able to observe how the patterns of neural activity were changing at each stage of each task.

The team used an analysis technique they developed, called SSIMS which can accurately detect patterns of activity in collections of neurons without relying on any assumptions about what the brain is trying to do. The patterns can be distinguished based on differences in their activity and can cluster together based on their similarities without the researchers imposing their own view of events going on in the task.

The data findings show that neurons in the ventral premotor cortex follow patterns that differentiate objects and actions. They began to show distinct, identifiable patterns of activity as soon as the object was presented, however, the animal knew how it was supposed to grasp that object. By the time grips were actually made, the patterns had become so distinct that all four object-grip combinations could be distinctly identified with about 95 percent accuracy.  This allowed the researchers to subdivide the neural activity patterns into groups that correspond to the basic grips and objects.

The results of the study demonstrate that objects have a significant effect on the evolution of the grip plan. That the brain can produce a variety of activity patterns and still arrive at an appropriate grip plan suggests that it is flexible enough to handle a wide variety of object contexts and can do so with a local network of neurons.

The lab state that the plan to grip an object evolves well in advance of actual execution. Early interpretation of grip planning, including accounting for the distinctive form that plans take in the context of different object, could allow a brain computer interface decoder to get a motion command to a prosthesis more quickly and accurately with information about what is to be gripped.

For the future the team are continuing with experiments to determine how well the findings can be generalized to a wider variety of objects and how much the structure of the experiments, and training, affects the neural patterns.  The researchers conclude they are optimistic that the findings could ultimately have direct application to improving brain-computer interface design and performance for patients with severe paralysis.

Source:  Brown University


A trajectory visualization shows how neural patterns associated with planning grips of different objects converged and diverged as the experimental task proceeded.  Credit:  Donoghue Lab/Brown University.
A trajectory visualization shows how neural patterns associated with planning grips of different objects converged and diverged as the experimental task proceeded. Credit: Donoghue Lab/Brown University.




Healthinnovations View All

Michelle Petersen is the founder of Healthinnovations, having worked in the health and science industry for over 21 years, which includes tenure within the NHS and Oxford University. Healthinnovations is a publication that has reported on, influenced, and researched current and future innovations in health for the past decade.

Michelle has been picked up as an expert writer for Informa publisher’s Clinical Trials community, as well as being listed as a blog source by the world’s leading medical journals, including the acclaimed Nature-Springer journal series.

Healthinnovations is currently indexed by the trusted Altmetric and PlumX metrics systems, respectively, as a blog source for published research globally. Healthinnovations is also featured in the world-renowned BioPortfolio,, the life science, pharmaceutical and healthcare portal.

Most recently the Texas A&M University covered The Top 10 Healthinnovations series on their site with distinguished Professor Stephen Maren calling the inclusion of himself and his team on the list a reflection of “the hard work and dedication of my students and trainees”.

Michelle Petersen’s copy was used in the highly successful marketing campaign for the mega-hit film ‘Jumanji: The Next Level, starring Jack Black, Karen Gilian, Kevin Hart and Dwayne ‘The Rock’ Johnson. Michelle Petersen’s copywriting was part of the film’s coverage by the Republic TV network. Republic TV is the most-watched English language TV channel in India since its inception in 2017.

An avid campaigner in the fight against child sex abuse and trafficking, Michelle is a passionate humanist striving for a better quality of life for all humans by helping to provide traction for new technologies and techniques within healthcare.

Leave a Reply

Translate »