Researchers begin to hone cybernetics by mapping grip pressure variance in the brain.


For designers of brain-computer interfaces, whose goal is to translate neural patterns into commands for a prosthetic device, it may be important to know that the emerging plan to execute a ‘power grip’ style, for example, may look different when the object is a hammer versus a soda can. Because the information is distributed across many neurons in a local network, it is possible to have many of these special object-action representations together.

Many groups have looked at encoding of different grips and different hand positions. Typically what’s studied is the relationship between a single object and a grip associated with it. What had not been done before is to investigate how the brain can formulate different grips on the same object or the same grip on different objects.

Now, a study from researchers at Brown University has found that neurons in the area of the brain responsible for planning grasping motions retain information about the object to be gripped as they make their movement plan. The collective neural activity therefore looks different when executing the same grip on one object versus another. This may help the brain design unique patterns when similar actions are performed in different environments.  The findings show  that the brain has many ways to formulate a grip command, and those seem to be influenced by what it’s gripping, each giving off a different brain pattern.

The team state that researchers now have a firmer grasp on the way the brain formulates commands for the hand to grip an object. They go on to add that the advance could lead to improvements in future brain-computer interfaces that provide people with severe paralysis a means to control robotic arms and hands using their thoughts.  The study is published in the Journal of Neuroscience.

The current study recording and analyzed the neural activity in the ventral premotor cortex of three trained rhesus macaques as they participated in a series of grip tasks. Over about a five-second span, the researchers would present one of two different objects. Then they’d show a red or yellow light to signal which of two different grips to use for each, and then flash a green light to signal that the grip should begin. After analysis, the researchers were able to observe how the patterns of neural activity were changing at each stage of each task.

The team used an analysis technique they developed, called SSIMS which can accurately detect patterns of activity in collections of neurons without relying on any assumptions about what the brain is trying to do. The patterns can be distinguished based on differences in their activity and can cluster together based on their similarities without the researchers imposing their own view of events going on in the task.

The data findings show that neurons in the ventral premotor cortex follow patterns that differentiate objects and actions. They began to show distinct, identifiable patterns of activity as soon as the object was presented, however, the animal knew how it was supposed to grasp that object. By the time grips were actually made, the patterns had become so distinct that all four object-grip combinations could be distinctly identified with about 95 percent accuracy.  This allowed the researchers to subdivide the neural activity patterns into groups that correspond to the basic grips and objects.

The results of the study demonstrate that objects have a significant effect on the evolution of the grip plan. That the brain can produce a variety of activity patterns and still arrive at an appropriate grip plan suggests that it is flexible enough to handle a wide variety of object contexts and can do so with a local network of neurons.

The lab state that the plan to grip an object evolves well in advance of actual execution. Early interpretation of grip planning, including accounting for the distinctive form that plans take in the context of different object, could allow a brain computer interface decoder to get a motion command to a prosthesis more quickly and accurately with information about what is to be gripped.

For the future the team are continuing with experiments to determine how well the findings can be generalized to a wider variety of objects and how much the structure of the experiments, and training, affects the neural patterns.  The researchers conclude they are optimistic that the findings could ultimately have direct application to improving brain-computer interface design and performance for patients with severe paralysis.

Source:  Brown University

 

A trajectory visualization shows how neural patterns associated with planning grips of different objects converged and diverged as the experimental task proceeded.  Credit:  Donoghue Lab/Brown University.

A trajectory visualization shows how neural patterns associated with planning grips of different objects converged and diverged as the experimental task proceeded. Credit: Donoghue Lab/Brown University.

 

 

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s