A research team from the University of Houston have created an human-machine interface that allowed a man to grasp a bottle and other objects with a prosthetic hand, powered only by his thoughts. The technique, demonstrated with a 56-year-old man whose right hand had been amputated, uses non-invasive brain monitoring, capturing brain activity to determine what parts of the brain are involved in grasping an object.
With that information, researchers created a computer program via an algorithm, or brain-machine interface, that harnessed the subject’s intentions and allowed him to successfully grasp objects, including a water bottle and a credit card. The subject grasped the selected objects 80 percent of the time using a high-tech bionic hand fitted to the amputee’s stump.
Previous studies involving either surgically implanted electrodes or myoelectric control, which relies upon electrical signals from muscles in the arm, have shown similar success rates, according to the researchers. The team state that their non-invasive method offers several advantages, it avoids the risks of surgically implanting electrodes by measuring brain activity via scalp electroencephalogram, or EEG. And myoelectric systems aren’t an option for all people, because they require that neural activity from muscles relevant to hand grasping remain intact. The study is published in Frontiers in Neuroscience.
The current study demonstrates for the first time EEG-based brain-machine interface control of a multi-fingered prosthetic hand for grasping by an amputee. It also could lead to the development of better prosthetics and cybernetics. Beyond demonstrating that prosthetic control is possible using non-invasive EEG, the researchers state that the study offers a new understanding of the neuroscience of grasping and will be applicable to rehabilitation for other types of injuries, including stroke and spinal cord injury.
The study subjects, five able-bodied, right-handed men and women, all in their 20s, as well as the amputee, were tested using a 64-channel active EEG, with electrodes attached to the scalp to capture brain activity. Brain activity was recorded in multiple areas, including the motor cortex and areas known to be used in action observation and decision-making, and occurred between 50 milliseconds and 90 milliseconds before the hand began to grasp. The team explain that this provided evidence that the brain predicted the movement, rather than reflecting it.
Current upper limb neuroprosthetics restore some degree of functional ability, but fail to approach the ease of use and dexterity of the natural hand, particularly for grasping movements, the researchers state, noting that work with invasive cortical electrodes has been shown to allow some hand control but not at the level necessary for all daily activities. The team add that the inherent risks associated with surgery required to implant electrodes, along with the long-term stability of recorded signals, is of concern.
The current study has shown that it is feasible to extract detailed information on intended grasping movements to various objects in a natural, intuitive manner, from a plurality of scalp EEG signals. Until now, this was thought to be possible only with brain signals acquired invasively inside or on the surface of the brain.
The researchers first recorded brain activity and hand movement in the able-bodied volunteers as they picked up five objects, each chosen to illustrate a different type of grasp. The objects were a soda can, a compact disc, a credit card, a small coin and a screwdriver. The recorded data were then used to create decoders of neural activity into motor signals, which successfully reconstructed the grasping movements.
The team then fitted the amputee subject with a computer-controlled neuroprosthetic hand and told him to observe and imagine himself controlling the hand as it moved and grasped the objects. The subject’s EEG data, along with information about prosthetic hand movements gleaned from the able-bodied volunteers, were used to build the algorithm.
The researchers state that additional practice, along with refining the algorithm, could increase the success rate to 100 percent.
Source: University of Houston
Michelle Petersen is the founder of Healthinnovations, having worked in the health and science industry for over 21 years, which includes tenure within the NHS and Oxford University. Healthinnovations is a publication that has reported on, influenced, and researched current and future innovations in health for the past decade.
Michelle has been picked up as an expert writer for Informa publisher’s Clinical Trials community, as well as being listed as a blog source by the world’s leading medical journals, including the acclaimed Nature-Springer journal series.
Healthinnovations is currently indexed by the trusted Altmetric and PlumX metrics systems, respectively, as a blog source for published research globally. Healthinnovations is also featured in the world-renowned BioPortfolio, BioPortfolio.com, the life science, pharmaceutical and healthcare portal.
Most recently the Texas A&M University covered The Top 10 Healthinnovations series on their site with distinguished Professor Stephen Maren calling the inclusion of himself and his team on the list a reflection of “the hard work and dedication of my students and trainees”.
Michelle Petersen’s copy was used in the highly successful marketing campaign for the mega-hit film ‘Jumanji: The Next Level, starring Jack Black, Karen Gilian, Kevin Hart and Dwayne ‘The Rock’ Johnson. Michelle Petersen’s copywriting was part of the film’s coverage by the Republic TV network. Republic TV is the most-watched English language TV channel in India since its inception in 2017.
An avid campaigner in the fight against child sex abuse and trafficking, Michelle is a passionate humanist striving for a better quality of life for all humans by helping to provide traction for new technologies and techniques within healthcare.