Life-changing injuries or life long disabilities can be extremely challenging and overwhelming at times, with researchers working around the clock to improve life for the newly or long-termed disabled. A brain-Computer Interface (BCI) conjoins the brain to artificial intelligence (AI), recording brainwaves to enable communication or to control a neuroprosthesis. This technology is now being widely used, however, there is still a lot of room for improvement with key biological and engineering problems remaining to be resolved. In regards to communication by individuals who have impaired function, these hurdles include low-quality recordings by home users, low translation speed, accuracy of translation and adapting applications to the needs of the user. Now, a study from researchers at the University of California, San Francisco develops an algorithm that can turn brain activity into text in realtime. The team states they use an electrocorticogram (ECoG), an electrical monitoring technique that records activity in the cerebral cortex via electrodes placed directly on the exposed brain, to detect and decode neural patterns into text while the person is speaking out loud. The opensource study is published in the journal Nature Neuroscience.
Previous studies show that even though BCIs have developed rapidly in the past decade they are still not achieving large-scale translation of brainwaves-to-text in realtime. For instance, when a BCI is paired with a virtual keyboard to produce text the word rate can still be limited to one finger typing. Most recently direct decoding of limited spoken speech from disabled patients using BCIs has resulted in either isolated syllables or 4–8 words, and in the case of continuous speech, only 40% accuracy has been achieved. The current study investigates whether an ECoG can be used to decode brainwaves into text for people speaking out loud who are capable of 250-word vocabularies.
The current study consists of four participants who had electrodes implanted in their brains previously to monitor their epilepsy. The participants were asked to repeat a set of 30-50 sentences multiple times, whilst their neural activity was tracked using an ECoG. Results show that after the data was fed through machine learning AI the average error rate was as low as 3%. Data findings show decoding is improved with transfer learning, a technique where AI learns to apply data from one task to another different task, thus improving its own performance.
The lab explains their system uses only 40 minutes of data, as opposed to the millions of hours normally required, for transfer learning with each participant to achieve an impressive level of accuracy never before seen. They go on to stress the system is not yet ready to use with severely disabled patients as it needs to interpret sound, albeit limited, to text.
The team surmises they have developed an algorithm that can translate brain activity into text using data from ECoG. For the future, the researchers state it is hoped that their system could one day translate brainwaves without the use of sound aiding patients who are unable to speak or type.
Source: The Guardian
Get Healthinnovations delivered to your inbox.
Michelle Petersen is the founder of Healthinnovations, having worked in the health and science industry for over 21 years, which includes tenure within the NHS and Oxford University. Healthinnovations is a publication that has reported on, influenced, and researched current and future innovations in health for the past decade.
Michelle has been picked up as an expert writer for Informa publisher’s Clinical Trials community, as well as being listed as a blog source by the world’s leading medical journals, including the acclaimed Nature-Springer journal series.
Healthinnovations is currently indexed by the trusted Altmetric and PlumX metrics systems, respectively, as a blog source for published research globally. Healthinnovations is also featured in the world-renowned BioPortfolio, BioPortfolio.com, the life science, pharmaceutical and healthcare portal.
Most recently the Texas A&M University covered The Top 10 Healthinnovations series on their site with distinguished Professor Stephen Maren calling the inclusion of himself and his team on the list a reflection of “the hard work and dedication of my students and trainees”.
Michelle Petersen’s copy was used in the highly successful marketing campaign for the mega-hit film ‘Jumanji: The Next Level, starring Jack Black, Karen Gilian, Kevin Hart and Dwayne ‘The Rock’ Johnson. Michelle Petersen’s copywriting was part of the film’s coverage by the Republic TV network. Republic TV is the most-watched English language TV channel in India since its inception in 2017.
An avid campaigner in the fight against child sex abuse and trafficking, Michelle is a passionate humanist striving for a better quality of life for all humans by helping to provide traction for new technologies and techniques within healthcare.