Study develops 4D glasses which allow wearers to be touched by a movie.
A person perceives the world around them through multiple senses. Concurrent impulses of signals in different sensory modalities, such as a flash and a beep, can be readily merged into a single, static event. However, real-world dynamic events often generate signals with different onsets and durations across the brain and the senses; it is hoped that mapping these areas in brain will enable the development of 4-D or mulitsensory electronic systems. Now, a study led by researchers at UC San Diego develops a pair of 4-D glasses which allows wearers to be physically ‘touched’ by a movie when they see a looming object on the screen, such as an approaching spacecraft. The team state that their device was developed in conjunction with a study to map brain areas which integrate the sight and touch of a looming object and aid in their understanding of the perceptual and neural mechanisms of multisensory integration. The opensource study is published in the journal Human Brain Mapping.
Previous studies show that real-world objects approaching or passing by an observer often generate visual, auditory, and tactile signals with different onsets and durations. Prompt detection and avoidance of an impending threat depend on precise binding of looming signals across the brain and senses. The current study uses a multisensory apparatus integrating a direct-view wide-field screen with flexible air hoses to deliver spatially aligned looming visual and tactile stimuli near the face with a varying temporal offset.
The current study assesses the synchrony between a looming ball simulated in virtual reality and an air puff delivered to the same side of the face. Results show that when the onset of ball movement and the onset of an air puff were nearly simultaneous, the air puff was perceived as completely out of sync with the looming ball. Data findings show that with a delay between 800 to 1,000 milliseconds, the two stimuli were perceived as one, as if an object had passed near the face generating a little wind.
In experiments using functional Magnetic Resonance Imaging (fMRI), the group delivered tactile-only, visual-only, tactile-visual out-of-sync, and tactile-visual in-sync stimuli to either side of the subject’s face in randomized events. Results show that more than a dozen brain areas were found to respond more strongly to lateralized multisensory stimuli than to lateralized unisensory stimuli, and the response was further enhanced when the multisensory stimuli are in perceptual sync.
The team surmises that their study demonstrates the perceptual and neural mechanisms of multisensory integration near the face, which has potential applications in the development of multisensory entertainment systems and media. For the future, the researchers state that their studies will help to further understand the perceptual and neural mechanisms of multisensory integration, and provide a solid scientific foundation for developing the next generation multisensory entertainment systems and media, such as 4-D film.
Source: UC San Diego