"You know, I know this steak doesn't exist. I know that when I put it in my mouth, the Matrix is telling my brain that it is juicy and delicious. After nine years, you know what I realise? Ignorance is bliss." – Cypher, The Matrix
As its name suggests, multisensory XR is the combination of any number of the five human senses to enhance the virtual experience of the user.
Hypothetically speaking, a person navigating a multisensory virtual environment will not only see that reality but hear, smell, feel and even taste it. (Quite honestly, they may never leave it…)
Philosophical considerations aside, The Matrix represents a touchstone for multisensory technology. An XR system in which virtual worlds are as rich and complex as our own, if not surpassing and replacing our own entirely. Admittedly, it's doubtful we will live to see such a simulation. (Unless… we're already… No, we're kidding.)
Sci-fi and fantasy rarely offer any technical explanations for their rarefied concepts – in The Matrix we just get red pills and spinal probes. On the other hand, modern science is beginning to put these ideas into practice with real, testable results that raise the question, "how far down the rabbit hole can we go?"
The scope and application of multisensory XR are not just limited to closed-off virtual systems. The technology is just as viable for augmented realities in which the user may feel, hear or smell virtual objects as continuations of their natural environment.
A (very) brief history of multisensory XR
In the 1960s, American filmmaker and cinematographer, Morton Leonard Heilig, created the first multisensory VR system – the Sensorama. It used a 3D stereoscopic colour display, stereo sound, a mechanical motion seat, fan-powered wind and aromas to simulate motorcycle rides through New York. There was no "freedom" within the Sensorama, save for leaving it.
In 1965, computer scientist Ivan Sutherland wrote his seminal paper, "The Ultimate Display", in which he characterised the display of the future as "a looking glass into a mathematical wonderland". "The ultimate display", Sutherland concluded, "would be a room within which the computer [could] control the existence of matter", foreshadowing the emergence of interactive virtual reality systems.
By the 1990s, NASA had developed its Virtual Interface Environment Workstation (VIEW), complete with a head-mounted display and their tactile "DataGloves" and "DataSuit". These allowed the computer to track the user's orientation and inputs within the virtual system.
The VIEW provided 3D audio-visual simulations, tactile environments and voice recognition software.
Where is multisensory XR today?
On the XR front, each of the five senses poses unique challenges. At the hardware level, all of the human faculties require some degree of reverse engineering to approximate virtual reproductions.
Integrating them into a seamless, embodied experience that doesn't leave the user completely bewildered is not trivial – to put it mildly. Fusing any number of these into a multimodal human-computer interface (HCI) entails significant design challenges for end-user satisfaction.
Research and development in the fields of virtual optics (sight) and audition (sound) are significantly more advanced than those of haptics (touch), olfaction (smell) and gustation (taste). But as we'll see, advances in these areas are forthcoming and show promising signs for the future of XR.
Olfactory XR
In recent years, olfactory research has made significant breakthroughs. In 2022, US VR company OVR received the AUREA award for their ION sensory unit, a device that re-creates scents for users in virtual reality.
Using nanoparticles that can be activated in a matter of milliseconds, the ION is carefully calibrated to reproduce thousands of aromas "emanating" from virtual objects via "optical engineering algorithms".
OVR claims that its olfactory VR technology can be applied to rehabilitation cases – specifically exposure therapy – by revisiting aromas associated with trauma.
Gustatory XR
One of the least advanced areas of study, gustation is a complicated affair – not least because we taste in combination with our olfactory sense (an HCI nightmare!).
At the National University of Singapore, a team of researchers created a "digital lollipop" capable of producing the four main taste profiles: salty, bitter, sweet and sour.
The lollipop consists of two metal plates, into which the user inserts their tongue. By passing alternating currents through electrodes the metal confection dupes the tastebuds of the user by way of subtle thermal stimulation.
But what about the sensation of eating itself? Chew the fat on this: a paper published in 2016 detailed the workings of an "Electric Food Texture System". Its authors, Arinobu Niijima and Takefumi Ogawa, describe the system as follows:
"We use a photo-reflector to measure motion of the user's lower jaw for 'bite detection', electromyography sensors to measure food texture for 'food texture database', and a medical electrical stimulator for 'electric stimulation'."
The "electric stimulation" is made possible by electrodes attached to the masseter muscle, which is used for chewing. By adjusting the frequency of the currents, the device can reproduce sensations of hardness or chewiness as the user bites down on their phantom lunch.
Haptic XR
Haptic technology is a little longer in the tooth than its olfactory and gustatory counterparts. Early examples were included in aircraft controls to signal aerodynamic vibrations to the pilot. Since the 90s, commercially available haptic suits and gloves using force feedback and vibrations have been marketed for entertainment purposes.
Broadly speaking, there are two methods of creating haptic feedback. Cutaneous stimulation uses electrical signals to stimulate the user's nerves through the skin. Whereas force stimulation is a mechanical method that exploits the user's perception to create "illusions" of contact – and in other cases limit the movement of the body to communicate virtual thresholds to the user.
Tesla currently produces a complete haptic suit using "electro muscle stimulation" and "transcutaneous electrical nerve stimulation", enabling full-body haptic feedback with VR devices.
Dexta Robotics, on the other, err… hand, has created the world's first commercially available force feedback gloves for applications in education, training, medicine and more. The gloves use variable force and stiffness to give a realistic, haptic quality to the virtual environment.
Conclusion
Use cases for multisensory XR are so broad, we could only touch on a few in this blog. However, with the inevitable coming of the "internet of senses", or the haptic internet, and the delivery of multisensory experiences over ultra-low-latency networks like 5G and 6G, the potential applications for remote services will become even greater.
Curious about multisensory XR? You can buy XR in our online store and get a taste of what the technology can do today. We sell XR headsets and glasses by leading manufacturers such as Microsoft, Magic Leap and mōziware.