Will The Metaverse Ever Feel Real? (Part 2)
Table of Contents:
In part 1 of “Will the Metaverse ever feel real?“ article, we saw how sight and hearing are being incorporated into the metaverse to create more and more realism. However, though developers have brought them closer to replicating real-life experiences than the other three senses, they are still struggling to make them fully realistic. So, how can touch, smell, and taste be brought into the virtual world? Is it even possible? Some of the innovations being worked on might surprise you. Read on to find out more.
Metaverse sense of touch
While sights and sounds have received plenty of attention, it’s probably fair to say that the most exciting advancements are currently happening concerning the touch sense in the metaverse technology. A number of companies have been set up over the past few years aiming to offer effective metaverse solutions for physical sensations. One of those is Japanese hardware firm H2L, whose UnlimitedHand came out in 2015. This wrist wearable had sensors that could measure muscle movement and provide feedback based on the actions performed to produce the feeling of the weight of an object and even pain. Through electrical stimulation, users could feel the trigger of a gun in their hands or the strings of a guitar. The CEO of H2L, Emi Tamaki, who herself has a condition that limits her freedom of movement, wants to offer others like her the chance to go beyond their physical limitations in the metaverse and has an ambition to release humans from all physical, spatial, and even temporal constraints by 2029. While UnlimitedHand seems to have fallen away somewhat, there are several other haptic technologies that might be able to take us toward Tamaki’s dream, though maybe not as soon as 2029.
Researchers at Carnegie Mellon University have been working on a very specific piece of haptic technology which they have successfully added to Quest 2. Using 64 transducers placed just above the mouth, it works via ultrasound to produce sensations on the user’s lips and tongue so they can have the feeling of kissing or drinking, for example. Though the current prototype looks clunky, it certainly adds another layer to haptic technology which could eventually bring increased realism. Some people think that it could enhance potential sexual experiences in the metaverse. In fact, there are those who believe that it will be possible to have sex in the metaverse eventually, opening up a world of possibilities for experimentation.
Of course, as well as groups such as those at Carnegie Mellon, the big players like Meta are also busy creating wearable technologies. They are working on a set of haptic gloves which contain sensors that turn them into a controller, as well as ridged pads that create physical sensations. Meanwhile, other companies are working on technologies that aim to free the user from having to don a wearable device like this. Glasgow University has developed a system called Aerohaptics, whereby users can stand around a volumetric display which uses a leap motion sensor and precise jets of air to stimulate feelings in the hands, fingers, and wrists. Multiple people can interact with this display at the same time without any wearables being involved at all, and its makers suggest that it could be used to enhance teleconferences or to help surgeons train to perform operations. The infrastructure required to take part in this is quite large and bulky, but it’s possible that its principles could be taken into other systems to enhance metaverse experiences in the future.
Another group working to make touch technology without a wearable is California tech start-up Emerge. They have engineered a tabletop panel about the size of a 13” laptop that can sit horizontally on a flat surface. It produces ultrasonic waves that can reach up to three feet from the device to stimulate the hands when held over it. Like the Aerohaptic, it ties you to one spot in order to use it, but the lack of a wearable could make it feel more realistic.
The problem of being tied to one spot is a difficult one to overcome and, while these wearable-free technologies might provide a little extra sensation, there are other more ambitious projects that have a more full-body effect while also allowing more freedom of movement. The OWO haptic jacket, for example, covers the entire torso and can give the wearer sensations of being hugged by a loved one, bitten by an insect, or stabbed by an enemy in a computer game. Users have to wear it directly over their skin to get the physical effects, which are provided by electrical impulses, and it connects to a mobile app which allows the wearer to control which parts of the body are affected and to what extent. Unwanted sensations can be switched off and those with a lower pain threshold can adjust the feelings that come through the device to suit their sensitivity level. It also has an 8-hour battery life, giving people the time to become deeply immersed in the experience or game they are involved in. The makers hope that by giving users the feeling of gun recoil or the sensation of falling, they can make metaverse experiences more exciting and high stakes as players will actually want to avoid getting shot or crashing their car.
The idea of a haptic jacket is quite amazing, but Tesla has gone one step further and developed a full-body haptic suit called TESLASUIT. Unfortunately, it’s currently more of a metaverse solution for businesses rather than individuals as it costs a whopping $12,999. As such, it contains many functions that metaverse companies and others will find incredibly useful. It features 3 systems: haptic feedback, motion capture, and biometrics. These can be used to monitor human behavior and improve performance, making it perfect for metaverse training. Like many of the other solutions, it utilizes electro-muscle stimulation to create physical sensations. The TESLASUIT also uses transcutaneous electrical nerve stimulation to simulate even more lifelike feelings than most other haptic technologies. Through this, users can repeat VR training, for example, in sports, the military, logistics, and many other industries, to build muscle memory, correct technique, and achieve deep learning through incredibly immersive experiences. Its 14 Inertial Measurement Units, each comprised of an accelerometer, a gyroscope, and a magnetometer, track the user’s position and movements, with heart rate and breathing also being measured to give extremely detailed feedback.
Companies can use this advanced system to implement metaverse training experiences which can adapt to the trainee for personalized experiences. Its power was demonstrated in 2019 when one rugby player ‘tackled’ another from 100 miles away. Combined with 5G tech, the player could feel the full force of the tackle, even staggering back from the shock of the ‘hit’. It has also been applied to high-risk training environments such as electrical safety training, where trainees received physical sensations if they made mistakes, simulating real consequences and improving the learning experience. These use cases show the power of the TESLASUIT and, even though right now it is extremely expensive, it shows the potential to mimic real physical sensations in the metaverse and is a big step towards bringing realistic touch into the virtual world.
Unlock the future with Mazer: Your innovation partner.
Metaverse sense of smell
While smell machines have been used in cinema experiences since the 60s, this is another sense which is seemingly difficult to integrate the sense of smell into the metaverse. Nevertheless, some companies are trying. Metaverse tech start-up Hypnos Virtual has developed Scentscape, a system that enhances VR experiences by introducing a neuroscience-based data stream called Bio-Media. Scentscape releases bio-aromas as the user goes through the metaverse, whether it be in a game or other experience, much like a musical soundtrack plays throughout a movie, directing the mood as they go. AI helps the system to deliver the correct scents at the correct time. On a large scale, it can be used in cinemas to provide the smell of the sea or a forest as characters journey through a landscape, but it can also be applied to smaller, more personal experiences like those that an individual might have in the virtual world.
Another metaverse company called OVR Technology has also created a system to bring smells into the metaverse. OVR’s scent designers and scientists have managed to build a cartridge that can be attached to a VR headset to provide more realistic experiences. It works by releasing nanoparticles of scent in millisecond increments to authentically stimulate the user’s sense of smell. The founders of OVR believe that smell is the pivotal sense for making a genuine connection with reality in the metaverse, stating that it triggers more than 70% of the memories and emotions we have on a daily basis. And because it also plays a huge role in forming our sense of taste, it can help to bring this sense into the metaverse as well. OVR Technology currently uses its scent cartridges to offer therapeutic experiences for human well-being, more immersive training for high-risk industries like fire and aviation, and other VR experiences. It can be added to any VR headset and, in theory, used with any VR content with the universal software plugin. With such innovations, we are getting closer and closer to providing the metaverse user with an all-around sensory experience that truly feels like real life.
Metaverse sense of taste
When it comes to replicating an authentic sense of taste in the metaverse, you may think that this would be completely impossible. However, in this world of innovation where metaverse start-ups are appearing everywhere, there are even teams working on systems to emulate taste in the virtual realm. In 2020, a team at Meiji University in Tokyo succeeded in developing a machine that could stimulate an authentic sense of taste without the subject ingesting any food. To do so, they created a device with 5 different electrolyte gels, one each for the main taste sensations of salty, sweet, bitter, sour, and umami, set in separate tubes. When the gels are electrically stimulated, they are released in different amounts to make the taste of different types of food as the subject licks them. At the moment, the gels are delivered through a rather clunky handheld device, but this again shows the potential: flavours can be artificially created, so, in theory, they can be administered to people as they interact with food and drink in the metaverse, further enhancing the realism of the experience.
Final thoughts for part 2
Remarkably, there is now technology that can replicate all of the senses in the metaverse. However, bringing these disparate technologies together to create something like the real world could potentially take decades. But the more and more realistic these experiences become, the more potential applications they will have, and the more people will be drawn into these worlds. At the moment, metaverse solutions for business and otherwise are a poor reflection of real life, but we should not be too concerned about that. Being able to distinguish metaverse experiences as separate from real-world ones is an important distinction to make, and if we ever get to a stage where the two appear the same to all five senses, what’s to stop us falling into the virtual world and never leaving it?
Ultimately, the metaverse may never truly feel like the real world. But why would we want it to? Maintaining a good connection to the real world and the people physically around us is important, and it may be too tempting for some to fully disconnect from these if the metaverse feels too real. It might be enough for the metaverse to feel like a very realistic game, rather than ever feeling exactly like real life – somewhere which can be used to enrich our experience of life instead of completely replacing it.
How are the senses of touch, taste and smell are being incorporated into the metaverse?
Touch, smell, and taste are being incorporated into the metaverse through various technologies and innovations. Companies are working on haptic technologies, such as wearable gloves, wrist wearables, and tabletop panels, to provide physical sensations and simulate touch in the virtual world. Similarly, scent machines and cartridges are being developed to introduce smells into the metaverse. Emulating taste in the metaverse is also being explored through devices that stimulate different taste sensations without ingesting food.
What are some other haptic technologies being developed to enhance the sense of touch in the metaverse?
Several companies, including Meta, Glasgow University, and California tech start-up Emerge, are working on haptic technologies to enhance touch experiences in the metaverse. Meta is developing haptic gloves with sensors and ridged pads to create physical sensations. Glasgow University’s Aerohaptics system uses precise jets of air and a leap motion sensor to stimulate feelings in the hands, fingers, and wrists without wearables. Emerge has engineered a tabletop panel that produces ultrasonic waves to stimulate the hands when held over it.
Unlock the future with Mazer: Your innovation partner.
Author: Rafał Siejca
Rafal has over twenty years of corporate experience, including roles at Millennium Bank, Comarch, and leading software teams at PZU, one of Europe’s largest insurance companies. As one of Poland’s few true VR experts with a decade of experience, he ensures timely, high-quality project delivery as CEO and CTO.