Table of Contents:
Virtual reality experiences have come a long way in recent years, and modern VR headsets like the Meta Quest 2, Pico 4, and Vive Pro 2 all offer far more realism than we’ve ever had access to before in the digital world. Some VR games and applications feel so real that the brain is somehow fooled – we feel genuine fear, physical sensations included, when standing on a high ledge, for example. In fact, such visceral reactions are so marked that they have sparked a whole host of YouTube videos showing people reacting to VR horror games by tearing their headset off or screaming in fear. Extreme though these actions may be, such powerful effects have allowed companies to successfully implement VR training and game designers to use virtual reality to enhance storytelling. However, even in the most immersive of metaverse experiences, we are still aware of being in a simulation. Will metaverse technologies ever take us beyond this feeling to a place where it feels truly real? It’s possible that the metaverse will feel real, but first, it will have to appeal to all the senses. So, how close are we to having a metaverse that feels real?
This article has been split into two parts to look at the technologies metaverse innovators use to bring the virtual reality world to life for each sense. Part 1 will deal with senses of sight and hearing in the metaverse as these appear to be the easiest to replicate, while part 2 will consider solutions being built for senses of touch, taste, and smell.
Metaverse Sense of Sight
It might be easy to think that the visual side of the metaverse is sorted – after all, we already have VR environments that look great. However, many metaverse experiences are too cartoony, and even the most sophisticated ones cannot quite mimic the real world yet. While platforms like Facebook Horizons have created avatars that look like computer game characters from the 90s, some companies have tried to make realistic ones that look and move just like humans. For example, Vogue is currently working on an app that will allow users to experience digital fashion on hyper-real avatars. It is hoped that this will appeal to the detail-oriented nature of the fashion industry and give users completely new ways to enjoy fashion without it looking overly simplified like it’s part of a video game. Although the metaverse avatars still don’t look exactly like real people, they are far more realistic than the ones seen in worlds like Decentraland or The Sandbox. They will no doubt become more and more realistic as the technology develops.
But it is not just realistic metaverse avatars that will help the metaverse to feel real, convincing environments are required as well. In this regard, video game companies may be leading the way. The 3D computer graphics game engine Unreal Engine is being used by those metaverse companies that want to build ultra-realistic virtual worlds. One such world is Victoria VR, a decentralized autonomous organization (DAO), run on the Web3 ethos of giving the users control and power over what happens in the world. Currently, in alpha testing, this blockchain-based metaverse world looks insanely real and will offer users the chance to buy land and build the world however they like, without the physical limitations of the real world.
Tech company Nvidia has also set up its own platform to facilitate a more real-looking metaverse. Nvidia Omniverse is a free platform for creating and operating metaverse applications. Like Unreal Engine, it creates incredible visuals that really bring the metaverse to life. Using digital twin technologies, users can create incredibly realistic simulations of buildings and even entire cities. In fact, they are now working to create a digital twin of the entire planet.
Another realistic virtual world is being created by Mazer. This metaverse platform will feature a series of futuristic cities that look like something out of the world of sci-fi, with a cyberpunk feel. Based on Mazer’s own metaverse platform, the first of these cities is Cyber Tokyo, which will be launched in early 2023. Within it will be thousands of NFT penthouse apartments which holders can use for almost anything they want, all the while feeling like they’re in a real apartment in an immersive city of the future.
Visually, the metaverse has a whole range of standards, from the childish look of Facebook Horizons to the hyper-realistic virtual worlds being created using Unreal Engine. However, even the very best of these still have something which the human mind detects as virtual, not real. Maybe in combination with the other senses, though, these environments could feel real.
Metaverse Sense of Hearing
The sound quality in the metaverse is already at a very high level and is probably one of the most advanced areas of this technology. Nevertheless, it can always be improved. One of the main obstacles to overcome is how and where sounds appear in the virtual environment in relation to the user. To help with this, Meta has recently finished work on three new AI models to make realistic sounds in the metaverse. Named Visual-Acoustic Matching, Visually-Informed Dereverberation, and VisualVoice, they have different benefits, with the first modifying the audio to more effectively convey the space between user and sound, leading to more realistic acoustics that reflect the shape and size of the virtual environment. Meanwhile, VisualVoice deals with audio-visual speech separation to make things like virtual meetings more seamless as participants move between and interact with different individuals or groups.
Mazer is also working to provide a realistic sense of hearing in the metaverse. Once again, the key is locating the sounds correctly in relation to the user so that it doesn’t just feel like they are being produced directly beside their ears. Incorporated into the Mazer metaverse platform is spatialized positional sound technology which locates each sound precisely within the world, making the user’s experience of different noises very close to how they would hear them in the real world. The platform also offers the ability to creatively manipulate the sound waves to accommodate the listener’s needs, for example, if they have specific hearing difficulties. Combined with outstanding visuals, these kinds of innovations can certainly make the metaverse feel much more real.
Final Thoughts for Part 1: Senses of Sight and Hearing in the Metaverse
Many virtual technologies have tried to replicate the natural human experience, but they have never managed to emulate it perfectly. The senses of sight and hearing in the metaverse are the two areas that have come closest to mimicking real life, though they are still a way from reproducing it to a level where we don’t know that we are in a virtual environment. If the other three senses could be incorporated, we could make the metaverse feel real. Check out Will The Metaverse Ever Feel Real? Part 2 to find out how metaverse businesses are trying to achieve this.
How realistic are the avatars in the metaverse currently?
The avatars in the metaverse vary in realism. While some platforms still have cartoony avatars, companies are working on creating more realistic ones that resemble humans. However, the avatars are not yet identical to real people but are continuously improving with advancing technology.
What role do video game companies play in creating realistic environments in the metaverse?
Video game companies are at the forefront of developing convincing environments in the metaverse. Game engines like Unreal Engine and platforms such as Nvidia Omniverse facilitate the creation of ultra-realistic virtual worlds. These technologies enable the development of visually stunning and immersive metaverse experiences.
Will the metaverse ever reach a point where it feels completely real?
While the metaverse has made significant advancements in replicating the senses of sight and hearing, it has not yet achieved a level where it feels indistinguishable from reality. To reach that point, advancements are needed in replicating the senses of touch, taste, and smell. Part 2 of the article explores the solutions being developed for these senses, which will play a crucial role in making the metaverse feel truly real.