Table of Contents:
Over the past couple of years there has been a huge increase in the number of companies investing in the metaverse. This is sure to increase, with some suggesting that metaverse-related revenues could run into the trillions of dollars by the end of the decade. However, while there are already many metaverse experiences, platforms, and worlds, without the right hardware, the metaverse will never become the immersive virtual world that people are predicting. For this to happen, the current hardware will have to become smaller, more lightweight, and more comfortable. At the moment, smart glasses seem to be the most likely technology to achieve these specifications. In this article, we will look at what smart glass technology is currently available and where it could take us, to see if the metaverse will ever reach its full potential.
A technology in its infancy
Current metaverse platforms like Decentraland and Sandbox are mostly experienced through ordinary PC screens or virtual reality headsets. They are more about world building and 3D experiences than truly integrated metaverse solutions and can feel quite clunky and not fully immersive. Of course, they are early iterations and constitute an important stage of the development of the metaverse for entertainment as well as business. Other virtual reality experiences that hint at the metaverse, such as VR training and shopping, can also only be accessed through the use of VR headsets like the Oculus Quest 2. While they are far more sophisticated and effective than they were even just a few years ago, they are still quite bulky and cannot be used comfortably for extended periods.
On the other side of the coin is augmented reality, where images are superimposed over real-world scenes or objects, creating an enhanced version of it. At the moment, users can access these experiences on their smartphones as well as through some of the more advanced smart glasses available on the market. In order to reach the level of interaction that would constitute the metaverse, a mixture of these two technologies would need to occur. This brings us to mixed reality, which blends the 3D digital overlays provided by AR with the digital interaction of VR to provide a seamless combination of the physical and virtual worlds. MR has already been used in areas such as healthcare, where surgeons were guided by holographic overlays that highlighted areas of bone and blood vessels during reconstructive surgeries.
Amazing as such innovations are, they still require fairly heavy-duty hardware. Once we have devices that can take us beyond this – those that can be worn comfortably throughout the day – we can say that the metaverse is truly upon us. Such devices would have to be small enough and light enough to allow the user to behave normally and to feel unencumbered, while also being powerful enough to project complex MR images over long periods. This is where smart glasses come in. But just how close are current smart glasses to offering us these kinds of capabilities?
How effective are our current smart glasses?
Google Glass brought the concept of smart glasses to the mainstream back in 2013, but they had very limited success, with many not liking them at all. Facebook acquired Oculus at a similar time and released the Oculus Rift headset, where people could have VR experiences and play VR games. Microsoft’s HoloLens was brought out by Microsoft in 2016, delivering a much more mixed reality experience. At the same time, Apple was developing their own VR and AR solutions, releasing the AR Kit for their devices in 2019, which became quite popular in the education and art industries.
Facebook took a big step forward in 2021 when they teamed up with Ray-Ban to create Ray-Ban Stories. These smart glasses act more like smartphones, with users able to take photos and videos, answer calls, and listen to music. This is still a long way from the kind of MR experience that would constitute the metaverse, but it’s an important milestone on the way towards it. Since then, things have started to move more quickly, with Facebook’s Project Aria enabling people to map out 3D spaces in the real world with their smart glasses.
While this Facebook x Ray-Ban collaboration has grabbed the headlines, there are other smart glasses available, such as the Nreal Air AR Glasses. These provide an overlay through which you can access your phone, watch movies, and play online games. While these are more of a fashion item, Lenovo’s ThinkReality A3 glasses have a more practical application. They are aimed at workers, allowing them to streamline their workflows through 3D visualization and to offer remote assistance and guidance. Because they are a lightweight pair of glasses, employees can take their workspace with them anywhere. They could be sitting in a café while viewing a display of multiple virtual monitors which no one else can see or visualizing workflows in the warehouse, for example.
Although devices like this have some great benefits, they also reveal a few of the problems facing smart glass designers right now. Firstly, there is no scope for developers to add new functionalities to these devices by building their own apps or environments. These glasses are built for specific uses and are quite limited by the need to be connected to a smartphone. The battery life is also a big issue, with the Nreal glasses only having a few hours of power before needing a recharge. Privacy is a wider issue affecting all of these devices and the metaverse in general. With any wearable device, people are going to be nervous about their data and conversations being at risk. Companies will need to find ways to calm these worries and provide true protection to those using these devices. So, while these kinds of gadgets are fun and interesting, they aren’t really metaverse experiences yet. However, there are new innovations in the pipeline with extremely exciting prospects.
Smart glasses of the future
Smart glasses generally work through a kind of mini projector embedded somewhere in the frame of the glasses which beams light onto the lenses that is then bent by optical waveguides to appear across the wearer’s field of view. While this aspect of the technology is quite effective, there are many areas which need to be improved to achieve metaverse functionality. Fortunately, there are some exciting projects in the works.
Meta recently announced its ambitious Project Nazare plan. Still in its early stages, with Zuckerberg suggesting a 5-10-year timeline for its completion, it aims to use brain-interface technology from CTRL-labs (acquired by Facebook in 2019). This harnesses Electromyography to convert subtle neural signals into virtual actions. Rather than the usual controllers that current headsets use, these glasses will be connected to wrist wearables that will also provide haptic feedback. The devices track the user’s actions by detecting nerve signals that run along their arms to allow things like typing on a virtual keyboard. They can even learn your movements and get used to the kinds of mistakes you regularly make, eventually using this data to automatically correct typos.
When CTRL-labs announced this tech back in 2018, they described how it could be used to take us beyond our physical movements, with users just thinking about what they want to type and the technology doing the rest without the user even moving their fingers. Unlike Elon Musk’s Neuralink, which Zuckerberg has said will be too invasive for people to trust, this doesn’t ‘read your mind’ as such, it simply analyzes your brain impulses. Most people would feel safer not having something implanted into their brain and, though smart glasses could still cause security concerns, Meta has a whole department dedicated to its ‘neuroethics program’ that could offer people the peace of mind they need to use these devices without any concerns for the safety of their data.
At the moment, no metaverse companies are able to provide smart glasses which can provide adequate metaverse solutions for business, but we might not have to wait too long for this to be achieved. Back in the 90s, when everyone was using dial-up, we never would have imagined what we would be able to do with the internet now or how fast things would have developed. Despite Meta and other developers having a lot of problems still to solve, if smart glasses can be successfully combined with wrist wearables, they will offer something which can be comfortably worn all the time, leading to true metaverse integration. The technology will have been perfected when it can pass what Zuckerberg calls the “visual Turing test”. This is when the display in our smart glasses feels like a real physical environment and our brain fully believes that it is. Having this sense of presence in the VR world is very important and until we have it, the metaverse won’t be able to truly reach its full potential. Nevertheless, this new wave of metaverse solutions for business and entertainment is coming, and it won’t be too long before we have reached the fully integrated metaverse that so many are hoping for.
Read also: Metaverse 101: What Is It, What Will It Look Like, And How Will It Develop