Table of Contents:
The metaverse is set to create a whole new virtual universe for people to live, work, and play in – a place full of excitement and new opportunities. But as well as providing these benefits, the metaverse will also open up new avenues for criminal activity, posing new problems for law enforcement and challenging the legal system to update its procedures to deal with unfamiliar crimes. Our ability to deal with this might determine whether or not the metaverse can safely be used to conduct business or if it will just turn into a lawless world where there are no rules. In this Mazer blog post, we will look at some of the contemporary threats the metaverse might pose both to businesses and individual users and what might be done to protect people from them.
The metaverse will be populated with avatars: virtual representations of real-world users. As such, it may be possible to interfere with them, for example by damaging or destroying them. Would this be considered murder? Certainly not like in the real world. After all, the user would still be alive, but the individual may have spent money building the avatar, so could it be considered damage to a person’s property? What about things like virtual real estate (see our Mazer blog about real estate in the metaverse)? Will damage to virtual property be treated like vandalism? Why should it not? These things and many more will have to be decided if the metaverse is to run smoothly.
Some believe that there won’t be any crimes related to physical acts such as murder or assault in the metaverse, but that they will be dealt with more like verbal, mental, or emotional abuse. Of course, virtual bullying has been around for years now, and people who spread extreme or hateful views are often banned from major social media platforms like Facebook and Twitter. But at what point will metaverse crimes become something that can be dealt with in a real-world court of law? That may depend on how far we go with the metaverse and whether or not we get to the point where avatars are treated more like physical people.
Avatars as people
As already mentioned, avatars are virtual beings that inhabit the metaverse. Each one is controlled by their physical owner, whose actual body is not under threat (unless they walk into something while wearing their headset, for example). That being the case, it would seem odd at best to give avatars the same protections as people. Indeed, many games require that a person’s avatar be ‘killed’ in the virtual world to simply respawn later. However, there have already been cases of sexual harassment in the virtual world (as discussed in our previous Mazer blog on online security in the metaverse), and with haptic technologies improving every day, it’s possible that people might develop the ability to physically injure someone in the metaverse in the future. These are certainly things we would want to protect people against, and this is when the law could be extended to cover such eventualities. However, the fact is, we don’t know exactly where the metaverse will take us, so it’s likely to be a question of learning as we go and adapting to whatever new changes occur.
Consequences for mental health
Sometimes, it can be the mental anguish connected with assault and other violent crimes that stays with people for longer and has serious effects on a person’s life. In previous Mazer blog posts, we’ve covered how virtual training in the metaverse can help medical professionals learn how to perform surgeries or how virtual environments can be used to treat certain mental health issues. If the metaverse can be used for such purposes, then it follows that acts of violence within the metaverse will have the same longstanding psychological effects. Cyberbullying has already shown this to be true. Add into the mix a virtual avatar, and the potential for mental trauma is even higher. The adults of today didn’t grow up with virtual avatars but imagine a child having an avatar who they have developed a connection with over many years, who they have bought outfits for, taken to virtual shops and events, played as in virtual games, and inhabited to hang out with friends in virtual environments. Any violation of this could have much more serious effects on the human behind the virtual character, perhaps more than we can imagine with our current level of connection to the virtual world. So, if virtual crimes can affect us as badly as physical crimes, why not treat them the same way, with the same trials and punishments?
One big problem with crime in the metaverse will be how to catch and prosecute people effectively. At the moment, users can be booted or banned from online communities, but it’s much harder to track down and prosecute the individual in real life. Online criminals can be very sophisticated, often more so than those trying to protect users against them. As such, they have clever ways to conceal themselves from other users, and it may be tremendously difficult for them to be caught. For example, with the development of NFTs (see our Mazer post on NFTs in the metaverse) meaning that users can own unique items in the metaverse, theft of virtual objects is now possible. Of course, NFTs can be tracked, so if someone steals one, it can be traced to their virtual wallet. However, hackers know how to cover their tracks and hide what they have stolen, making it hard to prove where it’s gone and who now has it. Even if you could, many people working in the legal system don’t even know what NFTs are. And how is an NFT to be valued? Right now, some NFTs are going for huge amounts of money, but their value is highly fluid and sometimes goes up or down depending on current trends. Needless to say, theft in the metaverse creates a whole host of problems for the legal system.
Imagine you catch a virtual criminal. How should they be punished? If somebody ‘murders’ your avatar, what would be a reasonable reaction to that? Users can be kicked off platforms, but it’s easy for them to come back under a different guise, a problem that has plagued large-scale online games such as Call of Duty, where widespread cheating led to the game’s developers being heavily criticized. Activision recently introduced the Ricochet Anti-Cheat system, which has vastly improved the situation, but with large industries having grown up around hacks and cheats, cheat developers aren’t likely to give up easily.
Another issue is who to go to if somebody commits a crime against you in the metaverse. If you enter a metaverse world like Decentraland, you could encounter people from all over the world, but you’ve got no idea who they actually are or where in the world they live. So, say they harass, abuse, or scam you – who do you report it to? The platform? That leaves you with the same problems outlined earlier – the user might be banned but could simply log back in with a different account. Outside of the virtual world, who do you report the crime to? The desire to regulate the metaverse to make it safe for everyone involved may also go against some of its biggest advantages. Many people are excited about the decentralised nature of the metaverse, but the need to prosecute cyber criminals and protect virtual property might mean that centralisation cannot be avoided. How much of a negative effect this might have on people’s desire to join the metaverse is yet to be seen but could certainly deter some.
New Meta CTO, Andrew Bosworth, seems to think that we simply have to accept a certain level of negative behaviour in the metaverse. He has openly admitted that VR environments can be toxic and, although he has stated Meta’s commitment to high levels of safety within the metaverse, he’s also suggested that it will be very hard to control and moderate what people say and do in this space. Not exactly comforting, but maybe he’s just being realistic. Maybe entering into the metaverse may involve some level of risk, but the benefits will be worth it, and, in the meantime, we must do what we can to minimize the threats and to punish those who perpetrate them.
Prevention or punishment
Such difficulties suggest that prevention might be better than punishment in the metaverse, at least for now. Meta has introduced a safe boundary around its avatars, which means that no one can come within 4 feet (about 1.2 metres) of another avatar. Safeguards like this will need to be put in place by developers as they build the metaverse, otherwise it won’t become the immersive virtual world that everyone is hoping for and will eventually prevent businesses and individual users from trusting it enough to invest, possibly resulting in its collapse.
Understandably, businesses looking to move into the metaverse will have big concerns about security. Companies as large as Facebook and JP Morgan have already built metaverse experiences, so they clearly think that it is a safe space to do so, but the technology is still in its infancy, and no one knows what dangers might come up. For the moment at least, the metaverse for business has done enough to protect its clients’ data. For example, on XR Wizards’ Mazer platform, user data is end-to-end encrypted, meaning that it is protected from interference by other users, and it cannot even be seen by XR Wizards themselves – it’s fully private. Such protections are going to be essential in the metaverse of the future, but who knows what other security measures metaverse businesses will need to take to ensure that their data, and that of their customers and clients, remains safe.
It’s clear that there is still a lot to work out when it comes to metaverse crimes. A lot will depend on how secure developers can make their virtual worlds and how savvy users are as they navigate these metaverse spaces. In truth, we don’t know what new crimes might be committed in the metaverse, so the legal system is going to have to be flexible and pragmatic in its approach if we are to see the metaverse for business being taken as a serious proposition. In the future, as metaverse crimes become more sophisticated, cybersecurity companies may have their hands full developing technologies to help metaverse businesses and other users protect themselves from cybercrimes. Or maybe the metaverse will become so realistic that we will have to start treating it more like the real world.
Read also: The Role Of Blockchain In The Metaverse