- The metaverse has presented an opportunity for more aggravated online abuse, mostly targeting women and minors.
- Several users of metaverse platforms including Facebook’s have reported cases of online sexual abuse while others say their children are being exposed to inappropriate content.
The age of the metaverse is well and truly upon us. Some of the world’s biggest companies including Meta (formerly Facebook) and Microsoft are investing fortunes to lead the new wave. Users are also flocking into metaverse platforms, some of which are built on the blockchain. But are they safe? Has online bullying taken a new level of aggravation and reach? Are women and minors safe on the metaverse? Recent reports indicate that we are a long way from a safe metaverse.
It’s easy to see why the metaverse is growing so rapidly. We are now living in an age where online interactions have become the new norm. We shop online, with Amazon now being four times bigger than Walmart. Dating is online as well as Tinder becomes ubiquitous globally. And with the COVID-19 pandemic, meetings and interactions have gone online as well with Facetime, Zoom and the rebirth of Skype. The metaverse is an extension of all these, with gaming being the entry point for most.
Read More: Bill Gates says the metaverse will host your office meetings in 3 years as Facebook takes giant leap
But it’s not safe yet and one Canadian woman narrated her story recently to the New York Times to this effect. Chanelle Siggens is an avid fan of Population One, a virtual reality game played with Facebook’s Oculus VR set. As she waited for the action to begin, another player’s avatar approached hers, started groping her, and ejaculated on her.
Her pleas for the abuse to stop fell on deaf ears.
He shrugged as if to say: ‘I don’t know what to tell you. It’s the metaverse — I’ll do what I want. Then he walked away.
Related: Ziliqa’s coming to the Metaverse as it launches its XR platform Metapolis
An unsafe metaverse
Online abuse is nothing new. It has been shown to be deadly in its traditional form, with many people taking their own lives after being bullied online. However, with the metaverse, it’s aggravated. Through VR headsets, and in some cases even full-body suits that can relay sensations, the abuse tends to feel more ‘real.’
Siggens explained:
When something bad happens, when someone comes up and gropes you, your mind is tricking you into thinking it’s happening in the real world,” Ms. Siggens said. “With the full metaverse, it’s going to be so much more intense.
This aggravation is especially bad for women and minors, much like in everyday life. Several women have come out to document the abuse they go through on virtual reality words, even forming support groups to help them cope.
Callum Hood, the head of research at the Center for Countering Digital Hate recently spent weeks monitoring digital interactions on VRChat, a metaverse-like VR game. In one 11-hour period, he recorded more than 100 problematic incidents. Some of them involved users below the age of 13. In some, the offenders made sexual or violent threats against these minors.
VRChat is unsafe because its developers and Facebook have failed to put basic measures in place to ensure abusive users cannot access its services. They have created a safe haven for abusive users at the same time as inviting minors to enter the metaverse.
What’s worse for the users is that those who should stamp these bad actors out are either unwilling or unable.
Take Facebook for instance. In March this year, Andrew Bosworth, an executive who will take over as CTO next year, admitted defeat to employees in an internal memo. Andrew said that moderating how users act in the metaverse “at any meaningful scale is practically impossible.”
Right now, all a user can do is block and report the culprit and hope that the platform will take further action. Titania Jordan, an executive at Bark, believes that most culprits end up getting back on the platforms and abusing even more people. Bark is a platform that uses AI to monitor children’s devices for safety reasons.
She told the New York Times:
V.R. is a whole other world of complexity. Just the ability to pinpoint somebody who is a bad actor and block them indefinitely or have ramifications so they can’t just get back on, those are still being developed.
Facebook has committed to making the metaverse safe for everyone. Being one of the biggest players in the space with its Oculus VR headsets, what it does will have a big impact on the direction other companies take. Mark Zuckerberg himself has made a pledge to bring more safety into the virtual world.
Related: Facebook goes all-in on Meta with a $60M deal for Meta name rights
Kristina Milian, a spokesperson for the company says that Meta is working with policymakers and experts on this pledge.
She stated:
We don’t allow content that attacks people based on race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability.