Advertisement

Investigation of VRChat finds rampant child grooming and other safety issues

"Predatory and toxic behavior has no place on the platform,” the game's developer said.

VRChat

One of the more popular VR apps you can download through Steam and Meta’s Oculus Quest Store has a child safety problem. If you’re unfamiliar with VRChat, the app styles as “the future” of social virtual reality. “Our vision for VRChat is to enable anybody to create and share their own social virtual worlds,” the game’s developer says on its Steam store page. With some understanding of Unity, players can create their own social spaces and avatars. That means you can see a lot of creativity on display in VRChat, but there’s also a dark side to it as the BBC found out.

Posing as a 13-year-old girl, BBC researcher Jess Sherwood said she entered a virtual strip club where she saw adult men chase a child while telling them to remove their clothes. In many of the rooms Sherwood visited, she frequently saw condoms and sex toys on display, and on one occasion even saw a group of adult men and minors simulating group sex. She also saw instances of grooming.

"It's very uncomfortable, and your options are to stay and watch, move on to another room where you might see something similar, or join in — which, on many occasions, I was instructed to do," she said.

"Predatory and toxic behavior has no place on the platform,” VRChat told the BBC. The developer added it was “working hard to make itself a safe and welcoming place for everyone.” Part of the problem stems from the fact nearly anyone can download and play VRChat. For instance, to download the app from the Oculus Quest Store, all you need is a Facebook account. Sherwood created a fake profile to set up her account and access VRChat, and users of all ages can mingle freely without age gating.

When we contacted the company, a spokesperson for VRChat told Engadget user safety was its top priority. "It is likely that, if it were reported, much of the content that you described would be removed immediately," they said. "Likewise, the users you described were acting in a way that would almost certainly lead to the termination of their accounts had they been reported to our Trust and Safety team."

The spokesperson added VRChat includes a number of tools for reporting harassment. The company's Trust and Safety team has the ability to use metadata to track down problematic users. Depending on the severity of someone's actions, that team can hand out suspensions and permanent bans.

Sherwood isn’t the first person to notice VRChat has a child safety problem. While the game has a “Very Positive” rating on Steam, the presence of predatory adults is something you see referenced frequently in both positive and negative reviews. “Enjoyable social VR slowly being ruined by horny degenerates and ddosers,” said one player with more than 2,300 hours spent in the game.

“The amount of ‘people’ around the age of 30 attempting to do stuff like flirting all the way to trying to have e-sex with clearly underage users is alarming,” said another player.

VRChat isn’t the only metaverse app dealing with what amounts to a harassment problem. At the start of February, Meta added a Personal Boundary feature to Horizon Worlds to give users the ability to prevent people from entering their personal space. More recently, Microsoft took the dramatic step of removing Altspace VR’s social hubs.

Update 3:06PM ET: Added more information from VRChat.