For as long as Internet socialisation has existed, children have suffered at the hands of adults. In the early 2000s, this was in chat rooms, in the 2010s it was social media, in the future it could be Facebook's Metaverse.
Reported by The Washington Post, the newly-founded Metaverse appears to be the next-gen platform for child predators. But what is Facebook, or now Meta, doing to protect children in its virtual worlds?
Children are everywhere in Facebook's Metaverse
In the WP article, a user of Facebook's Horizon Worlds platform discovered a virtual environment mostly populated with kids. Children as young as 9-years-old run around the VR land, swearing and annoying adults in the game.
Of course, children aren't supposed to be in Horizon Worlds. Despite appearing extremely child-friendly in its aesthetic, Facebook's Metaverse is strictly for adults aged 18 and above. However, just like any social platform, children will find their way online.
Some children gleefully brag about using their parents’ VR headsets to access Horizon Worlds. Others simply know how to say they're 18 on an unverified Internet platform; who hasn't lied their way onto a platform they're top young to join?
Experts in Internet abuse protection claim that Facebook's VR platform is woefully unprepared for the inevitable wave of predators on its service. As with any emerging platform, predators will be there, and Horizon Worlds isn't ready for it.
Horizon Worlds is a predator’s hunting ground
The current iteration of Horizon Worlds does absolutely nothing to protect children from being manipulated by adults. While long-running services like Roblox, Discord and other platforms used by children have parental controls and safety features for child users, Horizon Worlds tells users to deal with it, forcing the individual to mute or block others through clunky menus.
The platform’s lack of abuse preparation was made glaringly obvious earlier this week. In the first few months of launch, Horizon Worlds has suffered heavily from virtual gropers running in front of women and inappropriately “touching” their digital chests. In response, Meta made it so that avatars can no longer touch.
This “fix it when people notice it’s broken” attitude is exactly why experts are certain the platform will suffer heavily from predators. After all, the past often repeats itself. Sarah Gardener, vice president of external affairs at Thorn, a protective anti-CSAM non-profit, explained that the platform is poised for exploitation.
“[Child predators] are often among the first to arrive,” Gardner said. “They see an environment that is not well protected and does not have clear systems of reporting. They’ll go there first to take advantage of the fact that it is a safe ground for them to abuse or groom kids.”
Toxicity was expected
Toxicity in a fully immersive online platform was expected, not just by consumers, but by Facebook as well. In leaked reports, Meta executives allegedly claimed that its Metaverse would be impossible to moderate effectively.
However, Facebook's complete avoidance of planning for an obvious wave of child users is a glaring oversight. For a company that exists as a result of almost two decades of social media development, it would be seen as highly irresponsible.