‘Roblox’ enabled an underage girl’s sexual exploitation, lawsuit claims
Content warning: this article contains references to suicide and sexual exploitation.
The girl, who is now 13, first began playing Roblox when she was “nine or ten years old,” where she came into contact with multiple adult men who encouraged her to sign up to Discord, Snapchat and Instagram in order to communicate with them.
The lawsuit, which was filed in San Francisco Superior Court, also targets Discord Inc, Snapchat parent company Snap and Instagram parent Meta. Both Snap and Meta are currently facing dozens of similar lawsuits.
According to the lawsuit, Discord did not verify the girl’s age, despite requiring users to be at least 13 years old. A Discord spokesperson said the company has a “zero-tolerance policy for anyone who endangers or sexualizes children,” but did not comment directly on the lawsuit to Reuters.
The lawsuit claims that the group of men encouraged the girl to drink alcohol, abuse prescription drugs and send sexually explicit images of herself. The girl was also allegedly persuaded to send an unspecified amount of money to one of the men.
The girl went on to suffer severe mental health problems, leading to suicide attempts and hospitalisation as a result of her experience, the lawsuit claims.
The plaintiffs claim that the companies involved in the case failed to prevent minors from using their platforms, and that Snapchat and Instagram encouraged addiction in children. They are seeking unspecified damages.
Roblox has been accused of failing to protect its young user base in the past. Last year, People Make Games investigated the dangers to young people on the platform, and how Roblox’s business model takes advantage of the enthusiasm of young and aspiring developers.
The post ‘Roblox’ enabled an underage girl’s sexual exploitation, lawsuit claims appeared first on NME.