Last week it was reported that a lawsuit has been initiated against gaming giant Roblox and leading messaging platform Discord.
The court action—charging them with the facilitation of child predators and misleading parents into believing the platforms are safe to use for their children—centers around a 13-year-old plaintiff who was targeted by a predator on these platforms.
The papers filed at court in San Mateo outline that the plaintiff joined Roblox and Discord in 2023 after his father conducted thorough research and believed the platforms to be safe for children.
Attorneys said the plaintiff’s parents “(…) learned the truth (of this) only after it was too late.”
In 2024, the parents discovered their child had been groomed into sending explicit pictures and videos of himself to a 27-year-old man. The situation quickly escalated to the point that the plaintiff and his family had to move across the country after the predator discovered the boy’s location through Roblox.
The suit alleges that both Roblox and Discord are aware of how easily predators can target children through their platforms by grooming and manipulating children into sending explicit material but have failed to provide adequate safety measures to protect minors from such exploitation.
Anna Marie Murphy, an attorney with California-based law firm Cotchett, Pitre and McCarthy, said:
“Both Roblox and Discord, we allege, were negligent in the services they provided to our client, a 13-year-old boy. The predator used a function on Roblox called a whisper function that allowed him, as a complete stranger, to be in the game with our client and send a direct message. That was the first contact, and there was a request for a naked picture.”
Anapol Weiss, the other firm joining the case, explained more in a press release outlining its reasons for joining the suit:
“Roblox’s expansive and unsupervised ecosystem created an environment where predators and harmful content thrived and continue to thrive”.
The suit also alleges that the companies’ lax safety standards led to the plaintiff sending explicit material in exchange for “Robux” Roblox in-game currency, stating:
“Roblox’s success and continued growth has hinged on its constant assurances to parents that its app is safe for children. In reality, Roblox is a digital and real-life nightmare for children.”
“What happened is far from an isolated event. The plaintiff is but one of countless children whose lives have been devastated because of Roblox and Discord’s misrepresentations and defectively designed apps.” Murphy commented.
This is not the first time these companies have been sued for allegedly enabling predators to exploit minors on their platforms. In 2022, they were named in a lawsuit filed in San Francisco Superior Court, where the companies were accused of misrepresenting their platforms as safe for children while allowing predators to exploit a young girl.
A report by corporate investigation firm, Hindenburg Research, released in October 2024 and referenced in the statement released by Anapol Weiss, further alleged that Roblox has continued to prioritize profit and attractiveness to investors above online safety and protection of underage users, and was guilty of exposing children to an “x rated hellscape, grooming, pornography, violent content and abusive speech” despite claims to be working towards cleaning up the platform.
As this case unfolds, it highlights the ongoing concerns about child safety in online gaming and messaging platforms, emphasizing the need for stronger protective measures and parental vigilance in the digital age.
We don’t just report on threats—we remove them
Cybersecurity risks should never spread beyond a headline. Keep threats off your devices by downloading Malwarebytes today.