Connect with us

Gambling

Roblox refutes underage gambling allegations in class action lawsuit

Published

on

Roblox refutes underage gambling allegations in class action lawsuit

Roblox has issued a detailed response to a class-action lawsuit accusing the online gaming platform of facilitating gambling by underage users through unauthorised third-party websites.

NEXT.io Podcast - Banner

The lawsuit, brought by parents on behalf of their children, alleges that these websites exploit Roblox’s virtual currency, Robux, to enable gambling-like activities that violate consumer protection and child safety laws.

Roblox firmly denies the allegations, characterising the claims as legally and factually baseless. In its court filings, the company maintains that the disputed activities occurred outside its platform in violation of its robust terms of service.

Roblox asserts that it actively works to combat unauthorised uses of its platform, including actions against the named third-party sites — RBXFlip, Bloxflip, and RBLXWild.

At the core of the lawsuit are allegations that third-party websites use Robux to simulate gambling mechanics, enticing minors to participate.

These platforms allegedly allow users to wager Robux on games of chance, with the potential to win or lose virtual currency or items. Plaintiffs claim that Roblox profits indirectly by selling Robux, knowing the currency might be used on such sites.

Roblox counters these claims by emphasising its lack of affiliation with the third-party websites. The company states that users must leave the Roblox platform to access these sites and that such activities breach its terms of service.

Specifically, Roblox prohibits users from exchanging Robux outside its ecosystem and requires users to agree to these restrictions upon creating an account.

The company also highlighted its measures to disrupt the unauthorised activities. Since 2019, Roblox has employed technological safeguards and issued cease-and-desist letters to operators of third-party gambling sites.

Most recently, this month Roblox sent a legal demand to the operators of Bloxflip, ordering them to cease their use of Roblox branding and stop accessing Roblox’s platform entirely.

Roblox’s affirmative defences

Roblox’s legal defence includes invoking Section 230 of the Communications Decency Act, which shields online platforms from liability for third-party content. The company argues that it cannot be held responsible for actions taken by users or third parties on external platforms.

Roblox further claims that the plaintiffs lack clean hands, asserting that the minors voluntarily participated in the alleged misconduct. The company asserts these are beyond its control.

Moreover, Roblox underscores its clear delineation between Robux and real-world currency. The company’s terms of use specify that Robux is a virtual currency without monetary value and cannot be legally converted outside its platform.

By adhering to these terms, Roblox asserts that it has acted within the law.

Despite its defences, Roblox acknowledges the challenges of policing a platform with millions of daily users, many of whom are minors.

According to its recent data, Roblox boasts 79.5 million daily active users, nearly half of whom are under 18. These demographics, coupled with the platform’s popularity among young players, create unique vulnerabilities.

In its filings, Roblox detailed its proactive measures to detect and mitigate policy violations, including tools to identify and suspend accounts engaged in unauthorised transactions. However, the company admits that third-party sites continually attempt to evade detection by exploiting loopholes.

The plaintiffs in the case argue that Roblox has a moral and legal obligation to protect its young user base from harmful external influences. They allege that the company’s current measures are insufficient and seek damages and the introduction of stronger enforcement policies.

Legal experts note that the case could have broader implications for online platforms managing virtual economies. If Roblox is found liable, it could set a precedent for increased regulatory scrutiny of platforms that cater to minors.

Roblox responds to criticism

Roblox is introducing additional measures to enhance safety for its youngest and most at-risk users. The latest updates include remote parental controls, content-specific labels, and restrictions on direct messaging.

After unveiling a set of parental controls in October, Roblox is now enabling parents to manage their children’s accounts remotely through their own devices and accounts. These controls allow parents to set daily screen time limits and review their children’s friends list.

To use these features, parents must link their Roblox accounts to their child’s account. Additionally, new content labels will categorise games based on their features, such as violence or crude language, rather than just age recommendations.

Children under nine will automatically have access only to games with minimal or mild levels of such content. However, parents can customise which labels their child can view and interact with.

Roblox is also implementing age-based restrictions on communication features. Players under 13 will be limited to in-game chats and will not have access to direct messaging outside of games.

A new default setting will further restrict younger users from public broadcast messages within games. Moreover, Roblox is changing its default settings to block players under 13 from accessing games that lack content ratings.

Unrated games, which often lack details about their content, such as levels of violence or sensitive material, will not be searchable or playable by children under 13. To address this, developers must complete a content questionnaire for each game they want to make available to younger players.

This process will provide Roblox with the information needed to assign appropriate ratings. Developers have until 3 December to submit this information, or their games will be marked as unrated and inaccessible to preteens.

Continue Reading