In overseas markets, effective age verification measures for minor users have long been lacking. Due to the need to balance the protection of minors’ personal information with the effectiveness of verification mechanisms, legislative developments in this area remain at an exploratory stage worldwide. Recently, the U.S. Federal Trade Commission postponed its deadline for deciding on a new mechanism proposed by the ESRB to obtain parental consent using “privacy-preserving facial age estimation” technology.

Turning to Europe, the United Kingdom’s Online Safety Act (the “Act”) has now been approved by the House of Lords and is awaiting Royal Assent. The Act covers a broad range of issues, including minimizing the risk of children being exposed to harmful and age-inappropriate content, removing illegal content such as child sexual abuse material (CSAM), criminalizing fraudulent and scam advertisements, and introducing principled requirements for age verification for certain online services.

The UK government has stated that the Act aims to protect both children and adults from harmful or illegal online content. The regulatory scope of the Act includes companies with a “significant” number of users in the UK, or whose services are accessible to UK users and involve online risks. This includes internet platforms with user-generated content, such as game companies, dating websites, social media platforms, and adult websites.
Under the Act, all regulated services owe a duty of care in respect of illegal content. Where a service is likely to be accessed by children, it also has a duty to protect children from harm. Online services involving children will be subject to risk assessments across three categories (primary priority / priority / harmful), depending on user scale and service functionality. The objective criteria for these categories will be set out in secondary legislation.


Article 31 of the Act provides that, for services or functionalities unsuitable for children, internet service providers may deny access or use by minors only if they have established age verification or other age assurance systems or procedures. Article 32 further sets out requirements for conducting child access compliance assessments.

For game companies, foreseeable compliance obligations under the Act include conducting child risk assessments to determine whether games and specific in-game features are suitable for minors. Developers must also design age assurance processes (such as age verification mechanisms) to identify players’ age or age range and provide age-appropriate gaming services.
Although the Act does not provide detailed objective guidance on specific age assurance mechanisms, such requirements are expected to be clarified through secondary legislation. Based on current regulatory priorities, age assurance mechanisms will likely be used to restrict access to certain high-risk features for specific age groups, such as voice chat or age-restricted content deemed inappropriate for younger players.

The Act also strengthens enforcement against illegal content on social media platforms, particularly content harmful to children, including:
pornographic content; content that promotes, encourages, or provides instructions for suicide, self-harm, or eating disorders (even if below the criminal threshold); content depicting or encouraging serious violence; and bullying content.
The Act designates Ofcom as the regulatory authority. Ofcom will have enforcement powers, including the ability to impose fines of up to £18 million or 10% of global annual turnover, whichever is higher, on non-compliant internet service providers. Where providers fail to comply with Ofcom enforcement notices concerning specific child safety duties or child sexual exploitation and abuse, Ofcom may also pursue criminal liability against senior executives.
As the cost of accessing online content continues to decrease, and with the proliferation of social media, short-form and long-form video platforms, and streaming services, minors are increasingly exposed to harmful content. In the gaming sector specifically, the growing complexity of game content, the high degree of social interaction, and the prevalence of user-generated content have led regulators to recognize that traditional pre-release age rating systems alone are insufficient to fully protect minors.
Whether through the ESRB’s exploration of facial age estimation or the implementation of the UK Online Safety Act, the expansion of minor protection through age verification is likely to become a key focus of regulatory efforts worldwide in the coming years.
