OnlineSafetyActAgeVerificationUKRegulation

New Trends in Game Expansion into the UK Market: Age Verification Design Requires Attention

游戏出海英国新趋势,年龄验证设计需关注

March 24, 2026
11 views

Summary

Age verification for minors has long remained underdeveloped globally due to the need to balance privacy protection with verification effectiveness. While U.S. regulators continue exploring solutions such as privacy-preserving facial age estimation, the United Kingdom has taken a more concrete step through the passage of the Online Safety Act. The Act introduces broad obligations for online services, including risk assessments, age assurance mechanisms, and enhanced protections for minors against harmful and illegal content. For game companies, this implies new compliance requirements such as child risk assessments, age verification systems, and functional restrictions based on age groups. With significant penalties for non-compliance, the UK framework signals a shift toward stronger regulatory enforcement. Age verification is likely to become a central compliance focus for global digital platforms in the coming years.

In overseas markets, effective age verification measures for minor users have long been lacking. Due to the need to balance the protection of minors’ personal information with the effectiveness of verification mechanisms, legislative developments in this area remain at an exploratory stage worldwide. Recently, the U.S. Federal Trade Commission postponed its deadline for deciding on a new mechanism proposed by the ESRB to obtain parental consent using “privacy-preserving facial age estimation” technology.

Turning to Europe, the United Kingdom’s Online Safety Act (the “Act”) has now been approved by the House of Lords and is awaiting Royal Assent. The Act covers a broad range of issues, including minimizing the risk of children being exposed to harmful and age-inappropriate content, removing illegal content such as child sexual abuse material (CSAM), criminalizing fraudulent and scam advertisements, and introducing principled requirements for age verification for certain online services.

The UK government has stated that the Act aims to protect both children and adults from harmful or illegal online content. The regulatory scope of the Act includes companies with a “significant” number of users in the UK, or whose services are accessible to UK users and involve online risks. This includes internet platforms with user-generated content, such as game companies, dating websites, social media platforms, and adult websites.

Under the Act, all regulated services owe a duty of care in respect of illegal content. Where a service is likely to be accessed by children, it also has a duty to protect children from harm. Online services involving children will be subject to risk assessments across three categories (primary priority / priority / harmful), depending on user scale and service functionality. The objective criteria for these categories will be set out in secondary legislation.

Article 31 of the Act provides that, for services or functionalities unsuitable for children, internet service providers may deny access or use by minors only if they have established age verification or other age assurance systems or procedures. Article 32 further sets out requirements for conducting child access compliance assessments.

For game companies, foreseeable compliance obligations under the Act include conducting child risk assessments to determine whether games and specific in-game features are suitable for minors. Developers must also design age assurance processes (such as age verification mechanisms) to identify players’ age or age range and provide age-appropriate gaming services.

Although the Act does not provide detailed objective guidance on specific age assurance mechanisms, such requirements are expected to be clarified through secondary legislation. Based on current regulatory priorities, age assurance mechanisms will likely be used to restrict access to certain high-risk features for specific age groups, such as voice chat or age-restricted content deemed inappropriate for younger players.

The Act also strengthens enforcement against illegal content on social media platforms, particularly content harmful to children, including:

pornographic content; content that promotes, encourages, or provides instructions for suicide, self-harm, or eating disorders (even if below the criminal threshold); content depicting or encouraging serious violence; and bullying content.

The Act designates Ofcom as the regulatory authority. Ofcom will have enforcement powers, including the ability to impose fines of up to £18 million or 10% of global annual turnover, whichever is higher, on non-compliant internet service providers. Where providers fail to comply with Ofcom enforcement notices concerning specific child safety duties or child sexual exploitation and abuse, Ofcom may also pursue criminal liability against senior executives.

As the cost of accessing online content continues to decrease, and with the proliferation of social media, short-form and long-form video platforms, and streaming services, minors are increasingly exposed to harmful content. In the gaming sector specifically, the growing complexity of game content, the high degree of social interaction, and the prevalence of user-generated content have led regulators to recognize that traditional pre-release age rating systems alone are insufficient to fully protect minors.

Whether through the ESRB’s exploration of facial age estimation or the implementation of the UK Online Safety Act, the expansion of minor protection through age verification is likely to become a key focus of regulatory efforts worldwide in the coming years.

中文原文

在海外地区,未成年人用户的年龄验证长期以来缺少实质性措施。由于需要平衡未成年人个人信息保护与验证机制有效性的关系,因此这方面的立法进展在全球范围内尚属于探索阶段。近日,美国公平交易委员会推迟了对ESRB申请使用“隐私保护面部年龄估计”技术获取家长同意的新机制做出决定的最后期限。

而当目光聚焦至欧洲大陆,英国《在线安全法案》(以下称“《法案》”)现已获得上议院批准,并正在等待国王御准。该法案涵盖了广泛的问题,包括最大限度地减少儿童看到有害和不适合年龄的内容的风险、删除儿童性虐待材料 (CSAM) 等非法内容、将欺诈和诈骗广告定为刑事犯罪,以及为某些在线服务引入年龄验证的原则性规定。

英国政府表示,《法案》旨在保护儿童和成人免受有害或非法在线内容的侵害。《法案》所涵盖的监管对象包括在英国拥有“大量”用户的公司,或其服务可由英国用户访问并存在在线风险内容的公司,例如具有用户生成内容的互联网平台,例如游戏公司、约会网站、社交媒体平台和成人网站等。


根据该法案的规定,所有受监管的服务都将对非法内容负有注意义务,如果服务被认为可供儿童使用,则有义务保护儿童免受伤害。与儿童有关的在线服务将根据以下三个类别(primary priority/priority/harmful)进行风险评估,具体取决于其用户数量和该服务的功能。每个类别的客观评判标准将在二级立法中规定。

《法案》第31条明确,针对不适宜儿童的某项服务或功能,互联网服务商只有在建立了年龄验证或其他年龄确认系统或程序的情况下,可以拒绝前述未成年用户的访问或使用。同时第32条规定了互联网服务商进行儿童访问合规性评估的要求。

针对游戏公司而言,根据《法案》规定,可预见的合规要求包括进行儿童风险评估,以确定游戏及游戏内特定功能是否可以由未成年人使用;同时应设计年龄确认流程(例如年龄验证机制),了解玩家的年龄或年龄范围并提供适龄的游戏服务。虽然《法案》并未就年龄确认流程的具体要求提供客观性指引,亟待二级立法文件予以补充。但根据对监管重心的判断,经过此类年龄确认流程筛选后,应涉及限制特定年龄组的某些高风险功能,例如语音聊天或对某些内容进行年龄限制(如果认为某些内容不适合特定年龄以下的玩家)。


《法案》同时将对社交媒体平台中出现的非法内容加大打击力度,特别是保护儿童免遭侵害的有害内容类别,包括:色情内容、未达到犯罪门槛但宣扬、鼓励或提供自杀、自残或饮食失调指导的内容、描绘或鼓励严重暴力的内容、欺凌内容等。

《法案》设立 OFCOM 作为监管机构,OFCOM 将拥有执法权限,可以对不合规的互联网服务商处以最高1800万英镑或其全球年营业额10% 的罚款,以较高者为准;如果互联网服务商未能遵守Ofcom关于特定儿童安全职责或服务中儿童性虐待和性剥削的执行通知,OFCOM还可以追究高管人员的刑事责任。


随着互联网内容的可触达成本不断降低,在各类社交媒体、长短视频、流媒体等互联网泛娱乐平台的加持下,未成年人被暴露在各类不良信息的可能性越来越高。聚焦到游戏品类,游戏内容的不断复杂化丰富化、高占比的社交属性以及高自由度的用户生成内容,使得监管机构开始意识到,仅通过前置性游戏分级可能已经无法完全满足对于适龄用户的穷尽筛选,也无法满足对未成年人保护的需求。无论是ESRB寻求引入面部识别验证,还是英国《在线安全法案》的通过,通过年龄验证以扩展未成年保护的外延,或将是近几年各国监管的重点发力对象。

分享文章

相关文章

General

Game Licensing (ISBN Approval): Can Cultural Enforcement Be Exercised Across Regions?

游戏版号,文化执法也能异地?

This article analyzes the legality and rationality of cross-regional administrative enforcement in game licensing cases in China. It argues that, under the current legal framework, enforcement should follow the principle of territorial jurisdiction, as the place of illegal conduct is typically tied to the location of the game company. Cross-regional enforcement may lead to jurisdictional conflicts, increased compliance burdens, and risks of profit-driven enforcement, thereby undermining the business environment and procedural fairness.

6 views
General

Twitch bans streamers from “promoting or sponsoring” CS:GO skin gambling

Twitch禁止主播“推广或赞助”CSGO皮肤赌博

Twitch has updated its community guidelines to further restrict gambling-related content, explicitly banning the promotion and sponsorship of skin gambling websites, particularly those مرتبط with Counter-Strike: Global Offensive. Since 2022, Twitch has prohibited the promotion of gambling sites that are not licensed in jurisdictions with consumer protections, naming platforms such as Stake, Rollbit, and Roobet. The latest update expands these restrictions to include CS:GO skin gambling sites and their free social versions, while also banning links, promo codes, and visual displays of such content. Twitch stated that the move responds to renewed interest in CS:GO skin gambling.

4 views
General

U.S. Market Expansion: New Age Verification Method Under COPPA

美国出海:COPPA下新的年龄验证方法

To facilitate compliance with the Children’s Online Privacy Protection Act (COPPA), the Entertainment Software Rating Board (ESRB), together with other U.S. institutions, has proposed a new mechanism for obtaining verifiable parental consent (VPC). The proposal relies on privacy-protective facial age estimation technology, developed with technical support from Yoti and SuperAwesome. The U.S. Federal Trade Commission (FTC) is currently soliciting public comments on whether this method falls within existing COPPA-approved verification methods, whether it satisfies the statutory requirements for parental consent, and whether it introduces privacy risks, including those related to biometric information. The proposal signals a potentially significant development in age verification compliance for online platforms and gaming services operating in the United States.

5 views