SYDNEY, Dec 9: In late November, Australia’s federal parliament enacted significant legislation prohibiting social media access for individuals under 16 years of age. While specifics are still unclear—including which platforms will be impacted and how the ban will be enforced—the government has indicated that trials of age verification technologies will be integral to its implementation. Currently, video games and gaming platforms do not fall under Australia’s social media ban; however, by examining China’s extensive use of age verification to regulate minors’ gaming habits, we can better understand potential enforcement issues.
In China, stringent regulations cap online gaming for anyone under 18 to just one hour on select days, highlighting the considerable challenges of compliance and privacy protection.
‘Spiritual opium’: the reality of video games in China
China boasts a substantial video game industry, with major companies like Tencent influencing the global gaming arena. However, the consumption of video games among young people remains a contentious topic in the country. Video games are often viewed through the lens of addiction and harm, frequently referred to as “spiritual opium.” This perspective frames gaming as a potential danger to the physical, mental, and social well-being of youth.
Many Chinese parents hold this view, seeing video games as an impediment to academic achievement and social development. Such parental concerns have led to strict regulations on children’s gaming, garnering widespread approval among families.
In 2019, a law was enacted limiting gaming time for under 18s to 90 minutes on weekdays and three hours on weekends, with a nightly curfew prohibiting play from 10 PM to 8 AM. A 2021 amendment further restricted gaming to just one hour on weekends and holidays, specifically from 8 PM to 9 PM. In 2023, the regulatory framework expanded to cover livestreaming platforms, video-sharing sites, and social media, requiring these services to implement systems to prevent addiction.
Enforcement strategies
To comply with these regulations, leading gaming companies in China have adopted a variety of mechanisms. Many games now feature age-verification systems that require players to input their real names and IDs for confirmation.
Some have even implemented facial recognition technology, raising privacy concerns. Concurrently, mobile device manufacturers, app developers, and app stores have introduced “minor modes” that limit access based on time restrictions, except for apps approved by parents.
A report from the China Game Industry Research Institute indicated that over 75% of minors now reportedly limit their gaming to less than three hours per week, and officials claim that they have mitigated “internet addiction.” However, the enforcement of these policies remains fraught with challenges and ethical considerations.
Does it work?
Despite stringent regulations, many young players in China find ways to circumvent the rules. A recent study found that over 77% of minors evaded real-name verification by creating accounts under older relatives’ or friends’ names. Additionally, a burgeoning black market for gaming accounts has emerged on e-commerce platforms, allowing minors to rent or purchase accounts to bypass restrictions.
Reports have indicated that minors are even outsmarting facial recognition systems by using images of older individuals, highlighting the shortcomings of tech-based enforcement. Furthermore, this regulation has inadvertently exposed minors to scams involving game account sellers, with one incident resulting in nearly 3,000 minors losing over 86,000 yuan (approximately A$18,500).
Insights for Australia from China’s experience
The Chinese case suggests that failing to understand young people’s motivations for consuming media might drive them to bypass restrictions. A parallel situation could easily arise in Australia, undermining the intended effects of the social media ban. Prior to the law’s passage, many experts warned that outright bans enforced through questionable technology could be invasive and ineffective, potentially increasing online risks for youth.
Instead, Australian researchers and policymakers should collaborate with tech platforms to foster safer online environments by implementing age-appropriate content filters, parental controls, and screen time management tools, alongside comprehensive safety-by-design strategies.
Such measures would empower families while allowing young people to maintain vital social connections and engage in important recreational activities, ultimately promoting healthier online behaviors without compromising their privacy or personal freedom. (The Conversation)
(PTI)