Protecting Kids From Immersive Tech Could Lead to Over-Censorship
Attempts to protect childrenâs safety in the two-dimensional realm of online social media could adversely impact the 3D world of augmented and virtual reality, according to a report released Tuesday by a Washington, D.C., technology think tank.
Legislative efforts, like the Kids Online Safety and Privacy Act (KOPSA), which has passed the U.S. Senate and is now before the House of Representatives, could lead to harmful censorship of AR/VR content, maintained the report by the Information Technology & Innovation Foundation.
If KOPSA becomes law, AR/VR platforms may be forced to ramp up enforcement in the same manner as traditional social media platforms, the report explained.
By giving the FTC authority to deem content on these platforms harmful, it continued, the FTC may over-censor content on AR/VR platforms, or the platforms themselves may censor content to avoid liability, which could include content pertinent to childrenâs education, entertainment, and identity.
âOne of our fears that we have with KOPSA is that it opens the door for potential over-censorship by giving the FTC [Federal Trade Commission] power to decide what qualifies as harmful,â said the reportâs author, Policy Analyst Alex Ambrose.
âItâs another way for a political party to decide whatâs harmful,â she told TechNewsWorld. âThe FTC could say content like environmental protection, global warming, and climate change is anxiety-inducing. So we need to completely get rid of anything related to climate change because it could lead to anxiety in children.â
Over-Censorship Can Be Avoided
Andy Lulham, COO of VerifyMy, an age and content verification provider based in London, acknowledged that the specter of over-censorship looms large in discussions about online regulation. âBut I firmly believe this fear, while understandable, is largely misplaced,â he told TechNewsWorld. âWell-crafted government regulations are not the enemy of free expression, but rather its guardian in the digital age.â
Lulham maintained that the key to regulation lies in the approach. âBlanket, heavy-handed regulations risk tipping the scales towards over-censorship,â he said. âHowever, I envision a more nuanced, principle-based regulatory framework that can enhance online freedom while protecting vulnerable users. Weâve seen examples of such balanced approaches in privacy regulations like GDPR.â
The GDPR â General Data Protection Regulation â which has been in effect since 2018, is a comprehensive data protection law in the European Union that regulates how companies collect, store, and use the personal data of EU residents.
âI strongly believe that regulations should focus on mandating robust safety systems and processes rather than dictating specific content decisions,â Lulham continued. âThis approach shifts the responsibility to platforms to develop comprehensive trust and safety strategies, fostering innovation rather than creating a culture of fear and over-removal.â
He asserted that transparency will be the linchpin of effective regulation. âMandating detailed transparency reports can hold platforms accountable without resorting to heavy-handed content policing,â he explained. âThis not only helps prevent overreach but also builds public trust in both the platforms and the regulatory framework.â
âFurthermore,â he added, âI advocate for regulations requiring clear, accessible appeal processes for content removal decisions. This safety valve can help correct inevitable mistakes and prevent unwarranted censorship.â
âCritics might argue that any regulation will inevitably lead to some censorship,â Lulham conceded. âHowever, I contend that the greater threat to free expression comes from unregulated spaces where vulnerable users are silenced by abuse and harassment. Well-designed regulations can create a more level playing field, amplifying diverse voices that might otherwise be drowned out.â
Good, Bad, and Ugly of AR/VR
The ITIF report noted that conversations about online safety often overlook AR/VR technologies. Immersive technologies foster social connection and stimulate creativity and imagination, it explained. Play, imagination, and creativity are all imperative for children to develop.
The report acknowledged, however, that properly addressing the risks children face with immersive technologies is a challenge. Most existing immersive technologies are not made for children under 13, it continued. Children explore adult-designed spaces, which leads to exposure to age-inappropriate content and can build harmful habits and behaviors in childrenâs mental and social development.
Addressing these risks will require a combination of market innovation and thoughtful policymaking, it added. Companiesâ design decisions, content moderation practices, parental control tools, and trust and safety strategies will largely shape the safety environment in the metaverse.
It conceded, however, that public policy interventions are necessary to tackle certain safety threats. Policymakers are already addressing childrenâs safety on â2Dâ platforms such as social media, leading to regulations that may affect AR/VR tech, ITIF noted.
Before enacting those regulations, the report recommended policymakers consider AR/VR developersâ ongoing safety efforts and ensure that these tools maintain their effectiveness. When safety tools are insufficient, it continued, policymakers should focus on targeted interventions to address proven harms, not hypothetical risks.
âMost online services are working to remove harmful content, but the sheer amount of that content online means that some of it will inevitably slip through the cracks,â Ambrose said. âThe issues we see in platforms today, like the incitement of violence, vandalism, and spreading harmful content and misinformation, will only continue on immersive platforms.â
âThe metaverse is going to thrive on massive amounts of data, so we can assume that these issues will be pervasive â maybe even more pervasive than what we see today,â she added.
Safety by Design
Lulham agreed with the reportâs contention that companiesâ design decisions will shape the safety environment of the metaverse.
âIn my view, the decisions companies make regarding online safety will be pivotal in creating a secure digital environment for children,â he said. âThe current landscape is fraught with risks, and I believe companies have both the responsibility and power to reshape it.â
He maintained that user interface design is the first line of defense to protect kids. âCompanies prioritizing intuitive, age-appropriate designs can fundamentally alter how children interact with online platforms,â he explained. âBy crafting interfaces that naturally guide users towards and educate them on safer behaviors, we can significantly reduce harmful encounters.â
Content moderation is at a critical juncture, he added. âThe volume of content demands a paradigm shift in our approach,â he observed. âWhile AI-powered tools are essential, theyâre not a panacea. I argue that the future lies in a hybrid approach, combining advanced AI with human oversight to navigate the fine line between protection and censorship.â
Parental control tools are often overlooked but crucial, he maintained. These shouldnât be mere add-ons but core features designed with the same attention as the main platform. âI envision a future where these tools are so intuitive and effective that they become integral to family digital life,â he said.
He contended that trust and safety strategies will differentiate thriving platforms from faltering ones. âCompanies adopting a holistic approach, integrating robust age verification, real-time monitoring, and transparent reporting, will set the gold standard,â he declared. âRegular engagement with child safety experts and policymakers will be non-negotiable for companies serious about protecting young users.â
âIn essence,â he continued, âI see the future of online safety for children as one where âsafety by designâ isnât just a buzzword but the fundamental principle driving all aspects of platform development.â
The report noted that children, as drivers of the metaverse, play a crucial role in the market adoption of immersive technologies.
Ensuring innovation can flourish in this nascent field while also creating a safe environment for all users of AR/VR technology will be a complex challenge, it acknowledged, adding that parents, corporations, and regulators all have roles to play by balancing privacy and safety concerns while creating engaging and innovative immersive experiences.
Protecting Kids From Immersive Tech Could Lead to Over-Censorship