Efforts to ensure children's safety on two-dimensional online platforms

Legislative measures like the Kids Online Safety and Privacy Act (KOPSA), which has already passed the Senate and is now under review in the House, may result in over-censorship of AR/VR content, the Information Technology & Innovation Foundation (ITIF) warned in the repo

Oct 7, 2024 - 16:38
 0
Efforts to ensure children's safety on two-dimensional online platforms
Efforts to ensure children's safety on two-dimensional online platforms

Protecting Children from Immersive Technology Might Lead to Excessive Censorship
Efforts to ensure children's safety on two-dimensional online platforms, such as social media, could have unintended consequences on the immersive world of augmented and virtual reality, as per a report released Tuesday by a Washington, D.C.-based tech think tank.

Legislative measures like the Kids Online Safety and Privacy Act (KOPSA), which has already passed the Senate and is now under review in the House, may result in over-censorship of AR/VR content, the Information Technology & Innovation Foundation (ITIF) warned in the report.

Should KOPSA become law, AR/VR platforms might be forced to enforce content moderation similarly to traditional social media platforms, the report suggested.

By granting the Federal Trade Commission (FTC) the authority to determine harmful content, there is a risk that either the FTC or the platforms themselves could engage in excessive censorship to avoid liability, potentially affecting content important to children's education, entertainment, and identity, the report elaborated.

“One of the concerns we have with KOPSA is that it may pave the way for over-censorship by allowing the FTC to decide what is harmful,” said Policy Analyst Alex Ambrose, who authored the report.

Ambrose expressed further concerns in an interview with TechNewsWorld, stating, "It gives political powers the ability to determine what's harmful. For instance, the FTC might claim that content about environmental protection, global warming, or climate change causes anxiety and could decide to eliminate such content because it could cause distress in children."

Avoiding Over-Censorship
Andy Lulham, COO of VerifyMy, a London-based age and content verification company, recognized the concern of over-censorship but believes it is often overstated. "While the fear is understandable, it is largely misplaced," Lulham told TechNewsWorld. "Well-crafted regulations are not an enemy to free speech but a protector in the digital age."

Lulham stressed that the method of regulation is key. “Overly broad, heavy-handed regulations can result in excessive censorship,” he said. "However, a more balanced, principle-driven regulatory approach can preserve online freedoms while safeguarding vulnerable users. We've seen such balanced efforts in the case of privacy regulations like the GDPR."

The General Data Protection Regulation (GDPR), in force since 2018, is a broad data protection law in the European Union that dictates how businesses gather, store, and use personal information from EU citizens.

“I believe that regulations should concentrate on enforcing robust safety measures and systems rather than specifying individual content decisions,” Lulham added. “This shifts the focus onto platforms to develop sound trust and safety protocols, promoting innovation rather than fostering fear of content removal.”

Lulham also emphasized the importance of transparency. “Mandating transparency reports can help hold platforms accountable without resorting to heavy-handed censorship,” he explained. “This would not only prevent overreach but also build public trust in the platforms and regulatory processes.”

Moreover, he suggested that regulations should include clear and accessible appeals for content removals. “This can help fix mistakes and prevent unnecessary censorship.”

“Some critics might argue that regulation will inevitably result in some level of censorship,” Lulham acknowledged. “However, I believe the greater threat to free expression comes from unregulated environments where vulnerable users are subjected to abuse. Well-designed regulations can create a fairer environment, giving voice to those who might otherwise be drowned out.”

The Good, Bad, and Ugly of AR/VR
The ITIF report pointed out that online safety discussions often overlook AR/VR technologies. These immersive platforms encourage social interaction, creativity, and imagination—all vital for children's development, the report said.

However, it also acknowledged that addressing the risks to children posed by immersive tech is complex. Most AR/VR platforms are not designed for users under 13, and children often navigate adult-oriented spaces, which exposes them to inappropriate content and potentially harmful habits.

Solving these issues will require a combination of market innovation and careful policy decisions, the report added. Companies' choices in design, moderation practices, parental control tools, and trust and safety strategies will be crucial in shaping a safe metaverse environment.

Nonetheless, it admitted that public policy will still be needed to address specific risks. Policymakers are already focusing on children's safety on two-dimensional platforms, like social media, and these regulations could extend to AR/VR platforms, ITIF noted.

Before introducing such regulations, the report suggested that lawmakers consider the safety efforts already made by AR/VR developers and focus on fixing demonstrated harms rather than hypothetical threats when tools are insufficient.

“Most platforms are working to eliminate harmful content, but the sheer volume makes it inevitable that some will slip through,” Ambrose noted. “Issues like violence incitement, vandalism, and the spread of harmful content or misinformation are likely to persist in immersive platforms.”

“Since the metaverse will thrive on massive data usage, these problems could be even more pervasive than they are today,” she added.

Designing for Safety
Lulham echoed the report's view that companies' design choices will be critical in creating a safer metaverse.

“In my opinion, the safety decisions made by companies will be pivotal in ensuring a secure digital space for children,” he said. “The current online landscape is filled with risks, and I believe businesses have the power and responsibility to reshape it.”

He highlighted user interface design as the first line of defense for protecting children. “Companies that prioritize intuitive, age-appropriate designs can significantly alter how children interact with digital platforms,” he said. "By designing interfaces that naturally guide users toward safer behaviors, harmful interactions can be greatly reduced."

Content moderation, he argued, is also at a critical point. “The vast amount of content requires a shift in approach,” he observed. "While AI-powered tools are helpful, they aren’t a complete solution. A hybrid model—combining advanced AI with human oversight—may strike the right balance between protection and censorship."

Lulham also emphasized the importance of parental control tools. These tools should be central features of platforms, not add-ons. "I envision a future where these tools are so user-friendly and effective that they become a core part of digital family life," he said.

He also noted that trust and safety strategies will distinguish successful platforms from failing ones. “Companies that integrate comprehensive age verification, real-time monitoring, and transparent reporting will set the standard,” he said. “Collaboration with child safety experts and regulators will be essential for companies serious about protecting young users.”

“In summary,” Lulham concluded, “the future of online safety for children should be based on the principle of 'safety by design,' which must drive all aspects of platform development.”

The ITIF report emphasized that children will play a pivotal role in the metaverse's adoption.

Balancing innovation and safety in this evolving space will be a challenge, but parents, companies, and regulators all have important roles in protecting privacy and safety while fostering engaging and creative immersive experiences.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow