• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

NetFamilyNews.org

Kid tech intel for everybody

Show Search
Hide Search
  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research
  • About NetFamilyNews.org
    • Supporters
    • Anne Collier’s Bio
    • Copyright
    • Privacy

The missing piece in US child online safety law

September 7, 2023 By Anne 3 Comments

“Helpdesk” by DALL-E & me

US kids and parents need a toll-free number to call or text for help in getting harmful online content taken down. After studying various kinds of help like this that youth and parents in Europe, the UK, Australia and New Zealand have, we at The Net Safety Collaborative piloted a proof of concept for this – with independent evaluation – in the last decade.

Now, with so many state and federal laws aimed at protecting children passed or in the pipeline, it’s way past time our policymakers took up this missing piece in the child online protection toolbox and required it by law. It would help all parties to the harmful but legal content challenge: kids, parents and schools, as well as platform content moderators.

What US kids need and deserve

My friend and colleague, Prof. Sameer Hinduja of the Cyberbullying Research Center, just published two must-read blog posts about all the legislation, the first one looking at the problems with what has been proposed or passed so far and the second offering six components for comprehensive law that “would have the greatest positive impact” for young Internet users.

I agree that all six components are critically needed, and I respectfully suggest a 7th crucial one – or maybe a second piece to his 5th element that calls for the establishment of “an industry-wide, time-bound response rate when formal victimization reports (with proper and complete documentation and digital evidence) are made to a platform.”

The second part would help platforms attain that response rate: this independent source of help that would support platform content moderators by confirming harm called in by kids and caregivers, thereby reducing guesswork and “false positives” in the platform’s abuse reporting system. Reports that are false positives plague platforms because they’re not actionable – e.g., content that doesn’t appear to violate platform rules or “community standards,” that’s mis-reported or that basically abuses the abuse system (with users reporting someone they’re trying to get in trouble or kicked off the platform). In short, a helpline can provide the “real world” context that platform content moderators can never have – without which they usually can’t see how traumatizing particular content is for a child.

Real life context

By talking with the child, their parent or a teacher, a helpdesk gets that offline context and can confirm for the platform that the content is harmful. It just needs to be a “trusted flagger” or trusted partner of the platform – an external, statutory part of the overall abuse-reporting system – in order to be of real help to content moderators as well as kids. Help goes both ways when a helpline is also a trusted flagger.

In Europe, “trusted flaggers” are now codified in law, in the Digital Services Act that just went into effect for the world’s largest platforms. Researchers in Australia and Switzerland recently looked into whether trusted flaggers can help and found that they “can indeed reduce the spread of harmful content.”

Research grounding

There is US-based research that confirms this kind of help is needed. Researchers at the University of New Hampshire studied the two main options Americans have Internet “help-seeking”: abuse reporting and the police. Looking at “11 different types of technology-facilitated abuse” and found “very low rates of reporting” (7.3% and 4.8% to platforms and police, respectively). They also found that “only 42.2% said the website did something helpful and only 29.8% found police helpful. The authors clearly state that better help is needed in both cases. But as you can see, I’m suggesting another independent third party that would fill gaps neither apps nor cops can truly fill. Police can’t really help with content that’s “awful but lawful,” and platforms lack that offline context – for example, what’s happening in a peer group at school – and a helpline is all about meeting those needs.

This is not to say that there aren’t already trusted flaggers in the US – nonprofit helper organizations and hotlines that have relationships with certain platforms. It’s just that there’s no transparency around who they are, who they help and what platforms they work. “Some flaggers are more equal than others,” as researchers put it in the Yale Journal of Law & Technology. A law that sets up a central source of help all kids can find, requires support by all platforms and transparency on their separate and mutual work would fix that problem.

What the law might include

Ok, so to summarize: Ideally, a US helpline law would require a centralized go-to source of Internet help familiar to all US kids, parents and educators. It would include:

  • A call/text center that includes as well as refers to expertise in child development, mental healthcare, children’s digital practices and interests, K-12 school culture and other key aspects of US kids’ everyday lives (ideally, it has young people as agents, interns or on call to help adult agents)
  • Emergency referrals to law enforcement (knowing when to call 911) and the National Center for Missing & Exploited Children’s CyberTipline, as well as other forms of support for children, including in social services and specialized hotlines and helplines for vulnerable groups
  • Ensures either prompt action for users or prompt explanation for why their reports can’t be actioned, based on information provided by platform moderators
  • An office that either qualifies other organizations as trusted flaggers, as do the “digital services coordinators” established in Europe’s DSA, or publishes a list of Internet industry trusted flaggers and tracks industry compliance with this law
  • An office that coordinates relationships with platform contacts or content moderation managers and maintains and continuously updated confidential list of those contacts
  • Ensuring that all vulnerable groups are served by the trusted flaggers on that list
  • Requiring platforms to include in their transparency reports data on the number of reports received from the Internet Help Center and all trusted flaggers, and the percentage of them actioned
  • Requiring algorithmic promotion by platforms of the Internet Help Center and specialized helplines’ contact information (as platforms have long done with the Suicide Prevention Lifeline)
  • Requiring deep knowledge of industry community standards, rules and terms of services to ensure that only violating content is escalated to platforms.

The above may not fully solve the sheer scale problem that content moderation represents – probably nothing will – but research shows it will help young Internet users in this country more than they’re being helped right now. Algorithmic moderation is getting better and better at preemptively easing the demands on post-facto content moderation so that maybe, just maybe, more human moderators can be devoted to working with human helpline agents on content that is legal but traumatizing to kids. Let’s make this happen for our kids. It’s time, don’t you think?

Related links

  • “On Trusted Flaggers” in the Yale Journal of Law and Technology
  • “Trusted Flagger Programmes: Guidelines and Best Practice” from the UK Council for Internet Safety – includes principles and expectations of platforms as well trusted flaggers (much of the guidance our helpline pilot used is represented in this 2-page document)
  • The “trusted flaggers” part of Europe’s Digital Services Act
  • “Who’s Afraid of the DSA?” – details in Tech Policy Press about what companies must be in compliance now and how
  • Human help needed: “Research has shown that tools based on artificial intelligence struggle to detect online harmful content. Authors of such content [such as teen-age harassers] are aware of the detection tools, and adapt their language to avoid detection,” researchers report.
  • The Crimes Against Children Research Center on help is needed beyond reporting to platforms and police
  • Lessons we learned from piloting a social media helpline for US schools
Share Button

Filed Under: Law & Policy, Risk & Safety, Youth

Reader Interactions

Comments

  1. Lesley Podesta says

    September 7, 2023 at 3:28 pm

    This is so thoughtful Anne and you’ve nailed it. Really impressive work

    Reply

Trackbacks

  1. What child online safety really needs, senators - NetFamilyNews.org says:
    February 2, 2024 at 12:14 pm

    […] about its work with them and no legal requirement that they do so or how. I provide more details here and […]

    Reply
  2. Sharing Diigo Links and Resources (weekly) | Another EducatorAl Blog says:
    September 10, 2023 at 8:22 pm

    […] The missing piece in US child online safety law – NetFamilyNews.org […]

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Primary Sidebar

NFN in your in-box:

Anne Collier


Bio and my...
2016 TEDx Talk on
the heart of digital citizenship

Connect with me on LinkedIn
Follow me on MASTODON
Friend me on Facebook
See me on YouTube

IMPORTANT RESOURCES

Our (DIGITAL) PARENTING BASICS: Safety + Social
NAMLE, the National Association for Media Literacy Education
CASEL.org & the 5 core social-emotional competencies of SEL
Center for Democracy & Technology
Center for Innovative Public Health Research
Childnet International
Committee for Children
Congressional Internet Caucus Academy
ConnectSafely.org
Control Shift: a pivotal book for Internet safety
Crimes Against Children Research Center
Crisis Textline
Cyber Civil Rights Initiative's Revenge Porn Crisis Line
Cyberwise.org
danah boyd's blog and book about networked youth
Disconnected, Carrie James's book on digital ethics
FOSI.org's Good Digital Parenting
The research of Global Kids Online
The Good Project at Harvard's School of Education
If you watch nothing else: "Parenting in a Digital Age" TED Talk by Prof. Sonia Livingstone
The International Bullying Prevention Association
Let Grow Foundation
Making Caring Common
Raising Digital Natives, author Devorah Heitner's site
Renee Hobbs at the Media Education Lab
MediaSmarts.ca
The New Media Literacies
Report of the Aspen Task Force on Learning & the Internet and our guide to Creating Trusted Learning Environments
The Ruler Approach to social-emotional learning (Yale Center for Emotional Intelligence)
Sources of Strength
"Young & Online: Perspectives on life in a digital age" from young people in 26 countries (via UNICEF)
"Youth Safety on a Living Internet": 2010 report of the Online Safety & Technology Working Group (and my post about it)

Categories

Recent Posts

  • Safety by co-design: How we can take youth online safety to the next level
  • Much-less-social media on Facebook’s 20th birthday
  • What child online safety really needs, senators
  • Welcome to 2024!
  • Supporting the youngest witnesses of this humanitarian crisis
  • Should our kids learn how to use generative AI? Well…
  • The missing piece in US child online safety law
  • Generative AI: July 2023 freeze frame

Footer

Welcome to NetFamilyNews!

Founded as a nonprofit public service in 1999, NetFamilyNews quickly became the “community newspaper” of a vital interest community of subscribers in more than 50 countries. Site and newsletter became a blog in the early 2000s. Nowadays, you can subscribe in the box to the right to receive articles in your in-box as they're posted – or look for toots on Mastodon or posts on our Facebook page, LinkedIn and Medium.com. She welcomes your comments, follows and shares!

Categories

  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research

ABOUT

  • About NFN
  • Supporters
  • Anne Collier’s Bio
  • Copyright
  • Privacy

Search

Subscribe



THANKS TO NETFAMILYNEWS.ORG's SUPPORTER HOMESCHOOL CURRICULUM.
Copyright © 2025 ANNE COLLIER. ALL RIGHTS RESERVED.