• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

NetFamilyNews.org

Kid tech intel for everybody

Show Search
Hide Search
  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research
  • About NetFamilyNews.org
    • Supporters
    • Anne Collier’s Bio
    • Copyright
    • Privacy

If anything needs to go viral, it’s this…

July 2, 2021 By Anne 1 Comment

…the message, “Report it. Don’t share it.”  The “it” in this public awareness campaign Facebook just launched is child sexual abuse material (CSAM), the accurate term for what is typically called “child pornography” in the United States.

Thankfully, it’s extremely unlikely you’ll ever see content like this. “The prevalence of this content on our platform is very low,” Facebook researchers report, “meaning sends and views of it are very infrequent.” In the first quarter of this year (the latest figure available), the prevalence figure was 0.05%, meaning that, “of every 10,000 views of content on Facebook, we estimate no more than 5 of those views” are CSAM, according to Facebook’s transparency report. Put another way, compared with the 5.5 million pieces of bullying and harassment content that moderation teams “actioned,” 812,000 pieces of CSAM was removed from the Facebook platform, and 98.1% of those pieces were removed before anyone reported it, according to the same report.

Why is it crucial for social media users, as well as platforms, to report this content? Because every share and view re-victimizes the child depicted in the image or video. The sharing has to stop.

Sharing = re-victimizing

It’s probably hard for anyone reading this post to understand why any social media user needs to hear “report it, don’t share it.” It’s because even people who have no intention to harm a child share this material. More than 90% of this content is the same as or very similar to previously reported content, Facebook researchers found, meaning that it’s not the content originally posted. Senseless, right? So much (re-)victimization of children comes from users sharing this content onward. So the researchers had to know why – what were the intentions behind this sharing?

In-app user ed (easier-to-read version here)

They needed to understand this for better prevention and intervention, they write.

On the intervention side, the goal was to provide more context to go with the reports the moderation teams send NCMEC (the National Center for Missing and Exploited Children, to which US companies are required by federal law to report CSAM on their servers). More context means more information to help law enforcement find and help victims faster. On the prevention side: to refine the language in pop-ups and other educational messages in apps so they really connect with the users who don’t have any intention to harm children - who may just be appalled or in shock and want to share their outrage, people who wouldn’t share the content if they knew it re-victimizes children.

Maps to sexting typology

The research turned up a whole spectrum of intentions both malicious (intent to harm children) and nonmalicious, and the result was a “taxonomy of intent.” I’ll let you click to this page if you’d like to see the full range, with definitions and examples at the bottom, but what’s interesting to me is how the malicious-nonmalicious taxonomy maps to the youth sexting typology that the Crimes Against Children Research Center published in 2011 as guidance for law enforcement. The CCRC’s two categories are “aggravated” (involving an adult and engaging in sexual abuse, extortion, threats, interpersonal conflicts or creating/sharing of images without the knowledge or consent of the minor depicted) and “experimental” sexting (produced by the minor to share with an established dating partner, to “create romantic interest in other youth, or for reasons such as attention‐seeking” but with “no criminal behavior beyond the creation or sending of images, no apparent malice and no lack of consent by the minor depicted).

Hard-working collaborators

Research insights like these are being shared and acted on in an important industry collaboration called the Technology Coalition, with 22 member companies big and small in different sectors and levels of the Internet working to eradicate child sexual abuse online. I’d like to see this kind of cross-industry collaboration working as effectively against online bullying, harassment and hate speech for vulnerable users of all ages, but at least we have a proven model for that in the Tech Coalition.

As for collaboration across whole sectors – industry, government and NGOs – watch a remarkable TED Talk by Julia Cordua, CEO of Thorn, which is building technology that connects those dots “so we can swiftly end the viral distribution of abuse material and rescue children faster.” The video paints the whole picture – the problem, what needs to be done, what is being done and who’s working on it – in a little over 13 minutes. As of this writing, it has gotten nearly 1.8 million views.

Related links

  • A year ago, the Technology Coalition announced “Project Protect: A plan to combat online child sexual abuse,” after “in-depth consultation with more than 40 experts on CSEA around the globe.” With its investment in independent research, a forum for experts and sharing all that’s learned with the public, this is the outstanding model I refer to above.
  • Two of the experts Facebook Research worked with on the study leading up to this campaign: Prof. Ethel Quayle at the University of Edinburgh and Ken Lannings at NCMEC.
  • The Crimes Against Children Research Center at the University of New Hampshire was one of the first academic centers in the world to publish research on youth online risk (their first Youth Internet Safety Survey was published in 1999.
  • Thorn CEO Julia Cordua’s 2019 TED Talk “How we can eliminate child sexual abuse material from the Internet“
Share Button

Filed Under: Research, Risk & Safety, sexting, sexual exploitation Tagged With: CCRC, Crimes Against Children Research Center, Facebook, TechCoalition, Technology Coalition

Reader Interactions

Trackbacks

  1. Sharing Diigo Links and Resources (weekly) | Another EducatorAl Blog says:
    July 5, 2021 at 4:33 pm

    […] If anything needs to go viral, it’s this… – NetFamilyNews.org […]

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Primary Sidebar

NFN in your in-box:

Anne Collier


Bio and my...
2016 TEDx Talk on
the heart of digital citizenship

Connect with me on LinkedIn
Follow me on MASTODON
Friend me on Facebook
See me on YouTube

IMPORTANT RESOURCES

Our (DIGITAL) PARENTING BASICS: Safety + Social
NAMLE, the National Association for Media Literacy Education
CASEL.org & the 5 core social-emotional competencies of SEL
Center for Democracy & Technology
Center for Innovative Public Health Research
Childnet International
Committee for Children
Congressional Internet Caucus Academy
ConnectSafely.org
Control Shift: a pivotal book for Internet safety
Crimes Against Children Research Center
Crisis Textline
Cyber Civil Rights Initiative's Revenge Porn Crisis Line
Cyberwise.org
danah boyd's blog and book about networked youth
Disconnected, Carrie James's book on digital ethics
FOSI.org's Good Digital Parenting
The research of Global Kids Online
The Good Project at Harvard's School of Education
If you watch nothing else: "Parenting in a Digital Age" TED Talk by Prof. Sonia Livingstone
The International Bullying Prevention Association
Let Grow Foundation
Making Caring Common
Raising Digital Natives, author Devorah Heitner's site
Renee Hobbs at the Media Education Lab
MediaSmarts.ca
The New Media Literacies
Report of the Aspen Task Force on Learning & the Internet and our guide to Creating Trusted Learning Environments
The Ruler Approach to social-emotional learning (Yale Center for Emotional Intelligence)
Sources of Strength
"Young & Online: Perspectives on life in a digital age" from young people in 26 countries (via UNICEF)
"Youth Safety on a Living Internet": 2010 report of the Online Safety & Technology Working Group (and my post about it)

Categories

Recent Posts

  • Safety by co-design: How we can take youth online safety to the next level
  • Much-less-social media on Facebook’s 20th birthday
  • What child online safety really needs, senators
  • Welcome to 2024!
  • Supporting the youngest witnesses of this humanitarian crisis
  • Should our kids learn how to use generative AI? Well…
  • The missing piece in US child online safety law
  • Generative AI: July 2023 freeze frame

Footer

Welcome to NetFamilyNews!

Founded as a nonprofit public service in 1999, NetFamilyNews quickly became the “community newspaper” of a vital interest community of subscribers in more than 50 countries. Site and newsletter became a blog in the early 2000s. Nowadays, you can subscribe in the box to the right to receive articles in your in-box as they're posted – or look for toots on Mastodon or posts on our Facebook page, LinkedIn and Medium.com. She welcomes your comments, follows and shares!

Categories

  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research

ABOUT

  • About NFN
  • Supporters
  • Anne Collier’s Bio
  • Copyright
  • Privacy

Search

Subscribe



THANKS TO NETFAMILYNEWS.ORG's SUPPORTER HOMESCHOOL CURRICULUM.
Copyright © 2025 ANNE COLLIER. ALL RIGHTS RESERVED.