• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

NetFamilyNews.org

Kid tech intel for everybody

Show Search
Hide Search
  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research
  • About NetFamilyNews.org
    • Supporters
    • Anne Collier’s Bio
    • Copyright
    • Privacy

Behind the scenes of safety & free speech in social media

April 14, 2016 By Anne Leave a Comment

For a must-read article for anyone interested in safety and free speech online, some of the social media industry’s most seasoned content moderators – the apps’ and sites’ safety managers and free speech decisionmakers – went public for the first time.

"The Secret Rules of the Internet"
“The Secret Rules of the Internet”

“Their stories reveal how the boundaries of free speech were drawn during a period of explosive growth for a high-stakes public domain, one that did not exist for most of human history,” write Catherine Buni and Soraya Chemaly, the authors of the piece in TheVerge.com.

Content moderators’ stories – and this article – reveal a lot more, including how hard it is for human beings, much less the software that supports their work, to decide for users representing all ages, languages, cultures and countries…

  • What “safe” means for a community that includes children and many other protected classes
  • What “free speech” means online and whether it’s different there
  • What’s “newsworthy” – what violent or graphic content should be allowed to stay visible because its viewing could change the course of history
  • When content has crossed the line from artistic to harmful.

You’ll see how all this works behind the apps and services billions of us use – the sheer scale of the work and the toll it can take on the mental health of the people doing it. And that last point is so important.

“YouTube’s billion-plus users upload 400 hours of video every minute. Every hour, Instagram users generate 146 million “likes” and Twitter users send 21 million tweets,” Buni and Chemaly write. “The moderators of these platforms — perched uneasily at the intersection of corporate profits, social responsibility, and human rights — have a powerful impact on free speech, government dissent, the shaping of social norms, user safety, and the meaning of privacy.”

Many moving parts

It’s important, I strongly feel, for us individual users to understand not only how unprecedented this work, these decisions, and their impacts are but also how essential it is not to look at any single aspect of this picture – safety or speech rights or newsworthiness – in isolation. They’re all vitally important to all of us, and what we do about each has bearing on the rest of this strange, unfamiliar, constantly changing, phenomenon that we all – from users to moderators to policy makers – are part of. What we do in the name of protection rights affects participation and expression rights. We can’t forget that when we’re setting rules or writing laws. Everything from wily workarounds to serious harm can come with the unintended consequences of policy making that fails to factor in research and this whole picture.

Work in progress

Content moderation, or community management, is also a work in progress – a global one. It’s “not a cohesive system, but a wild range of evolving practices spun up as needed, subject to different laws in different countries, and often woefully inadequate for the task at hand,” Buni and Chemaly write. And it has as vast a range of approaches, from 4chan’s to Facebook’s to a month-old messaging app (do read the article for those stories).

So very human

The biggest takeaway of all, though, is the human factor. The real contribution of “The Secret Rules of the Internet” is that it sheds light on the very human work going on behind what we too often think of as technology.

Related links

  • “The global free speech experiment for participants of all ages” (May 2013)
  • “Flawed early laws of our new media environment” (Dec. 2013)
  • “Proposed ‘rightful’ framework for Internet safety” (July 2014)
  • “Digital citizenship’s missing piece” (Sept. 2015)
  • “Counter speech: New online safety tools with huge potential” (Dec. 2015)
  • “Our humanity, not our tech, is the key to fixing online hate” (Feb. 2016)
Share Button

Filed Under: Law & Policy, Literacy & Citizenship, Social Media Tagged With: Catherine Buni, community moderators, content moderation, Facebook, Soraya Chemaly

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Primary Sidebar

NFN in your in-box:

Anne Collier


Bio and my...
2016 TEDx Talk on
the heart of digital citizenship

Connect with me on LinkedIn
Follow me on MASTODON
Friend me on Facebook
See me on YouTube

IMPORTANT RESOURCES

Our (DIGITAL) PARENTING BASICS: Safety + Social
NAMLE, the National Association for Media Literacy Education
CASEL.org & the 5 core social-emotional competencies of SEL
Center for Democracy & Technology
Center for Innovative Public Health Research
Childnet International
Committee for Children
Congressional Internet Caucus Academy
ConnectSafely.org
Control Shift: a pivotal book for Internet safety
Crimes Against Children Research Center
Crisis Textline
Cyber Civil Rights Initiative's Revenge Porn Crisis Line
Cyberwise.org
danah boyd's blog and book about networked youth
Disconnected, Carrie James's book on digital ethics
FOSI.org's Good Digital Parenting
The research of Global Kids Online
The Good Project at Harvard's School of Education
If you watch nothing else: "Parenting in a Digital Age" TED Talk by Prof. Sonia Livingstone
The International Bullying Prevention Association
Let Grow Foundation
Making Caring Common
Raising Digital Natives, author Devorah Heitner's site
Renee Hobbs at the Media Education Lab
MediaSmarts.ca
The New Media Literacies
Report of the Aspen Task Force on Learning & the Internet and our guide to Creating Trusted Learning Environments
The Ruler Approach to social-emotional learning (Yale Center for Emotional Intelligence)
Sources of Strength
"Young & Online: Perspectives on life in a digital age" from young people in 26 countries (via UNICEF)
"Youth Safety on a Living Internet": 2010 report of the Online Safety & Technology Working Group (and my post about it)

Categories

Recent Posts

  • Safety by co-design: How we can take youth online safety to the next level
  • Much-less-social media on Facebook’s 20th birthday
  • What child online safety really needs, senators
  • Welcome to 2024!
  • Supporting the youngest witnesses of this humanitarian crisis
  • Should our kids learn how to use generative AI? Well…
  • The missing piece in US child online safety law
  • Generative AI: July 2023 freeze frame

Footer

Welcome to NetFamilyNews!

Founded as a nonprofit public service in 1999, NetFamilyNews quickly became the “community newspaper” of a vital interest community of subscribers in more than 50 countries. Site and newsletter became a blog in the early 2000s. Nowadays, you can subscribe in the box to the right to receive articles in your in-box as they're posted – or look for toots on Mastodon or posts on our Facebook page, LinkedIn and Medium.com. She welcomes your comments, follows and shares!

Categories

  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research

ABOUT

  • About NFN
  • Supporters
  • Anne Collier’s Bio
  • Copyright
  • Privacy

Search

Subscribe



THANKS TO NETFAMILYNEWS.ORG's SUPPORTER HOMESCHOOL CURRICULUM.
Copyright © 2025 ANNE COLLIER. ALL RIGHTS RESERVED.