• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

NetFamilyNews.org

Kid tech intel for everybody

Show Search
Hide Search
  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research
  • About NetFamilyNews.org
    • Supporters
    • Anne Collier’s Bio
    • Copyright
    • Privacy

How ‘crowdreporting’ could actually be a bad thing

April 26, 2013 By Anne Leave a Comment

It could help increase the visibility of the very content people want deleted. Here, in a guest post for NetFamilyNews, is an account by Maureen Kochan, our director of community at ConnectSafely.org, of how that happens:

By Maureen Kochan

Many users of Facebook have come across questionable content on the site on occasion. Chances are they reported it and moved on. But sometimes pages or groups are so offensive that organized campaigns spring up to get the page or group taken down. But when it comes to reporting bad content on Facebook, more reports might not be better.

Take one example that came to our attention recently. A user contacted us about a Facebook page that she and many others wanted removed. In her message the user acknowledged that the campaign against the page was probably making it all the more visible due to their engagement with the page (a lot of people were visiting the page to view and discuss the content, and to report it).

Facebook ultimately took the page down, though it had 227,000 “likes.” Of course no one can know how much the “anti” campaign increased the page’s reach, but the saying “any press is good press” came to mind as I followed what happened with the offending page, and there is no question that public outrage over a piece of content in social media only increases the attention it gets.

A similar situation came up several days later. An acquaintance sent us an online petition aimed at getting Facebook to remove a pro-dog-fighting group. By the time the petition, which contained a disturbing picture of an injured dog, got to me, Facebook had already removed the group. But the name of the group – and the disturbing image – lived on through the petition, which wasn’t hosted on Facebook. By the time I saw it, the petition had been signed 134,000 times with 334,000 shares. (The petition closed several days later with 259,000 signatures and 589,000 shares.)

So the people behind this group and page got even more mileage from the reactions to their exploits – in the case of the pro-dog-fighting group, long after their stuff was removed from Facebook. And it’s worth noting that, while Facebook removed the content in both cases, that doesn’t always happen. A Facebook page or group can be offensive without violating the site’s terms of use (e.g., when Facebook considers the content free speech). So piling on reports may not get the page or group removed and instead may help it reach a wider audience.

Certainly using social media to stage a public protest against something in social media seems logical, but we might think about whether the outcome will be different from the way things worked in the past, when public protests typically were staged about something that happened in the past – something that couldn’t be instantly clicked to or conveniently viewed on the very same page. Being able to see what’s being protested about can be informative, but it also means the viewer is giving attention to – essentially giving power to – the offending content. That may’ve been partly true in the past, but not to the same degree as now.

And then there’s the view of people who grew up with social media. Here’s what an Australian high school student recently told ConnectSafely co-director Anne Collier when asked about whether pages depicting violence, hate, misogyny, etc. should be taken down: “Nobody’s forcing you to look at or like that page. You can block it too, so that you never have to look at it if you don’t want to.” Another student told her, “Free speech is important,” adding that it’s better to allow people to “express their displeasure” with and on an offensive page than to require a service to delete it. A third said that, if it promotes violence, it should be taken down, but cautioned adults to remember that it isn’t just a social-media thing: “People say awful things to each other in person, and [online] is just another place where that happens.” Helpful perspective from the people who know social media well.

Share Button

Filed Under: Risk & Safety, Social Media Tagged With: abuse reporting, protest, social advocacy, social change, Social Media

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Primary Sidebar

NFN in your in-box:

Anne Collier


Bio and my...
2016 TEDx Talk on
the heart of digital citizenship

Connect with me on LinkedIn
Follow me on MASTODON
Friend me on Facebook
See me on YouTube

IMPORTANT RESOURCES

Our (DIGITAL) PARENTING BASICS: Safety + Social
NAMLE, the National Association for Media Literacy Education
CASEL.org & the 5 core social-emotional competencies of SEL
Center for Democracy & Technology
Center for Innovative Public Health Research
Childnet International
Committee for Children
Congressional Internet Caucus Academy
ConnectSafely.org
Control Shift: a pivotal book for Internet safety
Crimes Against Children Research Center
Crisis Textline
Cyber Civil Rights Initiative's Revenge Porn Crisis Line
Cyberwise.org
danah boyd's blog and book about networked youth
Disconnected, Carrie James's book on digital ethics
FOSI.org's Good Digital Parenting
The research of Global Kids Online
The Good Project at Harvard's School of Education
If you watch nothing else: "Parenting in a Digital Age" TED Talk by Prof. Sonia Livingstone
The International Bullying Prevention Association
Let Grow Foundation
Making Caring Common
Raising Digital Natives, author Devorah Heitner's site
Renee Hobbs at the Media Education Lab
MediaSmarts.ca
The New Media Literacies
Report of the Aspen Task Force on Learning & the Internet and our guide to Creating Trusted Learning Environments
The Ruler Approach to social-emotional learning (Yale Center for Emotional Intelligence)
Sources of Strength
"Young & Online: Perspectives on life in a digital age" from young people in 26 countries (via UNICEF)
"Youth Safety on a Living Internet": 2010 report of the Online Safety & Technology Working Group (and my post about it)

Categories

Recent Posts

  • Safety by co-design: How we can take youth online safety to the next level
  • Much-less-social media on Facebook’s 20th birthday
  • What child online safety really needs, senators
  • Welcome to 2024!
  • Supporting the youngest witnesses of this humanitarian crisis
  • Should our kids learn how to use generative AI? Well…
  • The missing piece in US child online safety law
  • Generative AI: July 2023 freeze frame

Footer

Welcome to NetFamilyNews!

Founded as a nonprofit public service in 1999, NetFamilyNews quickly became the “community newspaper” of a vital interest community of subscribers in more than 50 countries. Site and newsletter became a blog in the early 2000s. Nowadays, you can subscribe in the box to the right to receive articles in your in-box as they're posted – or look for toots on Mastodon or posts on our Facebook page, LinkedIn and Medium.com. She welcomes your comments, follows and shares!

Categories

  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research

ABOUT

  • About NFN
  • Supporters
  • Anne Collier’s Bio
  • Copyright
  • Privacy

Search

Subscribe



THANKS TO NETFAMILYNEWS.ORG's SUPPORTER HOMESCHOOL CURRICULUM.
Copyright © 2025 ANNE COLLIER. ALL RIGHTS RESERVED.