• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

NetFamilyNews.org

Kid tech intel for everybody

Show Search
Hide Search
  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research
  • About NetFamilyNews.org
    • Supporters
    • Anne Collier’s Bio
    • Copyright
    • Privacy

DIY community care: 1 sign of a new Net safety era

July 11, 2019 By Anne 1 Comment

Neighborhood watch sign

It’s a sign that we’re in a whole new era in online safety now: young users increasingly taking things into their own hands. You might call it “DIY Internet safety.” It’s not all good news, but it’s also not all bad (not that DIY is the all of the new era – keep reading…).

There are both positive and negative aspects in a piece of in-depth reporting on the subject at Buzzfeed News. What Buzzfeed describes seems mostly upside, with the headline: “TikTok Has a Predator Problem. A Network of Young Women Is Fighting Back.” I mean, it’s exciting to hear that users are helping younger peers by engaging in online community policing, right? 

The good stuff

It does seem really positive at first glance. These online safety mentors understand the limits of what social media content moderation can do “at scale” (e.g., Facebook has more than 1.5 billion daily active users; YouTube gets more than 500 hours of video uploaded by users every minute). These DIY protectors are not waiting around for platforms to take action; they’re even pressuring newer platforms like TikTok to up their safety game; they’re modeling self and community care for younger users (many under 13 at TikTok); they’re teaching young users what inappropriate behavior looks like; they’re gathering evidence (screenshots) of inappropriate comments and messages from what appear to be predatory older users; and they’re making lists and “outing” those users on other platforms, such as Instagram and YouTube. 

So, given all that, what could the downside be? The fact that self-appointed community police – or digital vigilantes – don’t always accuse abusers of rules and people accurately. Also, when they do make false accusations, in social media they often make them very publicly. Even if they do that with the best of intentions, people get hurt.

When it’s not so good

Agency plus education is good; agency all by itself can go either way. As Buzzfeed reporter Ryan Broderick put it, it can be a cross-platform “free-for-all where young users weaponize dubious screenshots, defamatory callout videos, and whisper campaigns via group chats to deliver vigilante justice at dizzying speeds.” So if someone’s innocent, reputations can get destroyed at dizzying speeds too. 

To lock in their effectiveness, what the well-intentioned groups of DIY protectors need to have is a set of standards or code of ethics for taking action (for example, see comedian Franchesca Ramsey’s 6 rules for calling people out online here). Developing best practices, or a code of ethics, for DIY community care would make a great lesson in new media literacy, right?

[The only part of the Buzzfeed piece I struggle with is Broderick’s throwaway phrase, “In an era when the failure of social media giants to police their platforms….” Yes, it’s failure in the eyes of many, but we can’t forget that, in the eyes of many others, the platforms police too much or with too much bias; and in the eyes of still others, it should not be up to corporations to be cops, censors, or arbiters of free speech. All of which complicates content moderation. This is not an excuse; it’s a reality.]

Other signs we’re seeing

In any case, do you too see that we’re in a new era for online safety? Because…

  • Media’s complex and shape-shifting. Checking assumptions about human behavior is hard enough when the behavior’s happening in physical spaces; it’s harder in media environments. We’re also seeing that no one person, expert group or organization can create safety for everyone – even for giant organizations with huge revenue like Facebook and YouTube (the bigger the platform the harder the solve is). Why? For one thing, because so far it’s really hard for machine learning algorithms to keep up with the fast-changing highly innovative behavior of young human beings; algorithms need a whole lot of examples to “learn,” and just as soon as they get fed examples of new speech and behaviors, youth speech and behaviors have changed.
  • We’re seeing the solution’s complex too. It needs all perspectives in the room – those of the people who write the algorithms, of the many different user groups (mentors, students, beneficiaries both resilient and vulnerable), of corporate executives, law enforcement, policymakers and caregivers (from parents to mental healthcare practitioners). Each perspective is crucial to problem solving, algorithm writing, safety feature design and policy making. More and more safety advocates are calling for collaborative rather than adversarial approaches to problem-solving – a defining characteristic to the “digital citizenship” on more and more minds around the world.
  • Media peer-mentoring is very new– even newer than adults mentoring kids. The online safety field hasn’t been about teaching young people how to stay safe and keep each other safe online. And it has only just begun encouraging adults to mentor rather than control and surveil children’s media use. Having teens and young adults modeling and teaching safe social norms in media will reinforce older adults’ efforts to work and play with their children in digital environments. [Here are some great resources for adults’ media mentoring from the American Library Association.] 

So we’re starting Phase 2 of this multi-phase, global social experiment around optimizing social media for all its users. Besides DIY safety, other signs include last year’s big data wakeup call; the growing public discussion about “recommendation engines” and rabbit holes (see sidebar); and new forms of user care, such as Facebook’s development of a content moderation “appeals” board. What are some other examples? Feel free to put them in a comment or tweet. It’s still the earliest of early days in our new media era. So there’s a whole lot of work to do, here in the Petri dish.

SIDEBAR: Speaking of ‘rabbit holes’

So about “recommendation engines” and rabbit holes. “Like all social media platforms, TikTok is optimized for engagement,” Buzzfeed reporter Ryan Broderick writes, using algorithms that “learn” what you like and show you more and more of it. “It also reacts in real time, delivering an endless stream of similar videos, even if you aren’t logged in,” he adds. 

We’re not just talking about TikTok, here. Those learning algorithms that deliver more and more content that’s like what you just watched or liked or shared are also called “recommendation engines” or “systems” that are just part of social media. If you keep clicking on what they turn up for you, you’re going down that rabbit hole. For young YouTube user Caleb Cain, who was recently profiled by New York Times writer Kevin Roose, it was “an alt-right rabbit hole,” as Caleb put it.

To his credit, Caleb climbed back up. I’m not being political, here; what I’m saying is that he was getting fed ever more extreme and conspiratorial content, and so he had the intelligence to look for alternative views. This 26-year-old man honestly wanted to learn not be indoctrinated or have his biases confirmed (Caleb tells his own story about this, as a new YouTube creator, here). 

The story is a powerful example of three things about our media environment that are good to keep in mind: 1) how we don’t necessarily just go down the rabbit holes, though irresponsible reporting (not Roose’s) would suggest otherwise; 2) how long it can take to come back up and that it takes inquiring minds and courage to do so, to credit Caleb; and 3) how we need critical thinking not just about what we see and hear in today’s media; we also need to think critically about the algorithms behind what we see and hear. (The story also describes how really smart creators figure out how to game the recommendation system gamers!)

On that 3rd point, as an adviser to several platforms, I can tell you that they’re certainly applying critical thinking to this issue. We need to too – and to help our children do so (both the Roose and Broderick pieces are great teaching tools). Applying critical thinking is for safety and wellbeing, not just their media literacy training. This too is a sign of the new era of safety – safety from the inside out (critical thinking, empathy and resilience), not just the outside in.

Related links

  • Caleb Cain courageously telling his story about going down “the alt-right rabbit hole,” coming back up and using that experience to help peers avoid what he went through over the course of about 5 years. His channel, Faraday Speaks, now how more than 20,000 subscribers.
  • About safety from the inside out 
  • My last post about TikTok and how, if policymakers want to regulate social media, they really need to think geopolitically  

 

Share Button

Filed Under: Risk & Safety, Social Media, Youth Tagged With: Caleb Cain, Franchesca Ramsey, Kevin Roose, rabbit holes, Ryan Brokerick, TikTok

Reader Interactions

Trackbacks

  1. Sharing Diigo Links and Resources (weekly) | Another EducatorAl Blog says:
    July 14, 2019 at 4:52 pm

    […] DIY community care: 1 sign of a new Net safety era – NetFamilyNews.org […]

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Primary Sidebar

NFN in your in-box:

Anne Collier


Bio and my...
2016 TEDx Talk on
the heart of digital citizenship

Subscribe to my
RSS feed
Friend me on
Facebook
Follow me on
Twitter
See me on
YouTube

IMPORTANT RESOURCES

Our (DIGITAL) PARENTING BASICS: Safety + Social
NAMLE, the National Association for Media Literacy Education
CASEL.org & the 5 core social-emotional competencies of SEL
Center for Democracy & Technology
Center for Innovative Public Health Research
Childnet International
Committee for Children
Congressional Internet Caucus Academy
ConnectSafely.org
Control Shift: a pivotal book for Internet safety
Crimes Against Children Research Center
Crisis Textline
Cyber Civil Rights Initiative's Revenge Porn Crisis Line
Cyberwise.org
danah boyd's blog and book about networked youth
Disconnected, Carrie James's book on digital ethics
FOSI.org's Good Digital Parenting
The research of Global Kids Online
The Good Project at Harvard's School of Education
If you watch nothing else: "Parenting in a Digital Age" TED Talk by Prof. Sonia Livingstone
The International Bullying Prevention Association
Let Grow Foundation
Making Caring Common
Raising Digital Natives, author Devorah Heitner's site
Renee Hobbs at the Media Education Lab
MediaSmarts.ca
The New Media Literacies
Report of the Aspen Task Force on Learning & the Internet and our guide to Creating Trusted Learning Environments
The Ruler Approach to social-emotional learning (Yale Center for Emotional Intelligence)
Sources of Strength
"Young & Online: Perspectives on life in a digital age" from young people in 26 countries (via UNICEF)
"Youth Safety on a Living Internet": 2010 report of the Online Safety & Technology Working Group (and my post about it)

Categories

Recent Posts

  • Future safety: Content moderators and digital grassroots justice
  • Mental health 2023, Part 1: Youth on algorithms
  • Where did my Twitter go? And other end-of-2022 notes
  • Global network of Net safety regulators: Let’s think on this
  • Dot-com bust, 2022-style
  • BeReal & being real about safety & privacy
  • How this new app might well be safer…
  • Why partner with teens on tech: Great new book

Footer

Welcome to NetFamilyNews!

Founded as a nonprofit public service in 1999, NetFamilyNews quickly became the “community newspaper” of a vital interest community of subscribers in more than 50 countries. Site and newsletter became a blog in the early 2000s. Nowadays, you can subscribe in the box to the right to receive articles in your in-box as they're posted – or look for tweets, posts on our Facebook page, and key commentaries from Anne on her page at Medium.com. She welcomes your comments, follows and shares!

Categories

  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research

ABOUT

  • About NFN
  • Supporters
  • Anne Collier’s Bio
  • Copyright
  • Privacy

Search

Subscribe



THANKS TO NETFAMILYNEWS.ORG's SUPPORTER HOMESCHOOL CURRICULUM.
Copyright © 2023 ANNE COLLIER. ALL RIGHTS RESERVED.