• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

NetFamilyNews.org

Kid tech intel for everybody

Show Search
Hide Search
  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research
  • About NetFamilyNews.org
    • Supporters
    • Anne Collier’s Bio
    • Copyright
    • Privacy

Take-aways from the ‘Facebook Files’

October 9, 2021 By Anne 4 Comments


What a week it has been, right? At least for those of us who follow and/or use social media. There was the naming of whistleblower Frances Haugen on “60 Minutes” Sunday night, US time; the hours-long outage of all of Facebook’s products Monday; Haugen’s testimony on Capitol Hill Tuesday, a report that hackers were offering for sale 1.5 billion people’s public data they scraped from Facebook; and the reportedly massive Twitch breach. Am I missing anything?

As for the Facebook-related news, if it interested you, I’m sure you’ve seen plenty of the very large amount of coverage all over the world. So I’ll cut right to some questions I have (Einstein did say it’s better to “listen to the person who has the questions” not the answers), then propose two major solutions.

  • Is it possible that policymakers would base policy on the views of one person, a data scientist who, at least in congressional testimony, doesn’t distinguish between focus groups and peer-reviewed research – or on (leaked) product research that wasn’t peer-reviewed and didn’t use a nationally representative sample (of teen Instagram users)?
  • In developing regulation, will policymakers factor in what academic research has found about social media’s positive effects as well as negative ones (see, for example, research by Mizuko Ito, et al, Candice Odgers and Victoria Rideout, whose latest study found that 43% of teens said using social media made them feel better when struggling mentally while 17% made them feel worse)?
  • Is there a risk now that platforms will choose not to find out the negative impacts of their products on vulnerable groups? Wouldn’t it be better if they did their (non-scientific) product research on both positive and negative effects and published those findings as well as what they’re doing about the negative effects?
  • Self-critical social comparison and body image struggles have long been part of adolescence (and being human) – how do Instagram and social media in general contribute to or increase this social problem?
  • Since research shows that social media use is highly contextual, will policymakers factor in research on how it fits into young people’s lives and how they use it, not just on its impacts, positive or negative? (Clearly, Sen. Richard Blumenthal didn’t know that “finstas” are not an Instagram product but rather a teens’ own adaptation of Instagram for their own purposes.) Will lawmakers learn about how much kids play, communicate, create and hang out vs just consume their media?
  • How is this a Big Tobacco moment – or not? Are social media users consumers the way cigarette smokers are? And, yes, tobacco use has social elements, but wasn’t the focus of policymakers on physical effects, and aren’t we now in a very different digital world, where our data is the source of revenue, and we’re entrusting it to the corporations using it?
  • What actually needs to change or be regulated – engagement and recommendation algorithms? What companies do with user data? How product research is done? Corporate transparency levels? The balance of human and algorithmic content moderation and levels tied to size of user base? How much content moderation users of different languages and cultures in distant countries receive?
  • Are lawmakers willing to consider new models for regulation being applied in other countries?
  • In terms of how people use Facebook all over the world, has the company outgrown its 20th-century ad-based business model?

Proposing 2 solutions

Much of the discussion of the past week has focused on regulation. So I’ll propose two solutions that haven’t gotten a lot of discussion and deserve more. I’ll list them first, then explain. Facebook should…

  • Acknowledge that it’s no longer just a corporation and structure itself accordingly.
  • Build out a network of independent help services for Internet users around the world.

In a commentary unusually focused on solutions, law professor Kate Klonick called them “opportunities for reform” and focused on solutions internal to Facebook and social media companies – changes such as greater transparency, rethinking the “user engagement” metric and adding more content moderators. Few people would argue with those. The problem is, they’re not enough.

Content moderation AND user help

Klonick wrote, “To put this all in perspective, in the United States there is roughly one law enforcement officer for every 500 people. Facebook has 2.8 billion global monthly active users; that means just 1.3 people working in safety and security for every 100,000 users.” What she doesn’t say is that police officers have or can get context for problems they’re called in to address. Content moderators can’t. Neither can algorithms.

Neither adding more human moderators nor tweaking algorithms is likely ever to catch up with harmful content and behavior in social media. That’s because what happens on platforms is highly contextual to ever-changing offline life, speech, behavior and social norms. In terms of speech, moderators who see only what’s happening on a platform online will never have enough context for that comment to make fail-proof moderation decisions – is it parody, sarcasm, “just a joke,” a cruel joke, etc., etc.? Algorithms for deleting content that violates community rules have to be “fed” tons of data to make good moderation “decisions,” and the offline-world speech and behavior reflected in that data is nuanced and keeps changing; so it’s unlikely the algorithm can ever catch up (all this besides the privacy issues related to providing it with all that data). And there are different norms and definitions all over the world to be factored in. The exceptions, such as child sexual abuse material, are much easier to moderate because almost every country in the world agrees it is criminal activity.

Context is the missing piece in the content moderation – or, better, user care – mix. Internet helplines provide the platforms with the context they need to delete highly contextual harmful context such as harassment, hate speech and bullying. And that’s only an extremely beneficial side effect of the care and support helplines provide users. There’s a whole network of Internet helplines in Europe doing this, one that the European Commission helped set up well over a decade ago. There’s also user care provided by the eSafety Commissioner’s Office in Australia and by NetSafe in New Zealand. Ideally, every country should have one, including the US. An Internet helpline should be independent of both industry and government in the US but could be partially funded by both (and individuals), as is the National Center for Missing & Exploited Children. Each country’s helpline needs to be structured and funded as appropriate for its own context.

Reorganization needed

Finally, picking up on my last question, I suggest Facebook needs to organize itself differently, in keeping with the role it has come to play. Because of its penetration into the everyday lives of people all over the world, their dependence on it as individuals and businesses, how much of their data it handles, the infrastructure it provides in many countries and so much more, Facebook has – in practice, on the ground – evolved away from being merely a corporation. Yet it’s still identifying and acting as a publicly traded company. What it has become is historically unprecedented – part company, part utility, part social institution – so it should neither be structured nor regulated based on existing models. I believe this misalignment is a big reason for growing cognitive dissonance, if not outrage, out in the public discussion. Facebook’s business is very personal, to many people in many ways (including Frances Haugen, who said she lost a friendship to misinformation). There is nothing unethical about profit, an ad-based business or publicly traded corporations. Under that model, safety is a cost center, not a profit center, and Facebook is organized to maximize user growth, tech innovation and advertising in order to maximize profit for its shareholders.

So I think it’s only logical to say that, in order to have credibility when it tells the public and policymakers it does not prioritize profit over people – Facebook needs to see itself and act as a social institution as much as a corporation and organize itself accordingly. Direct knowledge, of its impacts on vulnerable people in every culture and political system where it has a presence, needs to be folded into product development, acquisitions, every management decision. The company needs a chief safety officer in the “C suite” – an office that has real power, doesn’t fall under marketing or lobbying, supports the kind of investigative work that journalists do and has sufficient budget to contribute to helpline operations around the world.

And that’s a wrap on only the first week of a whistleblower’s work. It feels like the ground is shifting – let’s see how much.

Related links

  • Added later (Sept. 27): On the subject of  Facebook as a social institution, Adrienne LaFrance, executive editor of The Atlantic, goes even further to say “Facebook is not merely a website, or a platform, or a publisher, or a social network, or an online directory, or a corporation, or a utility. It is all of these things. But Facebook is also, effectively, a hostile foreign power.” I wouldn’t go that far. But I do feel a social institution of this magnitude and with this much resource should not be run as a publicly traded corporation or by one individual.
  • Prof. Kate Klonick’s October 1 commentary in the New York Times: For background, Klonick “spent months embedded at Facebook” observing the company’s creation of its (now spun off) Oversight Board. Her account is in the Yale Law Journal.
  • “Facebook’s own data is not as conclusive as you think about teens and mental health,” by Anya Kamenetz at NPR
  • US regulation does appear more likely in the US, now, because of unusual bi-partisan agreement on Capitol Hill (in The Guardian)
  • A thoughtful piece on “Parenting after the Facebook Files”
  • “Could platforms design for second chances?”
  • My 2020 Medium piece offering more detail on helplines and the middle layer they represent between people and helpers on the ground (in the US, 9-1-1 and specialized hotlines for people in crisis) and content moderators in the cloud – and my related post here with thoughts on regulation

Disclosure: I serve on the Trust & Safety advisories of Facebook, Snapchat, Twitter, Yubo and YouTube, and the nonprofit organization I founded and run, The Net Safety Collaborative, has received funding from some of these companies.

Share Button

Filed Under: Risk & Safety, Social Media Tagged With: Anya Kamenetz, Facebook, Frances Haugen, Kate Klonick

Reader Interactions

Trackbacks

  1. Online safety for 2022: 8 things we need to see - NetFamilyNews.org says:
    December 29, 2021 at 4:27 am

    […] structural problems: publicly traded companies that look, act and have the impacts of global social institutions, not only corporations. We don’t have models for what to do next. Hierarchical, even autocratic, […]

    Reply
  2. Meta and the metaverse - NetFamilyNews.org says:
    October 31, 2021 at 4:17 pm

    […] my last post, I suggested that, given its footprint on the planet and in the individual lives of more than 3 […]

    Reply
  3. October 13 – Michelle Ciulla Lipkin Class site says:
    October 13, 2021 at 10:04 am

    […] Take-aways from the ‘Facebook Files’ […]

    Reply
  4. Sharing Diigo Links and Resources (weekly) | Another EducatorAl Blog says:
    October 10, 2021 at 4:40 pm

    […] Take-aways from the ‘Facebook Files’ – NetFamilyNews.org […]

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Primary Sidebar

NFN in your in-box:

Anne Collier


Bio and my...
2016 TEDx Talk on
the heart of digital citizenship

Subscribe to my
RSS feed
Follow me on Twitter or even better:
NEW: Follow me on MASTODON!
Friend me on Facebook
See me on YouTube

IMPORTANT RESOURCES

Our (DIGITAL) PARENTING BASICS: Safety + Social
NAMLE, the National Association for Media Literacy Education
CASEL.org & the 5 core social-emotional competencies of SEL
Center for Democracy & Technology
Center for Innovative Public Health Research
Childnet International
Committee for Children
Congressional Internet Caucus Academy
ConnectSafely.org
Control Shift: a pivotal book for Internet safety
Crimes Against Children Research Center
Crisis Textline
Cyber Civil Rights Initiative's Revenge Porn Crisis Line
Cyberwise.org
danah boyd's blog and book about networked youth
Disconnected, Carrie James's book on digital ethics
FOSI.org's Good Digital Parenting
The research of Global Kids Online
The Good Project at Harvard's School of Education
If you watch nothing else: "Parenting in a Digital Age" TED Talk by Prof. Sonia Livingstone
The International Bullying Prevention Association
Let Grow Foundation
Making Caring Common
Raising Digital Natives, author Devorah Heitner's site
Renee Hobbs at the Media Education Lab
MediaSmarts.ca
The New Media Literacies
Report of the Aspen Task Force on Learning & the Internet and our guide to Creating Trusted Learning Environments
The Ruler Approach to social-emotional learning (Yale Center for Emotional Intelligence)
Sources of Strength
"Young & Online: Perspectives on life in a digital age" from young people in 26 countries (via UNICEF)
"Youth Safety on a Living Internet": 2010 report of the Online Safety & Technology Working Group (and my post about it)

Categories

Recent Posts

  • The missing piece in US child online safety law
  • Generative AI: July 2023 freeze frame
  • Threads: The new social media kid
  • Surgeon general’s advisory: Let’s take stock
  • Lawmakers, controlling and banning kids doesn’t help
  • New clarity on child sexual exploitation online
  • Game-changer: Child rights-by-design
  • Why I struggle mightily with the new Utah law

Footer

Welcome to NetFamilyNews!

Founded as a nonprofit public service in 1999, NetFamilyNews quickly became the “community newspaper” of a vital interest community of subscribers in more than 50 countries. Site and newsletter became a blog in the early 2000s. Nowadays, you can subscribe in the box to the right to receive articles in your in-box as they're posted – or look for toots on Mastodon or posts on our Facebook page, LinkedIn and Medium.com. She welcomes your comments, follows and shares!

Categories

  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research

ABOUT

  • About NFN
  • Supporters
  • Anne Collier’s Bio
  • Copyright
  • Privacy

Search

Subscribe



THANKS TO NETFAMILYNEWS.ORG's SUPPORTER HOMESCHOOL CURRICULUM.
Copyright © 2023 ANNE COLLIER. ALL RIGHTS RESERVED.