• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

NetFamilyNews.org

Kid tech intel for everybody

Show Search
Hide Search
  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research
  • About NetFamilyNews.org
    • Supporters
    • Anne Collier’s Bio
    • Copyright
    • Privacy

Where did my Twitter go? And other end-of-2022 notes

December 18, 2022 By Anne 7 Comments

What a year it has been. And what a week. Or two, almost. I’ll start with the latest, because it was a telling cap-off to 2022:

On December 8, three of us – Eirliani Abdul Rahman, a survivor of and activist against child sexual abuse, Lesley Podesta representing the Young & Resilient Research Center at Western Sydney University and I – resigned from Twitter’s Trust and Safety Council, a voluntary group of some 100 nonprofit organizations around the world which Twitter had formed in 2016 to help the platform keep its users as safe as possible.

Why did we resign? The many signs that safety on Twitter has tanked since Elon Musk took over the platform, including:

  • First and foremost, the first independent data – on the increased hate speech since the takeover – was published. Its sources were the Center for Countering Digital Hate and the Anti-Defamation League. They found that hate speech against Black Americans and gay men had jumped 202% and 58%, respectively, and antisemitic tweets were up 61% in the two weeks following Elon Musk’s takeover – even as Twitter claimed that safety remained a “top priority” (see also this to advertisers), but also…
  • The layoffs and resignations of thousands of Twitter employees, including those who worked on moderating harmful content, among them half to two-thirds of moderators who worked on mitigating child sexual abuse material (CSAM), according to Channel News Asia, as well as the reportedly inhumane way the layoffs were handled, none of which bodes well for the safety of humans who use the platform.
  • Twitter’s mass reinstatement of accounts that had violated its Rules against harmful content, including “Abuse/Harassment,” “Hateful Conduct” and threats of “Violence.”
  • Twitter’s lack of communication with its Trust and Safety advisers, from the Musk takeover to last Monday, when Twitter disbanded the Council. The silence was unprecedented. Patricia Cartes, the Twitter employee who was responsible for our advisory’s design (she resigned in 2018), told me with sadness that dismissing the Council signaled “the end of checks and balances for Twitter safety.”

Almost immediately after the three of us resigned, our Twitter feeds were swamped with vitriol, veiled threats and mis(or dis)information, including a claim or belief that we were employees of, not voluntary advisers to, Twitter. So, they claimed, we were somehow responsible for harm to children on the platform. Just to give you a feel for what this looked like….

Hate but also love

One person with 1 million followers tweeted that we should be jailed, naming (or “tagging”) us in their tweet. Musk retweeted to his 122 million followers the tweet that targeted us. Another Twitter user, one with 426,000+ followers, tagged us and Musk in a tweet saying, among others things, that we “should not be able to walk away,” and Musk responded with “Indeed. Shame on them!” His tweet got 77,000 likes and nearly 8,000 retweets. Within 48 hours we were getting threats in email and on other platforms as well as on Twitter (the ADL explains here how this digital-age-style “stochastic harassment” works). Now we could add direct personal experience of Twitter hate to the bulleted list above. But many, many people have experienced much worse, so here’s why I’m telling you this: It needs to be clear that…

This is the owner and CEO of a major social media platform not only doing nothing to correct this misinformation and vitriol on his platform, he was reinforcing and spreading it by responding to this misinformation associated with us by tagging us. Intentionally or unintentionally, Musk was weaponizing his followers – even as his company was claiming that Twitter was making safety a “top priority.” San Francisco-based startup adviser Jonathan Howard tweeted the “reasoning” behind all this, with a “touch” of sarcasm, way back on November 7: “people don’t like the hate speech right? so next we lock like 90% of the content moderation team out. really flood the place with n-words, while saying nothing’s changed about the policy (cuz we didn’t change *the policy*, see where we’re going? ehhh?).”

Within a few days, Twitter informed the Council that it was moving its meetings with them up from December 14 and 15 (two meetings to cover all the members’ time zones) to Monday, the 12th. Then, within an hour of the start of the meeting, Twitter suddenly ended the whole Council with a three-paragraph email. No meeting at all. The silence on Twitter’s side was now going to be permanent. Sixteen members of the ex-Council issued a joint statement about Twitter’s action on the 13th.

But here’s the thing: I loved Twitter. I was lucky, I see now. I hadn’t experienced the “cesspool” I’d heard so many people call it. It just became that – in my feed, at least – almost overnight. I’d joined way back in 2008 with the help of a tech educator, following everybody she followed. That kernel grew into a respectful professional community that I loved of nearly 10,000 researchers, online safety advocates, educators and journalists. I learned a great deal from and with them and never felt unsafe – until I resigned from Twitter’s own safety advisory.

And friends and strangers who I can’t name for their own protection reached out to us with love, on and off Twitter. One very kind person I can name – A.H. (@a_h_reaume) – because of her public thread on the platform which provided support in the form of correcting all the mis/disinformation. It’s a very visual representation of our experience and a perfect example of the giant overlap between media literacy and safety. A.H.’s timely support was such a relief in the middle of a hate speech storm. Please check it out.

And so much more happened in 2022

Twitter is quite the outlier, with social media platforms now trending toward greater safety. Examples of the trend include greater regulation, with Europe’s Digital Services Act and Digital Markets Act entered into force, the UK’s Online Safety Bill appearing close to being passed and California’s Age-Appropriate Design Code Act, signed by Gov. Gavin Newsom in September. The US federal legislation called the Kids Online Safety Act has gained momentum lately but hasn’t yet passed, with dozens of civil society groups warning that the bill could actually reduce kids’ and teens’ safety by “encouraging more data collection on minors and preventing access to topics such as LGBTQ issues,” the digital rights organization Fight for the Future reports. Here is what the Family Online Safety Institute has to say about KOSA.

Also on the regulatory front, national online protection regulators in three countries have even started something historically unprecedented: a trans-national regulators network, which I wrote about last month.

Another top-of-mind topic in child safety and regulatory circles this year has been age-verification and -authentication for child online safety, though not without controversy. Rightfully, I feel, civil liberties advocates worry about children’s data privacy if companies acquire and store minors’ identity data. Fortunately, in support of data minimization, at least one young company, Yoti, which has partnered with a number of Internet companies, has developed authentication technology that deletes a person’s data immediately upon age estimation.

Then there was the sadness of tens of thousands of tech company layoffs so close to the end of the year, which I wrote about last month – certainly not just Twitter’s, but also Meta’s, Stripe’s, Intel’s, Robinhood’s, Lyft’s, Snap’s and Shopify’s, with hiring freezes at Apple and Amazon.

What’s ahead

We’ll want to watch what happens, not just with Twitter, but also with the Twitter alternatives more and more people are migrating to, certainly people in my professional network:

Post and Spill (the latter started by former Twitter employees) represent the current centralized model (top-down, centrally controlled) and Mastodon, a more seasoned alternative. Mastodon went live in 2016 as part of “the fediverse,” a play on “federal” and “metaverse” that means it’s decentralized. It’s a network of interconnected servers (TechCrunch does a great job of explaining it). I think it’s a fascinating experiment and, well, I’ve joined Mastodon (here). It works a lot like Twitter and is infinitely more civil than my experience on the latter of late. The people I’ve encountered there have been very gracious in helping this newcomer.

One of the interesting subjects that appeared in a few cryptic tweets that tagged us Council resignees week before last was a light-touch argument between Musk and former Twitter CEO Jack Dorsey about the future of content moderation. Musk’s vision appears to be reach/not speech – reducing the reach of, rather than deleting, negative tweets – in other words de-boosting and de-monetizing them. Nothing inherently wrong with that, but it’s an old model.

Dorsey’s is, to me, more plausible going forward: I’d simply call it decentralized content moderation. In the immediate future, we’ll have more and more moderation by communities themselves, or a hybrid of central moderation and server-based moderation such as on Discord and Twitch. More exciting is what’s likely at least five years down the line: individual users moderating for themselves. Not all by themselves – games, platforms, apps, communities will still need to support them in this – but individuals will be given their own machine-learning algorithm for content moderation, one that they can “teach,” choosing the data they want to feed it themselves, based on their own values. Parents will help children do this, and valuable family conversations will happen around this important activity.

Yeah, I know this sounds pretty blue-sky (Dorsey actually founded a startup of that name). But I believe it’s the direction in which content moderation is headed. Nonprofit organizations, and hopefully eventually for-profit ones as well, will share with each other the code they need to provide their stakeholders (Internet users) with these customizable algorithms. That and, I hope, the build-out of the largely missing layer of user care, at least in the United States, are what lie ahead for a better, safer Internet – the middle layer between companies in the cloud and help on the ground which Europe calls Internet helplines. Because platform content moderation will never be enough to keep users safe. For more on that, see the last two bulleted items on this page in our site SocialMediaHelpline.com.

Sending love and wishing you and yours the happiest of holidays and a great 2023!

Related links

  • Academic research on the rise of hate on Twitter since it changed hands: “Musk Monitor: Under Musk, Hate Speech Is Rising” from The Fletcher School at Tufts University (added after this post was published)
  • My new Mastodon account
  • And the drama on Twitter continues, as the Washington Post reports, with Musk saying he’ll abide by a Twitter poll he ran on the platform for 12 hours. The question was whether he should step down as head of Twitter, and 57.5% said “yes.” If he does step down, he’ll still be Twitter’s owner, of course, and it’ll be interesting to see who would want to be the CEO cleaning up the chaos of the past 7+ weeks.
  • My interview for MediaCentar Sarajevo (readable with the help of Google Translate) and a sampler of news coverage of our resignation in five other countries: NPR, Grid, Business Insider, i24 in Germany, Channel News Asia in Singapore and Ars Technica – I haven’t figured out how to link you to my interviews with the BBC’s World Service and Radio 4.
  • Statement from the Center for Democracy and Technology, our fellow former member of the Trust and Safety Council, condemning Twitter’s dissolution of its Council
  • “A Political Theory of King Elon Musk,” by New York Times columnist Ross Douthat (picking up on his final sentence, though, it’s beginning to look like it’s not so “good to be king”)
  • The New York Times’s coverage of the independent data on Twitter hate speech, the article that sparked this whole experience for Eirliani, Lesley and me early this month
  • Had to add this (much later, in March 2023): Asked by Embedded this week if his Twitter experience has changed since Elon Musk took over (Oct. 27, 2022), millennial writer and artist John Paul Brammer, said, “It’s definitely worse. It feels like Twitter is a mangy animal with rabies shambling around a public park, foaming at the mouth, having periodic spasms and waiting to die. But it hasn’t yet. I will never leave Twitter. My account will be buried with it.”
Share Button

Filed Under: Law & Policy, Social Media, Youth Tagged With: CDT, Center for Democracy and Technology, Eirliani Abdul Rahman, Elon Musk, Jonathan Howard, Lesley Podesta, online safety, Trust and Safety Council, twitter

Reader Interactions

Comments

  1. Adrienne Katz says

    January 6, 2023 at 5:48 am

    We are so sorry to hear of the hounding of the council and the way it was amplified. Your guidance and expertise are much valued and we will be a willing audience wherever you post.

    Reply
    • Anne says

      January 6, 2023 at 10:33 am

      Thank you for your kind words, Adrienne, as well as for your beautiful work.

      Reply
  2. R&D Collier says

    December 25, 2022 at 6:28 pm

    Oh my !
    What you have shared is indeed alarming and also confirms your great expertise. Thank you for all your collaborative work globally. We look forward to learning more about Mastodon. And we share a phrase from a poem now hymn by the late Peter Henniker Heaton that provides comfort in these challenging times namely “God sets the pace.”
    Love
    Your Michigan Fans

    Reply
    • Anne says

      January 6, 2023 at 10:32 am

      Thank you so much, Michigan fans!

      Reply

Trackbacks

  1. Future safety: Content moderators and digital grassroots justice - NetFamilyNews.org says:
    January 31, 2023 at 11:04 am

    […] them content moderators, were laid off in extremely inhumane ways, which is one reason why I had to resign from Twitter’s Trust & Safety Council last […]

    Reply
  2. Elon Musk Is Why We Need Librarians | From the Editor – School Library Journal – JENI MEDIA says:
    January 25, 2023 at 3:09 am

    […] What’s ahead? “In the immediate future, we’ll have more and more [content] moderation by communities themselves, or a hybrid of central moderation and server-based moderation such as on Discord and Twitch,” says Collier. Further out, machine learning will enable individual users and organizations to customize algorithms to facilitate choice in the data that they consume. “That and, I hope, the build-out of the missing layer of user care are what lie ahead for a better, safer internet,” she predicts. […]

    Reply
  3. Mental health 2023, Part 1: Youth on algorithms - NetFamilyNews.org says:
    January 3, 2023 at 4:50 pm

    […] Where did my Twitter go? And other end-of-2022 notes […]

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Primary Sidebar

NFN in your in-box:

Anne Collier


Bio and my...
2016 TEDx Talk on
the heart of digital citizenship

Subscribe to my
RSS feed
Follow me on Twitter or even better:
NEW: Follow me on MASTODON!
Friend me on Facebook
See me on YouTube

IMPORTANT RESOURCES

Our (DIGITAL) PARENTING BASICS: Safety + Social
NAMLE, the National Association for Media Literacy Education
CASEL.org & the 5 core social-emotional competencies of SEL
Center for Democracy & Technology
Center for Innovative Public Health Research
Childnet International
Committee for Children
Congressional Internet Caucus Academy
ConnectSafely.org
Control Shift: a pivotal book for Internet safety
Crimes Against Children Research Center
Crisis Textline
Cyber Civil Rights Initiative's Revenge Porn Crisis Line
Cyberwise.org
danah boyd's blog and book about networked youth
Disconnected, Carrie James's book on digital ethics
FOSI.org's Good Digital Parenting
The research of Global Kids Online
The Good Project at Harvard's School of Education
If you watch nothing else: "Parenting in a Digital Age" TED Talk by Prof. Sonia Livingstone
The International Bullying Prevention Association
Let Grow Foundation
Making Caring Common
Raising Digital Natives, author Devorah Heitner's site
Renee Hobbs at the Media Education Lab
MediaSmarts.ca
The New Media Literacies
Report of the Aspen Task Force on Learning & the Internet and our guide to Creating Trusted Learning Environments
The Ruler Approach to social-emotional learning (Yale Center for Emotional Intelligence)
Sources of Strength
"Young & Online: Perspectives on life in a digital age" from young people in 26 countries (via UNICEF)
"Youth Safety on a Living Internet": 2010 report of the Online Safety & Technology Working Group (and my post about it)

Categories

Recent Posts

  • A solution for ‘awful but lawful’
  • New global service for getting nudes off the Internet
  • Then there’s the flip side of ChatGPT
  • For SID 2023: What youth want ‘online safety’ to teach
  • ChatGPT for media literacy training
  • Future safety: Content moderators and digital grassroots justice
  • Mental health 2023, Part 1: Youth on algorithms
  • Where did my Twitter go? And other end-of-2022 notes

Footer

Welcome to NetFamilyNews!

Founded as a nonprofit public service in 1999, NetFamilyNews quickly became the “community newspaper” of a vital interest community of subscribers in more than 50 countries. Site and newsletter became a blog in the early 2000s. Nowadays, you can subscribe in the box to the right to receive articles in your in-box as they're posted – or look for tweets, posts on our Facebook page, and key commentaries from Anne on her page at Medium.com. She welcomes your comments, follows and shares!

Categories

  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research

ABOUT

  • About NFN
  • Supporters
  • Anne Collier’s Bio
  • Copyright
  • Privacy

Search

Subscribe



THANKS TO NETFAMILYNEWS.ORG's SUPPORTER HOMESCHOOL CURRICULUM.
Copyright © 2023 ANNE COLLIER. ALL RIGHTS RESERVED.