• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

NetFamilyNews.org

Kid tech intel for everybody

Show Search
Hide Search
  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research
  • About NetFamilyNews.org
    • Supporters
    • Anne Collier’s Bio
    • Copyright
    • Privacy

Twitch, Yubo & online safety innovation with an ancient ‘tool’

October 22, 2018 By Anne 1 Comment

As geeky as “content moderation” sounds, we’re hearing about it more and more these days – in podcasts, at conferences, in books and in hearings. People all over the U.S. and world are talking about how to make social media safer, and content moderation by the platforms is an important part of that.

But what we’re not hearing or talking about enough is another part of the social media safety equation: social norms, a tool we humans have been using to shape behavior for millennia. There’s some really interesting innovation going on around social norms right now, and in a sector of social media where you might least expect to see it: live-streamed video, projected to be more than a $70 billion market in two years but already mainstream social media for teens and young adults, according to Pew Research. I’ll give you two examples.

Parents, you’re very likely to have heard of Twitch, the 800-pound gorilla of streaming platforms based in San Francisco. The other example is Yubo, a fast-growing startup based in Paris, may be familiar to parents in Europe, but I’ll get to that in a minute. Both are innovating in really interesting ways, blending both human norms and tech tools. Both…

  • Innovate around user safety because it’s “good business”
  • Have live video in their favor as they work on user safety
  • Empower their users to help.

To explain, here’s what they’re up to….

Starting with the startup

Yubo, an app that now has more than 15 million users in more than a dozen countries, reached its first million within 10 months without any marketing (it was then called Yellow). The app’s all about communication—chat via live video-streaming. Somehow, the focus of early news coverage was on one Tinder-like product feature, even though the app’s safety features have long included a “firewall” that keeps 13-17 year-old users separate from adults, Virtually all news coverage has been alarmist, especially in Europe, encouraging parents and child advocates to expect the worst.

So they doubled down, with a mix of high- and low-tech “tools.” One of them is surprisingly unusual, and it definitely involves an element of surprise: nearly real-time intervention when a user violates a community rule (the rules are no bullying, hate speech, underwear, nudity or violence).

Yubo screenshotWhen the violation happens—maybe a user joins a chat in their underwear—the user almost immediately gets a message like the one in the screenshot to the left.

It’s one of those sudden disruptors that stops the conversation flow and changes the dynamics, helping chatters learn the community standards by seeing them enforced in real time. Yubo can do that because it takes a screenshot of all chat sessions every 10 seconds; so between users’ own reports of annoying behavior and algorithms running in the background that detect and escalate it, the app can respond very quickly. If the violation doesn’t stop, moderators will either disable the chat mid-stream or suspend the user’s account.

You wouldn’t think an app communicating with users in real time is that unusual, but Yubo says it’s the only social video app that intervenes with young users in this way, and in more than 20 years of writing about kids and digital media, I haven’t seen another service that does.

Teaching algorithms

The other innovative part of this is the algorithms, and the way Yubo uses them to detect problems proactively for faster response (Twitch doesn’t even have algorithms running proactively in the background, they told me). Proactive is unusual. For various reasons—including the way platforms are used—content moderation has always been about 99% reactive, dependent on users reporting problems and never even close to real-time. Bad stuff comes down later—sometimes, without explanation.

Knowing how hard it is for algorithms to detect nudity and underwear, I asked Yubo COO Marc-Antoine Durand how that works on his app. “We spent nearly a year labeling every reported piece of content to create classifiers. Every time we had a report, a dedicated team had to classify the content into multiple categories (ex: underwear, nudity, drug abuse, etc.). With this, our algorithms can learn and find patterns.”

It takes a whole lot of data—a lot of images—to “teach” algorithms what they’re “looking for,” Durand acknowledged. An app that’s all live-streamed video, all the time, very quickly amasses a ton of data for an algorithm to chew on. And Yubo has reached 95% accuracy in the nudity category (its highest priority) with a dedicated data scientist feeding (and tweaking) the algorithms full-time, he said. “We had to make a distinction between underwear and swimsuits,” he added. “We trained our algorithms to know the difference by detecting the presence of sand, water, a swimming pool, etc.”

Not that nudity’s a huge problem on the app—for example, only 0.1-0.2% of users upload suggestive content as profile photos any given day—Yubo says they prioritize nudity and underwear.

But algorithms, high-frequency screenshots and the element of surprise aren’t the only tools in the toolbox. Probably because live-streamed video hasn’t come up much on the Hill or in news about “content moderation” (yet), we don’t hear much about how live video itself—and the social norms of the people using it—are safety factors too. But it’s not hard for anyone who uses Skype or Google hangouts to understand that socializing in video chat is almost exactly like socializing in person, where it’s extremely unusual for people to show up in their underwear, right? If someone decides to in Yubo, and others in the chat have a problem with that, the person typically gets reported because it’s just plain annoying, and moderators can suspend or delete their account. The every-10-sec. screenshots are backup so moderators can act quickly on anything not reported. On top of that, Yubo provides moderation tools to users who open chats, to the “streamer” of the “live.”

Twitch communities’ social norms

Twitch leverages social norms too. It empowers streamers, or broadcasters, with tools to maintain safety in chat during their live streams. With 44 billion minutes streamed per month by 3.1 million “streamers” (according to TwitchTracker.com) the company really has to harness the safety power of social norms in millions of communities.

“We know through statistics and user studies that, when you have a lot of toxicity or bad behavior online, you lose users,” said Twitch Associate General Counsel Shirin Keen at the CoMo Summit one of the first conferences on the subject. “Managing your community is core to success on Twitch.” So because streamer success is Twitch’s bread and butter, the platform has to help streamers help their viewers feel safe.

Twitch’s phenomenal growth has been mostly about video gamers streaming their videogame play. But the platform’s branching out—to the vastness of IRL (for “In Real Life”), its biggest new non-gaming channel. And just like in real real life, each community develops its own social norms. The Twitch people know that. The platform lets streamers shape their own norms and rules—lets them appoint their own moderators and provides them with moderation tools for maintaining community standards.

So can you see how safety innovation is as much about humans as it is about tech—what we find acceptable or feel comfortable with in the moment and in a certain social setting? Take “emotes,” for example. They’re a way users can express their emotions in otherwise dry, impersonal text chat. They’re also part of channel culture and community-building. So are “bits,” colorful little animated graphics that viewers can buy to cheer a streamer on from the chat window. And these graphical elements help moderators see at a glance, in a fast-moving chat stream, when things are good or going south. Twitch told me they’ve even just launched “Creator Camp” aimed at training streamers and their “mods” (channel moderators) on how to create their unique community and its unique culture, “all without bug spray, sunburns, and cheesy sing-alongs (ok maybe a few sing-alongs,” Twitch tells its streamers.

So two vastly different live video companies are experimenting with something we’re all beginning to see, that…

      • Live video makes social media a lot like in-person socializing
      • Social media, like social anything, is affected by social norms
      • Social norms help make socializing safer
      • People stick around and have more fun when they feel safe.

The bottom line being that safety—in the extremely participatory business of social media—is good business.

[Disclosure, I am an Internet safety adviser to Yubo and other social media services, including Facebook, Twitter and Snapchat but not including Twitch.]

Related links

  • A more detailed version of this post can be found at Medium.com
  • Live-stream video metrics: ”Two-thirds (67%) of consumers globally have streamed live video,” according to a June 2018 report.
  • The power of social norms in online safety: I’ve written a lot about in this blog. A few examples are: “Kids deserve the truth about cyberbullying” (2011); “Zooming in on social norms” (2014); and “Core concerns: ‘Blue Whale’ & the social norms research” (2017)
  • Live-stream video metrics: ”Two-thirds (67%) of consumers globally have streamed live video,” according to a June 2018 report
  • All negative, all the time not helpful: At a conference last June, psychology professor Sonia Livingstone, a leading researcher in the field of youth online risk, called the current debate about “tech addiction” and “screen time” impossibly “monolithic. We get complete exclusion of any kind of consideration of the opportunities that technologies bring.” She’s no apologist for technology, she said, but she suggested the importance of taking a “critical stance” toward the uniformly negative narrative we’re seeing in tech news. Even professionally neutral researchers are calling for more balance in the public discussion about online wellbeing.
  • A “day” in the life of a content mod: a talk at MIT by volunteer content moderator at Twitch and grad student Claudia Lo, illustrating how it’s proactive and reactive and so much more than that binary (and this shows how important transparency is for anyone talking about regulation)
  • Other tools and measures Yubo uses, a post in my blog a year ago before the app rebranded (and, more recently, Yubo’s own perspective)
  • About content moderation on Reddit: insights from Nate Mathias, PhD
  • Pioneering journalism on the subject at Wired from Adrian Chen in 2014 and from Catherine Buni and Soraya Chemaly in The Verge’s award-winning “Secret Rules of the Internet” in 2016
  • “Twitch and Shout” – in-depth radio program on Twtich from public radio’s On the Media
  • “Post No Evil”: RadioLab’s remarkable in-depth report on content moderation at Facebook (some of the best reporting I’ve seen so far on the subject)
  • “There is no magic bullet for moderating a social media platform,” Techdirt’s podcast about the first CoMo conference
  • Players in the spotlight: “Games Are Taking a Back Seat to Players on Video Game Streaming Sites” at NPR.org

Disclosure: As a nonprofit executive, I’ve advised companies such as Google, Facebook and Yubo (not including Twitch) on youth online safety for a number of years. The ideas expressed, here—informed by that work, as well as 20+ years of writing about youth and digital media—are entirely my own.

Share Button

Filed Under: Risk & Safety, social norms, video, Youth Tagged With: algorithms, live-streamed video, Marc Antoine-Durand, Shirin Keen, Twitch, video games, video streaming, Yubo

Reader Interactions

Trackbacks

  1. Sharing Diigo Links and Resources (weekly) | Another EducatorAl Blog says:
    October 28, 2018 at 2:55 pm

    […] Twitch, Yubo & online safety innovation with an ancient ‘tool’ – NetFamilyNews… […]

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Primary Sidebar

NFN in your in-box:

Anne Collier


Bio and my...
2016 TEDx Talk on
the heart of digital citizenship

Subscribe to my
RSS feed
Friend me on
Facebook
Follow me on
Twitter
See me on
YouTube

IMPORTANT RESOURCES

Our (DIGITAL) PARENTING BASICS: Safety + Social
NAMLE, the National Association for Media Literacy Education
CASEL.org & the 5 core social-emotional competencies of SEL
Center for Democracy & Technology
Center for Innovative Public Health Research
Childnet International
Committee for Children
Congressional Internet Caucus Academy
ConnectSafely.org
Control Shift: a pivotal book for Internet safety
Crimes Against Children Research Center
Crisis Textline
Cyber Civil Rights Initiative's Revenge Porn Crisis Line
Cyberwise.org
danah boyd's blog and book about networked youth
Disconnected, Carrie James's book on digital ethics
FOSI.org's Good Digital Parenting
The research of Global Kids Online
The Good Project at Harvard's School of Education
If you watch nothing else: "Parenting in a Digital Age" TED Talk by Prof. Sonia Livingstone
The International Bullying Prevention Association
Let Grow Foundation
Making Caring Common
Raising Digital Natives, author Devorah Heitner's site
Renee Hobbs at the Media Education Lab
MediaSmarts.ca
The New Media Literacies
Report of the Aspen Task Force on Learning & the Internet and our guide to Creating Trusted Learning Environments
The Ruler Approach to social-emotional learning (Yale Center for Emotional Intelligence)
Sources of Strength
"Young & Online: Perspectives on life in a digital age" from young people in 26 countries (via UNICEF)
"Youth Safety on a Living Internet": 2010 report of the Online Safety & Technology Working Group (and my post about it)

Categories

Recent Posts

  • Youth rights’ digital upgrade: True cause for celebration
  • Finally! A YouTube (flexibly) designed for kids 9-12
  • Safer Internet Day 2021: Ground youth safety & rights in dignity
  • Pepe the Frog, GameStop & the gamification of reality
  • Online safety after Trump’s deplatforming
  • Young artists & activists wrapping 2020 in light
  • Finally! Solid cyberbullying data on tweens
  • Beyond ‘The Social Dilemma’ to social solutions

Footer

Welcome to NetFamilyNews!

Founded as a nonprofit public service in 1999, NetFamilyNews quickly became the “community newspaper” of a vital interest community of subscribers in more than 50 countries. Site and newsletter became a blog in the early 2000s. Nowadays, you can subscribe in the box to the right to receive articles in your in-box as they're posted – or look for tweets, posts on our Facebook page, and key commentaries from Anne on her page at Medium.com. She welcomes your comments, follows and shares!

Categories

  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research

ABOUT

  • About NFN
  • Supporters
  • Anne Collier’s Bio
  • Copyright
  • Privacy

Search

Subscribe



THANKS TO NETFAMILYNEWS.ORG's SUPPORTERS DOMAIN NAMES AND WEB HOSTING UK AND HOMESCHOOL CURRICULUM.
Copyright © 2021 ANNE COLLIER. ALL RIGHTS RESERVED.