As geeky as “content moderation” sounds, we’re hearing about it more and more these days – in podcasts, at conferences, in books and in hearings. People all over the U.S. and world are talking about how to make social media safer, and content moderation by the platforms is an important part of that.
But what we’re not hearing or talking about enough is another part of the social media safety equation: social norms, a tool we humans have been using to shape behavior for millennia. There’s some really interesting innovation going on around social norms right now, and in a sector of social media where you might least expect to see it: live-streamed video, projected to be more than a $70 billion market in two years but already mainstream social media for teens and young adults, according to Pew Research. I’ll give you two examples.
Parents, you’re very likely to have heard of Twitch, the 800-pound gorilla of streaming platforms based in San Francisco. The other example is Yubo, a fast-growing startup based in Paris, may be familiar to parents in Europe, but I’ll get to that in a minute. Both are innovating in really interesting ways, blending both human norms and tech tools. Both…
- Innovate around user safety because it’s “good business”
- Have live video in their favor as they work on user safety
- Empower their users to help.
To explain, here’s what they’re up to….
Starting with the startup
Yubo, an app that now has more than 15 million users in more than a dozen countries, reached its first million within 10 months without any marketing (it was then called Yellow). The app’s all about communication—chat via live video-streaming. Somehow, the focus of early news coverage was on one Tinder-like product feature, even though the app’s safety features have long included a “firewall” that keeps 13-17 year-old users separate from adults, Virtually all news coverage has been alarmist, especially in Europe, encouraging parents and child advocates to expect the worst.
So they doubled down, with a mix of high- and low-tech “tools.” One of them is surprisingly unusual, and it definitely involves an element of surprise: nearly real-time intervention when a user violates a community rule (the rules are no bullying, hate speech, underwear, nudity or violence).
When the violation happens—maybe a user joins a chat in their underwear—the user almost immediately gets a message like the one in the screenshot to the left.
It’s one of those sudden disruptors that stops the conversation flow and changes the dynamics, helping chatters learn the community standards by seeing them enforced in real time. Yubo can do that because it takes a screenshot of all chat sessions every 10 seconds; so between users’ own reports of annoying behavior and algorithms running in the background that detect and escalate it, the app can respond very quickly. If the violation doesn’t stop, moderators will either disable the chat mid-stream or suspend the user’s account.
You wouldn’t think an app communicating with users in real time is that unusual, but Yubo says it’s the only social video app that intervenes with young users in this way, and in more than 20 years of writing about kids and digital media, I haven’t seen another service that does.
Teaching algorithms
The other innovative part of this is the algorithms, and the way Yubo uses them to detect problems proactively for faster response (Twitch doesn’t even have algorithms running proactively in the background, they told me). Proactive is unusual. For various reasons—including the way platforms are used—content moderation has always been about 99% reactive, dependent on users reporting problems and never even close to real-time. Bad stuff comes down later—sometimes, without explanation.
Knowing how hard it is for algorithms to detect nudity and underwear, I asked Yubo COO Marc-Antoine Durand how that works on his app. “We spent nearly a year labeling every reported piece of content to create classifiers. Every time we had a report, a dedicated team had to classify the content into multiple categories (ex: underwear, nudity, drug abuse, etc.). With this, our algorithms can learn and find patterns.”
It takes a whole lot of data—a lot of images—to “teach” algorithms what they’re “looking for,” Durand acknowledged. An app that’s all live-streamed video, all the time, very quickly amasses a ton of data for an algorithm to chew on. And Yubo has reached 95% accuracy in the nudity category (its highest priority) with a dedicated data scientist feeding (and tweaking) the algorithms full-time, he said. “We had to make a distinction between underwear and swimsuits,” he added. “We trained our algorithms to know the difference by detecting the presence of sand, water, a swimming pool, etc.”
Not that nudity’s a huge problem on the app—for example, only 0.1-0.2% of users upload suggestive content as profile photos any given day—Yubo says they prioritize nudity and underwear.
But algorithms, high-frequency screenshots and the element of surprise aren’t the only tools in the toolbox. Probably because live-streamed video hasn’t come up much on the Hill or in news about “content moderation” (yet), we don’t hear much about how live video itself—and the social norms of the people using it—are safety factors too. But it’s not hard for anyone who uses Skype or Google hangouts to understand that socializing in video chat is almost exactly like socializing in person, where it’s extremely unusual for people to show up in their underwear, right? If someone decides to in Yubo, and others in the chat have a problem with that, the person typically gets reported because it’s just plain annoying, and moderators can suspend or delete their account. The every-10-sec. screenshots are backup so moderators can act quickly on anything not reported. On top of that, Yubo provides moderation tools to users who open chats, to the “streamer” of the “live.”
Twitch communities’ social norms
Twitch leverages social norms too. It empowers streamers, or broadcasters, with tools to maintain safety in chat during their live streams. With 44 billion minutes streamed per month by 3.1 million “streamers” (according to TwitchTracker.com) the company really has to harness the safety power of social norms in millions of communities.
“We know through statistics and user studies that, when you have a lot of toxicity or bad behavior online, you lose users,” said Twitch Associate General Counsel Shirin Keen at the CoMo Summit one of the first conferences on the subject. “Managing your community is core to success on Twitch.” So because streamer success is Twitch’s bread and butter, the platform has to help streamers help their viewers feel safe.
Twitch’s phenomenal growth has been mostly about video gamers streaming their videogame play. But the platform’s branching out—to the vastness of IRL (for “In Real Life”), its biggest new non-gaming channel. And just like in real real life, each community develops its own social norms. The Twitch people know that. The platform lets streamers shape their own norms and rules—lets them appoint their own moderators and provides them with moderation tools for maintaining community standards.
So can you see how safety innovation is as much about humans as it is about tech—what we find acceptable or feel comfortable with in the moment and in a certain social setting? Take “emotes,” for example. They’re a way users can express their emotions in otherwise dry, impersonal text chat. They’re also part of channel culture and community-building. So are “bits,” colorful little animated graphics that viewers can buy to cheer a streamer on from the chat window. And these graphical elements help moderators see at a glance, in a fast-moving chat stream, when things are good or going south. Twitch told me they’ve even just launched “Creator Camp” aimed at training streamers and their “mods” (channel moderators) on how to create their unique community and its unique culture, “all without bug spray, sunburns, and cheesy sing-alongs (ok maybe a few sing-alongs,” Twitch tells its streamers.
So two vastly different live video companies are experimenting with something we’re all beginning to see, that…
-
-
- Live video makes social media a lot like in-person socializing
- Social media, like social anything, is affected by social norms
- Social norms help make socializing safer
- People stick around and have more fun when they feel safe.
-
The bottom line being that safety—in the extremely participatory business of social media—is good business.
[Disclosure, I am an Internet safety adviser to Yubo and other social media services, including Facebook, Twitter and Snapchat but not including Twitch.]
Related links
- A more detailed version of this post can be found at Medium.com
- Live-stream video metrics: ”Two-thirds (67%) of consumers globally have streamed live video,” according to a June 2018 report.
- The power of social norms in online safety: I’ve written a lot about in this blog. A few examples are: “Kids deserve the truth about cyberbullying” (2011); “Zooming in on social norms” (2014); and “Core concerns: ‘Blue Whale’ & the social norms research” (2017)
- Live-stream video metrics: ”Two-thirds (67%) of consumers globally have streamed live video,” according to a June 2018 report
- All negative, all the time not helpful: At a conference last June, psychology professor Sonia Livingstone, a leading researcher in the field of youth online risk, called the current debate about “tech addiction” and “screen time” impossibly “monolithic. We get complete exclusion of any kind of consideration of the opportunities that technologies bring.” She’s no apologist for technology, she said, but she suggested the importance of taking a “critical stance” toward the uniformly negative narrative we’re seeing in tech news. Even professionally neutral researchers are calling for more balance in the public discussion about online wellbeing.
- A “day” in the life of a content mod: a talk at MIT by volunteer content moderator at Twitch and grad student Claudia Lo, illustrating how it’s proactive and reactive and so much more than that binary (and this shows how important transparency is for anyone talking about regulation)
- Other tools and measures Yubo uses, a post in my blog a year ago before the app rebranded (and, more recently, Yubo’s own perspective)
- About content moderation on Reddit: insights from Nate Mathias, PhD
- Pioneering journalism on the subject at Wired from Adrian Chen in 2014 and from Catherine Buni and Soraya Chemaly in The Verge’s award-winning “Secret Rules of the Internet” in 2016
- “Twitch and Shout” – in-depth radio program on Twtich from public radio’s On the Media
- “Post No Evil”: RadioLab’s remarkable in-depth report on content moderation at Facebook (some of the best reporting I’ve seen so far on the subject)
- “There is no magic bullet for moderating a social media platform,” Techdirt’s podcast about the first CoMo conference
- Players in the spotlight: “Games Are Taking a Back Seat to Players on Video Game Streaming Sites” at NPR.org
Disclosure: As a nonprofit executive, I’ve advised companies such as Google, Facebook and Yubo (not including Twitch) on youth online safety for a number of years. The ideas expressed, here—informed by that work, as well as 20+ years of writing about youth and digital media—are entirely my own.
[…] Twitch, Yubo & online safety innovation with an ancient ‘tool’ – NetFamilyNews… […]