Skip to content

Twitch, Yubo & online safety innovation with an ancient ‘tool’

As geeky as “content moderation” sounds, we’re hearing about it more and more these days – in podcasts, at conferences, in books and in hearings. People all over the U.S. and world are talking about how to make social media safer, and content moderation by the platforms is an important part of that.

But what we’re not hearing or talking about enough is another part of the social media safety equation: social norms, a tool we humans have been using to shape behavior for millennia. There’s some really interesting innovation going on around social norms right now, and in a sector of social media where you might least expect to see it: live-streamed video, projected to be more than a $70 billion market in two years but already mainstream social media for teens and young adults, according to Pew Research. I’ll give you two examples.

Parents, you’re very likely to have heard of Twitch, the 800-pound gorilla of streaming platforms based in San Francisco. The other example is Yubo, a fast-growing startup based in Paris, may be familiar to parents in Europe, but I’ll get to that in a minute. Both are innovating in really interesting ways, blending both human norms and tech tools. Both…

  • Innovate around user safety because it’s “good business”
  • Have live video in their favor as they work on user safety
  • Empower their users to help.

To explain, here’s what they’re up to….

Starting with the startup

Yubo, an app that now has more than 15 million users in more than a dozen countries, reached its first million within 10 months without any marketing (it was then called Yellow). The app’s all about communication—chat via live video-streaming. Somehow, the focus of early news coverage was on one Tinder-like product feature, even though the app’s safety features have long included a “firewall” that keeps 13-17 year-old users separate from adults, Virtually all news coverage has been alarmist, especially in Europe, encouraging parents and child advocates to expect the worst.

So they doubled down, with a mix of high- and low-tech “tools.” One of them is surprisingly unusual, and it definitely involves an element of surprise: nearly real-time intervention when a user violates a community rule (the rules are no bullying, hate speech, underwear, nudity or violence).

Yubo screenshotWhen the violation happens—maybe a user joins a chat in their underwear—the user almost immediately gets a message like the one in the screenshot to the left.

It’s one of those sudden disruptors that stops the conversation flow and changes the dynamics, helping chatters learn the community standards by seeing them enforced in real time. Yubo can do that because it takes a screenshot of all chat sessions every 10 seconds; so between users’ own reports of annoying behavior and algorithms running in the background that detect and escalate it, the app can respond very quickly. If the violation doesn’t stop, moderators will either disable the chat mid-stream or suspend the user’s account.

You wouldn’t think an app communicating with users in real time is that unusual, but Yubo says it’s the only social video app that intervenes with young users in this way, and in more than 20 years of writing about kids and digital media, I haven’t seen another service that does.

Teaching algorithms

The other innovative part of this is the algorithms, and the way Yubo uses them to detect problems proactively for faster response (Twitch doesn’t even have algorithms running proactively in the background, they told me). Proactive is unusual. For various reasons—including the way platforms are used—content moderation has always been about 99% reactive, dependent on users reporting problems and never even close to real-time. Bad stuff comes down later—sometimes, without explanation.

Knowing how hard it is for algorithms to detect nudity and underwear, I asked Yubo COO Marc-Antoine Durand how that works on his app. “We spent nearly a year labeling every reported piece of content to create classifiers. Every time we had a report, a dedicated team had to classify the content into multiple categories (ex: underwear, nudity, drug abuse, etc.). With this, our algorithms can learn and find patterns.”

It takes a whole lot of data—a lot of images—to “teach” algorithms what they’re “looking for,” Durand acknowledged. An app that’s all live-streamed video, all the time, very quickly amasses a ton of data for an algorithm to chew on. And Yubo has reached 95% accuracy in the nudity category (its highest priority) with a dedicated data scientist feeding (and tweaking) the algorithms full-time, he said. “We had to make a distinction between underwear and swimsuits,” he added. “We trained our algorithms to know the difference by detecting the presence of sand, water, a swimming pool, etc.”

Not that nudity’s a huge problem on the app—for example, only 0.1-0.2% of users upload suggestive content as profile photos any given day—Yubo says they prioritize nudity and underwear.

But algorithms, high-frequency screenshots and the element of surprise aren’t the only tools in the toolbox. Probably because live-streamed video hasn’t come up much on the Hill or in news about “content moderation” (yet), we don’t hear much about how live video itself—and the social norms of the people using it—are safety factors too. But it’s not hard for anyone who uses Skype or Google hangouts to understand that socializing in video chat is almost exactly like socializing in person, where it’s extremely unusual for people to show up in their underwear, right? If someone decides to in Yubo, and others in the chat have a problem with that, the person typically gets reported because it’s just plain annoying, and moderators can suspend or delete their account. The every-10-sec. screenshots are backup so moderators can act quickly on anything not reported. On top of that, Yubo provides moderation tools to users who open chats, to the “streamer” of the “live.”

Twitch communities’ social norms

Twitch leverages social norms too. It empowers streamers, or broadcasters, with tools to maintain safety in chat during their live streams. With 44 billion minutes streamed per month by 3.1 million “streamers” (according to TwitchTracker.com) the company really has to harness the safety power of social norms in millions of communities.

“We know through statistics and user studies that, when you have a lot of toxicity or bad behavior online, you lose users,” said Twitch Associate General Counsel Shirin Keen at the CoMo Summit one of the first conferences on the subject. “Managing your community is core to success on Twitch.” So because streamer success is Twitch’s bread and butter, the platform has to help streamers help their viewers feel safe.

Twitch’s phenomenal growth has been mostly about video gamers streaming their videogame play. But the platform’s branching out—to the vastness of IRL (for “In Real Life”), its biggest new non-gaming channel. And just like in real real life, each community develops its own social norms. The Twitch people know that. The platform lets streamers shape their own norms and rules—lets them appoint their own moderators and provides them with moderation tools for maintaining community standards.

So can you see how safety innovation is as much about humans as it is about tech—what we find acceptable or feel comfortable with in the moment and in a certain social setting? Take “emotes,” for example. They’re a way users can express their emotions in otherwise dry, impersonal text chat. They’re also part of channel culture and community-building. So are “bits,” colorful little animated graphics that viewers can buy to cheer a streamer on from the chat window. And these graphical elements help moderators see at a glance, in a fast-moving chat stream, when things are good or going south. Twitch told me they’ve even just launched “Creator Camp” aimed at training streamers and their “mods” (channel moderators) on how to create their unique community and its unique culture, “all without bug spray, sunburns, and cheesy sing-alongs (ok maybe a few sing-alongs,” Twitch tells its streamers.

So two vastly different live video companies are experimenting with something we’re all beginning to see, that…

      • Live video makes social media a lot like in-person socializing
      • Social media, like social anything, is affected by social norms
      • Social norms help make socializing safer
      • People stick around and have more fun when they feel safe.

The bottom line being that safety—in the extremely participatory business of social media—is good business.

[Disclosure, I am an Internet safety adviser to Yubo and other social media services, including Facebook, Twitter and Snapchat but not including Twitch.]

Related links

Share Button

Leave a Reply

You may use basic HTML in your comments. Your email address will not be published.

Subscribe to this comment feed via RSS

This site uses Akismet to reduce spam. Learn how your comment data is processed.