The company behind Rounds – a video hangout app for mobile and Web – has decided to keep the socializing just among friends. Referring to its “young user base” (it says 70% of its users are under 25, though it has yet to catch on at my son’s high school), this week announced that the less than 3-month-old app was “retiring” its “Meet New People” feature “to focus on longer, more meaningful video conversations between real friends and family,” according to Tel Aviv-based Rounds Entertainment. “The phenomenon of random video chatting a different stranger every five seconds is over,” said CEO Dani Fishel, probably referring to Chat Roulette (which hasn’t turned up in the news for a while, but see this from 2010 ). Sounds like a very smart move, since people, regardless of age, can relax when they’re among friends – when they feel safe. The app, which sounds like a video Instagram and which used to be just on the Web as a Facebook app, will logically take off even more as a mobile apps, simply because phones go everywhere. So because of its FB ties (the FB version is called Video Chat Rounds), the new security measure requires that Rounds hangout participants be Facebook friends to begin video chatting. According to SheBytes, “Rounds for Facebook had an estimated user base of 1 million users per day with most users averaging about 40 minutes on the app.”
And then there’s Vine
Could Vine’s problems with user-generated pornography be a reason why Rounds was so proactive? The age rating for the video app Vine, which lets anyone create and share six-second video clips, just got increased to 17+, up from 12+, in the Apple app store, TechCrunch reports – thought that’s not much of a barrier (it just requires users to confirm they’re 17 or older before they download it), unless iPhone parental controls are set to restrict app downloads by age (see eHow). Other improvements include an accompanying warning of “frequent/intense sexual content or nudity” and the ability to report offending videos or block the people who send them, PCMag.com reports. And TheVerge.com reports that Twitter, which acquired Vine last fall, now blocks porn-related search terms in Vine, such as “#porn” or “#nude.” At the bottom of its article, PCMag flags eight other video apps that “ran afoul of Apple’s policies.”
Perspective for parenting
We now live in a user-generated media environment. Regulating that is not like regulating media; it’s like regulating the content of hundreds of millions of lives updated in real time, as they’re lived (and that’s beside the challenge of regulating speech and upholding the freedom of it at the same time). “Almost every photo and video service that depends on its users to generate content faces the dilemma of screening out content that younger users and others may find objectionable. In most cases, filters and user reports keep the most violent or pornographic content off of services’ main pages,” the Washington Post reports, but obviously they’re not fail-proof. For children, you may want to see if an app either limits sharing by default (like Rounds) or at least allows users themselves to limit sharing and who can “friend” or follow them to friends and family. That’s the second thing to look for after age-rating. Also look at the Terms of Service to see what the app doesn’t allow. Or just set those iPhone parental controls (see Coolmomtech.com, but this would probably be a bit overkill for teens).
Even so, in any digital interaction, safety can’t be 100% guaranteed because of three factors:
- The fluid nature of both technology and humanity in life and user-produced media. On the human side, sometimes friends have arguments, relationships change and break up, and we don’t always know online “friends” (who could be relatives or anybody’s second cousin) as well as friends known in offline life. On the tech side, there’s the copy-anywhere nature of digital media and then the “Net effect,” where all the used-to-be parts of our lives (personal, work, school, family) can show up in one digital place, stay visible virtually forever, and be searchable (based on early work by social media researcher danah boyd – see this).
- Safety is social. The digital media and tech are social, so safety and privacy are as well. For example, when a photo or video involves a bunch of people (or just more than one), safety is shared, sometimes a negotiation. Where that piece of media goes is not up to just one person.
- In transition. Not just the conditions but also the understanding of them are works in progress. Billions of us are using these media and technologies while we’re getting used to this new set of conditions. We’ve been working out social norms for thousands of years; now we have to apply them to a new environment with unprecedented conditions. That’s both exciting and unsettling – sometimes scary. So we need a lot of patience with each other and a lot of communication, which takes friction out of communication and learning (unlike fear, which gums things up).
And it helps to remember that digital media are tools and there are risks and downsides to every useful tool. Knives can cut, stoves can burn, cars can crash, but all are great tools, and the positive aspects are what people experience most. With video, some of those might be getting a distant friend in on a party, playing social games across time zones, seeing loved ones’ facial expressions when a big announcement’s being made, and generally sharing everyday lives with people who are important to us in a dynamic, immediate kind of way.