Formspring: What’s going on around it
Last week I wrote about what’s going on at Formspring. This week: what’s going on around it. The problem we, all of us, are dealing with isn’t mainly a Web site whose business plan did not include teens using the site for their own purposes, some harmful. I think everyone can agree that the basic problem is the nasty behavior on it. What those of us who grew up in the mass-media era struggle over is what today’s (social) media companies and Web sites can do about that behavior buried in millions of accounts or pages.
“Living, breathing” products. What’s difficult for those of us used to a regulated, professionally produced mass-media environment to understand is why it could be hard for social media companies to fix their “product.” Of course there are things they can and should do to help – such as responsiveness to abuse reports – but there’s nothing they can do to fix the problems in their services completely. Why? Because…
- They don’t produce their “products” – their users do. And not only do users produce most of the product, they’re constantly changing it.
- The products of the social Web not only increasingly mirror the full spectrum of human life, they’re an expression of it – I mean, life, sociality, and behavior are being expressed in (or as) these products in realtime.
- The problems in this “living” product are largely rooted in real-life relationships, not just the behavior in the Web sites where it lives.
- New “product,” good and bad, is unfolding continuously. What sites can do – account deletions, for example – are temporary measures, not solutions. New profiles and groups can replace deleted ones pretty much instantly (and do, when there’s bullying going on or a long-term disagreement).
A new social contract. This new, user-driven media environment we’re all experiencing (even non-users, whose loved ones take and post photos of them on a social network site) is, in effect, a new kind of social contract – a tenuous, evolving, unwritten “contract” to which providers, users, and government are all “signatories”; power, or control, is distributed; and it’s already in place. We all, providers included, signed on the minute we started participating. Under it, no single signatory – users, companies, or government – has total control over the product (users who have self-control still have to negotiate outcomes with fellow users who can comment on their content, tag them in photos, etc.), and all have responsibilities as well as rights. It’s the social Web’s “social contract” that, because its conditions are constantly changing, is under continuous negotiation.
But let’s look at some of the “fixes” being proposed for the Formspring problem:
- Not allowing users under 18. Formspring could make the investment in adding credit card validation, as adult-content sites do, but should this be required of all sites used by teens, barring all teens when not all have abused their services – in fact, probably only a minority of them? It would have to be required by some sort of regulation because few sites would so significantly reduce their traffic, their source of revenue, without being required to do so. But that regulation – even if it passed constitutional muster – would affect only US-based sites, marginally solving only part of the problem.
- Employing moderators and tech protections to stop abusive behavior. Children’s sites do this because children must be protected, as should all minors be, but I wonder if sites for everybody 13+ could afford such protections scaled to tens, in some cases hundreds, of millions of users and stay in business. Probably not, but such an analysis has not been made public. Probably many would be forced out of business, which is fine, theoretically, but people have gotten accustomed to using social network sites. They’re a fact of life now, and there would probably be an even greater clamor to keep them in business than develops against less conscientious sites when they hit the news. The dilemma even great kids’ sites have is that the more restrictions you place on user activity, the more likely users are to migrate elsewhere. So, for survival, kids’ sites have to strike a delicate balance: “protecting their brand” and getting as close as possible to 100% safe, while not restricting kids’ activity so much that they go to the nearest competitor (there are hundreds of kids’ virtual worlds – see this on virtual world moderators and these safety tips).
- Beefing up abuse reporting and customer service. Something we want from all sites, but there are the regulatory and business issues above, and then there’s the Formspring example. Some of the nasty behavior on it is peer group-imposed or self-imposed harassment – like the peer-pressure-loaded show of bravado in a “Truth or Dare” game. Participants would not be reporting abuse that they don’t see (or won’t admit to seeing) as abuse. Adults who go on the site and find abusive behavior could use the abuse-reporting process, but what would it accomplish beyond getting accounts shut down and, again, sending kids into more “underground” settings for this interaction, where there’s even less prospect of detection or supervision?
- Pressuring Formspring out of business to set an example. Teens are always looking for spaces online and offline away from adult monitoring, so the more we might try to ban or shut down Web sites, the fewer legitimate sites there would be for teens and the more “underground” they go. New ones pop up all the time, usually without the safety features that more established sites have. “Underground” is a big space online.
We don’t need to add “migratory cyberbullying” to the youth-online-risk list by focusing too much on “bad sites” rather than bad behavior. The profound media shift we’re experiencing is requiring a huge adjustment for anyone not born into the Web 2.0 era and is extremely frustrating to anyone seeking recourse through litigation or regulation (both becoming slow, inflexible blunt instruments in this fast-moving, user-driven environment). Social media companies that don’t really own the content and can’t keep it from leaving don’t have as much control over the product as we might think. And, though users have more power than they did in the mass-media era (or ever in history), they have little control over fellow users. Increasingly, it’s all a negotiation – or better, a collaboration.
Challenge me on any of this. This is starkly painted, and of course nothing’s black and white. But I’ve been watching this new social contract evolve for 15 years, and it’s clearer and clearer to me that the only real solution is an immediate and long-term national (or global) commitment to teach and model the rights and responsibilities of good citizens online as well as offline (because it’s all just life to young people) and a new media literacy that addresses what’s said, done, and shared in media as well as what’s consumed. It’s far from a quick fix but – once we get going – it will start delivering right away. Because the research shows that responsible behavior is protective.