No stage of life is easy, but being a teen is hard in a very particular way. “Adolescence is a period of rapid cognitive, social and emotional development,” said researcher Amanda Lenhart, Health & Data lead at the Data & Society Research Institute in New York, at a conference this week, for example. “They’re a protected legal category. They don’t have the same power or ability to make decisions” as adults do or “the same opportunities to consent,” she said, in a talk about an important, just-released report she co-authored with Kellie Owens: “The Unseen Teen: The Challenges of Building Healthy Tech for Young People.”
It was a “lightning talk,” so Lenhart had picked out one insight – one question, really – from the many findings in her multi-year research project to share at a cross-discipline summit, held by TTC Labs and Facebook, on “designing [apps and other digital spaces] for trust, transparency and control for young people online” this week. By “cross-discipline” is meant speakers and participants from governments, NGOs, the UN, academia and tech companies, bringing the many perspectives and skill sets needed to take on the complex challenge of retrofitting digital environments to serve children’s best interests, as they were not originally designed to do.
Design for growing up
“Teens push boundaries, make mistakes, transgress,” Lenhart said. Her question, what she called a “provocation,” was, how do social media companies design for that? How to design for the beautiful, challenging developmental design of adolescence. Taking a cue from psychology professor Alison Gopnik at University of California, Berkeley, you might think of it as the most sophisticated part of the “R&D Division” of human development.
One of the biggest challenges of designing digital spaces for youth is allowing teens to make mistakes, learn from them and try again – demonstrating corporate responsibility and protecting young users as they do the completely developmentally normative trial-and-error work of being adolescents. Though Lenhart and Owens found tech company employees who care greatly about getting that right, many tech companies aren’t set up to support them (see the sidebar for other problems).
We’ve all heard the axiom, “Everyone deserves a second chance” – at least when the mistake is an honest one. It’s hard enough for parents to know if it is, harder for a school and even harder for authorities who have never met a child involved in a social situation that goes south. Those who care about the child and get some context for what happened generally try to ensure they get a second chance. The further removed authorities are from a child who is learning a tough lesson, the less context they have for what happened. So think about digital platforms that have zero “real life” context for a threat or cruel comment in a digital space.
A tech worker’s view
Lenhart shared a comment from a corporate participant in her study:
So we’ve been thinking a lot about when a young person violates our rules or does something really wrong, and then they get suspended. Is that something that they should be held accountable for forever? And should we be looking at permanent suspension, especially of young users where they’re still kind of formulating how they want to be in the world? If someone got suspended for saying, ‘I want to kill you,’ that’s suspendable because it’s a violent threat. If you sent that when you were 16 to someone you hate and you actually have no intention of killing them, is there any kind of rehabilitation that we can do?… There’s a lot of things we can do to help educate especially younger users about how they should show up in society, and that’s something that I’ve been talking a lot [about] with the design team.
How many times have the words “I’m going to kill you” been uttered offline jokingly, in frustration or even anger with zero intent to harm? While putting those words into text in a public digital space raises alarms as a threat of violence. Thank goodness tech workers are seeing the need for caution – and education.
Like parents who want to get to the bottom of things before doling out consequences and positive school cultures that prioritize restorative practices over punitive ones, platforms too need to figure out how to give kids second chances. Many of their content moderators know that the actual context of what happens online, particularly among users under 18, is not the digital space where a comment appears but rather the users’ offline lives and social circles. It’s just that the platforms don’t have that context – and can’t get it in the way that parents and school staff can.
Restorative practices in social media?
Like restorative circles in schools and restorative justice in law enforcement for juveniles, might social media platforms develop spaces or systems for restorative practices – for “rehabilitating” young users who only look like they need rehabilitation, because of something they typed jokingly or in a moment of anger but with no intent to harm? There’s that question, and then there’s the question of how to provide rehabilitation for users who really are abusing people and a platform’s system. Some have a “three strikes, your out” policy.
But for offline context, platforms could work together to identify and collaborate with offline helpers, such as child helplines and other trusted NGOs in many countries which can provide the context and connections with care that young social media users deserve. We don’t have all the answers yet, but we know where to get some in offline societies around the world – and it’s good researchers and supporters of children’s rights are asking the questions.
SIDEBAR: Top take-aways from the Data & Society report
Just released, “The Unseen Teen” was a multi-year, qualitative research project that interviewed tech company employees in a variety of roles, offering rare insight into “how adolescent well-being is prioritized (or not) in the design and development of popular social media and gaming platforms.” Here are some key take-aways:
- What is “digital wellbeing”? It’s a term we see a lot, but there is actually little agreement, even in the research community, on the definition of “digital wellbeing,” co-authors Amanda Lenhard and Kellie Owens write. Like screen time, measuring it “misses a more holistic view of what benefits younger users.” They are not data points. I suggest we think in terms of wellbeing in the digital age, placing digital interaction and activities in the context of everyday offline life.
- The “unseen” part. The platforms young people use were not designed with them in mind, so adolescent users and their needs are an afterthought. If platforms are going to operate in the “best interests of young people,” as called for by globally recognized principles of children’s digital rights, they’re going to need some retrofitting.
- Teens are over-generalized. They’re not an undifferentiated mass of people. Companies as well as regulators “must recognize that the impacts of social platform use vary for different subgroups of adolescents,” authors Amanda Lenhart and Kellie Owens write. Teen use of platforms varies for different subgroups. So don’t design for something called “the average teen.”
- The plausible deniability problem. Some companies practice a “strategic ignorance by purposely not collecting data on young people … [in order] to avoid tackling thorny issues related to how adolescents use their products,” the authors write, hopefully not in the name of data minimization. That would be an abuse of an important principle in an age of hyper datification.
- It will “take a village”: The wellbeing, safety and privacy of under 18 Internet users involve many actors and many factors, inside and outside Internet companies. The authors have recommendations for both tech workers and regulators in the report and say that both internal and external pressure can drive change within tech companies.
Related links
- “The Unseen Teen” report’s recommendations for companies and workers, including “focus on empowering, not just protecting” and “integrate expertise in user wellbeing into product teams”
- Principles of Designing for Children’s Rights (see the 10 principles here, and see my last post about children’s digital rights)
- “Hey Big Tech, Now is the Perfect Time to Support Our Kids” from developmental psychologist Candice Odgers at University of California, Irvine, writing for the Joan Ganz Cooney Center. I love that she writes, “Youth may have the answers to how to best use social media and other digital spaces ‘for good’ [agree!].” She has recommendations too, and “the first step [for platforms]….is acknowledging that [kids] are present in the spaces that Big Tech companies have started to police. Despite many…platforms requiring a user age of 13 or over, when we asked adolescents in our 2015 survey, half of the 11-year-olds reported having a social media account. This number snowballed to 85 % by age 14. While some companies have more kid-friendly apps, such as YouTube Kids or Messenger Kids, we know that most young people have remained on the leading platforms. A ‘kiddie’ version of the Internet is not what children under 13 want.”
- “The Overlooked User”: the blog post on this research Lenhart and Owens published last fall
- “The 4 Myths of Healthy Tech”: Speaking of “digital wellbeing,” last fall Lenhart and Owens published this report. Two of the myths are especially relevant to parental concerns of late: “Social media is addictive, and we are powerless to resist it” (there is very little evidence for this, and what evidence is exists is controversial, they write) and “our health and wellbeing depend on spending less time with screens and social media platforms.” In response, the authors write, “Not all screen time is the same. It can be connective, supportive, emotionally enriching, horizon-expanding, and educational, as well as sometimes harmful. Social media can be used as a ‘release valve’ for youth, allowing them to manage the pressures and limitations in their lives.”
- The International Institute of Restorative Practices – go-to source on restorative practices in the US (see this page for a definition or this 2-min. video on YouTube)
- “Cyberbullying and…second chances?” – a reflection I wrote on this subject back in 2010
[…] “Could platforms design for second chances?” […]