• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

NetFamilyNews.org

Kid tech intel for everybody

Show Search
Hide Search
  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research
  • About NetFamilyNews.org
    • Supporters
    • Anne Collier’s Bio
    • Copyright
    • Privacy

Key piece of the puzzle: Australia’s ‘Safety by Design’ tools

June 22, 2021 By Anne 2 Comments

A vital piece of the global online safety puzzle has just fallen into place: Australia’s eSafety Commissioner’s Office this week unveiled its Safety by Design tools for Internet companies everywhere. They’re the outgrowth of extensive international research and consultation with people in industry, government, academia and advocacy, including youth and parents – a process eSafety started in 2018.

Think of the puzzle as a sort of Rubik’s Cube. It has to be 3D because of the technical “stack” of businesses that make the Internet possible, with all the moving parts on each layer – human, algorithmic, organizational – which help keep users safe. We users know the top layer best, but the layer we use can’t be the only one thinking about safety. On every level, the prevention-intervention spectrum needs to be considered. Prevention typically means legislation and education – of everybody on the receiving and providing end of digital media. Intervention includes everything from law enforcement to algorithmic and human content moderation behind platforms to Internet hotlines and user helplines to regulatory action.

The interesting thing about the eSafety Office is that it not only fills both prevention and intervention roles (regulation, user education and intervening in cases of online abuse), it’s also collaborative – across borders, sectors and fields of expertise – in developing solutions. The people there know that no single actor, whether a national government or even an industry coalition, can solve this puzzle by themselves. They also work it with industry, for industry, which everyone knows is unique for a regulator but I believe essential for our user-driven media environment. [See this about the principles behind the tools and the sidebar below this post to learn more about eSafety directly from Commissioner Julie Inman Grant.]

Though regulators in many countries have long told corporations what they have to do, regulatory work has been short on the how-to part. The missing piece, until now, was corporate education that helps companies assess, against a clear set of safety standards, the policies, products and systems they either have in place or need to develop. The cross-sector consulting eSafety did made the tools applicable globally, not just for a particular government’s regulator. They come in two sets of modules, one for startups (<50 employees), with a primer on all aspects of user safety for development from the ground up, and one for mid-size to large companies (50+ employees) that works as an audit tool. The tool for established companies is truly comprehensive; it covers “structure and leadership,” “internal policies and procedures,” “moderation, escalation and enforcement” (including fulfilling legal obligations), “user empowerment” “and “transparency and accountability.” The list alone sets a kind of standard. The eSafety Office says the aim in developing both tools was for them to be “realistic, actionable and achievable,” and both come with case studies and examples from internationally known companies.

There are so many other things to love about these tools: that they…

  • Give early stage companies time to think through unintended consequences (not a feature of social media’s early days, right?!)
  • Include a typology of online harms based on human rights principles as well as the lived experience of eSafety user help services
  • Are designed to help both corporate leadership and product designers and managers
  • Come with a commitment to ongoing iteration, based on collaboration with university business schools, computer engineering departments and programs in international and human rights law.

So yes, safety for users all over the world at scale is a bit of a Rubik’s Cube, but it’s actually getting less challenging. Because, thanks to research and tools like these, we know better than ever what we’re dealing with, how best to get things done, who can help and with what skill sets. Plus, we have some fine models in place (see below) and great tools coming on line. Clarity is spreading.

SIDEBAR: Updated prescription

While we’re on the subject of safety assessment tools for industry, let’s assess where we are as a field. What we’re really talking about is an ecosystem of Internet user care that spans the planet. I got prescriptive in an article for Medium a year ago, so building on the steps I laid out there, here’s an update:

If you got this far, you read the latest on the prevention side, part of a holistic approach that includes intervention as well: the eSafety Commissioner’s Office. On the intervention end of the spectrum for the stack’s whole top layer (social apps and services, games, websites, etc.): In addition to the early provisions of law enforcement, platform abuse reporting systems (of varying degrees of helpfulness) and content moderation, we now have user appeals in the form of an Oversight Board. It was a missing piece too – intervention way after the fact in the form of appeals against content moderation decisions. So far, it’s for Facebook users. It needs to be cross-platform. A couple more prescriptions (I called them predictions) for that Board are here.

That type of intervention is important but, as I mentioned, way after the abuse or Terms of Service violation occurs. Arguably needed even more around the world is reasonably immediate help that goes beyond mere content deletion (the latter being the only help platforms can provide). This kind of help, which can obtain offline context for the issue seen online, is what eSafety, Internet helplines throughout Europe, NetSafe in New Zealand and more and more “traditional” child helplines provide. The world’s vulnerable Internet users need contextualized help, someone who can understand what in the victim’s life gave rise to the harmful online content. Only sometimes is getting the content taken down all they need. If so, perfect. Maybe platforms’ systems can help with that. But the giant platforms get so many non-actionable abuse reports (what they call false positives) that, probably more often than not, they won’t even get to that content deletion. Sometimes they will if a “trusted flagger” like a helpline provides the context (or verification) the platform’s moderation team needs to act on the report. So what I’m saying is, Internet users need a trusted flagger organization such as a helpline – one that the platforms have agreed to work with – in every region of the world, ideally every country (we don’t have one in the US). I call this the “middle layer” of help between the help vulnerable people or groups have on the ground (e.g., hotlines for suicide or violence prevention) and the platforms in the cloud – see this post for a bit more about that and this site with lessons learned from piloting a helpline in the US.

We have the first inklings of support for the people who care for the users: content moderators. Their new professionals association, the Trust & Safety Professionals Association, represents peer support, education and research.

We also have a brilliant model in the form of a working international coalition of companies working together on addressing child sexual abuse material: the Tech Coalition. That cross-industry model needs to be applied to other forms of online harm, including hate speech, harassment and cyberbullying, faced by users of all ages.

We now have the best possible framework for child and youth online safety: General Comment 25 on their digital rights – of participation and provision, as well as protection – adopted this past February by the UN Committee for the Rights of the Child.

The US needs a new model for Internet regulation. Though we have some quasi-governmental organizations (quangos) such as the National Endowment for the Arts and the National Labor Relations Board, we don’t have a regulatory one like the UK’s Ofcom. And – though it would be fascinating to explore – something like Australia’s centralized and collaborative eSafety Office may not work for this larger federal system. “Our existing regulatory tools…are not up to the task. They are not fast enough, smart enough, or accountable enough to achieve what we want them to achieve,” wrote University of Toronto law Prof. Gillian Hadfield in Quartz. She proposes “super-regulation”– “super” because it elevates governments out of the business of legislating the ground-level details and into the business of making sure that a new competitive market of [licensed] private regulators operates in the public interest.” Other interesting proposals, in particular by author Tarleton Gillespie of Microsoft Research New England, are discussed under ‘Super-Regulation’ here.

Related links

  • About the launch of the Professionals Trust and Safety Association, an important development, wherein I quote a great tweet by legal scholar Evelyn Douek: “Americans want platforms to be places of open expression, but also misinfo removed, but also tech companies have too much power & can’t be trusted to make content decisions, but also the gvt would be worse.” Right?
  • The prescriptive post I mentioned that’s as iterative as Safety by Design
  • Some early predictions for the Oversight Board
  • Of recent Internet-related dilemmas and developments: deplatforming heads of state and how meme culture gamifies reality (and longstanding institutions)
  • Lessons learned from piloting SocialMediaHelpline.com for US schools
Share Button

Filed Under: international online safety, Law & Policy, Research, Risk & Safety, Social Media Tagged With: eSafety, Gillian Hafield, Julie Inman Grant, Office of the eSafety Commissioner, Safety by Design, Tarleton Gillespie

Reader Interactions

Trackbacks

  1. Online safety for 2022: 8 things we need to see - NetFamilyNews.org says:
    December 28, 2021 at 11:32 am

    […] on the Rights of the Child; the draft Online Safety Bill and Parliament’s response in the UK; the release of Australia’s eSafety Commissioner’s Safety by Design for the tech industry  and investors; […]

    Reply
  2. Sharing Diigo Links and Resources (weekly) | Another EducatorAl Blog says:
    June 29, 2021 at 11:20 am

    […] Key piece of the puzzle: Australia’s ‘Safety by Design’ tools – NetFamilyNews.org […]

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Primary Sidebar

NFN in your in-box:

Anne Collier


Bio and my...
2016 TEDx Talk on
the heart of digital citizenship

Subscribe to my
RSS feed
Friend me on
Facebook
Follow me on
Twitter
See me on
YouTube

IMPORTANT RESOURCES

Our (DIGITAL) PARENTING BASICS: Safety + Social
NAMLE, the National Association for Media Literacy Education
CASEL.org & the 5 core social-emotional competencies of SEL
Center for Democracy & Technology
Center for Innovative Public Health Research
Childnet International
Committee for Children
Congressional Internet Caucus Academy
ConnectSafely.org
Control Shift: a pivotal book for Internet safety
Crimes Against Children Research Center
Crisis Textline
Cyber Civil Rights Initiative's Revenge Porn Crisis Line
Cyberwise.org
danah boyd's blog and book about networked youth
Disconnected, Carrie James's book on digital ethics
FOSI.org's Good Digital Parenting
The research of Global Kids Online
The Good Project at Harvard's School of Education
If you watch nothing else: "Parenting in a Digital Age" TED Talk by Prof. Sonia Livingstone
The International Bullying Prevention Association
Let Grow Foundation
Making Caring Common
Raising Digital Natives, author Devorah Heitner's site
Renee Hobbs at the Media Education Lab
MediaSmarts.ca
The New Media Literacies
Report of the Aspen Task Force on Learning & the Internet and our guide to Creating Trusted Learning Environments
The Ruler Approach to social-emotional learning (Yale Center for Emotional Intelligence)
Sources of Strength
"Young & Online: Perspectives on life in a digital age" from young people in 26 countries (via UNICEF)
"Youth Safety on a Living Internet": 2010 report of the Online Safety & Technology Working Group (and my post about it)

Categories

Recent Posts

  • Children’s own views on well-being: Global study
  • Most kids are fine online: New study
  • Datafied Childhoods the book
  • Powerful parenting for child safety online, offline
  • Media literacy and an invasion: The view from Estonia
  • Online safety for 2022: 8 things we need to see
  • ‘Playful by Design,’ a landmark report
  • 9 things that make viral hoaxes challenging

Footer

Welcome to NetFamilyNews!

Founded as a nonprofit public service in 1999, NetFamilyNews quickly became the “community newspaper” of a vital interest community of subscribers in more than 50 countries. Site and newsletter became a blog in the early 2000s. Nowadays, you can subscribe in the box to the right to receive articles in your in-box as they're posted – or look for tweets, posts on our Facebook page, and key commentaries from Anne on her page at Medium.com. She welcomes your comments, follows and shares!

Categories

  • Home
  • Youth
  • Parenting
  • Literacy
  • Safety
  • Policy
  • Research

ABOUT

  • About NFN
  • Supporters
  • Anne Collier’s Bio
  • Copyright
  • Privacy

Search

Subscribe



THANKS TO NETFAMILYNEWS.ORG's SUPPORTER HOMESCHOOL CURRICULUM.
Copyright © 2022 ANNE COLLIER. ALL RIGHTS RESERVED.