They were historic conversations on many levels, and not just because 11 British Members of Parliament flew across the Atlantic to hold hearings with Google, Twitter and Facebook executives (as well as scholars, journalists and news publishers) at George Washington University last week. It was “the first ever live broadcast and public hearing of a House of Commons select committee outside the UK,” The Guardian reported, and there were some five hours of recorded formal testimony (it can be watched here and here).
The fairly limited news coverage of the hearing focused on the sessions with Facebook, Twitter and Google/YouTube. But the House of Commons’ Digital, Culture, Media and Sport Committee heard a full afternoon’s testimony from two other sets of views too: a group of scholars and researchers and one of journalists and news publishing executives. The subject was “fake news,” but the MPs didn’t seem sure about just exactly what that is and involves (not unlike the rest of us).
That the hearing encompassed so much – everything from the past and present of the news business to electoral law to the future of democracy, in addition to algorithms, content moderation, and news’s place in social media’s vast spectrum of content – was both good news and bad news. It was bad news because the problem of fake news didn’t get full, in-depth treatment. For example, at one point the conversation pivoted rapidly from how U.S. voter data was processed in the U.K. (by London-based data mining and analysis firm Cambridge Analytica – see this by testifier David Carroll of The New School) to comparing social media platforms to traditional publishing companies (more on this in a moment). It was good news because this cross-disciplinary conversation needed to start and it shed a bright spotlight on how much would-be regulators and Internet companies have to learn about each other and how much they and all of us have to learn about the societal impacts of big data – if willing to learn.
The definitions problem
At one point in the first afternoon session, MP Simon Hart asked Claire Wardle, a research fellow at Harvard’s Shorenstein Center, whether – since there are already “established norms for people who have the power to affect the outcome of elections by what they choose to print and what they choose to withhold – is there a sustainable argument out there which explains why people who run an online platform consider themselves to be in a very different place legally from those who run an offline platform, i.e. a newspaper?”
Dr. Wardle responded, “My frustration is, we get into these battles of definitions, with us saying ‘you’re a publisher,’ and their saying ‘no, we’re a platform.’ The truth is, they’re somewhere in the middle. They’re a hybrid form of communication. What I’d like to see, and to be honest I did hear some of that this morning [in the testimony of the platforms], ‘We would like to be part of the conversation around what new forms of regulation might look like.’ Because I don’t think we can take the broadcast model. We can’t regulate speech on Twitter in the same way we regulate the BBC. That’s not workable.”
Clearly the definitions problem is central. It came up many times in the other afternoon session too, the one with journalists and news publishers, which was more about business, changing content and users/audiences than law or election manipulation. David Chavern, CEO of the New Media Alliance, said, “These are amazing products that give our news brands access to many, many people, and they’ve built amazing technologies. People ask if they’re publishers, media companies or platforms. Yes. All of that. I primarily call them attention companies. They want to access as much of the public’s attention as possible. They want more of it tomorrow than they have today…. Do I want a world where Google and Facebook are editors? No. But I think they could do a much better job of helping the user separate the wheat from the chaff.”
Facebook has started down that road. The platform “unveiled a plan to crowdsource credibility ratings for news outlets and has assembled a team of fact-checkers in Italy, ahead of next month’s parliamentary elections in that country,” the Washington Post reported in its coverage of the hearing. Because this and other coverage focused on the social media sessions, here are…
Other highlights on the “fake news” topic
- The content mashup: “There is no clear bright line between what constitutes news and other information,” said Kinsey Wilson, Chief Content Officer at the New York Times. “There’s a spectrum of information that has become part of the mix on these platforms and become very difficult to separate out.” Earlier he said that “makes it very difficult to determine whether [the platforms] are in the publishing space or merely a purveyor of content other people have produced. Clearly through their algorithms, they’re applying a level of judgment to what people see, and that has a societal impact quite apart from the misinformation that flows through those platforms.”
- The burden on consumers: David Chavern: “One of the digital challenges is it [news, information, parody, clickbait] is all put into a blender, which puts an enormous burden on consumers on how to differentiate,” and what comes out of that “blender” is never simply true or false. “The most powerful kind of fake news is content that’s just somewhat off [the mark], which feeds people’s biases – says what they wish was true.”
- Augmented media literacy needed: Dr. Wardle at Shorenstein said, “Most users don’t understand this space. News and media literacy has to include how algorithms get developed and how they work – kids need to be taught how to evaluate an algorithm….
- Algorithms and transparency: “We keep saying algorithms are black boxes,” Wardle said. “Can we talk about why the algorithm was designed in the first place … what was the goal?… Yes [the platforms] are commercial entities and have to maintain their competitive edge but because of their influence [on societies], we need that transparency.”
- Problems not new: The newspeople made it clear that their business’s digital challenges didn’t start with Brexit and the 2016 U.S. election, or even social media. CBS News correspondent Major Garrett said it started when classified ads went digital. “Craigslist and eBay killed American newspapers,” taking 35-40% [of revenue] out of every newspaper in America within about two years.” He pointed to new models now showing promise: The Texas Tribune supported as a nonprofit community service and the Las Vegas Review Journal, an independent news service and Web site “built entirely on donations from the community and from businesses in the community.”
- Upsides for news publishers: Revenue for mainstream news providers is up, they said, as is audience engagement. Brand has become more important than ever, said Tony Maddox, managing director of CNN International. “There are a lot of good things happening right now, and I think the news business is benefitting.” Wilson of the New York Times: “Facebook is the most efficient way for us to acquire subscribers,” and subscriptions represent two-thirds of the Times’s revenue now, he added. CBS’s Garrett said, “It’s a tremendous opportunity, it seems to me. Untrustworthy media went unanswered before. Now there’s great interest in that question of trust. This time feels very heavy, but also the best opportunity we’ve had in our careers because the audience has never been more interested in what we do and how we do it.”
- Divisiveness & bad actors: In a comment on “means and motive,” Major Garrett said that the American political process has long had bad actors, only now the means is different: “We’ve given [the current ones] the means by which to fulfill their motives by being so archly partisan against one another.” He added that there is now “a much wider opportunity for the fake news or propaganda that divides us to have traction. Our own political discourse has created an opening that I don’t believe existed before.”
- Impact on the political process: David Chavern of the New Media Alliance said the impact is greatest on smaller communities, pointing to a “brutally destructive fake news story in Twin Falls, Idaho…. I don’t think that’s something the tech giants understood when they got into this world, and they’re trying to figure out how to fix that and haven’t done that yet…. We need quality news or we’re not going to have civil society,’ he added. CBS’s Garrett said, “I can only take the long view on that. That which is not only true today but tomorrow and a year from now will live in starker contrast to that which is not true…. Those responsible for that which is true will gain [influence] substantially over time…. If I didn’t believe that, I couldn’t do what I do” as chief White House correspondent.
- A kind of language barrier: George Washington University media professor Frank Sesno raised the question of “whether we’re going to have an informed or a deformed public discourse and public process” in addressing “fake news” and pointed to a culture and language barrier challenging discussions between Internet companies and people outside them.
- News isn’t regulated: The New York Times’s Kinsey Wilson told the MPs, “Traditional news organizations in this country are not licensed and, apart from libel and defamation laws, there is no particular regulation that applies to us.” He later added, “There’s a risk that regulation will have a lot of unintended consequences as well as desired impacts.” In the earlier session, Claire Wardle expressed concern about the potential for reflexive regulation: “I don’t think we should have state intervention that potentially is knee-jerk and isn’t reacting to the realities and the challenges that come from these platforms at a scale that’s hard to even imagine,” she said. “I want [the platforms] to be part of the conversation so we can have an honest look-at.”
- Regulate a moving target?: The conditions of this media environment are new and changing, and so are users’ behavior and experiences, CBS News’s Garrett said. “I think we have to at least acknowledge the possibility that there will be behavioral adaptations,” suggesting, it seemed, that we need to see what happens there.
- Our children’s social media: “I’ll use my children as an example – 22, 21 and 17,” Garrett continued. “All three have grown up in this digital world, and they’re already demonstrating signs of exhaustion, psychologically and otherwise. I believe there’s a very real possibility that, as those who grew up [in this media environment] adapt differently, they will begin to send signals to Google, Facebook and others about what they do believe is credible and not credible. And that possibility of them changing things, quite separate from any regulatory regime anyone ever might create, gives me some sense of optimism…. I’m optimistic. Perhaps [it’s] unfounded, but I see it in my own children and the behavior they express for themselves and on behalf of their friends.” I have that same optimism when I talk with my children and their peers in many countries, don’t you? Do we listen to them enough?
For me, Shorenstein Center fellow Claire Wardle summed up this unwieldy discussion in five words: “What are we talking about?” She was coming back to the need for definitions in responding to a question about electoral law. She asked if the MP addressing her was referring only to the content political campaigns post in social media. “Do we regulate just campaigns and candidates when much of the manipulation is cultural and outside the bounds of politics? My problem is with creating boundaries that do nothing because there’s so much outside the boundaries.”
But that shouldn’t in any way discourage us from having these conversations. This hearing was confusing and fascinating and necessary. Kudos and thanks to the British MPs for bringing their hearing to us.
Related links
- Here’s coverage of the hearing from the Chicago Tribune, CNN Money and the Washington Post (mentioned above)
- The testimony of New School associate professor David Carroll was cut off in the hearing video, so for details on his contribution to the hearing, see these articles in Medium.com: “Hacking the Voters” last fall and “Confronting a Nightmare for Democracy: Personal Data, Personalized Media and Weaponized Propaganda”
- The report Shorenstein Center fellow Claire Wardle co-wrote with Hossein Derakhshan for the Council of Europe is here. She described “7 kinds of misinformation and disinformation” (broadly referred and mis-referred to as “fake news”) in this article almost exactly one year ago.
- Adding later (2/20): To Major Garrett’s point about bad actors exploiting our divisiveness, New Yorker columnist Masha Gessen shows better than most how its opposite, a “shared reality,” would protect, or at least make us less vulnerable: “We [Americans] no longer have a sense of shared reality, a common imagination that underlies political life. In a society with a strong sense of shared reality, a bunch of sub-literate tweets and ridiculous ads would be nothing but a curiosity.”
- “The ‘fake news’ culprit no one wants to identify: You” at Wired.com
- “A transcript of the proceedings will be available later via the Official Report/Hansard,” according to the hearing’s page at Parliament.uk
[…] About an extraordinary gathering of lawmakers trying to figure out how to define and regulate social media platforms in 2018 […]