Sharing is caring, unless it’s misinformation
Over the past couple of weeks, more misinformation has been spreading online, specifically about the killing of George Floyd by police, and the nationwide protests that followed.
A video claiming that Floyd was alive reached 1.3 million people online. Thousands of others posted on Twitter and Facebook that billionaire George Soros is funding the protests. That is not true.
This interview has been edited for clarity.
Ryan Calo is an investigator with the University of Washington’s Center for an Informed Public. He says there's a reason why false information is everywhere right now.
Whenever events unfold very rapidly, it is an opportunity for purveyors of disinformation to get ahead of the news and get ahead of good information by spreading lies. A lot of it has to do with, first, everyone's paying attention to it, and second, it's unfolding so fast that people who have good information are drowned out and outpaced by those who are trying to foment strife.
Calo says social media companies like Twitter and Facebook have taken some steps to prevent false information from spreading online, but it's not nearly enough.
In other contexts, Facebook and Twitter and other platforms have actually been quite successful at dramatically reducing the amount of bad content. You don't see a lot of advertisements for online gambling anymore, and you don't see anything like what you're used to in terms of jihadi recruitment videos.
In fact, even within Coronavirus, the steps that these companies are taking are greater than before, and some people are responding, understandably, "OK, now do hate speech. Now do these other things."
Specifically within Facebook, just in the last week, reporting has revealed a rift between the leadership, and particularly Mark Zuckerberg, and a number of employees. The employees would rather see Facebook take down problematic information by politicians, rather than to leave it up.
Including President Trump, correct?
Especially President Trump, but any prominent politician. The arguments are really interesting. The people that want to leave it up, they argue something like this: Yes, it's a violation of our terms of service to say "When the looting starts, the shooting starts," but by taking it down, we withdraw from public scrutiny a very powerful and problematic statement that people ought to scrutinize.
The arguments on the other side are also quite powerful. They are something like: "Well, look, why should a person, by virtue of how powerful they are, and how visible they are, why should their statements get to survive and get to stay within the marketplace of ideas, when less powerful, less visible people who say similar things, they are silenced?"
I think one reason that current protests, and the current moment of racial tension, is particularly a source of disagreement is because it's not merely any violation of the terms of service. It's a violation of the terms of service that seems to be in support of a kind of law-and-order mentality that has fomented racial tension and reinforced racialized violence.
What do you think the solution is right now?
I wish there were a single solution that I could tell you on the radio right now, and it would it would solve the problem. I think, in fact, we're going to need a range of things: teaching people to be able to detect and respond to misinformation; addressing really dangerous misinformation really quickly. But, the other thing is, we ought to consider whether this environment needs to slow down.
What do you mean by slowing down?
The root of the problem in many ways of misinformation has to do with the ability of people to share information so quickly that is unverified. If there were a way to slow that down, that would do a lot of a lot of good.
Both you as an individual hesitating before you share something, but also the very structure of social media is one of absolutely rapid, unedited, unfiltered discussion. There may be ways to slow down viral content, if it's perceived to be potentially a source of misinformation.
But the truth is, you're going to need a range of different things. You need more discerning consumers of information. Also, a lot of this has to do with just the degradation in public trust of institutions. At the Center, we're trying to address all of those things, but we don't perceive that any one of them will be enough.
Join us for our free virtual series, Stand with the Facts: Big Tech and the fight against misinformation. You can register here.
Listen to the interview by clicking the play button above.