Facebook, Google Spread Misinformation About Las Vegas Shooting. What Went Wrong? | KUOW News and Information

Facebook, Google Spread Misinformation About Las Vegas Shooting. What Went Wrong?

Oct 3, 2017
Originally published on October 4, 2017 6:35 am

In the hours just after the massacre in Las Vegas, some fake news started showing up on Google and Facebook. A man was falsely accused of being the shooter. His name bubbled up on Facebook emergency sites and when you searched his name on Google, links of sites connecting him with the shooting topped the first page.

It appears to be another case of automation working so fast that humans can't keep pace. Unfortunately, these powerful tech companies continue to be a main destination for news and it's not clear how they can solve the problem.

In this particular case, the man's name first appeared on a message board on a site called 4chan. It's known as a gathering spot for underground hackers and the alt-right. Everyone who posts is anonymous. And we're not publishing the man's name because he's been through enough.

Shortly after the shooting, police announced that a woman named Marilou Danley was a person of interest. She had been living with the shooter in his Nevada home.

On a message board called /Pol/- Politically Incorrect, someone said her ex- husband was the shooter. His Facebook page indicated he was a liberal and the far-right trolls on Pol went to work to spread the word.

Even after police identified the shooter, the wrong man's name appeared for hours in tweets. On Facebook, it appeared on an official "safety check" page for the Las Vegas shooting, which displayed a post from a site called Alt-Right News. And on Google, top searches linked to sites that said he was the shooter. When you searched his name, a 4chan thread about him was promoted as a top story.

So, why did parts of these hugely powerful companies continue to point to an innocent man?

Bill Hartzer, an expert on search, says Google is constantly crawling the Web and picking up new information as it appears. The innocent man went from hardly having anything online to have a whole bunch of stuff.

"Google has not had the time to really vet the search results yet," Hartzer says. "So what they'll do is they will show what they know about this particular name or this particular keyword."

In a statement, Google said the results should not have appeared, but the company will "continue to make algorithmic improvements to prevent this from happening in the future."

One improvement that Greg Sterling thinks Google should make is putting less weight on certain websites, like 4chan. "In this particular context had they weighted sites that were deemed credible more heavily you might not have seen that," says Sterling, is a contributing editor at Search Engine Land. On the other hand, he says "if news sites ... were given some sort of preference in this context you might not have seen that."

Unfortunately, it seemed like Facebook was giving those same sites credibility. In a statement, Facebook said it was working on a way to fix the issue that caused the fake news to appear. (Disclosure: Facebook pays NPR and other leading news organizations to produce live video streams that run on the site.)

But Sterling says part of the issue with having these companies determine what's news is that they're run by engineers. "For the most part the engineers and the people who are running Google search don't think like journalists," he says. "They think like engineers running a product that's very important."

And then there is the scale of what Google and Facebook do. They are huge. And that's only possible because computers do a lot of the work. Yochai Benkler, a law professor at Harvard, says that with such massive scale even if there were humans helping out there would be mistakes.

Benkler says that even if Facebook and Google blocked sites like 4chan, it wouldn't solve the problem. "Tomorrow in another situation like this someone will find some other workaround," Benkler says. "It's not realistic to imagine perfect filtering in real time in moments of such crisis."

But, for the man who spent hours being accused of mass murder, the technical problems at Google and Facebook probably aren't much comfort. And they won't be much comfort to the next person who lands in the crosshairs of fake news.

Copyright 2017 NPR. To see more, visit http://www.npr.org/.

KELLY MCEVERS, HOST:

In the hours after the massacre in Las Vegas, fake news about it started showing up on Google and Facebook. A man was falsely accused of being the shooter. His name bubbled up on a Facebook safety check site and at the top of Google search results. And all of that was automated. NPR's Laura Sydell reports that as these powerful tech companies continue to be a main destination for news, this problem is likely to happen again.

LAURA SYDELL, BYLINE: His name first appeared on a message board on a site called 4chan. 4chan is known as this gathering spot for underground hackers in the alt-right. Everyone who posts is anonymous. And we're not saying the man's name because he's been through enough. Shortly after the shooting, the police announced that a woman named Marilou Danley was a person of interest. She'd been living with the shooter in Nevada. On a message board called /pol/ - Politically Incorrect, someone said it was her ex-husband who was the shooter. His Facebook page indicated he was a liberal, and the far-right trolls on /pol/ went to work to spread the word.

Even after police identified the shooter, the wrong man's name appeared for hours in tweets. On Facebook, it appeared on an official safety check page for the Las Vegas shooting, which displayed a post from a site called Alt-Right News. And on Google, the top searches linked to places that said he was the shooter. When you searched his name, a 4chan thread about him was promoted as a top story. So why did parts of these hugely powerful companies continue to point to a totally innocent man?

Bill Hartzer is an expert on search. He says Google is constantly searching the web and picking up new information as it appears. The innocent man went from hardly having anything online to having a whole bunch of stuff.

BILL HARTZER: Google has not had the time to really vet the search results yet. So what they'll do is they will show what they know about this particular name or this particular keyword.

SYDELL: In a statement, Google said the results should not have appeared. And it will, quote, "continue to make algorithmic improvements to prevent this from happening in the future." One improvement that Greg Sterling thinks Google should make is putting less weight on certain websites, like 4chan. Sterling's a contributing editor at Search Engine Land.

GREG STERLING: In this particular context, had they awaited sites that were deemed credible more heavily, you might not have seen that. So if news sites, for example, were given some sort of preference in this context, you might not have seen that.

SYDELL: Unfortunately it seemed like Facebook was giving these same site's credibility. In a statement, Facebook said it was working on a way to fix the issue that caused the fake news to appear. But Sterling thinks part of the issue with having these companies determine what's news is that they're run by engineers.

STERLING: For the most part, the engineers and the people who are running Google Search don't think like journalists. They think like engineers running a product that's very important.

SYDELL: And there is this scale of what Google and Facebook do. They're massive. Computers have to do a lot of the work. And with such huge scale, even if there were humans, there would be mistakes, says Yochai Benkler, a law professor at Harvard who studies online news. Benkler thinks if Facebook and Google were to block sites like 4chan, it would not solve the problem.

YOCHAI BENKLER: So tomorrow, in another situation like this, someone will find some other workaround. It's not realistic to imagine perfect filtering in real time in moments of such crisis.

SYDELL: But for the man who spent hours being accused of mass murder, the technical problems at Google and Facebook probably aren't much comfort. And they won't be much comfort for the next person who lands in the crosshairs of fake news. Laura Sydell, NPR News, San Francisco.

(SOUNDBITE OF THE BARR BROTHER'S SONG, "STATIC ORPHANS") Transcript provided by NPR, Copyright NPR.