How Russian Propaganda Spreads On Social Media | KUOW News and Information

How Russian Propaganda Spreads On Social Media

Oct 29, 2017
Originally published on October 30, 2017 10:35 am

Facebook, Google and Twitter head to Washington this week for their first public congressional hearings on Russian interference in the 2016 presidential campaign via their social networks. In the runup, NPR is exploring the growing social media landscape, the spread of false information and the tech companies that build the platforms in our series: Tech Titans, Bots and the Information Complex.

Earlier this year, a Facebook group page called Blacktivist caught the eye of M'tep Blount.

As a supporter of Black Lives Matter, Blount figured Blacktivist would be a similar group. The Facebook page came with a message: "You have a friend who is a part of this group," and it had a huge following — over 400,000 as of late August.

Blount found that Blacktivist's page shared information about police brutality. Videos often showed police beating African-Americans in small towns. "It was like, 'Wow! This is happening in this community too. I really hope they do something about it but they probably aren't going to,' " she says.

As it turns out, the Blacktivist page was not like Black Lives Matter, at all. It appears to have been linked to Russia, and Facebook has since taken it down. The group was carefully crafted to attract people like Blount whose behavior on Facebook showed they mistrusted police and were concerned about civil rights.

It was just one of the many calculated ways in which social media platforms have been used lately to covertly sow divisions within society. Later this week, Facebook, Google and Twitter will face members of Congress to answer questions in three public hearings about their role in enabling Russian interference in the 2016 U.S. presidential election. The hearings are also expected to shed light on how Russian propaganda has spread in the U.S. through these major social media platforms.

Jeff Hancock, a psychologist who heads Stanford University's Social Media Lab, says that propaganda via a page like Blacktivist was not aimed at changing Blount's mind. It was actually meant to trigger strong feelings.

"Propaganda can actually have a real effect," he says. "Even though we might already believe what we're hearing, this can heighten our arousal, our emotions."

Hancock has studied the ways people are affected by seeing information that confirms some of their beliefs. In his study, he asked people how they felt about an issue before showing them stories. For example, those who thought Hillary Clinton was corrupt were shown stories confirming it. If people were worried about police brutality, he showed them posts of police brutalizing civilians.

"When we have more confirmation that a possible risk is there, whether it's real or not, we perceive it as more risky," Hancock says. So, in Blount's case, if she was already worried about police brutality, then the more times she is exposed to those images the stronger she will feel about it, he says.

This kind of propaganda, he says, is designed to enhance divisions among people and increase "the anger within each other. It's really truly just a simple divide-and-conquer approach."

It's an approach that Russia has frequently used around the world, says Michael McFaul, a former U.S. ambassador to Russia. "They think that that leads to polarization, (which) leads to arguments among ourselves and it takes us off the world stage," he says.

Another potent example is the Twitter account @TEN_GOP, which had more than 100,000 followers. It called itself the unofficial account of the Tennessee Republican Party.

But it was purportedly set up by Russians. The account has since been shut down. But for months, it sent out a stream of fake news such as a tweet falsely stating that there was voter fraud in Florida. That sort of news got plenty of amplification. Though there is no evidence that President Trump or any of his supporters knew of the Russia link, the account was often retweeted by his aide Kellyanne Conway and the president's son Donald Trump Jr. Donald Trump himself thanked the account for its support.

Clint Watts, a fellow at the Foreign Policy Research Institute who has been investigating Russian use of social media, said it showed the power of just one Twitter account and its ability to "actually influence the discussion and be cited in the debate."

Watts says this kind of media propaganda is simply how it works in the digital age, whether it's the Russians, the North Koreans or a fake news site.

Facebook has already handed over details of 3,000 ads worth $100,000 by Russians to Congress. The company has promised more transparency about who is behind the advertising campaigns. Twitter says it will no longer take ad money from two Russian media outlets, RT and Sputnik. Despite efforts by Facebook, Twitter and Google to take action on their own, Democratic lawmakers are pushing legislation that would require Internet platforms to disclose more information about political ads.

McFaul, the former ambassador, believes the companies can do more. "They're not obligated to post a story that they know to be false," he says. "They already regulate free speech and advertisement. You can't advertise guns, for instance, on Facebook."

And there is still a lot that isn't known about the use of digital platforms to spread fake news and propaganda. But Americans may have a chance to learn more when Twitter, Facebook and Google sit down to answer questions in front of Congress this week.

Copyright 2017 NPR. To see more, visit http://www.npr.org/.

LULU GARCIA-NAVARRO, HOST:

Silicon Valley goes to Washington this week. Officials from Facebook, Google and Twitter appear before Congress to talk about their social platforms and Russian interference in the 2016 presidential election. NPR will be bringing you stories on this all week this morning to kick things off. NPR's Laura Sydell reports on social media that was created to cause divisions in the United States linked to Russia.

LAURA SYDELL, BYLINE: The first thing to know about Russian propaganda is that it doesn't say it's Russian propaganda. It might simply be a social media post, a tweet or a Facebook page about a topic that you find interesting. M'tep Blount is a supporter of Black Lives Matter. One day, she saw a group page that might have been affiliated with the movement. It was called Blacktivist.

M'TEP BLOUNT: It was on my news feed. It was, oh, you have a friend who's a part of this group. And I was like, all right. I'll look into it. It definitely had a big following...

SYDELL: ...As in over 400,000 followers in late August. The Blacktivist page was sharing information about police brutality. And videos often appeared on the page of police beating African-Americans in small towns.

BLOUNT: It was like, wow. This is happening in this community, too. I really hope they do something about it. But they probably aren't going to do.

SYDELL: As it turns out, the Blacktivist page was linked to Russia. And Facebook took it down. It doesn't seem as if the Blacktivist group was trying to change Blount's mind about anything. And it was carefully crafted to attract people like Blount whose behavior on Facebook made it clear that they mistrusted police and were concerned about civil rights.

JEFF HANCOCK: Propaganda can actually have a real effect. Even though we might already believe what we're hearing, this can heighten our arousal or our emotions.

SYDELL: Jeff Hancock is a psychologist who heads the Stanford University Social Media Lab. Hancock has studied the ways people are affected by seeing information that confirms their beliefs. In his study, he asked people how they felt about an issue before showing them stories. So he says if someone thought Hillary Clinton was corrupt, he showed posts confirming it. If people were worried about police brutality, he showed them posts of police brutalizing civilians.

HANCOCK: If I'm worried about police brutality then, you know, the more times I'm exposed to that, the stronger it makes me feel about it.

SYDELL: Hancock says this kind of propaganda is designed to enhance divisions...

HANCOCK: ...And by doing that, reduce the will to vote. The anger within each other - it really truly is just a simple divide-and-conquer approach.

SYDELL: It's an approach that Russia has frequently used around the world, says former Russian Ambassador Michael McFaul.

MICHAEL MCFAUL: They think that that leads to polarization, that leads to arguments among ourselves. And it takes us off the world stage.

SYDELL: The Russian campaign spread across all forms of social media. Take a Twitter account like @TEN_GOP, which had more than 100,000 followers. It called itself the unofficial account of the Tennessee Republican Party. But it wasn't. It was reportedly set up from Russia. The account, which has been shut down, sent out a stream of fake news such as a tweet falsely stating that there was voter fraud in Florida. The fake news got plenty of amplification. There's no evidence that President Trump or his supporters knew about the accounts linked to Russia. Still, it was retweeted by Trump spokesperson Kellyanne Conway, Donald Trump Jr. And then Trump himself thanked the account for its support. Clint Watts, a fellow at the Foreign Policy Research Institute, has been investigating Russian use of social media.

CLINT WATTS: It has been retweeted, cited many, many times by people in the Trump campaign or Republican operatives and even in the mainstream media. And so that shows how just one account with just a lot of effort can actually influence the discussion and be cited in the debate.

SYDELL: Watt says this kind of media propaganda campaign is not exclusive to the Russians. This is simply how it works in the digital age. After every major news event, the social-media sphere starts filling up with conspiracy theories and fake news. Under pressure from Congress, Facebook has handed over a $100,000 ad campaign with 3,000 ads by Russians to Congress. It's promised more transparency about who is behind advertising campaigns. Twitter says it will no longer take ad money from two Russian media outlets. And there's still a lot we don't know about the use of digital platforms. But we may have a chance to learn more when Twitter, Facebook and Google sit down to answer questions in front of Congress later this week. Laura Sydell, NPR News.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.