Wednesday, September 27, 2017

Zuckerberg Blew Off Warnings of Russian Trolls in 2015


The 2016 presidential election wasn’t the first time Russian trolls used Facebook to mess with another country’s political system. And it wasn’t the first time Facebook CEO Mark Zuckerberg offered a weak defense of his company’s role in facilitating Russian online aggression.
Years before Russian trolls organized pro-Trump flash mobs, advertised fake news to tens of millions of Americans, and promoted anti-immigrant hate, they pushed and pushed to get Ukrainian activists suspended from the social network. And it worked, those activists say.
The anti-Ukrainian trolls lodged endless complaints with Facebook, claiming that their anti-Kremlin posts were really hate speech or porn. The social network would dutifully comply with the trolls’ requests.
“I’ve been blocked [from Facebook] because of a post about a rainbow. I put a picture of my city [with] a picture of [the] rainbow. The picture said, ‘Everything will be okay,’” one Ukrainian activist, Yaroslav Matiushyn, told The Daily Beast. “I was blocked for a month.
“One was a text post. It wasn’t erotic text—no porno, nothing erotic. They complained there is some nudity in it,” he said. Matiushyn was banned again after, he said, trolls bombarded Facebook with nudity reports.
Facebook’s inability to tackle Russia’s troll problem in Ukraine reached a fever pitch in 2014 and 2015, with several Ukrainians writing into Zuckerberg’s May 2015 call for question submissions at a Facebook town hall. The top 20 questions worldwide were about Russian trolling of the website’s report button to silence Ukrainian accounts. The top question received 45,000 likes. And even Ukrainian president Petro Poroshenko asked the company to create a Ukrainian Facebook office to deal with Ukraine. The request was rebuffed.
“The users whose opinions differ from those of the Kremlin are blocked, though they do not violate any community rules,” one user wrote on Zuckerberg’s call for questions.
Zuckerberg addressed the question at an employee town hall on May 15, 2015.
“The Ukrainian President, Petro Poroshenko, actually sent in a question, as well,” a Facebook employee said, as colleagues laughed.
Zuckerberg argued that a good number of the posts contained “hate speech” and anti-Russian “slurs.” He did not address the Russian trolls and their often spurious allegations, even though the trolls were referenced in most of the questions.
“We did the right thing according to our policies in taking down the posts. I support our policies in taking down hate speech,” he said.
Zuckerberg then apologized to some users whose posts “were accidentally [taken] down because the post contained nudity instead of hate speech.”
“That was a bug in the software we were running,” he said, laughing. “We’ll try not to make that mistake anymore.”
Individuals and groups who maliciously report posts that comply with site guidelines pose a unique problem for social networks, Emerson College communication studies professor Vincent Raynauld told The Daily Beast.
Often, such tactics are an attempt to “slow down the mobilization, and slow down the discourse as well,” said Raynauld, whose published work focuses mostly on microtargeting and social media impacts on elections.
Social networks use algorithms and track patterns to help them crack down on violations of their policies, as well as on faulty reports. But Reynauld said they are in a “perpetual beta” because the strategies used to evade detection constantly evolve.
“There is no way to ensure a free discourse on these platforms,” Reynauld said. “But there’s a way for Facebook to engage more with its user pool.”
In December of 2016, over a year after facing public questions about Kremlin-sponsored trolling and abuse on his platform, Zuckerberg referred to fake news on Facebook as a “very small amount of content” and said “I think it’s a pretty crazy idea” that Kremlin-sponsored propaganda “influenced the (American) election in any way.”
“I do think there is a certain profound lack of empathy in asserting that the only reason someone could have voted the way they did is they saw some fake news,” he said. “If you believe that, then I don’t think you have internalized the message the Trump supporters are trying to send in this election.”
Nine months—and a torrent of public criticism—later, Facebook announced in September that it had identified several Kremlin operations disguised as American Facebook groups. After some initial resistance, company representatives shared some of their findings with congressional intelligence committees and with special prosecutor Robert Mueller. (North Carolina Republican Sen. Richard Burr, who heads the Senate intelligence committee, said on Monday that his panel has not received a complete report from Facebook, however.)
As The Daily Beast reported, the Facebook pages, with names like “Secured Borders” and “Being Patriotic,” created real-life pro-Trump rallies on American soil, established fake “voter fraud hotlines” on Election Day, and wrote posts with divisive, racist, and anti-immigrant sentiments.
Facebook’s less-than-urgent responses to these examples of an autocratic foreign power interfering in the domestic affairs of a democratic country shouldn’t be altogether surprising, however. The company has a track record of acquiescence to the demands of authoritarian regimes. Rohingya activists told The Daily Beast that the social network had been taking down their posts about the intensifying ethnic cleansing in Burma. In 2016, Facebook also developed a censorship tool to appease the Chinese government and allow the company to suppress posts on a regional basis, The New York Times reported.
The company likewise bowed to Turkey’s demands of censoring images of the prophet Muhammad to get access to that market, just weeks after the Charlie Hebdo attacks.
And in Russia itself, the social network blocked a page in support of Alexei Navalny, Putin’s most vocal challenger. When it debuted the Facebook gay pride reaction button, users in Russia and other anti-gay countries were unable to access it.
Navalny later co-signed a petition demanding Facebook change their approach to an “army of shills on state payroll” blocking Ukrainian posts on the site. The petition racked up almost 10,000 signatures.
It’s a sea change for the social network once hailed as a key enabler of the Arab Spring.
On Tuesday, Russian authorities threatened to block access to Facebook in the country if it doesn’t begin to store data of Russian citizens on Russian servers. Russia has already banned LinkedIn for not complying with the same data storage requirements.
The Ukrainians who felt silenced by Facebook’s moderation system at the height of tensions with Russia are still grasping for answers about what happened to their messages—and who was to blame for their disappearance.
“We’d heard around that the people looking into the complaints on most of the Slavic languages was Russian,” Michael Baskin told The Daily Beast.
Baskin was one of the first to file a high-profile public complaint to Facebook about the Russian troll campaign against Ukrainian accounts.
“Such obvious mismanagement encourages further informational war between Ukrainian and Russian people and benefits the state that has been recognized worldwide as the aggressor,” he wrote in a post addressed to Zuckerberg that was liked over 3,000 times.
Baskin, a Ukrainian living in Dublin, said he’d asked Facebook employees and their friends, who had relayed the theory of incidentally pro-Russian comment moderation, and that he didn’t believe it “was only one lady’s decision,” but “probably someone up in management.” Facebook’s European headquarters are also located in Dublin.
In his 2015 Facebook town hall, Zuckerberg called Baskin’s theory a “meme that was floating around that the content moderation was done out of a Russian office by Russians who were anti-Ukrainian” and said that it’s “not true.”
In that case, Baskin said, he still wants answers for why Ukrainian activists had accounts repeatedly shuttered while Kremlin propaganda abounded—both in Ukraine in 2015 and the U.S. in 2016.
“These trolls were fighting for the brains and thoughts of others on Facebook. It was in very serious, extreme scale,” he said.
But, according to a Facebook representative, it wasn’t enough to prompt a change in policy.
“We removed a lot of content for violating our standards on hate speech, including the use of racial slurs. The content we removed was done so consistent with our Community Standards,” a Facebook spokesperson told The Daily Beast in a statement. “Anyone can report content to us if they think it violates our standards. One report is enough to take down content if it violates our policies but multiple reports will not lead to the removal of content if it meets our standards. That is true today as it was then.”

No comments:

Post a Comment