Former Facebook Workers: We Routinely Suppressed Conservative News
Facebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential “trending” news section, according to a former journalist who worked on the project. This individual says that workers prevented stories about the right-wing CPAC gathering, Mitt Romney, Rand Paul, and other conservative topics from appearing in the highly-influential section, even though they were organically trending among the site’s users.
Several former Facebook “news curators,” as they were known internally, also told Gizmodo that they were instructed to artificially “inject” selected stories into the trending news module, even if they weren’t popular enough to warrant inclusion—or in some cases weren’t trending at all. The former curators, all of whom worked as contractors, also said they were directed not to include news about Facebook itself in the trending module.
In other words, Facebook’s news section operates like a traditional newsroom, reflecting the biases of its workers and the institutional imperatives of the corporation. Imposing human editorial values onto the lists of topics an algorithm spits out is by no means a bad thing—but it is in stark contrast to thecompany’s claims that the trending module simply lists “topics that have recently become popular on Facebook.”
These new allegations emerged after Gizmodo last week revealed details about the inner workings of Facebook’s trending news team—a small group of young journalists, primarily educated at Ivy League or private East Coast universities, who curate the “trending” module on the upper-right-hand corner of the site. As we reported last week, curators have access to a ranked list of trending topics surfaced by Facebook’s algorithm, which prioritizes the stories that should be shown to Facebook users in the trending section. The curators write headlines and summaries of each topic, and include links to news sites. The section, which launched in 2014, constitutes some of the most powerful real estate on the internet and helps dictate what news Facebook’s users—167 million in the US alone—are reading at any given moment.
“Depending on who was on shift, things would be blacklisted or trending,” said the former curator. This individual asked to remain anonymous, citing fear of retribution from the company. The former curator is politically conservative, one of a very small handful of curators with such views on the trending team. “I’d come on shift and I’d discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn’t be trending because either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz.”
The former curator was so troubled by the omissions that they kept a running log of them at the time; this individual provided the notes to Gizmodo. Among the deep-sixed or suppressed topics on the list: former IRS official Lois Lerner, who was accused by Republicans of inappropriately scrutinizing conservative groups; Wisconsin Gov. Scott Walker; popular conservative news aggregator the Drudge Report; Chris Kyle, the former Navy SEAL who was murdered in 2013; and former Fox News contributor Steven Crowder. “I believe it had a chilling effect on conservative news,” the former curator said.
Another former curator agreed that the operation had an aversion to right-wing news sources. “It was absolutely bias. We were doing it subjectively. It just depends on who the curator is and what time of day it is,” said the former curator. “Every once in awhile a Red State or conservative news source would have a story. But we would have to go and find the same story from a more neutral outlet that wasn’t as biased.”
Stories covered by conservative outlets (like Breitbart, Washington Examiner, and Newsmax) that were trending enough to be picked up by Facebook’s algorithm were excluded unless mainstream sites like the New York Times, the BBC, and CNN covered the same stories.
Other former curators interviewed by Gizmodo denied consciously suppressing conservative news, and we were unable to determine if left-wing news topics or sources were similarly suppressed. The conservative curator described the omissions as a function of his colleagues’ judgements; there is no evidence that Facebook management mandated or was even aware of any political bias at work.
Managers on the trending news team did, however, explicitly instruct curators to artificially manipulate the trending module in a different way: When users weren’t reading stories that management viewed as important, several former workers said, curators were told to put them in the trending news feed anyway. Several former curators described using something called an “injection tool” to push topics into the trending module that weren’t organically being shared or discussed enough to warrant inclusion—putting the headlines in front of thousands of readers rather than allowing stories to surface on their own. In some cases, after a topic was injected, it actually became the number one trending news topic on Facebook.
“We were told that if we saw something, a news story that was on the front page of these ten sites, like CNN, the New York Times, and BBC, then we could inject the topic,” said one former curator. “If it looked like it had enough news sites covering the story, we could inject it—even if it wasn’t naturally trending.” Sometimes, breaking news would be injected because it wasn’t attaining critical mass on Facebook quickly enough to be deemed “trending” by the algorithm. Former curators cited the disappearance of Malaysia Airlines flight MH370 and the Charlie Hebdo attacks in Paris as two instances in which non-trending stories were forced into the module. Facebook has struggled to compete with Twitter when it comes to delivering real-time news to users; the injection tool may have been designed to artificially correct for that deficiency in the network. “We would get yelled at if it was all over Twitter and not on Facebook,” one former curator said.
In other instances, curators would inject a story—even if it wasn’t being widely discussed on Facebook—because it was deemed important for making the network look like a place where people talked about hard news. “People stopped caring about Syria,” one former curator said. “[And] if it wasn’t trending on Facebook, it would make Facebook look bad.” That same curator said the Black Lives Matter movement was also injected into Facebook’s trending news module. “Facebook got a lot of pressure about not having a trending topic for Black Lives Matter,” the individual said. “They realized it was a problem, and they boosted it in the ordering. They gave it preference over other topics. When we injected it, everyone started saying, ‘Yeah, now I’m seeing it as number one’.” This particular injection is especially noteworthy because the #BlackLivesMatter movement originated on Facebook, and the ensuing media coverage of the movement often noted its powerful social media presence.
(In February, CEO Mark Zuckerberg expressed his support for the movement in an internal memo chastising Facebook employees for defacing Black Lives Matter slogans on the company’s internal “signature wall.”)
When stories about Facebook itself would trend organically on the network, news curators used less discretion—they were told not to include these stories at all. “When it was a story about the company, we were told not to touch it,” said one former curator. “It had to be cleared through several channels, even if it was being shared quite a bit. We were told that we should not be putting it on the trending tool.”
(The curators interviewed for this story worked for Facebook across a timespan ranging from mid-2014 to December 2015.)
“We were always cautious about covering Facebook,” said another former curator. “We would always wait to get second level approval before trending something to Facebook. Usually we had the authority to trend anything on our own [but] if it was something involving Facebook, the copy editor would call their manager, and that manager might even call their manager before approving a topic involving Facebook.”
Gizmodo reached out to Facebook for comment about each of these specific claims via email and phone, but did not receive a response.
Several former curators said that as the trending news algorithm improved, there were fewer instances of stories being injected. They also said that the trending news process was constantly being changed, so there’s no way to know exactly how the module is run now. But the revelations undermine any presumption of Facebook as a neutral pipeline for news, or the trending news module as an algorithmically-driven list of what people are actually talking about.
Rather, Facebook’s efforts to play the news game reveal the company to be much like the news outlets it is rapidly driving toward irrelevancy: a select group of professionals with vaguely center-left sensibilities. It just happens to be one that poses as a neutral reflection of the vox populi, has the power to influence what billions of users see, and openly discusses whether it should use that power to influence presidential elections.
“It wasn’t trending news at all,” said the former curator who logged conservative news omissions. “It was an opinion.”
[Disclosure: Facebook has launched a program that pays publishers, including the New York Times and Buzzfeed, to produce videos for its Facebook Live tool. Gawker Media, Gizmodo’s parent company, recently joined that program.]
Update: Several hours after this report was published, Gizmodo editors started seeing it as a topic in Facebook’s trending section. Gizmodo’s video was posted under the topic but the “Top Posts” were links to RedState.com and the Faith and Freedom Coalition.
Update 4:10 p.m. EST: A Facebook spokesperson has issued the following statement to outlets including BuzzFeed and TechCrunch. Facebook has not responded to Gizmodo’s repeated requests for comment.
“We take allegations of bias very seriously. Facebook is a platform for people and perspectives from across the political spectrum. Trending Topics shows you the popular topics and hashtags that are being talked about on Facebook. There are rigorous guidelines in place for the review team to ensure consistency and neutrality. These guidelines do not permit the suppression of political perspectives. Nor do they permit the prioritization of one viewpoint over another or one news outlet over another. These guidelines do not prohibit any news outlet from appearing in Trending Topics.”
Update May 10, 8:50 a.m. EST: The following statement was posted by Vice President of Search at Facebook, Tom Stocky, late last night. It was liked by both Mark Zuckerberg and Sheryl Sandberg:
My team is responsible for Trending Topics, and I want to address today’s reports alleging that Facebook contractors manipulated Trending Topics to suppress stories of interest to conservatives. We take these reports extremely seriously, and have found no evidence that the anonymous allegations are true.Facebook is a platform for people and perspectives from across the political spectrum. There are rigorous guidelines in place for the review team to ensure consistency and neutrality. These guidelines do not permit the suppression of political perspectives. Nor do they permit the prioritization of one viewpoint over another or one news outlet over another. These guidelines do not prohibit any news outlet from appearing in Trending Topics.Trending Topics is designed to showcase the current conversation happening on Facebook. Popular topics are first surfaced by an algorithm, then audited by review team members to confirm that the topics are in fact trending news in the real world and not, for example, similar-sounding topics or misnomers.We are proud that, in 2015, the US election was the most talked-about subject on Facebook, and we want to encourage that robust political discussion from all sides. We have in place strict guidelines for our trending topic reviewers as they audit topics surfaced algorithmically: reviewers are required to accept topics that reflect real world events, and are instructed to disregard junk or duplicate topics, hoaxes, or subjects with insufficient sources. Facebook does not allow or advise our reviewers to systematically discriminate against sources of any ideological origin and we’ve designed our tools to make that technically not feasible. At the same time, our reviewers’ actions are logged and reviewed, and violating our guidelines is a fireable offense.There have been other anonymous allegations — for instance that we artificially forced #BlackLivesMatter to trend. We looked into that charge and found that it is untrue. We do not insert stories artificially into trending topics, and do not instruct our reviewers to do so. Our guidelines do permit reviewers to take steps to make topics more coherent, such as combining related topics into a single event (such as #starwarsand#maythefourthbewithyou), to deliver a more integrated experience.Our review guidelines for Trending Topics are under constant review, and we will continue to look for improvements. We will also keep looking into any questions about Trending Topics to ensure that people are matched with the stories that are predicted to be the most interesting to them, and to be sure that our methods are as neutral and effective as possible.
No comments:
Post a Comment