So, Is US Senator John Thune justified in calling for a review of how Facebook does business? Well, there’s an interesting report out today, complete with leaked papers from Facebook itself:
The documents, given to the Guardian, come amid growing concerns over how
Facebook decides what is news for its users. This week the company was accused of an editorial bias against conservative news organizations, prompting calls for a congressional inquiry from the US Senate commerce committee chair, John Thune.
The boilerplate about its news operations provided to customers by the company suggests that much of its news gathering is determined by machines: “The topics you see are based on a number of factors including engagement, timeliness, Pages you’ve liked and your location,” says a page devoted to the question “How does Facebook determine what topics are trending?”
But the documents show that the company relies heavily on the intervention of a small editorial team to determine what makes its “trending module” headlines – the list of news topics that shows up on the side of the browser window on Facebook’s desktop version. The company backed away from a pure-algorithm approach in 2014 after criticism that it had not included enough coverage of unrest in Ferguson, Missouri, in users’ feeds.
The guidelines show human intervention – and therefore editorial decisions – at almost every stage of Facebook’s trending news operation, a team that at one time was as few as 12 people:
A team of news editors working in shifts around the clock was instructed on how to “inject” stories into the trending topics module, and how to “blacklist” topics for removal for up to a day over reasons including “doesn’t represent a real-world event”, left to the discretion of the editors.