Facebook IS the Trending Topic

It started out an expose’ on algorithms. Michael Nunez, himself a “reporter, editor and tv personality,” wrote a scathing blog post about Facebook’s “grueling work conditions [and] humiliating treatment” of a group of news curators behind Facebook’s trending news.


The original article was a combination of Silicon-Valley-sweatshop horror andmachines-are-coming-for-your-job dystopia. But what really caught people’s attention were allegations that news with a conservative viewpoint was being suppressed.

To be clear, Facebook insists that wasn’t the case, and they have always admitted that humans are part of the trending topics process. Technically, they’re not even calling it news, just acknowledging that the topic is trending.

None of that stopped the GOP from launching an inquiry and demanding that Facebook explain itself. Which Facebook did, along a promise to revise the process to ensure no bias could creep in.

Why does all of this matter? Two reasons. First, it shows exactly how much influence Facebook has over what people consume online, and second, it shows how little transparency there is around the process.

Pew Internet Research found that 40 percent of American adults are getting news from Facebook, a number than has been steadily rising over the past two years. Plus, Facebook Live and Instant articles are designed to bring more and more content into the site.

With all this content, Facebook is using an algorithm to decide what gets shown to its users. It’s generally based on a user’s history and preferences, but companies have long been able to boost their visibility with paid reach. These accusations make it hard not to wonder what other factors could be at play.

It’s not that users are surprised to know that someone is filtering their news—after all, we’ve been relying on reporters, editors, anchors and other journalists to help us do that for centuries. News organizations have been looking after the bottom line for just as long. With human filters, it’s fairly easy to understand the process and their unavoidable biases.

With an algorithm, even one taught by humans, it gets a lot murkier. Obviously, that doesn’t mean we should abandon Facebook or other social networks. It’s just a good reminder that are powerful forces behind the scenes of what we see online.

About the author:
Tara Saylor is a communications manager by day, grad student by night and curious all the time. She is also a web nerd and recovering copywriter. Tara focuses on the channels that enable communication and using metrics to improve communication effectiveness. She tweets about communication and combines as @AnokheeTara.