Facebook Fires Humans, Hires Robots To Tell You What’s Hot Today

Image courtesy of Facebook

If you’ve ever looked at the Trending Topics in the top right of your Facebook newsfeed, just to see chatter about some video game character right next to news about a massive natural disaster, you’ve probably thought, “who on earth is deciding what shows up here?” Well, now it’s what, not a who, and it… might still need some refining.

As Quartz reports, Facebook suddenly ousted its human Trending team on Friday and replaced them with an algorithm. The robots will decide what is trending, and they will tell you.

As recently as last week, the Trending module was helped out by a Trending Team of about 15-20 people. Here’s how it worked, according to Facebook:

An algorithm determined what was popular enough to bubble up to the top, and then real people looked at the topics the system churned up in order to determine if it was actually a thing tied to a real event (as opposed to something that will always trend, like “lunch”). Those people would write up a quick and snappy summary, categorize the news (“sports” vs “entertainment”), mark it as high priority if the ten biggest major media outlets were covering it, and then move on.

Another algorithm would then determine what you saw, among those. Something important enough, like an earthquake, might show up for nearly everyone; news about a particular movie star, video game, or athlete might only show up for people whose networks indicated a high level of interest in that thing.

But last May, Facebook found itself at the center of controversy, when some politically-aligned news outlets accused the social network of deliberately manipulating Trending Topics to suppress certain political stories. Even some members of Congress started asking pointed questions, though Facebook vigorously denied any such thing.

Facebook executives, including the guy in charge of Trending as well as Zuck himself, took to the virtual airwaves to defend the way Facebook works, but the damage to their image stayed done.

Which brings us to today, where instead of a headline and a summary, your Trending module is a lot more terse, and looks something like this:

fbtrending

There are still humans involved, Facebook says, but they now have a much different role. Now they’re mostly just keeping an eye on the software to correct any mistakes it makes, rather than adding any copy of their own.

As Quartz points out, though, removing humans from a system does not actually mean you are removing any human bias from that system. Algorithms are coded by people, and while machines do learn, they learn from the inputs humans give them — which basically always carries some kind of bias or other.

If an AI learns from a team of people, it will learn to adopt whatever biases that team of people displays, not to be bias-free. If Facebook wants the machines to run the system for reasons of efficiency, that’s one thing… but it’s not going to find anything getting magically above accusations of bias just because the people are gone.

Facebook is trying to get rid of bias in Trending news by getting rid of humans [Quartz]

Want more consumer news? Visit our parent organization, Consumer Reports, for the latest on scams, recalls, and other consumer issues.