Facebook Learned Nothing

After facing criticism from users, politicians, bureaucrats, and shareholders, Facebook's CEO Mark Zuckerberg announced his personal 2018 New Year's Resolution, to "fix Facebook." In previous years, Zuckerberg has publicly announced his resolutions, and his followthrough. He's visited every U.S. state, built an artificial intelligence for his home, ran 365 miles, and learned Mandarin. So I had some hope that he'd be able to complete this one - after all, as CEO, "fixing Facebook" should be a core part of his job.

About a week and a half into 2018, Zuckerberg announced the first change, which would be to reduce the priority of public content in our newsfeeds. The goal was to reduce branded posts from 5% to 4% of the content - which doesn't sound like a big difference, mostly because it isn't. The problem posts on Facebook weren't advertisements for Audible and fabric softener.

Yesterday, January 19th, he announced the second big change: "to make sure the news you see, while less overall, is high quality."


Academics, journalists, and armchair pundits are all still trying to make sense of what happened to "the news" over the past few years, and there are a lot of different schools of thought. Some explanations are simple (more literacy, more written stupidity), and others are incredibly complex (we're living out the result of a 1960s Soviet intelligence operation.) But they all rely on the same premise:

People are truly terrible at figuring out the truth.

And we are. We don't know what we don't know, we think we know more about the things we know a little about, and we're almost unable to follow a logical flow if we have even the slightest emotional connection to the subject.

However, Zuckerberg has faith that people can do it, this time… with the help of surveys. From his announcement:

We decided that having the community determine which sources are broadly trusted would be most objective. Here’s how this will work. As part of our ongoing quality surveys, we will now ask people whether they’re familiar with a news source and, if so, whether they trust that source. The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don’t follow them directly.

Now, there's some problems off-the-bat with this sort of data collection: self-selection bias, aware observer bias, and so on. But there's also a more specific problem: Gallup (and others) already did a study on this, and the results weren't good. Here's some of their key findings:

And so on. My point here is not to pick on Republicans, but to highlight that not only do different political demographics have vastly different definitions of things like objectivity and truth, but that different demographics have differing opinions on whether or not media bias is even important.

Facebook won't be able to resolve these differences through user surveys any more than they were able to do it with algorithms. I actually don't think Facebook can do anything about it - the solutions need to come from news outlets.


There's optimism to be found in that Gallup study. Americans are getting more and more critical of existing news sources, while simultaneously consuming more news. Basic economics indicates that with this increased demand for good news sources, we should see a supply to match.

But getting there will take some time - and will take each of us doing our part to be good consumers of media… like maybe not saying that Fox News is an objective news source, or discounting anything disagreeable to our politics as fake.