That I’m linking to this, instead of the actual post has everything to do with Facebook’s relentless assault on web standards. The idea that Facebook doesn’t know what it’s doing here is far fetched at best. The bubble effect is not some esoteric theory, it’s common knowledge that Facebook is a confirmation bias machine. This addresses the problem by doubling down on the affect. Nobody should be surprised.
From the horse’s mouth,
We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you – the community – and have your feedback determine the ranking. We decided that having the community determine which sources are broadly trusted would be most objective.
You could read this as a cop out, but that would be too kind. Copping out in this instance is conveniently intentional. Gizmodo asks the obvious misanthropic question this move invites,
If people cannot tell truth from garbage, why are those same people being used to rank publications on a scale of trustworthiness?
Other supposedly sober outlets seem to be missing the point here, from Slate
At first blush, it looks like Facebook is doing exactly what I and other critics have long been calling for it to do: acknowledge that its algorithm plays a crucial role in determining what news people read, and take some responsibility for its profound effects on the media and the spread of information. It’s about time, right?
Except that, based on its announcement, Facebook’s approach to a notoriously difficult problem—figuring out which media to trust—appears to be painfully simplistic and naïve.
I think it’s naive to think they are being naive. This approach gives the appearance of doing something, and achieves exactly what they want. Welcome to hyper-reality.