The problem is this: Facebook has become a feedback loop which can and does, despite its best intentions, become a vicious spiral. At Facebooks scale, behavioral targeting doesnt just reflect our behavior, it actually influences it. Over time, a service which was supposed to connect humanity is actually partitioning us into fractal disconnected bubbles.
The way Facebooks News Feed works is that the more you engage with posts from a particular user, the more often their posts are shown to you. The more you engage with a particular kind of post, the more you will see its ilk. So far so good! Its just showing you what youve demonstrated youre interested in. Whats wrong with that?
The answer is twofold. First, this eventually constructs a small in-group cluster of Facebook friends and topics that dominate your feed; and as you grow accustomed to interacting with them, this causes your behavior to change, and you interact with them even more, reinforcing their in-group status and (relatively) isolating you from the rest of your friends, the out-group.
Second, and substantially worse, because engagement is the metric, Facebook inevitably selects for the shocking and the outrageous. Ev Williams summed up the results brilliantly:
Of course this doesnt just apply to Facebook. The first problem applies to all social networks with smart algorithmic feeds that optimize for engagement. Facebook is just the largest and most influential by far.
The second has been a problem with television for decades. Why have majorities or crazily large minorities of people believed, for many years, that violent crime just keeps getting worse, that their hometown mall might be bombed by terrorists at any moment, that Sharia law will come to their province/state any day now, that the rest of the world is a war-torn shambles only barely propped up by vast quantities of aid we cant afford despite the easily available, incredibly copious, clear evidence to the contrary? In large part because if it bleeds, it leads.
Fake news is far from new; its just become explicit rather than implicit. And I certainly dont mean to suggest that Facebook singlehandedly caused the terrible trend of demonizing any and all people with whom one disagrees. Studies show that political polarization is more extreme in older people, who use social media less, than in the young. Whatevers happening is far more complicated than just Facebook is driving us apart.
Still we hoped the 21st century of Facebook would be better, more compassionate, more understanding, than the 20th century TV. But its not, and the ways in which its worse are far more personal. We hoped that making the world more open and connected would be good for us. Maybe it would be, if the metric that the connecting entity optimized for was something other than engagement. But it now seems fairly clear that engagement is negatively correlated with happiness for users, and moderately clear that this is, in fact, a causal relationship:
The analogy I like to use is global warming causing extreme weather: the more energy pumped into our atmosphere, the more it behaves in bizarre and erratic ways. Facebook is like a powerful greenhouse gas for our collective social atmosphere. TV was too, of course, but it was CO2 to Facebooks methane.
I dont want to get into Facebooks privacy issues, hatespeech issues, ongoing rejection of all the principles of the open web, etc. Im not suggesting that this is anyones fault, or even that anyone has done anything wrong. Nothing like Facebook has ever existed before. It is a company that is also a massive global experiment, one with some excellent outcomes.
But it would be good for us all if Facebook were to at least acknowledge the possibility that at least some of their experiments outcomes seem at best worrying and maybe even alarming and something should be done to try to mitigate them. As hard as that admission might be.
Im happy to report that this may well be happening. Mark Zuckerbergs recent comments to the effect that Facebook is working on a way to connect you with people that you should know like mentors. I hope this is the harbinger of a new understanding that Facebooks focus on optimizing for engagement is, in and of itself, harmful to its users and an understanding that its always best to head off a backlash before it begins, rather than after it gathers steam.