News Feeds Make Us Multi-Polar
This is the promise of social media. Or as it appears now, the edge of the Earth we will never reach. Scroll all you wish, you are never getting to the end of the news feed. But we keep scrolling, because we aren’t satisfied with our haul. Maybe it is our problem, for being too discontent. Maybe it is others’ problem, for being too disagreeable.
Or maybe it is the news feed’s problem.
Design of News Feeds
News feeds were linear to begin with; what you read very much depends on when you decide to read. The more you post, the more likely your post is being read. That promotes spam. With time they changed it, from Facebook to Twitter, and tried to show us less of the “junk” and more of the relevant stuff.
But how do they know what is most relevant to each of us? Well, they rely on their best available guess: What we click on. What we Like. What we comment on. What we share. Whatever we do becomes numbers, feeding algorithms. And so the more we respond to posts from Page X, the more we will see from Page X next time.
I used “we” partly because the top content is not only a function of what you do. It is also a function of what your friends like and react to. Not all friends, though. Algorithms also know which friends you interact most with on the digital platform. Chances are, these are friends who share similar beliefs as you. In other words, more of Page X!
How Much Can We Customize?
One caveat: Facebook now allows us to hide stories and “see less” from particular pages. In theory, it helps to make our news feeds a more balanced representation of our interests. The thing is, I have used it a few times. For the same page. For some reason, certain pages keep coming back in troves. This is cause for concern.
You might be following 1000 pages, but with time, you will end up just seeing the same 10-20 pages, any time of the day. Even without any manipulation (I’m assuming, of course), we always end up funneled into holes. Not one big standardized hole, but many tiny customized holes, with our unique lists of yes-people and funny animals.
I’m looking beyond planned manipulations because there is a more systemic flaw beneath it all. The algorithms which shape our news feeds are bound with a bias: the bias of stability.
Think about Charlie and the Chocolate Factory. If you watched it, you might recall the sight of Violet, the girl who kept chewing her gum till she swelled into a gigantic blue ball. That’s how I see social media. Once the algorithms “see” us chewing gum on several occasions, they decide to feed us gum till we nearly explode.What algorithms don’t know is that we also like to drink plain water. But no one goes around declaring their love for plain water.
News feed algorithms only reduce our personalities to our most obvious, expressive side. Marketing responds, by feeding us sensationalistic content. Over time, we unwittingly adapt to this bias of our news feed. We start to define ourselves in terms of gum or chocolate, violet or blue, cats or dogs. We dehydrate our selves.
Customization is supposed to give us more freedom. But the customization of our news feeds is biased towards a static description of our inclinations. By feeding us what they think we want, we actually lose our freedom. The unexpected insight that might lead us to change our minds rarely shows up.
This is the “standard” we are all subjected to with social media. Yet this standard doesn’t bring us together; it drives us apart. We become self-righteous about the world we are seeing. A world we have technically chosen to see.
If the real world was bipolar – two groups fighting from opposites on a spectrum – it is increasingly multi-polar. When we read messages that challenge our world, we either fight or flight. We side with others only when it’s convenient to. We lose sight of our similarities.
Are We Doomed?
Social media is far from hopeless, of course. While the news feed is the central organizing principle of most existing platforms, there are other key elements which may mediate our collective multi-polarization, such as comment sections and hashtags.
Polarization starts from within, from the pages we follow to the posts we read. Algorithms appeal to our taste for convenience and congruence, leading us to lose our sensitivity to nuance and capacity to change. We gotta learn to be smarter than them. We gotta be willing to step back and look at ourselves and ask what we really want.
It starts from accepting that the Earth is round. We will never discover the promise of technology, unless we accept that technology can never substitute the work of humanity.