Are you seeing everything your friends post on Facebook? Recently, Facebook published a study in Science on whether its algorithmic filtering of the news feed means people will see less political news that disagrees with their views. The answer is, perhaps not surprisingly, yes, but not by much. Over on Medium, Eli Pariser, author of The Filter Bubble notes that “On average, you’re about 6 percent less likely to see content that the other political side favors. Who you’re friends with matters a good deal more than the algorithm.”
But the algorithm still matters. The danger of a filter bubble isn’t whether it’s harming political discourse, but what it means for how we communicate with the people we care about most. Many Facebook users are unaware that Facebook is even filtering their news feeds, even after the debacle of their infamous emotion study, where some users had either happy or sad updates hidden from their feeds to see how it changed their own updates.
To put this into perspective, imagine posting a request for mental health help to your friends on Facebook, only for it to get lost to the noise of posts the algorithm has determined your friends are more interested in.
Facebook’s filtering algorithm is secret – all we know is that “likes” help posts gain visibility, and that, as sociologist Nathan Jurgenson explains, “the newsfeed is also sorted based on what people are willing to pay. The order of Facebook’s newsfeed is partly for sale.” Of course, neither you or I can pay Facebook to promote what we care about, that’s only for advertisers and publishers. (Though it does help prevent, say, a noted Neo-Nazi buying promoted spots on your News Feed.)
It’s to Facebook’s benefit to either keep its audience ignorant of the filtering in their news feeds or to convince them it’s a good idea. To quote Jurgenson on sharing political news that your friends might disagree with:
By making the newsfeed an algorithm, Facebook enters users into a competition to be seen. If you don’t get “likes” and attention with what you share, your content will subsequently be seen even less, and thus you and your voice and presence is lessened.
The algorithm doesn’t exist to make your Facebook experience better for you, but to make it better for Facebook. The more you post, share, comment and like, the more data Facebook gets on you to sell to the highest bidder. The filtered news feed is a perfect way to ensure that a Facebook user will keep sharing the stuff that gets them responses, or at least keep trying.
Facebook isn’t the only tech company that algorithmically determines what you see and when, either. Google does it, Amazon does it, and Netflix does it. Technology companies love to play up the neutrality of algorithms, but that’s a charade. The truth about algorithms is that they reflect the values of the people who program them.
In the case of Facebook, those values are to keep your eyes on the screen and thumbs posting and sharing, whether what you share is seen or not. It certainly isn’t about making the world “more open and connected” when they’re controlling what you see.