Does Facebook’s filter create a political polarisation among its users in the US?
We all know that Facebook filters content we see on our news feeds based on our interests and what we have ‘liked’. When first announced, the idea caused quite a stir, forcing Facebook to add two types of news feeds, the “Top Stories” one, with the filtered content, and the “Most Recent”, which sorts entries by the time they were posted.
The filtered content is created by Facebook’s algorithms and is affected by multiple factors – likes, clicks, time spent viewing the link by other people, etc. Other people’s behaviour, especially those in your friends list, will influence your news feed. Some things will get to you, others won’t.
This algorithm was put to the test by Facebook's in-house social scientists, to see how much it exposes or ‘protects’ people from different political ideas. They used approximately 10 million test subjects, all of which stated their political preferences on their profiles.
Then they determined the political flavour of content that was being shared. They created a scale where –2 was very conservative, and +2 for very liberal. The alignment was determined based on who posted the story, so The New York Times came in at –0.62 (somewhat liberal) whereas the average Fox News story was +0.78 (somewhat conservative).
The stories that mattered for this study were the "cross-cutting" ones, news articles with a liberal slant appearing in a conservative person's news feed or vice versa.
Once they determined the people and the relevant stories that both appeared and were filtered by Facebook, they came to the conclusion that the algorithm made it only 1 per cent less likely for users to be exposed to politically cross-cutting stories, the team reported in Science magazine (opens in new tab).