Image Credit: Wakalani via Wikimedia Commons

You know that Facebook friend that constantly annoys you with their posts from the opposite side of the ideological spectrum?  Give them a chance - you might need them to keep you balanced.

Several questions have plagued social researchers for a while.  Do Facebook users exist in so-called "echo chambers," only receiving information from like-minded friends?  Or does Facebook's algorithm effectively create "filter bubbles" which limit your exposure to challenging and opposing ideas?  In order to answer these questions, researchers at Facebook and the University of Michigan looked closely at the usage data of 10.1 million active Facebook users in the United States over a 6-month period.  (Don't worry - the data had been stripped of identifying information.)

Published this month in Science, the study examined users who had reported a political allegiance in their Facebook profile.  Specifically, they analyzed their relationships with other Facebook users, as well as the type of news content these users shared.  It appears that the old saying, 'birds of a feather flock together,' holds true.  As expected, the researchers noted that Facebook users tend to assemble according to their idealogical affiliations.

However, most users also have a number of Facebook friends from the other side.  On average, about 20% of a Facebook user’s friends have the opposite ideological affiliation and interestingly, conservatives tend to have more friends who share liberal content than vice versa.  Regardless, this report suggests that liberals and conservatives alike are each exposed to more opposing viewpoints than previously thought.

Cross-cutting content is limited by what is shared by your friends (Potential from network), News Feed algorithm (Exposed), and an individual's selection (Selected)           Image Credit: Bakshy et al. (2015)

When you log into Facebook, the order of the stories listed in your News Feed depends on several considerations, such as how much you frequent Facebook, how often you've clicked on certain links in the past, and the extent to which you interact with your Facebook friends. In this study, Facebook’s sorting algorithm was found to slightly limit opposing viewpoints in the participant’s News Feed.  But that wasn’t the biggest obstacle to a person receiving attitude-challenging content.  Instead, the researchers reported that individual choices - or what the Facebook users chose to click on and presumably read - played a bigger role in limiting the user's exposure to disparate news stories.  

For this study, the news stories being shared were classified as "hard" or "soft."  Hard content included national or international news and politics, whereas soft content focused on entertainment, sports, and travel.  Not surprisingly, Facebook users who defined themselves as liberal or conservative were much more likely to share hard content.  And in spite of the limitations to what is available in the News Feed and what users are willing to click on, the study reported Facebook users still selected 7% of  of the available hard content.

So, it sounds like our oppositely-minded friends are exposing us to alternative viewpoints. "Rather than people browsing only ideologically aligned news sources or opting out of hard news altogether, our work shows that social media expose individuals to at least some ideologically cross-cutting viewpoints," reported the researchers.  But, it's up to every Facebook user as to what they choose to read.  Indeed, the researchers concluded by stating that “the power to expose oneself to perspectives from the other side ... lies first and foremost with individuals.”

We could better deal with the onslaught of information and misinformation if we were better educated in argument and debate. Simon Rankin

And in the interest of full disclosure, it should be noted that although this study was funded by Facebook, the investigators reported that "Facebook did not place any restrictions on the design and publication of this observational study, beyond the requirement that this work was to be done in compliance with Facebook's Data Policy and research ethics review process (www.facebook.com/policy.php)."


Share This Article