Why we Live in Hyper-Fragmented Digital Worlds (And What we can do About it)

Bronwynne Powell
6 min readDec 19, 2018
Image credit: Unsplash via Pablo

Once upon a time, the internet was hailed as the platform to protect democracy and promote debate.

Fast forward a few decades and our social media feeds are more divisive than ever.

We’re shielded from dissenting opinions, living in highly curated and fractured digital worlds.

Mark Zuckerberg is alleged to have told colleagues, “A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.”

It’s easy to balk at the statement; however, our news feeds are ultra-personalized.

And the filtering is constant — without our knowledge and consent: algorithms are designed to respond to our preferences and decide what we’ll get to see.

On top of that, our own biases compel us to connect with people who share, and reinforce, our views in offline and online worlds.

These are acts with polarising consequences.

Is the idea of the internet as a bastion for a connected and open global village shattering like glass in front of our eyes?

Let’s take a look.

The Filter Bubble

Eli Pariser, an internet activist behind organisations like Upworthy and Avaaz, coined the term, the Filter Bubble.

Pariser, who’s progressive politically, noticed a decreased in the number of Facebook posts from his conservative friends, and an increase in posts from his democrat friends. The Facebook algorithm identified that Pariser spent more time engaging with his democrat posts and started serving them up in greater doses, at the cost of the conservative posts.

In a Ted Talk on the topic, Pariser explains Facebook wasn’t the only one involved in this “invisible, algorithmic editing of the Web.

“Google’s doing it too. If I search for something, and you search for something, even right now at the very same time, we may get very different search results. Even if you’re logged out, one engineer told me, there are 57 signals that Google looks at — everything from what kind of computer you’re on to what kind of browser you’re using to where you’re located — that it uses to personally tailor your query results. Think about it for a second: there is no standard Google anymore”

The result?

A “personal, unique universe of information that you live in online.”

While opinion is split on the impact the filter bubble has on our online interactions, there is research to validate that we’re often seduced by sameness …

Birds of a Feather

Research shows people who are similar in significant ways usually have deeper and stronger ties; these types of relationships are more likely to survive during transitional periods like leaving school, etc.

It’s known as homophily.

When you translate it online you have a feed full of people that are just like you.

What’s the problem with that?

Miller McPherson, Lynn Smith-Lovin and James M Cook, summarised the body of research around this behaviour in this 2001 paper:

“Homophily limits people’s social worlds in a way that has powerful implications for the information they receive, the attitudes they form, and the interactions they experience.”

Doesn’t this degrade the internet’s ability to expand access to the public sphere?

In this post on Medium’s Aeon Magazine, C Thi Nguyen, sets the danger as such:

When we take networks built for social reasons and start using them as our information feeds, we tend to miss out on contrary views and run into exaggerated degrees of agreement.

We run the risk of being fooled by our timelines; therefore, we struggle to determine the line between reality and our siloed digital worlds.

In a post on Medium, Sean Blanda warns of the dark side of the lopsided view of the world:

What is emerging is the worst kind of echo chamber, one where those inside are increasingly convinced that everyone shares their world view, that their ranks are growing when they aren’t.

Confirmation Bias

These aren’t new threats.

More than 20 years ago, MIT researchers Marshall Van Alstyne and Erik Brynjolfsson cautioned against the dangers of “virtual cliques”.

“Individuals empowered to screen out material that does not conform to their existing preferences may form virtual cliques, insulate themselves from opposing points of view, and reinforce their biases. Internet users can seek out interactions with like-minded individuals who have similar values, and thus become less likely to trust important decisions to people whose values differ from their own. This voluntary balkanization and the loss of shared experiences and values may be harmful to the structure of democratic societies as well as decentralized organizations.”

What makes them so enduring is our desire to find information that validates our existing opinion, known as Confirmation Bias.

Shane Parrish has a great primer on the bias over at his Farnam Street blog

Confirmation bias is our tendency to cherry-pick information that confirms our existing beliefs or ideas. Confirmation bias explains why two people with opposing views on a topic can see the same evidence and come away feeling validated by it. This cognitive bias is most pronounced in the case of ingrained, ideological, or emotionally charged views.

In a post on Medium Tony Deller describes the struggle to battle against our biases even when we know they’re at work:

“Confirmation bias is like a mental virus that is highly transmissible and impossible to cure. Even if you are aware of it, constantly trying to fight it by stamping out your own ignorance, confirmation bias manages to survive because it can live within any concept.”

So how do we get back on track?

The internet was heralded as a bedrock of a democratic society. In Wealth of Networks, Yochai Benkler argues the architecture of Web 2. 0 is more favourable for the formation of a networked public sphere, when comparing it with the traditional mass media model.

Despite all its weaknesses, the internet is often referred to as the Fifth Estate, a potential vehicle to challenge institutional authority.

Pariser traces the solution back to an approach taken by traditional journalism at the turn of the 20th century.

“And the thing is, we’ve actually been here before as a society. In 1915, it’s not like newspapers were sweating a lot about their civic responsibilities. Then people noticed that they were doing something really important. That, in fact, you couldn’t have a functioning democracy if citizens didn’t get a good flow of information, that the newspapers were critical because they were acting as the filter, and then journalistic ethics developed. It wasn’t perfect, but it got us through the last century. And so now, we’re kind of back in 1915 on the Web. And we need the new gatekeepers to encode that kind of responsibility into the code that they’re writing.”

Cesar Hildago, at MIT, suggests a “flip feed button” for social networks like Facebook.

“Its algorithms could identify your bias, then show you stories selected from the other end of the political spectrum. An alert could pop up when an algorithm detects that your feed is getting too closed off, and it could suggest people to friend or pages to follow with a view to widening your perspective,” reads this News Scientist article discussing Hildago’s proposal.

What can we do while they figure it out?

While the gatekeepers rise to Pariser’s challenge, we can engage in subtle, but significant, practices to widen our perspectives.

We could indulge in a more varied media diet: we can follow sites, pages, and people that express opposing viewpoints.

And to counter our own biases, Deller points to a process from philosopher and journalist Frederic Friedel:

  1. Articulate your observation
  2. Come with a hypothesis to test your observation
  3. Make a prediction based on your hypothesis
  4. Test and repeat

My own experience of the internet has been largely positive.

I’ve been lucky enough to take advantage of the benefits presented by a connected world. As a freelance writer, I’ve had the opportunity to work with colleagues in all corners of the globe. I’ve improved my professional skills through online courses and learned more about causes I care about on social media.

But I’m aware that it might take ongoing effort to avoid getting too closed off online. I must continue to question myself, my motives and my opinions. It’s uncomfortable; sometimes you’ll feel displaced and uncertain.

What’s at stake? Being blindsided by changes in the world I didn’t see coming because, well, I didn’t want to.

The digital age opens up boundless new possibilities. Opportunities for learning, connecting and seeing things in ways we’ve never expected. All’s that left is to open our eyes.

--

--