How to escape your social media bubble before the election

Mashable’s series Algorithms explores the mysterious lines of code that increasingly control our lives — and our futures.


You live in an online bubble.

But, you’re not alone. We all live in an online . Social media algorithms control much of what we see when we log into Facebook, YouTube, TikTok, Instagram, and Twitter.

Why? Why else! These social media companies are chasing after the almighty dollar. For Big Tech companies, it’s all about keeping you on these platforms for as long as they can, engaging with as much content as possible. They make more money from advertisers that way.

So, for example, if you’re a big Donald Trump supporter and follow your favorite Fox News pundits, the social media algorithms are going to recommend to you more right-wing pundits to watch and more pro-Trump content to consume.

The consequences: skewed worldviews for those unknowingly living in an algorithm-devised bubble. 

With the 2020 U.S. presidential election coming up, step out of your bubble. It’s time to understand what’s playing out, so at the very least, you won’t be (that) surprised by whatever the outcome is on Election Day. Here are some steps to take to start popping your social media bubbles.

1. Realize you’re in a bubble

Much of what we see on our social media news feeds and timelines are a product of what accounts we follow, what channels we subscribe to, and what content we share and like.

Based on that, you may think we’re in charge, that we’re curating our own feeds. But there’s a bigger force at play: the algorithms. 

You never see all the posts, videos, or tweets from everyone you follow. The algorithm chooses what you see. The trending topics that determine the topics of the day? The algorithm picks them. Those newly discovered accounts that were recommended to you while you were scrolling through your timeline? You know, the ones that just happened to fit your interests to a tee? That’s the social media platform’s algorithm taking in all your data and figuring out exactly what it thinks you would like.

The first step to escaping the bubble is realizing you’re in a bubble. 

“Filter bubbles today are what parental political opinions were 10 years ago,” explained Harleen Kaur, founder of Ground News, an app that compares news stories for political bias. “They are absolutely integral in shaping someone’s worldview and once formed, are very hard to burst.”

Kaur, a former space engineer, founded Ground News in order to make it easier to read a variety of perspectives on individual news stories.

“The greatest sin of filter bubbles is that they impede access to content that may challenge someone’s worldview.”

“Filter bubbles intensify polarization and impair our ability to have constructive conversations about the issues that plague our society today,” she explains. “The greatest sin of filter bubbles is that they impede access to content that may challenge someone’s worldview and only serve to reinforce strongly held convictions. When people don’t have access to any information that we disagree with, they struggle to understand the perspective of others.”

Whether your algorithmically curated feed leans left or right, you are absorbing opinions with one very specific ideological bent. Let’s change that.

2. Retrain the algorithms

You’ve actually been training the algorithms all along. 

We may not have total control over what we see, but our follows, shares, and likes give the algorithms the data points they need to make those decisions.

Once these algorithms are making these decisions for you, their choices can create a filter bubble made up of completely one-sided news or even straight-up misinformation.

“This is a question that researchers are still trying to understand, particularly with regard to how misinformation-heavy communities form, how people find their way into them,” noted Renée DiResta, a research manager at Stanford Internet Observatory, which studies social media abuse. “Evidence suggests recommendation engines play a role in that process.” 

But, you can play a role here, too. 

DiResta explained to me how coronavirus-related conspiracies, for example, are often spread by a few highly active users sharing this content within the same groups and communities.

“This is a combination of algorithmic amplification but also active participation from members of the communities,” she says.

To pop that bubble, you need to retrain these social media algorithms that are giving you bad information. Follow some accounts from all sides of the political spectrum. It doesn’t mean you’re actually a fan of those personalities or their points of view. You just want the algorithm to surface that content so you know that these other viewpoints and opinions exist.

3. Understand media biases 

Now that you’re aware of the filter bubble and looking to pop it, you should understand the biases various media outlets have.

There’s nothing wrong with these biases as long as the outlets are transparent about them. In fact, it’s much better for an organization like Fox News to openly have a conservative bias than it is for the algorithms to determine what we see. We know what we’re getting into when we put on Fox News. We don’t know the social media algorithms in the same way.

With so many digital media outlets out there, there are a few tools to help you understand the direction each one leans. is a decent source to check up on the overall bias of many outlets, especially lesser known ones. It’s run by independent media analyst Dave Van Zandt and a team of volunteers who have developed a methodology for rating news outlets. 

However, I find that Kaur’s has a much better overall approach. The platform simply looks at each individual news story, via an algorithm, and then lets you know what type of outlets are mostly covering that specific event. It basically tells you if a particular news story is being widely covered or if it’s just a big topic of conversation between the news outlets within your ideological bubble. 

Ground News also puts out a weekly Blindspot Report, which focuses on news stories that were primarily ignored by one side of the aisle. To determine the news outlets’ biases, it aggregates the media bias designations from various outlets, including Media Bias/Fact Check. 

4. See things from another perspective

“It’s important that people get out of their own filter bubbles because doing so in nature questions and tests your own personal beliefs,” explained investigative reporter Jared Holt in a phone conversation with me. “I think that’s a very healthy thing to do.”

Holt reports for Right Wing Watch, a conservative watchdog publication run by progressive advocacy group, People for the American Way. While Holt, who describes his personal ideology as politically left, writes critically of far right personalities, he doesn’t seek out this content just to bash it. He provides important context and background information on the issue he’s covering. 

Most importantly, Right Wing Watch is transparent about its media biases. It’s right in the name. And you don’t need to agree with Holt’s politics in order to understand the importance of stepping out of your filter bubble. If you are on the right, you could read Holt’s reporting and have a more rounded understanding of differing points of view.

WATCH: What is an algorithm, anyway?

Uploads%252fvideo uploaders%252fdistribution thumb%252fimage%252f95373%252f448f9b70 4c20 4dd9 9292 a94f8d36ede5.png%252f930x520.png?signature=ywat0urxcugx117rqjhednu8k e=&source=https%3a%2f%2fblueprint api production.s3.amazonaws

Holt explains that consuming right wing media all the time, while being politically to the left, has helped him too.

“I think that having that kind of opinion come into my own personal politics in a personal capacity is always a good test of what I, as an individual, believe and why I support the causes that I do support,” Holt tells me.

Maybe you don’t want to mess with the perfectly trained algorithm on your own account. That’s fine! If you want to pop that filter bubble, you can always create as many additional accounts on each social media channel as you’d like.

Create a fresh YouTube account that just subscribes to leftist indie media! Register a TikTok profile and only follow right-wing zoomers! Sign up for Facebook and only like the mainstream news pages your parents follow!

Those accounts will show you exactly what a user who follows those accounts would see.

5. Use online tools to pop that bubble

If you’re looking for an easy way to pop that filter bubble, there are apps that will do it for you.

For example, is an app that creates Twitter lists made up of just the accounts a specific user follows. Want to see exactly what President Donald Trump wakes up to every morning in his newsfeed? Use Vicariously.

The creator of Vicaroiusly, Jake Harding, told me that he believes the filter bubble problem on Twitter is especially amplified.

“[Twitter’s] more of an interest graph than a social graph,” Harding explained. “And it’s text first so opinions are the name of the game.” 

Basically, you’re more likely to follow accounts on Twitter based on what they’re tweeting than if you personally know someone on the platform.

Another tool that’ll help you view social media from the eyes of another user is . As you probably guessed, this one’s for YouTube. 

The site has six different profiles of YouTube users, such as the liberal, the conservative, and the climate denier. Clicking on one profile gives you a daily curated feed of YouTube videos recommended by the platform’s algorithm which that type of user would most likely see.  

TheirTube’s creator, Tomo Kihara, told me he created the site after seeing the YouTube homepage of someone he knew that turned into a conspiracy theorist.

“Each of these TheirTube personas is informed by interviews with real YouTube users who experienced similar recommendation bubbles,” Kihara explained.

Just click on any two and compare what’s being recommended. It’ll open your eyes as to how different everyone’s daily news intake looks. One interesting thing you can do is to see if your own personal YouTube algorithm recommends any channels that match up with a TheirTube profile. Once you understand you’re in a filter bubble, seeing some of your favorite channels classified as fitting a “conspiracist” profile may very well result in some introspection.

The tools are now in your hands. Pop that filter bubble. Expand your worldview. You’ll be a better person for it.

Read more from Algorithms:

source.



LuvNaughty | We're here to get you off LiL VAPE | Home of the vapour Latest Media News | Stay updated with us The Lazy Days | Procrastinate right