The Danger Of Social Media Filter Bubbles

I was listening to a Shawn Ryan podcast with Chase Hughes. During that conversation, Ryan shared how frustrating it was to get stuck in a filter bubble on social media.

A filter bubble is when AI algorithms track your internet movement. Those algorithms then personalize content that you will most likely consume. This happens on almost every website, even when you think that you are getting the same generic experience everyone else is getting. The desired outcome by social media sites and most websites is to provide you with content that you will consume. You then get locked in a “jail” of sorts where you can’t get any other information.

The problem is that these algorithms can put you in a filter bubble, a term coined by Internet activist Eli Pariser. Being in a filter bubble means these algorithms have isolated you from information and perspectives you haven’t already expressed an interest in, meaning you may miss out on important information. ~Digital Media Literacy GCF Global

Have you ever “hovered” over a particular topic on social media. Then, for several hours(days or months) that topic is the primary content of your news feed. For example, I like fishing a particular mountain range. If I open one story or even hover over the image, I am locked into that content and am in information jail. No other content is shown to me. This is a filter bubble.

Why are these dangerous?

From a mental health standpoint, an information filter bubble can lock a person, who is depressed and lonely, into a bubble which only presents information about depression and loneliness. They won’t let the user come up for air. In fact, those social media algorithms could easily place others with similar mental health issues into the same information filter bubble. See the problem? Instead of a support system, social media users get no differing perspectives and get locked into a dark prison.

From a professional standpoint, algorithms cater to the user so that the user will keep using their site. If you like data driven schools, search engines will only provide information catered to data driven schools. Some search engines will even often cater this to your location. Alternative educational pedagogy about critical thinking will never see the light of day.

Chase Hughes is an interesting listen. He specializes in human behavior, influencing behavioral outcomes, and psy-ops. He talked about how easy it was to control human behavior. He noted that two things were needed to change someone’s behavior. First, you must alter their focus and perspective. Then, you must attach an emotional outcome. His model is called the FATE model. Focus. Authority. Tribe. Emotion.

Social media does all of that. It alters your perspective. It provides sources in your news feed which offer authority on a subject. Remember when social media encouraged double masking? Social media provides you a tribe to support a certain believe. Then, you see stories and reels which support those ideals. Folks, that is 100% manipulation by algorithm.

How do you know if you are they subject of a psy operation? First, people of alternate views are censored. Second, something shiny and new is offered in its place. LOL. I thought of professional development. But really, just think about how social media was used during COVID. Follow the FATE model, and you will easily see how the public was manipulated/forced into agreeing with the shutdown.

(This article was written in November 2025)


Discover more from Rob's Innovation in Education Blog

Subscribe to get the latest posts sent to your email.