By Alexia Finney | Staff Writer
Every time a college student opens TikTok, Instagram or YouTube, they’re stepping into a digital hall of mirrors where their beliefs are reflected and reconfirmed. The “For You Page,” that students use for entertainment, has become one of the strongest engines for confirmation bias.
Generation Z prides itself on being well-informed; however, I think we often miss how media habits have shaped the way we think. Subsequently, the way we interact with one another.
Instead of broadening perspectives, the internet has become a place that encourages partisan thinking and like-mindedness. Ultimately, confirmation bias, amplified by media platforms, influences the beliefs and interactions of today’s college students.
As defined by Britannica, confirmation bias is “people’s tendency to process information by looking for, or interpreting, information that is consistent with their beliefs.” While this bias is something we have all fallen victim to, it becomes potent when applied through media streams and their mass availability and consumption. Students aren’t just consuming content; instead, they are selectively shown content that they’ll like to keep them engaged.
This process is controlled by an artificial algorithm or filter that dictates the content we want to see. An algorithm is made to match a user’s interests and niches to available content created by others. Although the algorithm quickly becomes curated to our interests, the outcome has created a polarizing divide where groups reconfirm pre-existing beliefs and hate on others for holding their own, neglecting mature conversations with indifference.
Algorithms and confirmation biases create an “intellectual comfort zone,” where it feels good to be right and not explore other ideologies or ways of thinking — especially now when your algorithm tells you what you want to hear.
College students may think they are conducting research and being insightful when they watch informational content online. However, in reality, their online habits are tracked and subtly dictate the content they watch and ultimately believe.
Algorithms and confirmation biases not only affect how we cherry-pick information but also influence our interactions with peers and our conduct in discussions. Today, I feel this is relevant in politics. If a student thinks one political ideology is correct, they’ll unconsciously find videos, posts or articles that reinforce that pre-existing belief. Now the student enters a political conversation with more resistance to opposing views and potentially acts overconfidently in their stance.
A huge indicator of confirmation bias is the decline of journalism and the rise of social media as a primary news source. For example, newspaper distribution peaked in 1984 with 61 million handled, but had dropped to 23 million weekday papers by 2021, according to the Pew Research Center’s 2023 State of the News Media Fact Sheet.
There has been a drop in audience numbers tuning into local TV news across morning, evening and night programs. As professional news outlets continue to shrink, students will turn to social media, where content is curated to maximize engagement and align with the user’s existing beliefs rather than challenging opposing viewpoints.
Our generation’s reliance on social media over traditional news isn’t just a shift in where we get information, but a shift in how we form beliefs. Our understanding isn’t what creates meaning; our feeds do.
Algorithms and confirmation bias swaddle students into an intellectual comfort zone, where we feel confident in our beliefs because the content we consume rarely tells us otherwise. If we want to understand the world better, we need to challenge our ideas with the possibility we could be wrong; otherwise, we are not thinking, we are scrolling.