The video explains that algorithms track your clicks and tailor content to match your interests. While this makes browsing feel convenient, it can trap you in a “filter bubble,” a term from Eli Pariser (watch his TED Talk to learn more). In a filter bubble, you mostly see views you already agree with and miss out on other perspectives—making it harder to understand events fully or engage in meaningful discussions. The video suggests staying aware and seeking out diverse sources as a way to counteract this.
However, while the explanation is clear, it frames filter bubbles mainly as an individual problem to solve by being mindful. What it leaves out is the larger picture: companies design algorithms to maximize engagement and profit, which often reinforces filter bubbles by pushing sensational or agreeable content. Understanding filter bubbles, then, isn’t only about personal choices—it’s also about recognizing the systems of power and profit that shape what we see online.
The video defines an echo chamber as an environment where someone mainly encounters information and opinions that reinforce what they already believe. This can make it difficult to consider opposing viewpoints or handle complex topics. Echo chambers are fueled by confirmation bias—the tendency to favor information that supports existing beliefs. The internet makes them especially easy to fall into, since social media and online communities connect people with like-minded perspectives. To avoid echo chambers, the video suggests checking multiple news sources, engaging with diverse perspectives, and grounding discussions in facts and respect.
The video clearly introduces echo chambers, but like the filter bubble video, it places the burden on individuals to “check multiple sources” or “interact with different perspectives.” What it leaves underexplored are the structural forces that encourage echo chambers online—platforms that profit from outrage and engagement, or recommendation systems that amplify like-minded content. Echo chambers are not only personal habits but also the result of systemic incentives that shape how conversations unfold. Recognizing this helps students see that while individual actions matter, the design of platforms also plays a powerful role.
In her book Algorithms of Oppression, Safiya Umoja Noble explains how search engines—often assumed to be neutral and reliable—actually reflect and reinforce systemic racism and sexism. Drawing from her own background in corporate advertising, Noble shows how search results are shaped by money, branding, and advertising priorities rather than fairness or accuracy. She highlights how harmful stereotypes, such as the sexualized misrepresentation of women of color, rise to the top of search results and gain the appearance of credibility. Noble argues that algorithms are not just math but automated decisions that amplify certain voices while silencing others, often deepening the marginalization of people already oppressed in society. Her work calls for alternatives to commercial search, such as public-interest platforms designed with different ethics, reminding us to question the idea that “just Googling it” provides an objective truth.
Algorithms don’t just shape what you see on search engines—they also influence what shows up in your social media feeds. The same topic can look very different depending on where you search and what data platforms use to personalize your results. This activity will help you experience these differences firsthand and reflect on how algorithms affect the visibility of information across platforms.