Skip to Main Content
UNLV Logo

GEM: How Algorithms Shape What You See

This online GEM lesson is part of the Mapping the Information Landscape Microcourse.

What are algorithms and their role in what information you see?

When you type a question into Google or scroll through your social media feed, you aren’t seeing information at random. Behind the scenes, algorithms—step-by-step sets of rules and calculations—are deciding what to show you first, what to hide, and how to rank results. These algorithms consider things like your past searches, your location, what other people clicked on, and even how long you pause on a page.

In this lesson, you’ll start by learning what an algorithm is and how it works in everyday tools like search engines. The video below will give you a clear, simple explanation before we dive deeper into how algorithms shape the information you encounter.



How does Google determine your results?

According to Google, its search engine works by using ranking systems—or, algorithms—to sift through billions of web pages and organize them by relevance to your query. These systems weigh factors like the words you type, how recently a page was published, and whether other users found it useful. On top of the traditional “blue links,” Google adds features like autocomplete (to predict your search), featured snippets (to highlight a likely answer), knowledge panels (to provide quick facts), and now AI Overviews. The company frames these tools as ways to make your search results faster, more reliable, and more helpful.

At the same time, this explanation leaves out critical details. Google defines what counts as “reliable and useful,” and those decisions are influenced by business models, advertising, and systemic biases embedded in ranking systems. By stressing the impossibility of human oversight, Google shifts attention away from the fact that its engineers and policy teams still shape the rules of what rises to the top. So while the video presents search as a neutral, benevolent service, it’s important to ask: whose perspectives are amplified, whose are hidden, and who benefits from the way Google orders information?

 

Other Algorithms and How they Curate Your Feed

Search engines aren’t the only places where algorithms shape what we see. Social media and video platforms—like Instagram, TikTok, and YouTube—use recommendation systems to decide what shows up in your feed, what trends rise, and which voices get amplified. These systems are designed to maximize engagement, but they also influence culture, politics, and personal identity in powerful ways. In this section, we’ll look at how different platforms curate content and consider how those choices affect what information we encounter—and what we might never see.


πŸ‘‰ Explore Instagram's Algorithms

Read through Instagram's own 2023 blog about its ranking algorithms. Think about potential changes since that year (like gen AI) that could have affected these algorithms. 

Summary: Instagram emphasizes that it doesn’t rely on a single “master algorithm.” Instead, each part of the app—Feed, Stories, Explore, Reels, and Search—has its own ranking system tailored to how people use it. These systems weigh thousands of “signals” to decide what you’ll see first: your past activity (likes, comments, saves), details about the post itself (how popular it is, when it was posted), information about the person who posted, and your history of interaction with them. Based on these signals, Instagram predicts how likely you are to spend time on a post, like it, comment, share it, or tap on a profile—and then ranks content accordingly.

Critique: Instagram frames this personalization as a way to “maximize your experience” and give creators the chance to reach audiences more effectively. But it’s important to read this critically. The blog positions ranking as neutral and user-centered, yet the logic of prioritizing engagement inevitably benefits certain types of content—often highly visual, trendy, or emotionally charged posts that keep people on the app longer. Meanwhile, slower, niche, or dissenting content can get buried. And while Instagram claims not to “shadowban,” it admits that content can quietly become ineligible for recommendation based on opaque guidelines.

Reflection: If Instagram’s ranking systems are designed to keep you engaged, how might that goal shape not just what you see, but also what kinds of voices and perspectives you don’t?


Watch this 2023 post by Adam Mosseri (the head of Instagram) talking about the platform's algorithm.  As you watch, keep in mind: Mosseri presents Instagram’s ranking as user-centered and neutral, but his explanation deserves critical reflection. Consider what isn’t being said—how Instagram’s business model, advertising priorities, or assumptions about “engaging” content may shape what rises in your feed and what gets pushed aside.

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

A post shared by Adam Mosseri (@mosseri)


Watch this interview with Mosseri from May 2025, which gives more insight about where Instagram is this year. The interview covers a wide range of topics, but Mosseri talks bout the algorithm at the 44:16 mark


πŸ‘‰ Explore TikTok's Algorithms

Read TikTok's resources discussing the platform's recommender systems and algorithmic curation: How TikTok recommends contentHow TikTok recommends videos #ForYouTikTok Creator Academy: Search.

Summary: TikTok describes its “For You” feed as a personalized stream of videos designed to match each user’s unique interests. When you first join, TikTok may ask you to select categories (like pets or travel) to seed recommendations. After that, the system learns from your interactions—likes, shares, comments, watch time, and even skips—to predict what you’ll want to see next. It also takes into account content information (such as sounds, captions, and hashtags) and user information (like device type, language, and location). While follower counts don’t directly determine recommendations, TikTok acknowledges that creators with larger audiences are more likely to reach more viewers. Over time, your feed is continually reshaped by your choices and signals you send through engagement.

Critique: TikTok frames its recommendation system as joyful, creative, and user-driven—but there’s more going on. Optimizing for “engagement” means that content that keeps people watching—catchy, emotional, sensational, or polarizing videos—may be favored over slower or more nuanced material. TikTok does note the risk of “filter bubbles,” but this admission underplays how structural biases can amplify certain voices and marginalize others. Moreover, the system’s reliance on behavioral data—what you watch, how long you linger, who you follow—means that your personal attention is continually harvested and monetized. While TikTok highlights its safety measures and diversity efforts, it is ultimately a for-profit company whose business model depends on keeping you scrolling.

Reflection: As you think about your own TikTok use, consider: What kinds of content does TikTok prioritize in your feed? Are there patterns in the videos you see—same sounds, same aesthetics, same trending topics? Which voices and perspectives are harder to find? If TikTok’s goal is to keep you engaged, how might that shape not just your entertainment choices but also your exposure to culture, politics, or social issues?


This video by Modern Millie gives a creator’s perspective on how TikTok’s algorithm works, with practical advice for using that knowledge to grow an audience. It’s important to note, though, that Millie is also selling content strategy courses. Her focus is on how creators can optimize their videos for visibility—framing success in terms of adopting strategies that “work” with the algorithm. This perspective implicitly emphasizes homogenizing content around formulas that maximize reach and engagement, which raises critical questions: if creators all adapt to the same strategies, how does that affect creativity, diversity of voices, and the kinds of content that get visibility on the platform?


πŸ‘‰ Explore Bluesky's "Marketplace of Algorithms"

Read about Bluesky and its different approach to algorithmic curation: Algorithmic choiceAbout Bluesky.

Summary: Bluesky is a new kind of social media app that aims to work differently from platforms like Instagram or TikTok. Instead of one company controlling what you see, Bluesky lets users choose from many different “feeds” (curated timelines), often built by outside developers or communities. For example, there might be a science feed moderated by scientists, or a news feed curated by journalists. This is possible because Bluesky is built on the AT Protocol—an open framework that allows people to move their accounts, posts, and followers between different apps. In practice, this means if you don’t like the way Bluesky organizes your feed, you can switch to a different feed—or even a different app built on the same system (for example, Skylight, a short-form video app)—without losing your identity or connections. If you log in to Skylight with your Bluesky username, your followers and posts carry over automatically—so you don’t have to start from scratch.

Critique: Bluesky presents itself as giving power back to users, framing its system as a “choose your own adventure” alternative to the top-down control of Facebook, X/Twitter, or TikTok. But this openness also brings challenges. The idea of a “marketplace of algorithms” assumes that people have the time, knowledge, and interest to choose or even build their own feeds—something many users may not do. In practice, most people will still rely on default feeds, which means Bluesky (the company) still has influence over what gets seen first. Moreover, while the app resists traditional advertising models, it will still need to make money, and it isn’t clear how future business decisions might affect users’ control. Finally, while “freedom of speech, not reach” sounds empowering, it may make moderation uneven, raising questions about how misinformation, hate speech, or harmful content will circulate in this open ecosystem.

Reflection: Bluesky asks us to imagine social media without a single authority deciding what we see. That’s exciting—but it also shifts responsibility onto users to customize their own experience. Consider: Would you want to manage your own feed choices, or do you prefer the convenience of a single algorithm making those decisions for you? What might be gained—or lost—when power shifts away from centralized companies and into a patchwork of communities, developers, and individual users?


In this interview, Jay Graber—CEO of Bluesky—explains how the platform is trying to build a different kind of social media. She discusses Bluesky’s open design, where users can move their accounts and followers between apps (like Bluesky and Skylight), customize their feeds, and even build community-run timelines. Graber also addresses challenges around moderation, free speech, and how Bluesky might sustain itself financially without relying on traditional advertising models. As you watch, think about how her vision of a decentralized, “choose your own adventure” internet compares to the centralized control of platforms like Instagram, TikTok, or YouTube.


This CNBC segment from January 2025 introduces Bluesky as a fast-growing alternative to X (formerly Twitter), with millions of new users joining after shifts in other platforms. The video highlights Bluesky’s decentralized design, its promise to give users more control over their feeds and data, and the challenges of building a sustainable model without relying on ads. Since then, Bluesky has continued to grow and face new controversies, raising questions about how well its “user-first” vision holds up in practice. As you watch, consider what the CNBC piece emphasizes—and what you might want to explore further about Bluesky’s current state of affairs.
Β© University of Nevada Las Vegas