Are Social Media Algorithms Too Powerful?

Reading time

5 minutes (estimated)

Date posted

A group of people looking down at the phones they're holding.

In a world where many will spend hours scrolling through social media feeds every day, few stop to consider the unseen forces guiding them. Our experiences of social media are shaped by complex systems of code that determine exactly what content appears on our screens, this is what is meant when we refer to ‘algorithms’. These algorithms can influence everything, from the products we buy to the news we consume, and even the opinions we form. As these systems grow increasingly sophisticated, a critical question emerges:
Have social media algorithms become too powerful?

The Evolution of the Feed

Many will remember the days when most social media feeds would simply display posts chronologically. Those days are behind us. Today’s algorithms are essentially sophisticated prediction engines, analysing thousands of data points about your behaviour to determine what will keep you engaged longest.

The algorithm isn’t just deciding what you see, it’s actively shaping your digital experience based on what will generate the most engagement. Every pause on a video, every extra second spent looking at a photo, every reaction and comment becomes data that refines this personalised content delivery system.

The shift from chronological to algorithmic feeds wasn’t accidental. Platforms discovered that personalised content dramatically increases user engagement. What began as simple recommendations have evolved into powerful systems that can predict with uncanny accuracy what will capture and hold your attention.

Information Disorder book cover.

“…algorithms built into social media platforms filter and prioritize particular types of information. In this way, algorithms can even promote false or harmful information so that such information could go viral and influence people’s perception or way of seeing the world.”

– Lee, T., and C. Jia. (2023). Curse or Cure? The Role of Algorithm in Promoting or Countering Information Disorder. In Information Disorder: Algorithms and Society. Routledge, Taylor & Francis.

The Benefits: A Curated Digital Experience

It’s worth noting that algorithms can address genuine problems. With countless posts being created every day, chronological feeds risk being overwhelmed and filled with irrelevant content. Algorithmic curation helps us navigate this vast sea of information by highlighting content likely to interest us.

Key benefits include:

  • Content discovery tailored to individual interests and preferences

  • Reduced information overload by filtering out less relevant content

  • Community building around niche interests and shared passions

  • Democratised reach for small creators who can find their audience regardless of initial following

  • Targeted opportunities for businesses to reach potential customers efficiently

  • Personalised learning through content that matches your knowledge needs

  • Time efficiency by prioritising content most likely to be valuable to you

When functioning this way, algorithms can connect us with ideas, products, and people that may genuinely enrich our lives. They can help us discover new music that matches our taste, find communities that share our niche interests, or learn about opportunities we might otherwise miss.

The Downsides: Manipulation and Division

However, the power of these systems comes with significant downsides. Algorithms optimised for engagement tend to promote content that triggers strong emotional reactions. This can lead to increasingly polarised feeds that reinforce existing beliefs rather than challenging them.

Concerning aspects include:

  • Filter bubbles that limit exposure to diverse viewpoints

  • Amplification of extreme content that generates strong emotional reactions

  • Misinformation spread as sensational false content often outperforms nuanced truth

  • Addiction by design through features that exploit psychological vulnerabilities

  • Decreased attention spans from constant exposure to short-form, high-stimulation content

  • Privacy concerns from the extensive data collection required for personalisation

  • Mental health impacts including increased anxiety, depression, and social comparison

  • Manipulation vulnerability as systems can be gamed by bad actors

The ‘filter bubble’ effect means many users rarely encounter viewpoints that contradict their existing beliefs. Over time, this algorithmic reinforcement can deepen societal divisions and make constructive dialogue across these divisions increasingly difficult.

Generative AI, Media, and Society book cover.

“One of the most critical aspects of the attention economy is its connection to the rise of artificial intelligence. Embedded in everything from social media algorithms to news feeds, AI systems curate content based on user preferences, learning from each interaction to better predict and manipulate what users will engage with next”

– Feher, K. (2025). Generative AI, Media, and Society. Routledge, Taylor & Francis.

The Attention Economy

Perhaps most troubling is how algorithms have transformed attention into a commodity. Social media platforms aren’t truly free, we pay with our attention, which is packaged and sold to advertisers.

Algorithms designed to maximise ‘time spent’ often exploit psychological vulnerabilities. The endless scroll, autoplay features, and perfectly timed dopamine hits from notifications aren’t accidents, they’re carefully engineered features designed to keep us engaged for longer.

Regaining Control

Despite these concerns, we aren’t powerless against algorithmic influence. Digital literacy offers the first line of defence. Understanding how these systems work allows us to engage more intentionally with our feeds.

Practical steps include:

  • Regularly auditing and cleaning up your follow lists

  • Setting time limits for social media usage

  • Intentionally seeking out diverse viewpoints

  • Taking periodic breaks from algorithmic feeds

Some platforms have begun offering more user control in response to public pressure. Options to view feeds chronologically, mute certain topics, or see why posts are recommended represent small but important steps toward transparency.

Regulatory approaches are also emerging globally. The EU’s Digital Services Act requires more transparency around recommendation systems, while other jurisdictions are exploring similar measures to ensure algorithmic accountability.

The Routledge Handbook of Privacy and Social Media book cover.

“When users decide to share personal data, they are often not fully aware of the information analyses that may reveal more about them than is apparent at first sight (cf. the three categories of data on social media introduced in see section “Data practices of social media and risks to individuals”). Providers use algorithms to gain more information about the users, especially to learn which advertisements and which advertising strategies may be most successful.”

– Bieker, F., and M. Hansen. (2023). Consumer Privacy and Data Protection in the EU. In The Routledge Handbook of Privacy and Social Media. Routledge, Taylor & Francis.

Finding Balance

The question isn’t whether algorithms should exist, as they’re necessary to navigate today’s information landscape. Rather, we should consider who controls these algorithms, what values they embody, and how transparent they are.

The most powerful algorithms aren’t necessarily those with the most complex code, but those that strike a balance between personalisation and discovery, between engagement and wellbeing. As users, we deserve systems that serve our genuine interests rather than exploiting our vulnerabilities.

Perhaps the solution lies not in abandoning algorithmic curation, but in demanding systems that are optimised for user satisfaction rather than engagement time, meaningful connection rather than attention capture, and information quality rather than emotional reaction.

As we navigate this algorithmic age, it becomes clear that he power to shape our digital experiences shouldn’t lie solely with platforms whose financial incentives may not align with our wellbeing. By understanding these systems and demanding better, we can work toward a future where algorithms serve as beneficial tools rather than engines of manipulation.

The algorithms themselves aren’t too powerful, but the unchecked influence they currently wield may well be.

Browse our collection

For more on social media and algorithms, look to our expansive array of books on the topic.

Topics

Tags