Manipulative algorithms and addictive design: Summing up what’s wrong with social media

Social media’s twofold problem: manipulative algorithms and addictive design

After posting a five-part article on social media reform, as well as publishing a couple of editorials on this topic (one highlighting the problem with social media, the other looking at solutions to this problem), I thought it would be helpful to sum up these thoughts. So, let’s recap what’s wrong with several social media platforms (and by wrong, I’m referring to ethical concerns). Simply put, there’s a twofold problem: manipulative algorithms and addictive design.

Social media platforms, with manipulative algorithms and addictive design
Manipulative algorithms and addictive design are what run many of today’s social media platforms (Image Source: Ibrahim.ID / CC BY-SA 4.0 via Wikimedia Commons)

Problem 1: Manipulative algorithms

From online extremism that fueled the 2021 Capitol attack to vaccine misinformation that worsened the coronavirus pandemic, it’s obvious that social media platforms like Facebook and Twitter have amplified a lot of harmful and misleading information. To suggest these platforms opened up a can of worms would be an insult to worms. Indeed, there seems to be no shortage of outrageous, erroneous content going viral on social media nowadays.

To date, social media companies have responded by trying to screen and remove objectionable posts or lies. It’s a practice commonly known as content moderation. To be sure, this approach is often necessary to prevent egregious content that incites violence (a.k.a. dangerous speech). And yet, trying to moderate all other kinds of outrageous, erroneous content (such as hate speech and fake news) has remained an endless game of whack-a-mole.

To understand why, consider what the content moderators are up against: the social media algorithms that distribute content online. On one hand, these algorithms make platforms like Facebook and Twitter incredibly effective for advertising and sharing. On the other hand, those algorithms make social media practically impossible to moderate.

That’s because the algorithms spreading content on social media work much faster than the content moderators trying to screen and remove it, including extreme or false info. To see why content moderators are fighting a losing battle, consider what social media algorithms are designed to do.

What social media algorithms are designed to do

  • First, hijack your attention the second you log on to social media. For example, place click-bait ads or viral content into your social media feed.
  • Next, collect all the data you happen leave traces of online. In addition to your comments and messages, such data includes your ‘likes’ and the amount of time you spend scrolling and glancing at ads or content.
  • Then, sell access to your data to advertisers and outside parties. In turn, advertisers and outside parties can use that info to target you with even more ads or content, especially if those ads and content keep you ‘liking’ and scrolling endlessly through your social media feed.

By monitoring people’s online behavior, harvesting their private data, and manipulating what they see online (a form of manipulation known as “surveillance capitalism”), social media algorithms work 24/7 to keep users addicted to ‘liking’ and scrolling through endless ads and content. Which brings us to the second problem with social media.

Problem 2: Addictive design

As mentioned, social media algorithms manipulate what people see online in order to keep them addicted to ‘liking’ and scrolling through nonstop ads and content. Hence, it’s not just the manipulative algorithms that are a problem but also the addictive design of social media. For instance, two examples of addictive design are the ‘like’ button and the infinite scroll.

What ‘like’ buttons and infinite scrolling are designed to do

The ‘like’ button is a clear example of an addictive design feature. The thrill—and disappointment—of constantly checking to see how many ‘likes’ you receive on social media is an activity that easily becomes addictive. It’s not unlike the high and low feelings you can get from gambling.

If you’re young, feeling vulnerable, or craving social affirmation, this thrill and disappointment can feel particularly distressing. If your posts aren’t ‘liked’ by others, the lack of validation might feel like social denunciation, as if nobody cares to hear what you have to say.

In fact, much research shows that the distress induced by social media has been harmful to teens, especially girls. Moreover, social media companies like Facebook have conducted research that reveals similar findings (even though Facebook downplayed those harmful effects).

Likewise, social media platforms may become addictive because they work like a game that never ends. Specifically, we’re referring to the infinite scroll feature—like the Facebook News Feed or Twitter timeline. The infinite scroll (as well as similar features like ‘auto refresh’) can easily give rise to “doomscrolling.” Doomscrolling refers to endlessly and mindlessly scrolling online, with no end in sight.

Perhaps because it’s such a blatant form of addictive behavior, it’s no secret that doomscrolling negatively affects our mental health. As digital minimalists like Joshua Fields Millburn have pointed out, “scrolling is the new smoking.” 

Problems with manipulative algorithms and addictive design

Unfortunately, a problem with manipulative algorithms and addictive design is that they can incentivize lots of outrageous, erroneous content. After all, what manipulates users on social media and keeps them addicted to ‘liking’ and scrolling’ is often what’s outrageous. And what’s outrageous needn’t have any relationship to what’s true.

The result is a society vulnerable to bad actors, including cult personalities who spread extremism and misinformation online. Indeed, as documentaries like The Social Dilemma illustrate, the manipulative algorithms and addictive design of social media helped bring us to this precarious situation.

Ways to mitigate manipulative algorithms and addictive design

Therefore, if we want to mitigate extremism and misinformation on social media, it’s clear that we need to change the manipulative algorithms and addictive design that characterize several social networking sites. The good news is that this problem is solvable. For instance, here are of couple ways we can mitigate the problem of manipulative algorithms and addictive design.

Put a check on manipulative algorithms by implementing personal privacy and data protection rules

Regarding manipulative algorithms, we can implement laws and policies that enhance personal privacy and data protection. These rules could put a reasonable check on how social media algorithms monitor people’s online behavior, harvest their private data, and manipulate what they see online.

For example, personal privacy and data protection rules could limit how much private data gets collected by social media companies. These rules could also restrict how the data can be accessed or used by advertisers and outside parties. Ultimately, the goal of personal privacy and data protection is to give individuals more control over their own private data.

Eliminate addictive design features on social media

Regarding addictive design, social media companies like Facebook or Twitter can redesign their platforms by getting rid of the addictive features, such as the ‘like’ button and infinite scroll.

Surely, redesigning social media in this way would be a step in the right direction. Instead of platforms that addict users to outrageous, erroneous content, we could have platforms that help people find credible, accurate information.

Why care about redesigning social media by changing manipulative algorithms and addictive design?

Before wrapping up this discussion, let’s preempt a final question. Why care about reforming or redesigning social media by changing the manipulative algorithms and addictive design? Well, just think about this topic from the standpoint of Internet freedom and individual control, two things we all surely value.

Manipulative algorithms vs. internet freedom

As long as we lack personal privacy and data protection online, we’re letting manipulative algorithms monitor our online behavior, harvest our private data, and determine what can see (and what we can’t see) on the Internet. In other words, there is no Internet freedom—in particular, freedom to control our own data and online information—without personal privacy and data protection.

Addictive design vs. individual control over technology

Likewise, there is no individual control over technology if it’s addictive by design. Of course, if someone is addicted to a product, service, or technology, it’s possible (but probably not reasonable) put the full burden of overcoming that addiction on the user. For instance, if someone feels addicted to cigarettes, we could say, “Just stop smoking!” Likewise, if someone feels addicted to social media, we could say, “Just stop using social media!”

Then again, the point of acknowledging such behavior as addictive is to realize it’s out of someone’s control. (Or, at the very least, it’s out of someone’s control to a significant degree—otherwise, it wouldn’t be addictive.) In other words, if something is addictive by design, it’s unreasonable to put the onus of overcoming addition solely on the addict. The professionals responsible for that addictive design also bear responsibility to redesign it in the best interest of the user.

 

2 thoughts on “Manipulative algorithms and addictive design: Summing up what’s wrong with social media”

  1. Hello,

    This is a good sum up of some of the main problems with social media.

    The reason I’m making this comment is to ask you if you think the best stance to hold towards social media (YouTube included), granted how it works today, is to fully abstain. What do you think?

    Furthermore, I’m wondering if you think that social affirmation is an inevitable need human beings have. You write, for instance, that “if your posts aren’t ‘liked’ by others, the lack of validation might feel like social denunciation, as if nobody cares to hear what you have to say.” This, of course, will matter to you and it will have an effect on you, only if you value and care about others caring to hear what you have to say. Only if you depend your own sense of worth on the opinion of others; of all the others. I agree that this is the case today for a lot of people — to a degree, maybe for all people — but do you think it is inevitable?

    I know that this is kind of an anthropological – philosophical question, but essentially I’m wondering if the whole ‘like’ feature inevitably has negative consequences because of how we are, or if it can be helpful, in the sense of being informative for instance.

    Reply
    • Thanks for your perceptive thoughts and questions!

      Regarding whether or not the best stance toward social media is to fully abstain, my own belief is that people have to decide for themselves. While my decision was to abstain from most social media (such as Facebook, Twitter, Instagram, etc., although I make an exception for LinkedIn), I also recognize that many people may need to use these platforms for business purposes or network effects. So at this point, I think the idea of just telling people not to use social media is a nonstarter. On an individual level, people have to decide for themselves … and many may be better off without social media! But on a social level, it seems more constructive to figure out how to reform these technologies for the better.

      Your second question gets at an incredibly important (indeed, philosophical and anthropological) point: the need for social affirmation and interpersonal connection. Yes, I do think it’s clear that human beings are social animals with a need for interpersonal connection and affirmation. When it comes to social media, however, the question about connection and affirmation revolves around the issue of quantity versus quality. For instance, the ‘like’ button is all about quantity. Sure, it delivers an expedient dopaminergic high (not unlike the feeling one gets from the thrill of gambling or doing drugs), but it does not result in social fulfillment. For real connection and affirmation, we need quality—i.e., real conversation. The key, then, is likely to redesign these platforms to reward more quality conversation and not mere quantity of ‘likes.’

      Reply

Leave a Comment