There's a new documentary on Netflix called The Social Dilemma that chronicles problems exacerbated by social media platforms like Facebook, Twitter, and YouTube. It's generated some buzz outside of the tech community where the topics it focuses on have been stewing for years, and offers damning testimonies from industry insiders. I've tried to summarize the most salient points below.

  • Social media companies' customers are advertisers. Their users' attention is their product.
  • It is in social media companies' interest that social media be addictive. More attention paid by users equals more revenue opportunities.
  • On many social media platforms, automated algorithms selectively place content in users' feeds. Ultimately, the objective of these algorithms is to increase user "engagement", or the amount of attention users pay.
  • Social media platforms track users' engagement with content so that algorithmic selection can be individually optimized.
  • Algorithmic selection of content often reflects users' inherent biases because content that triggers those biases is more likely to increase engagement.
  • Because a key element of social media is facilitating connections between people via the content they post, users are more likely to connect with those who share their biases. These connections inevitably strengthen those biases.
  • Many users do not realize that others are not necessarily shown the same content as themselves.
  • Content that elicits emotional reactions is proven to command more attention and is thus more likely to be selected by algorithms to show to users. Such content more often represents extreme/polarizing opinions than sober facts. It also tends to be shorter in form factor, not offering in-depth analysis.
  • The more attention people pay to information on social media platforms, the less they tend to pay to other information sources.
  • Content creators may favor emotionally charged content to leverage algorithmic bias and attract more views/likes/upvotes. This can help them build a following and influence users' ideas and behaviors.
  • Content creators may be profit-motivated if the social media platform offers a share of advertising revenue generated by their content. To users, profit-motivated content may be indistinguishable from other content.

The combination of the above factors increases the risk that social media users' ideas and behaviors will be manipulated toward strongly biased agendas. Additionally, users' biases are more likely to become entrenched and more difficult to overcome. This can strain relationships between people who don't share biases and inhibit the ability or will to compromise.

These negative impacts on society and individuals are not immediately obvious. Specific instances of manipulation can be hard to measure or perceive, but cumulatively they cause big problems like political and social polarization. The roots of such problems are akin to those of climate change; many will deny them because they are not obvious or because the incentive to benefit from them outweighs the perceived risk. Social media's negative impact is like boiling a frog. We risk not admitting that we're in trouble until it's too late.

What can be done?

Unfortunately, short of industry regulation, combating this problem comes down to individual choices.

Quitting social media (or avoiding starting to use it) is both simplest and hardest thing to do. This may not be pragmatic for some as their social circle relies on these platforms to stay connected and plan logistics (what social media is supposed to be for). For those who cannot quit, it would be best to use social media only to interact with loved ones; feeds of public content should be avoided.

When trying to become informed about something, information found through social media sources should be taken with large grains of salt. It's better to seek out credible, well-researched, objective sources of information. It's also important to be skeptical of information that is delivered via other sources that depend on advertising revenue, information that causes an emotional response, and information provided by sources that tailor content to individual preferences or behaviors. Fox News and CNN, for example, frequently use emotional response as a tactic for engagement with news. Google personalizes its search results, even if you are not logged in. Not to mention that it also gets most of its revenue from advertising.

Another thing is remembering to ask questions, even if you're just asking yourself. Taking information at face value is bound to do more harm than good.

Finally, spreading the word is crucial! Tell your friends and family to check out The Social Dilemma, share this blog post, write your own blog post, take action in other ways. The more people understand these problems, speak up, and change their usage habits, the more likely real change becomes.