Facebook is disputing the claims made in The Social Dilemma, an investigative documentary from Netflix that explores the ways social networks including Facebook are built to be addictive, drive polarization and promote misinformation.

At the core of Facebook’s rebuttal, “What ‘The Social Dilemma’ Gets Wrong,” is that the documentary offers a sensationalist view of how social platforms work via insights and commentary from former employees of tech giants who haven’t been on the “inside” for many years.  

Against a backdrop of perturbed users who say they’ve considered deleting their Facebook and Instagram accounts after watching the documentary, Facebook is seeking to absolve itself of any wrongdoing by outlining the steps it’s taken in recent years to quell critics’ complaints. 

The amount of time people spend on social media has only increased since the pandemic, with 48 percent of global consumers saying they’re using social media more. 

According to filmmakers, this addiction is the direct result of social companies like Facebook building features that aim to increase users’ time spent on its products.

Facebook’s response: “Instead, we want to make sure we offer value to people, not just drive usage.” To do this, it says in 2018 it changed its ranking for news feed to show meaningful social interactions over things like viral videos. Further disputing this claim, Facebook says it gives users control over how they use its products through time management tools like an activity dashboard and notification limits.

Facebook slams the documentary for calling its algorithm “mad,” on the basis that all consumer-facing apps use algorithms to improve the experience for users, noting: “That also includes Netflix, which uses an algorithm to determine who it thinks should watch ‘The Social Dilemma’ film, and then recommends it to them. This happens with every piece of content that appears on the service.”

Users have long expressed concerns over the role Facebook has played in spreading political misinformation and hate speech and interfering with elections. To that end, Facebook says misinformation that could lead to imminent violence, physical harm and voter suppression is “removed outright,” adding that in Q2 it removed over 22 million pieces of hate speech and over 100,000 pieces of content across Facebook and Instagram that violated its voter interference policies.

Addressing the documentary’s claim that social media platforms fuel political division, Facebook argues that news from polarizing pages represent a “tiny percentage” of what most people see on Facebook.

Critics might say otherwise, as an internal memo from Facebook’s head of virtual reality (VR) and augmented reality (AR), Andrew Bosworth warned employees not to “use the tools available to us to change the outcome of the 2020 election.” In addition, Bosworth credits Facebook’s advertising tools for Trump’s election victory, brushing off the role played by Russian interference and the Cambridge Analytica scandal.

In 2018, Facebook sought to make political ads more transparent when it created an ad library that makes all ads on Facebook visible to users, even if they don’t see the ad in their own feed. Social issue and election ads are then labeled and archived in that ad library for seven years.

While Facebook took a clear stance against the claims made by the documentary, the internet was mixed in its reactions. Some were quick to delete their social media profiles while others challenged the film’s credibility, with one user saying:

Another use pointed out the documentary’s lack of a solution to society’s growing dependence on social media: