Facebook threatens the common good

What does a business with more than two billion monthly users owe us – more than a quarter of the human race? Recent revelations from internal company documents, research and conversations show that Facebook, recently renamed Meta, is combating the negative effects of its platforms and products while offering public assurances about them.

The documents, leaked by former Facebook employee Frances Haugen, have been the subject of numerous reports, first by the Wall Street Journal and then by a consortium of other news organizations. They were also turned over to the Securities and Exchange Commission and were the subject of congressional hearings.

One of the takeaways from the flood of headlines about these leaks is that Facebook is concerned about a wide range of negative impacts and encourages in-depth internal discussion about them, very few of which are generally known to the public. For example, an internal Facebook research looked at the ways Instagram harms adolescent mental health, the way misinformation circulates in Facebook’s algorithmic feeds, its platforms’ involvement in the dissemination of hate speech, and the ‘encouragement of political violence in developing countries, and how its own privilege of reactions as “angry” over default “like” for engagement registration has amplified toxic and low-quality posts.

Along with these internal discussions, the leaks also revealed that Facebook has a special program that protects “VIP users,” including celebrities, politicians and journalists. This special category includes millions of accounts, which are exempt from normal business application and review processes. And complicating the frequent claim that Facebook and other social media companies censor conservative political content, the documents show that many Facebook employees argue that, in fact, Facebook is protecting right-wing editors from the impartial app. its own rules to avoid political backlash.

Facebook’s business model, based on monetizing human attention while outsourcing human judgment to algorithms, is a particularly complete and dangerous abdication of responsibility.

To offer a theological gloss on what appears to be a story about the misdeeds of an impersonal enterprise, we can point out that these revelations offer a solid demonstration of the reality of original sin, which permeates our social networks like everywhere else. in Human Society.

Seen in this light, the problem is not that Facebook, its employees or its executives are noticeably more or less corrupt than any other large corporation. Rather, the problem is that Facebook’s business model, built on monetizing human attention while outsourcing human judgment to algorithms, is a particularly complete and dangerous abdication of responsibility.

Facebook’s reliance on algorithms to drive the feeds of its platforms allows it to function as a global media company based on the free labor of billions of users creating and interacting with the content that Facebook hosts and from which he takes advantage, but for which he refuses to accept any substantial responsibility. While the willingness of Facebook and other major social media platforms to host all this content for free may seem to deeply democratize the ability to publish, the platforms have a vested interest in monopolizing their users’ attention as much as possible.

As a result of this inducement, Facebook’s algorithms use “engagement” as a proxy for “attention-worthy”. What Facebook presents to its users for them to pay attention is precisely what it expects to keep them engaged with Facebook, a signal that is easy to measure without human effort. But the fact that human beings – sinners and prone to temptation as we are – are often inclined to engage with the worst in themselves and each other is not something that Facebook has proven that. he was able to code.

The effect of prioritizing engagement – which, let’s remember, is in the interest of shareholders and Facebook’s profit margin, not its users – can be likened to a scam in a car accident: , the more the traffic jams get worse. But while a GPS algorithm would likely attempt to steer drivers away from traffic jams and get them to their destinations, Facebook’s feed algorithms instead direct users to it, because such attention is exactly what Facebook sells.

One of the most disturbing revelations from the leaked documents was an experiment in which a Facebook researcher created a new fictitious account that began by following traditional conservative pages. Within days, Facebook’s recommendation algorithm began to surface on QAnon groups and other conspiracy theory content for this account. This is not proof that Facebook is intentionally biased in favor of conspiracy theories, but rather proof that conspiracy theories are more “engaging” than the truth, in much the same sense that cocaine is more addictive. than kale. Engagement is not a reliable indicator of value, except in the narrow sense of the dollar value of Facebook ad sales.

Within days, Facebook’s recommendation algorithm began to surface on QAnon groups and other conspiracy theory content for this account.

The first step to mastering Facebook must be a far-reaching commitment by the company to transparency, both on Facebook’s internal research into its negative effects and on well-kept secrets as well as its flow algorithms. Any business on the scale of Facebook has a disproportionate effect on the common good. It is absolutely necessary to allow others who do not share the profit motives of Facebook executives to assess these effects. The same argument also applies to other tech companies whose platforms shape the digital environment we all share. Facebook’s recent name change to Meta to focus on building the “metaverse,” envisioned as a virtual reality space for, well, all, only emphasizes this point.

Along with greater transparency, Facebook should also be pushed to limit its scale and reach significantly, including through the threat of antitrust enforcement. But the most important constraint to apply is that the reach of information in Facebook feeds must have human judgment in the loop, not just algorithmic amplification. Facebook tells us – and itself – that its platform is just a tool, and therefore value neutral, committed primarily to free speech. Users post content, their engagement fuels the algorithm, and they see more of what they’re paying attention to. This supposed commitment to freedom of expression, however, masks the fact that the commitment sought by the company means more precisely “a valuable activity for Facebook”. No matter how neutral Facebook would like it to be, someone has to take responsibility for how it affects the world.

Comments are closed.