What do you think?
Rate this book
400 pages, Hardcover
First published September 6, 2022
When you see a post expressing moral outrage, 250,000 years of evolution kick in. It impels you to join in. It makes you forget your moral senses and defer to the group's. And it makes inflicting harm on the target of the outrage feel necessary - even intensely pleasurable.
The platforms also remove many of the checks that normally restrain us from taking things too far. From behind a screen, far from our victims, there is no pang of guilt from seeing pain on the face of someone we've harmed. Nor is there shame at realising our anger has visibly crossed into cruelty. In the real world, if you scream expletives at someone for wearing a baseball cap in an expensive restaurant, you'll be shunned yourself, punished for violating norms against excessive displays of anger and for disrupting your fellow restaurant-goers. Online, if others take note of your outburst at all, it will likely be to join in.
But as the Valley expanded its reach, this culture of optimisation at all costs took on second-order effects. Uber optimising for the quickest ride-share pickups engineered labour protections out of the global taxi market. Airbnb optimising for short-term rental income made long-term housing scarcer and more expensive. The social networks, by optimising for how many users they could draw in and how long they could keep them there, may have had the greatest impact of all. "It was a great way to build a startup," Chaslot said. "You focus on one metric, and everybody's on board [for] this one metric. And it's really efficient for growth. But it's a disaster for a lot of other things."
Even its most rudimentary form, the very structure of social media encourages polarisation. [...] Facebook groups amplify this effect even further. By putting users in a homogeneous social space, studies find, groups heighten their sensitivity to social cues and conformity. This overpowers their ability to judge false claims and increases their attraction to identity-affirming falsehoods, making them likelier to share misinformation and conspiracies. "When we encounter opposing views in the age and context of social media, it's not like reading the newspaper when sitting alone," the sociologist Zeynep Tufekci has written. "It's like hearing them from the opposing team while sitting with our fellow fans in a football stadium... We bond with our team by yelling at the fans of the other one."
The social platforms had arrived, however unintentionally, at a recruitment strategy embraced by generations of extremists. The scholar J.M. Berger calls it 'the crisis-solution construct'. When people feel destabilised, they often reach for a strong group identity to regain a sense of control. It can be as broad as nationality or as narrow as a church group. Identities that promise to recontextualise individual hardships into a wider conflict hold special appeal. You're not unhappy because of your struggle to contend with personal circumstances; you're unhappy because of Them and their persecution of Us. It makes those hardships feel comprehensible and, because you're no longer facing them alone, a lot less scary.
The problem, in this experiment [on Facebook misinformation], wasn't ignorance or lack of news literacy. Social media, by bombarding users with fast-moving social stimuli, pushed them to rely on a quick-twitch social intuition over deliberate reason. All people contain the capacity for both, as well as the potential for the former to overwhelm the latter, which is often how misinformation spreads. And platforms compound the effect by framing all news and information within high-stakes contexts.
[In 2018] Zuckerberg [...] riffed on the nature of free speech: "I'm Jewish, and there's a set of people who deny the Holocaust happened. I find that deeply offensive. But at the end of the day, I don't believe that our platform should take that down, because I think there are things different people get wrong. I don't think that they're intentionally getting it wrong."
It was vintage Silicon Valley. If Zuckerberg was willing to sacrifice historical consensus on the attempted extermination of his forebears for the sake of a techno-libertarian free-speech ideal, then so should everybody else. And, like many of the Valley's leaders, he seemed to be living in an alternate universe where platforms are neutral vessels with no role in shaping users' experiences, where the only real-world consequence is that somebody might get offended, and where society would appreciate the wisdom of allowing Holocaust denial to flourish.
When asked what would most effectively reform both the platforms and the companies overseeing them, Haugen had a simple answer: turn off the algorithm. "I think we don't want computers deciding what we focus on," she said. She also suggested that if Congress curtailed liability protections, making the companies legally responsible for the consequences of anything their systems promoted, "they would get rid of engagement-based ranking." Platforms would roll back to the 2000s, when they simply displayed your friend's posts by newest to oldest. No AI to swarm you with attention-maximising content or route you down rabbit holes.
Her response followed a reliable pattern that has emerged in the years I've spent covering social media.