Bad Algorithms Didn’t Break Democracy

As Girard had it, we are defined and constituted as a species by our reliance on imitation. But we are not mere first-order mimics: When we ape what someone else does, or covet what someone else has, we are in fact trying to want what they want. “Man is the creature who does not know what to desire, and he turns to others in order to make up his mind,” Girard wrote. “We desire what others desire because we imitate their desires.” Unable to commit to our own arbitrary wants, we seek to resemble other people—stronger, more decisive people. Once we identify a model we’d like to emulate, we train ourselves to make the objects of their desire our own.

The emotional signature of all this imitation—or mimesis—is not admiration but consuming envy. “In the process of ‘keeping up with the Joneses,’ ” Thiel writes, “mimesis pushes people into escalating rivalry.” We resent the people we emulate, both because we want the same things and because we know we’re reading from someone else’s script. As Girard would have it, the viability of any society depends on its ability to manage this acrimony, lest it regularly erupt into the violence of “all against all.”

Around the time of that 2004 symposium, Thiel was making a $500,000 investment in a small startup called The Facebook. He later attributed his decision to become its first outside investor to the influence of Girard.

“Social media proved to be more important than it looked, because it’s about our natures,” he told The New York Times on the occasion of Girard’s death in 2015. “Facebook first spread by word of mouth, and it’s about word of mouth, so it’s doubly mimetic.” As people like and follow and dilate on certain posts and profiles, the Facebook algorithm is trained to recognize the sort of people we aspire to be, and obliges us with suggested refinements. The platforms are not simply meeting demand, as Zuckerberg would have it, but they’re not really creating it either. They are, in a sense, refracting it. We are broken down into sets of discrete desires, and then grouped into cohorts along lines of statistical significance. The kinds of communities these platforms enable are ones that have simply been found, rather than ones that had to be forged.

As the critic Geoff Shullenberger has pointed out, Facebook’s cultivation of these communities—structured by constant and simple mimetic reinforcement—is only half of a story that gets considerably darker. Girard spent the later decades of his career elaborating how, in myth and ancient history, human societies purchased peace and stability by displacing the bad blood of mimetic rivalry into violence against a scapegoat. “The war of all against all culminates not in a social contract but in a war of all against one,” Thiel writes, “as the same mimetic forces gradually drive the combatants to gang up on one particular person.”

Ancient religions, Girard argued, advanced rituals and myths to contain this bloodthirsty process. And Christianity, a religion centered around the crucifixion of an innocent scapegoat, promised transcendence of the entire dynamic with the revelation of its cruelty. (Girard was a professed Christian, as is Thiel.)

The problem, as Thiel sees it, is that we now live in a disenchanted age: “The archaic rituals will no longer work for the modern world,” he wrote in 2004. The danger of escalating mimetic violence was, in his view, both obvious and neglected. His concern at the time was with global terrorism in the wake of September 11, but later it seems he also came to worry about resentment toward the investor class in an age of growing inequality. In a set of notes published online in 2012 by the coauthor of Thiel’s book Zero to One, Thiel identifies tech founders as natural scapegoats in the Girardian sense: “The 99% vs. the 1% is the modern articulation of this classic scapegoating mechanism.”