“Less Trust, More Truth” said the black nylon drawstring bag in the cardboard box. I had to have one. No other item of swag at this crypto-conference articulated as boldly what some blockchain advocates imagine the technology to do. It was precisely claims that blockchain, and in particular, Bitcoin, could obviate the need for trust that drew me to take interest in this new buzzy area of technodeterminist fantasy coming out of Silicon Valley in 2018. Bitcoin, the founding white paper claimed, would permit anonymous actors to transact with one another without needing to trust each other nor some exploitative “third party.” Bitcoin—a blockchain built by an ingenious combination of cryptographic protocols and decentralisation—would replace trust with algorithmic mechanisation. Yet, I hadn’t encountered the desire for less trust coupled with the imperative for more truth. Moreover, what did they actually mean with these terms? I had to have one of these bags.
The bag was an item of swag being handed out by employees of Polkadot, a blockchain platform and cryptocurrency designed to permit cross-chain operability.
I elbowed my way across the exhibition space within the Sports Castle, the sprawling venue in Denver’s Capitol Hill neighbourhood which hosted the 2022 edition of ETHDenver, a hackathon and conference sponsored by the Ethereum Foundation. The bag was an item of swag being handed out by employees of Polkadot, a blockchain platform and cryptocurrency designed to permit cross-chain operability, which sought to expand the utility and usability of the numerous existing blockchains. The slogan came from an utterance made by one of Polkadot’s co-founders, 42-year-old Englishman Gavin Wood, in an interview with Wired in November 2021. Wood was also a co-founder of Ethereum, the blockchain system—and reigning alternative to Bitcoin—at the heart of the event in Denver. In the interview, Wood was asked to describe web3, a term he coined to describe the supposed next iteration of the Internet based on blockchains and decentralisation. When the interviewer asked Wood to explain his slogan, Wood noted that trust was “bad in itself” as it required individuals to be dependent on “arbitrary authorities” who might abuse their power. Truth, in Wood’s understanding, is achieved when you can be certain that your expectations will be met. As a result, blockchain-based systems, Wood proposed, could permit individuals to avoid having to make themselves vulnerable to others, and could provide certainty that the system would work as expected.
In a related effort to proselytise, Emre Surmeli of the web3 Foundation, also founded by Gavin Wood, gave a talk titled “Less Trust More Truth” in which he defined truth as “the verifiable state of the network.” While this definition, too, seems alien to common understandings of “truth,” it at least mobilised the adjective “verifiable.” This becomes interesting when we note that there have been other instances outside of blockchain when trust is contrasted against something else, some other quality or practice. In 2015, for example, when President Obama announced the nuclear agreement with Iran he stressed that this agreement was “not built on trust, but on verification.” In this speech, verification was posited as something more robust, more certain than mere trust that another party to the agreement would uphold their commitments. Verification—the process of establishing the truth, accuracy, or validity of something, in this case, the state and status of Iranian nuclear technologies and materials— was posited as the rational basis for the agreement.
Obama’s phrase was a reference to an earlier moment in nuclear disarmament negotiations in the late Cold War when Ronald Reagan was advised by Suzanne Massie to use the Russian rhyming proverb “trust but verify” (Doveryay, no proveryay) in his interactions with Gorbachev. The phrase has been cited and recycled on numerous occasions, most recently by Trump’s Secretary of State Mike Pompeo in a 2020 speech at the Richard Nixon presidential library in which he noted that in dealing with China US allies needed to “distrust and verify.” I bring up these sayings that contrast trust against other qualities or practices to suggest that they might display a trajectory from a moment in which trust still has some kind of redeemable role to play in efforts to coordinate actors who ostensibly have no reason to trust each other.
Reagan’s use of the proverb acknowledges trust as a necessary quality in the matter of nuclear arms control negotiations, but pairs this quality with practices of verification that create more certainty, if not “truth.” Obama’s slogan, however—perhaps symptomatic of an increasingly suspicious geopolitical moment and political culture—dispenses with trust altogether, preferring the (presumably) cold hard facts of verification to a situation of vulnerability in which the US might be taken advantage of by villainously treacherous Iranians. Finally, Pompeo’s remixing clearly reflects hawkish hostility towards most foreign nations that in some way threaten US primacy.
Nuclear arms control and nuclear nonproliferation negotiations between nation states might appear only distantly related to the interaction order of an Internet with anonymous participants. But both situations address that distinctly modern condition, how to “cope with the freedom of [unknowable] others,” as sociologist Niklas Luhmann (Luhmann 1979) has famously described it. As Luhmann and others have noted, this risk is dealt with through institutions that govern and regulate society in order to render dealings within it more predictable, more knowable. “Systemvertrauen” (system trust) is what Luhmann calls our generalised reliance that things will go on as expected. Trust in established institutions is more difficult to achieve when the issue is a contestation of those established institutions (the nuclear order in the case of nuclear verification, or the extant techno-economic order as in the case of web3). Hence, the desire for verification—for producing a stable, reliable, immutable truth.
The Truth Machine is what journalists Michael J. Casey and Paul Vigna titled their popular book about blockchain’s ostensibly revolutionary potential to transform society. From their constructivist perspective, blockchain, as ledgers which were “essentially a digitised, objective rendering of the truth” (Vigna and Casey 2018, 30) would allow for the production of “consensus, a common understanding on what we take to be the truth” (Vigna and Casey 2018, 34). As the decentralised, disintermediated, transparent, and mathematically-verifiable version of double entry bookkeeping, blockchains could be “a tool upon which society can create the common stories it needs to sow even greater trust” and “build social capital” lost with the financial crisis of 2008 (Vigna and Casey 2018, 34).
As one of the central normative affects of modernity, trust is hard to dispense with.
In a context of fake news and science scepticism, the desire for some kind of stable, reliable truth that corresponds to a shared reality is hardly surprising. What is more difficult to fathom for everyone except a hard core of trustless “maximalists” is the abandonment of trust. Anthropologist Matthew Carey, in his ethnography of mistrust in the Moroccan High Atlas, notes drily that there seems to be “no concept that so federates the disparate caucuses of modernity as trust” (Carey 2017, 1). As one of the central normative affects of modernity, trust is hard to dispense with. Alas, it is more common for scholars such as legal scholar Werbach to argue that far from being “trustless” blockchain was in fact constructing a “new architecture of trust,” more robust and more reliable than existing economic and legal institutions as long as it could be productively integrated with existing legal regimes. Scholars more critical of blockchain than Werbach have bemoaned the desire for trustlessness as due to either rabid right-wing politics (Golumbia 2016), as a lamentable form of commodification (Bodó 2021), or as a failure to understand the basis of the semiotic (Weichselbraun 2021) or the social order (Semenzin and Gandini 2021).
Yet, I have come around to my own realisation that when blockchain acolytes claim that the system gets rid of trust, they are not quite right and not quite wrong. Yes, the system does not actually rely on “trust” (understood as an affective situation in which one makes oneself vulnerable to a largely unknown other). Rather, blockchain’s algorithmic mechanism produces “confidence”—an expectation in relatively predictable outcomes (see Luhmann 1988 for an attempt to distinguish between confidence and trust). And, this distinction is analytically useful because it avoids “essentializing” trust as an “ontological aspect of social existence” (Seligman 1997, 8) and allows us to be more specific in describing how social coordination is permitted through different kinds of institutions, practices, and technologies without introducing the moral baggage of the term “trust.”
In understanding blockchain as a technology to produce confidence, I find convincing legal scholar de Filippi’s (De Filippi, Mannan, and Reijers 2020) argument that blockchain is a confidence machine. Confidence, she and her co-authors note following Seligman (Seligman 1997), is the attitude that things will go as expected. It is akin to Luhmann’s Systemvertrauen and is produced by and through institutions working as they were designed to, producing anticipated outcomes. Predictability is thus an important quality on which confidence is based. De Filippi et al. note that blockchain produces confidence because it “creates shared expectations with regard to the manner in which it operates, and the procedural correctness of its operations” (De Filippi, Mannan, and Reijers 2020, 2). In the case of Bitcoin, if I participate in “mining” new Bitcoin blocks with sufficient computing power, I can expect to be rewarded with a set amount of Bitcoins.
Predictability is an important quality on which confidence is based.
Web3 is not just Gavin Wood and his sociotechnical imaginary of “fully automated algorithmic governance” (Groos 2020). It is populated by a diverse set of actors, many of whom do realise that even the most technically technical systems are made by humans and are thus subject to human error and vagaries as the website “Web3 is Going Just Great” chronicles. Nevertheless, we can note that there seems to be a shared enthusiasm for solving perennial problems of human interaction with the assistance of what Lorraine Daston in her new book calls “thin rules”: automatic, mechanised, free of human interference, epitomised by the algorithm (Daston 2022). The algorithm’s appeal lies in its mechanistic predictability, an appeal which has a relatively short history. Daston argues that the desire for “thin rules” emerges in a society that has become suspicious of judgement and discretion (what Wood seems to identify as “arbitrary authority”). She shows that “thick rules” full of context and caveats, accompanied by the ability to discern have fallen out of fashion.
Over the course of a few days in Denver, thousands of participants—mostly young people in their twenties and early thirties, with dyed hair, in edgy fashions—were feverish with the excitement of a new dawn. Hundreds of concurrent talks and round-tables enunciated the conviction that as part of this technological development they were standing at the precipice of a revolutionary transformation of the existing social, political, and economic order. Groups of hackathon participants sought to solve various technical problems posed by the challenge of anonymous, decentralised, disintermediated online communication. At the same time as they were building this brave new digital world, the in-person conference (after so many months of Covid confinement) was evidence that any kind of world-building project benefits from the collective effervescence which can be most efficiently generated by the messy analog co-presence of bodies in place—a time-tested means of social coordination.
Bibliography
Bodó, Balázs, The Commodification of Trust (May 11, 2021). Blockchain & Society Policy Research Lab Research Nodes 2021/1, Amsterdam Law School Research Paper No. 2021-22, Institute for Information Law Research Paper No. 2021-01, Also available here: SSRN.
Carey, Matthew. 2017. Mistrust: An Ethnographic Theory. Chicago: HAU.
Daston, Lorraine. 2022. Rules: A Short History of What We Live By. Rules. Princeton University Press.
De Filippi, Primavera, Morshed Mannan, and Wessel Reijers. 2020. “Blockchain as a Confidence Machine: The Problem of Trust & Challenges of Governance.” Technology in Society 62 (August): 101284.
Golumbia, David. 2016. The Politics of Bitcoin: Software as Right-Wing Extremism. Minneapolis: University of Minnesota Press.
Groos, Jan. 2020. “Crypto Politics: Notes on Sociotechnical Imaginaries of Governance in Blockchain Based Technologies.” In Data Loam: Sometimes Hard, Usually Soft. The Future of Knowledge Systems, edited by Johnny Golding, Martin Reinhart, and Mattia Paganelli, 148–70. De Gruyter.
Luhmann, Niklas. 1979. Trust and Power. Ann Arbor, Mich: John Wiley & Sons Inc.
—1988. “Familiarity, Confidence, Trust: Problems and Alternatives.” In Trust: Making and Breaking Cooperative Relations, edited by Diego Gambetta, 94–107. Oxford: Blackwell.
Seligman, Adam B. 1997. The Problem of Trust. Princeton University Press.
Semenzin, Silvia, and Alessandro Gandini. 2021. “Automating Trust with the Blockchain? A Critical Investigation of ‘Blockchain 2.0’ Cultures.” Global Perspectives 2 (1): 24912.
Vigna, Paul, and Michael J. Casey. 2018. The Truth Machine: The Blockchain and the Future of Everything. St. Martin’s Publishing Group.
Weichselbraun, Anna. 2021. “In Code We Trust: On the Semiotics of Blockchain.” Kuckuck 36 (1/21): 62–65.
Featured image by the author.