Truth-Seeking in the Digital Age

Truth-Seeking in the Digital Age


On the day of Donald Trump’s inauguration, I wandered around Washington, D.C.’s Georgetown neighborhood with my sister, and we counted the red baseball caps that, until that day, had been rare sightings in the city. With palpable discomfort, locals side-stepped clusters of people wearing the soon-to-be President’s merchandise. One fact: 90.9 percent of D.C. residents voted for Hillary Clinton.1 When the ceremony took place, I sat in a café in the shadow of the National Cathedral. Every few moments, I glanced at a TV in the corner to watch the action. Closeups of his hand on the Bible and of Melania’s sculptural gown were interspersed with shots of the crowd who had gathered to watch. Shortly thereafter, the size of that crowd became a matter of public debate, and when justifying the new administration’s exaggerated claims about its size, campaign manager Kellyanne Conway introduced the corrosive phrase “alternative facts” to the national lexicon.

Deliberately misleading information has been a part of politics at least since the Ancient Greek government of Plato’s time. His dialogue Phaedrus addresses the prominence of sophists, Athenian speechmakers who used rhetoric to draw listeners toward dubious conclusions. Instead of taking a position against rhetoric, Plato argues (through the figure of Socrates) that rhetoric should guide people toward the truth. “First,” says Socrates, “you must know the truth concerning everything you are speaking or writing about; you must learn how to define each thing in itself; and, having defined it, you must know how to divide it into kinds until you reach something indivisible.”2 The indivisible kernel at the heart of a rational argument is Plato’s truth. Yet Plato gives few suggestions for distinguishing truth from falsehood and acknowledges that even when guiding people toward the truth, some degree of manipulation may be required. One of the greatest challenges of political polarization is that all participants are passionately convinced of their own grasp on the truth. This long-standing problem intensifies in the age of digital media. 

As cultural historian Walter Ong notes, the proliferation of distinct points of view is a part of print culture; so that while deception does exist in oral cultures, as with Plato’s sophists, print allows varied opinions to proliferate widely, regardless of their relationship to the truth. It is even more difficult to combat falsehoods spread in print because, as Ong writes, “There is no way directly to refute a text. After absolutely total and devastating refutation, it says exactly the same thing as before.”3 Media theorist Marshall McLuhan is also skeptical of print, and particularly the news, as a means of communicating the truth. He writes, “The press is a group confessional form that provides communal participation. It can ‘color’ events by using them or by not using them at all.”4  Public perception of the truth, in this view, is shaped by various news sources, and the form of the news is more important than its content. 

McLuhan’s distinction between form and content can also help explain why the truth of the information on digital platforms seems irrelevant; digital content serves mostly to attract attention for advertisers. This digital advertising context forms the backdrop of internet studies scholar Safiya Noble’s argument that algorithms carry human biases. Noble finds that search engines, particularly Google, are central to information access in the digital age. She distinguishes information from knowledge, claiming that search engines offer only the former. The truth is often muddied by biases embedded in the design of search algorithms, which promote some results over others, reflecting the interests of advertisers and disregarding the dangers of misinformation.5

Noble’s writing brings to mind another moment in which I encountered a version of the truth that was radically different from my own. Driving on D.C.’s Connecticut Avenue, I passed Comet Ping Pong, the combined table tennis arcade and pizzeria where I’d celebrated my fifteenth birthday. Now, it was wrapped in yellow crime scene tape, and a crowd had gathered outside. The restaurant had recently received a deluge of threatening messages due to a conspiracy theory alleging that a child prostitution ring operated out of its nonexistent basement. That day—December 4, 2016—a man named Edgar Maddison Welch arrived at the restaurant with an AR-15 and fired three shots. Onlookers were scared but physically unharmed. Welch drove there from North Carolina to “self-investigate” the theory known as Pizzagate, and I have often wondered about his solo car ride. To undertake that trip was no small commitment, and a firm belief in his version of the truth must have kept his foot on the pedal.

Welch’s beliefs developed largely on the internet, where Pizzagate found fertile ground to spread. The theory grew from an environment of misinformation that was nurtured by Trump’s candidacy but certainly preceded it as well. As cultural critic Jia Tolentino writes, “The worldview of the . . . Pizzagaters was actualized and to a large extent vindicated in the 2016 election—an event that strongly suggested that the worst things about the internet were now determining, rather than reflecting, the worst things about offline life.”6 The shift Tolentino identifies from reflection to determination also interests Noble in her exploration of algorithmic bias. Noble refers to a 2013 United Nations campaign that responded to sexist autosuggestions offered in Google searches about women. The campaign was correct to suggest that these searches reflect endemic views, but it failed to acknowledge that Google itself has a role in determining those views. It’s not by accident, but by algorithmic design, that Google auto-suggests sexist searches and information. In fact, Noble explains, it is related to advertisers’ profit incentives.      

Noble also discusses the example of Dylann Roof.  In 2015, Roof killed nine people at Emanuel African Methodist Episcopal Church, an African American church in South Carolina and an important site of antiracist political organizing. Roof committed a violent hate crime in that church and had written openly online about his racist beliefs. He claims that Googling the phrase “black on white crime” had a formative effect on his politics.7  The story indicates that confirmation bias is embedded in Google’s search platform. After all, Roof’s biases appear in the very framing of his inquiry. Yet Google did nothing to correct or reroute those beliefs, instead reinforcing them by providing the results that Roof probably expected. This anecdote of algorithmic bias demonstrates the stakes of the need for improved regulation.

How to proceed is unclear, particularly when so much power rests with private tech companies. The approach of many companies, when faced with public pressure, has been to flag misleading content with a sort of fine-print warning. This technique, while popular, is almost comically insufficient. It’s not likely that anyone who believes firmly in a given narrative will be dissuaded by a warning from a social media platform. Michelle Obama’s beloved liberal adage “When they go low, we go high” has been the favored approach to dealing with misinformation, but sometimes it begins to seem futile. What if, instead, we fought radical right-wing conspiracy theorists with their own tools?

Absurdist and activist Peter McIndoe has addressed this question with a parody conspiracy group called Birds Aren’t Real, which he founded in 2017, shortly after Trump’s inauguration. Frustrated by the absurdity of American political life, McIndoe invented a conspiracy in which birds had been killed and replaced by government drones in the 1970s. In the world of this conspiracy theory, birds are tools of government surveillance, and they charge by perching on power lines. The movement gathered support—mostly from those who were in on the joke—and served as an outlet for frustrated young people who wanted something to laugh about in a bleak political landscape. Birds Aren’t Real is more than just a joke, however; it is also an experimental protest tactic. Members of the movement have marched alongside conservative protestors and chanted nonsensically about birds to delegitimize the protestors’ messages by revealing their absurdity.8 

Jenny Odell, a writer and artist from California, uses birds not to satirize misinformation but as an anchor to physical space. She embraces the concept of bioregionalism to articulate why place remains important, even in the digital age. Birds, for example, are native to specific bioregions which don’t obey national borders, and paying attention to bioregions can be a political act in an increasingly decontextualized world. Odell believes the truth is best approached by careful direction of our attention. In her book How to Do Nothing, she advocates for “ongoing training: the ability not just to withdraw attention, but to invest it somewhere else, to enlarge and proliferate it, to improve its acuity.”9 Attention, for Odell, is a resource which we can direct toward knowledge, if only we can preserve it from distraction. 

Web 2.0, or the social internet, abounds with distraction and has complicated our already troubled relationship with truth. The past two years of the coronavirus pandemic and surrounding misinformation have made this problem even more difficult to ignore. Now, there’s excitement about a pivot to Web 3.0, supposedly characterized by decentralization and transparency. Maybe, some argue, this transparency could combat misinformation or lessen the chokehold of major tech companies on the internet. Their efforts aim toward the construction of a new world, a digital “metaverse” where we can spend most of our time. (There are no bioregions there; nothing is really “native.”) If Facebook’s co-optation of the term “meta” is an indicator, it doesn’t seem that big tech has any plans to stay out of it.

Increasingly, enthusiasts about this new technology congregate in New York. At one such event in November, I encountered a crowd excited about cryptocurrencies and the potential of NFTs, many of which were on display that night. The attendees milled around the club, exchanging Twitter handles and opinions about the digital artwork. One screen displayed a hideous neon Last Supper scene with a giant ape crowned as Jesus. Animals and spaceships gathered around him, animated on an endless loop. Some of the partygoers spent extravagantly on champagne and watched as women in bikini tops brought it to their private tables. It looked more like a conspicuous display of wealth than a radical movement toward decentralization. The day after attending that party, I started to feel ill and thought back to the excruciating conversations I had with investors who had traveled there by plane, many of them hoping to connect with people they had previously only met online.

When I tested positive for the coronavirus, my Google searches took on a fevered intensity: “do I have a breakthrough case,” “how to get back my taste,” and “does covid cause headaches.” Millions of results answered me, and some of them helped. Others only terrified me as I attempted to regulate my uneasy breathing. Quarantining near a Delaware beach, I left my phone at home and went on halting walks. I watched sunlight hitting waves, almost painfully bright in my addled state. Shorebirds hopped in front of my feet, alien and unmistakably real. There was no need to seek out more information; I knew that I was sick. 

  1. District of Columbia Presidential Race Results: Hillary Clinton Wins,” The New York Times, Last Updated August 1, 2017.
  2. Plato, Phaedrus, Translated by Alexander Nehamas and Paul Woodruff (Hackett, 1995), 83.
  3. Walter Ong, Orality and Literacy, (New York: Routledge, 2002), 78.
  4. Marshall McLuhan, Understanding Media, (Cambridge: MIT Press, 1994). 204.
  5. Safiya Noble, Algorithms of Oppression (NYU Press, 2018).
  6. Jia Tolentino, Trick Mirror (Random House, 2019), 11.
  7. Noble, Algorithms of OppressionI, 116.
  8. Taylor Lorenz, “Birds Aren’t Real, Or Are They? Inside a Gen Z Conspiracy Theory,” The New York Times, December 9, 2021.
  9. Jenny Odell, How to Do Nothing (Melville House 2019), 93.
Back to Top