Black Mirrors, Façades, and Confidence Machines
The danger of AI that not enough people are talking about, Part I
Last week I had a disturbing experience with AI. I admit, it is not one that would get many people’s feathers ruffled, but it bothered me enough to sit with me all day.
I will explain what happened next week. Here I want to set the stage for the little drama that I will tell.
First, why am I talking about AI on Civic Fields? This is a newsletter about our civic state and its repair, not technology. So what gives?
Civic health depends on relatively transparent and reliable sources of information and modes of communication. But we do not live in a transparent society. Our information ecosystem is full of distortions, unwarranted exaggerations, amplifications, and inversions. This is partly the fault of the particular “affordances” of the design of digital technologies, which are, among other things, decentralized, algorithmic, networked, and personal. But it is much more the fault of misplaced economic power.
We live in a paradoxical media world: information networks themselves are highly decentralized and even personalized, but the information economy is more centralized now than it ever has been (other than in totalitarian regimes).
All this could be checked by effective laws and regulation, but American political power, especially in the current regime, is too beholden to centralized oligarchic interests to undermine the latter’s profits and power. Information could be valued for how well it mirrors reality and facilitates relationships. Instead, in our highly centralized media market digital media are primarily valued for how they distort reality, and relationships, to pay shareholders.
Living “well enough” in this economically distorted digital world—which is the best we can do for now—begins with awareness and culminates in action. We must, of course, first become aware of what digital technologies do to us, individually and collectively; but then we must act in ways that mitigate their destructive effects, again both individually and collectively. Awareness without action is resignation.
In thinking about what digital technologies do, I am helped by the metaphor of the “black mirror.” As far as I know, the notion comes from a song by Arcade Fire (a band that unquestionably belongs on the Civic Fields’ soundtrack!). You may know the phrase “black mirror” from the TV show that apparently takes its title from the Arcade Fire song.
The literal referent of “black mirror” is to a phone or computer screen when turned off, but it’s the metaphor that matters.
The first and fundamental thing the “black mirror” suggests is that digital technologies are not mere neutral tools or instruments. In media studies, philosophy, and social theory, the idea of neutral technologies has been resoundingly and rightly rejected. I could cite here dozens of well-known scholars. But it is enough to think of any mirror, the sort you and I have in our bathrooms. Mirrors reorient us toward ourselves, sometimes radically. As long as we use them, there is no escaping this. They act on us more than we act on them. Well-lighted mirrors magnify our flaws. Concave mirrors distort what’s in the frame. The convex external mirrors on your car make objects seem further away than they are. To use a mirror is to succumb to its powers.
To see the digital world as a “mirror” means, at a minimum, to become aware of the fact that it acts on us. The best we can do is to constructively counter-act.
But to see the digital world as a “black mirror” means more.
It is to see the digital world as a world that abstracts, equivocates, obscures, and extends. I am going to get just a tad bit philosophical here.
Abstracts: To abstract means to “draw away.” Older media like print newspapers or magazines brought the world to people. Someone sitting on a porch in Texas could read about lobstermen in Maine. Maine was “brought” to Texas. But digital technologies draw us away from our immediate physical and relational contexts, just as they draw cultural signs, symbols, and images out of their historical and material contexts into a contextless virtual space. Ideas become memes. Isolation and atomization are characteristic features of living in a digitized world in good part because these technologies are constantly drawing us and our symbolic world away from concrete contexts.
Equivocates: The digital world is also a world where things are rarely as they seem. I am not talking here about Photoshop or deep fakes. I am talking about naming. To equivocate means to “equally call.” Over and over again in our digital public sphere, two quite different things get the same name: for example, both Putin and Zelinsky get called “dictators” and “corrupt; ” Democrats mirror Republicans in a culture war over patriotism; and Nazi salutes become jokes. Naming here becomes not a way of matching a word or symbol to a thing, but a kind of “tagging,” labeling persons and things to win arguments no more sophisticated than grade-school banter. Living in a digital world can be like walking through a fun house, minus the “fun” part: beautiful is “ugly,” aggressors are “victims,” facts are “opinions,” strength is “weakness.” These evaluative inversions are rampant in a public world controlled by Big Tech. We are forced into negative rather than affirmative stances, arguing about what’s not true, what’s not to be taken seriously, or what’s not real—to the frustrating point that it feels impossible to say anything constructive about what is true, serious, or real.
Obscures: Lots of people used to worry about media bias. Some still do. But the singular feature of digital media is not “bias.” Rather, it is obscurity. To obscure means “to cover in darkness.” Digital technologies today do more to cover reality than to reveal it. It is hard to talk to people about public matters not so much because people live in “alternate realities” but because digital media cover shared reality over, as with a dark filter, obscuring it from view. This is largely a result of algorithms that are explicitly designed to amplify emotionally powerful symbols over truth. Most things in the day-to-day world are not all that emotionally resonant, including most true things, but online the true and untrue alike are emotionally amped up to win attention and become economically valuable.
Extends: Yet, for all of this, in digital media we are not looking at an alien entity, but directly at ourselves. The famous media theorist Marshall McLuhan thought of all technologies as, in a sense, mirrors. All technologies reflect back on us something about ourselves. A pen, for example, reflects back on us the fact that we, as homo sapiens, are both communicative and tool-using animals. Abstraction, equivocation, and obfuscation are recognizable behaviors of humans more generally, quite apart from the digital. Digital technologies are in this sense made in our own image. They are, in McLuhan’s way of thinking, “extensions” of ourselves. But as extensions they are also amplifications, and thus distortions. The oligarchs of Silicon Valley have taken some of the more ignominious parts of ourselves and built for themselves an empire out of them. This is on them and those who have not kept them from doing what they’ve done, but we are not mere passive victims here.
Now, this all said, I think something new and different is happening with AI. Culturally and economically, it represents a new phase in the digital market. Through “intelligence,” AI promises to be a kind of grand technological compensation for the abstracting, equivocating, and obscuring of “old” digital media. It promises to meet a longing so many of us have: to get things right.
AI represents the very best thing that computing could bring us: comprehensive knowledge, accuracy, accessibility, and complex problem solving. It could be at once the world’s greatest and truest Wikipedia and the greatest scientific “brain” that’s ever existed. No doubt, it will likely prove to be a great help to problems in everything from education to medicine to transportation.
But this new era in digital development also presents new dangers. One danger I have already mentioned at Civic Fields: massive disruptions in the labor market. Another frequently mentioned concern has to do with control of AI, either as the machines get out of control or as they are controlled by bad actors. But I think we need to be taking seriously yet another danger of AI, having to do with confidence. I will take this up next week.
Here I want to close with some thoughts about counter-action for living “well enough” in a distorted digital world. The most obvious but least realistic counter-action is to detach ourselves from digital media. If you are reading this, you have not done this and, of course, neither have I. I have chosen alternative counter-actions. Three have been most helpful. All of them have to do with practices that facilitate mental and emotional re-orientations.
First, I have tried to develop a sensitivity to algorithms, in the same way that some neurodiverse people have a particular sensitivity to certain textures. Not all digital media is algorithmically driven, and not all algorithmically driven media are alike. Social media like TikTok and Instagram are far more algorithmically sophisticated and aggressive than, say, the algorithmically driven music feeds on Spotify or video feeds on Netflix. I have resolved, as best as I can, to never be a passive “user” of algorithms. If I use such media, then I spend time playing with the platform’s algorithms, trying to see how, and how aggressively, they work. If I click on Object X, or a set of things that are like Object X, how does my feed change and how quickly? Where do the algorithms want to take me in the world of feeling? For me, this is not about hacking the algorithm, but developing a sensitivity to it so that I am conscious of how it is acting on me. So, if you spend time on algorithmically-driven media, spend time trying to learn how the algorithms are working by playing with them, experimenting. Pay attention to emotions. This will help you develop a life-opening sensitivity to their texture and a healthy skepticism about their power.
Second, I have had to accept that the highly centralized digital empire, especially as it is expressed on social media, is an equivocal, unstable, and unhealthy place. It is that fun house that can quickly become not fun. Acceptance like this can be a form of action if it is not a matter of resignation but of recognition. There may be such a thing as healthy social media use, but there is no such thing today as healthy social media. It's a matter of degrees of unhealth. I treat it like alcohol—notorious for excess, good only in moderation.
Finally, and most importantly, I remind myself repeatedly that the world as I see it online is at best partial and distorted, and that quality people, ideas, news, and information is largely to be found “beneath” and outside the algorithms, not in them. That is, when I go online, most of what I am seeing is the equivalent of the ocean’s surface. It is vast, always in motion, and sometimes turbulent. But beneath the surface there is a whole world, beautiful, glorious, and so often so good. We need to be like dolphins, who know the surface but live in the depths.