Technopoly by Neil Postman

Technopoly by Neil PostmanTechnopoly by Neil Postman

I don’t know what took me so long to get around to reading Neil Postman’s work, but here we are. Postman’s penultimate book, Technopoly, is almost unnervingly prescient.

I have a complicated history with technology (as we currently think of it). I didn’t get on the tech train at the first stop. I wasn’t one of those futzing around with Geocities in the late 90s. But I do recall four specific experiences that got me on board.

The first was discovering Linux, wiping Windows from my eMachines computer and installing Ubuntu in 2005. The freedom it gave me to tinker, to bend the computing experience to my will, was intoxicating. Around the same time, I started learning enough basic HTML and CSS to customize my Myspace page and felt the same thrill at manipulating the code and seeing the changes reflected on the screen.

On the consumer hardware side, two launches got me fully on board. It wasn’t the iPhone but the iPod that first captivated me. Instead of lugging around a 350-page CD binder, I carried all my music in a device smaller than a credit card. Carrying 1,000 songs in my pocket made me feel godlike.

And what the iPod was to music, the Kindle was to books. Suddenly I could carry an entire library around with me—and a bookstore, too. I still remember the first time a friend recommended a book, and within 60 seconds I had pulled out my Kindle, purchased the book, and started reading.

The future was wild.

I’ve since become disillusioned, not by technology per se, but by the particular version of the technology industry that’s allowed to flourish under neoliberal capitalism. It’s monopolies and gamification and enshittification all the way down.

I lacked any kind of historical scaffolding for my disillusionment until I came across a piece by digital technology critic LM Sacasas. Writing in his newsletter, Sacasas detailed how the prevailing narrative of Christianity as the backdrop for the rise of Western culture missed something huge: a parallel religion, no less influential but much sneakier. The religion of technology.

There was a point at which the two—the technological project and Christianity—were inseparable in their goals. I’ll skip the details here, but the point is that in early America, technology was created in service of something, namely religion. This was the age of Providence.

But at a certain point, technology shed its role as religion’s co-conspirator and became a religion itself. This was the age of Progress.

Crucially, Progress still had a goal in mind. Technologies were created for a purpose, usually to maximize human flourishing (ostensibly, anyway).

Then came the final transition, to the age of Innovation. Technology was no longer a means to an end. Innovation itself was the goal, which led us to where we find ourselves now:

Perhaps most importantly, however, I would argue that the religion of technology was always fundamentally unstable. Technology is a means to an end. The moment it became an end in itself, that is to say, the moment technology became the dominant partner in the religion of technology and took up the role of civil religion, at that moment our present moment became inevitable.

That inevitability is the thread running through Postman’s Technopoly.

Early in the book, Postman argues that technology has always been limited by the ideology of the culture it’s created in and by, even when by technology” we mean something as simple as early humans making tools:

In any case, theological assumptions served as a controlling ideology, and whatever tools were invented had, ultimately, to fit within that ideology. We may say, further, that all tool-using cultures from the technologically most primitive to the most sophisticated are theocratic or, if not that, unified by some metaphysical theory. Such a theology or metaphysics provides order and meaning to existence, making it almost impossible for technics to subordinate people to its own needs.

Technology operates within the ideological framework of its creators, and can’t escape those constraints. Until relatively recently, technology was deployed as a means to further entrench that ideological framework, or to support it in some way.

So what happens when technology becomes the means and the end? Postman again (writing in the early 1990s, mind you):

Attend any conference on telecommunications or computer technology, and you will be attending a celebration of innovative machinery that generates, stores, and distributes more information, more conveniently, at greater speeds than ever before. To the question What problem does the information solve?” the answer is usually How to generate, store, and distribute more information, more conveniently, at greater speeds than ever before.” This is the elevation of information to a metaphysical status: information as both the means and end of human creativity. In Technopoly, we are driven to fill our lives with the quest to access” information. For what purpose or with what limitations, it is not for us to ask.

Sound familiar? The quest for efficiency for efficiency’s sake is so deeply engrained in our culture that to question is it is heresy. (The implications for the current AI discourse are obvious, but that’s for another time.)

So we march on, chasing the dream of perfect efficiency. The messiness of human life is an obstacle to that goal, so we ignore aspects of it that can’t be quantified. There is simply too much information, so we must reduce it to something intelligible.

Schools were the first technology to accomplish that goal, through curriculums. Bureaucracy was the next innovation in information reduction.

In principle a bureaucracy is simply a coordinated series of techniques for reducing the amount of information that requires processing. Beniger notes, for example, that the invention of the standardized form, a staple of bureaucracy, allows for the destruction” of every nuance and detail of a situation. By requiring us to check boxes and fill in blanks, the standardized form admits only a limited range of formal, objective, and impersonal information, which in some cases is precisely what is needed to solve a particular problem.

The process of eliminating extraneous information to produce something useful and/or intelligible isn’t necessarily pernicious. Our brains do it every day; without that capacity, we’d be too busy processing every individual blade of grass we walk by to do anything else.

But someone has to decide what information to include and what to omit, and it’s here where we start to run in to trouble. Using medicine as an example, Postman explains how medical technology shapes how doctors approach their work:

Technology is not a neutral element in the practice of medicine: doctors do not merely use technologies but are used by them. Second, technology creates its own imperatives and, at the same time, creates a wide-ranging social system to reinforce its imperatives. And third, technology changes the practice of medicine by redefining what doctors are, redirecting where they focus their attention, and reconceptualizing how they view their patients and illness.

And who decides what information to omit and what to measure? The creators of the technology being used. Spotify, for example, could choose to use albums as the app’s organizing principle. That decision might lead to the inclusion of critical reviews of albums or artist commentary as central, rather than peripheral, components of the app.

Of course, that doesn’t serve Spotify’s interests very well. Their goal is to maximize the number of listener hours, and the more efficient way to do that is to make the decision as to what to listen to for you. Open the app and press play. We know what you like.

Music streaming is a relatively benign example, but a recent ProPublica story shines a light on what can happen when we extend this line of reasoning to more consequential areas like prisoner parole.

The state of Louisiana now uses an algorithm to determine who among the prison population is eligible for parole, and who is not. According to ProPublica, the algorithm does not take into account efforts prisoners make to rehabilitate themselves. Instead, it focuses on factors that cannot be changed.”

This is the future we’re building: a world where the purpose of a technology is secondary to its ability to efficiently organize information, if it has a purpose at all.

Instead, we fetishize gathering gargantuan amounts of information to feed to algorithms to sort through it for us, and cede our own power to it in the process. Doing so presupposes that the most meaningful and intransigent problems we face can be solved by simply acquiring more information for the machine to sort through. Postman again:

The computer argues, to put it baldly, that the most serious problems confronting us at both personal and public levels require technical solutions through fast access to information otherwise unavailable. I would argue that this is, on the face of it, nonsense. Our most serious problems are not technical, nor do they arise from inadequate information. If a nuclear catastrophe occurs, it shall not be because of inadequate information. Where people are dying of starvation, it does not occur because of inadequate information. If families break up, children are mistreated, crime terrorizes a city, education is impotent, it does not happen because of inadequate information. Mathematical equations, instantaneous communication, and vast quantities of information have nothing whatever to do with any of these problems. And the computer is useless in addressing them.

Efficiency may be the answer to some questions. But increasingly, we don’t even get to ask questions for which efficiency or technology isn’t the answer. And when we stop asking those questions, we limit the possible worlds we can build to those that technology (and its makers) build for us.

What might it mean to instead pick up the hammer and build for ourselves again?

May 26, 2025


Previous:But does it yearn?