Technopoly by Neil Postman

Technopoly by Neil PostmanTechnopoly by Neil Postman

I don’t know what took me so long to get around to reading Neil Postman’s work, but here we are. Postman’s penultimate book, Technopoly, is almost unnervingly prescient.

I have a complicated history with technology (as we currently think of it). I didn’t get on the tech train at the first stop. I wasn’t one of those futzing around with Geocities in the late 90s. But I do recall four specific experiences that got me on board.

The first was discovering Linux, wiping Windows from my eMachines computer and installing Ubuntu in 2005. The freedom it gave me to tinker, to bend the computing experience to my will, was intoxicating. Around the same time, I started learning enough basic HTML and CSS to customize my Myspace page and felt the same thrill at manipulating the code and seeing the changes reflected on the screen.

On the consumer hardware side, two launches got me fully on board. It wasn’t the iPhone but the iPod that first captivated me. Instead of lugging around a 350-page CD binder, I carried all my music in a device smaller than a credit card. Carrying 1,000 songs in my pocket made me feel godlike.

And what the iPod was to music, the Kindle was to books. Suddenly I could carry an entire library around with me—and a bookstore, too. I still remember the first time a friend recommended a book, and within 60 seconds I had pulled out my Kindle, purchased the book, and started reading.

The future was wild.

I’ve since become disillusioned, not by technology per se, but by the particular version of the technology industry that’s allowed to flourish under neoliberal capitalism. It’s monopolies and gamification and enshittification all the way down.

I lacked any kind of historical scaffolding for my disillusionment until I came across a piece by digital technology critic LM Sacasas. Writing in his newsletter, Sacasas detailed how the prevailing narrative of Christianity as the backdrop for the rise of Western culture missed something huge: a parallel religion, no less influential but much sneakier. The religion of technology.

There was a point at which the two—the technological project and Christianity—were inseparable in their goals. I’ll skip the details here, but the point is that in early America, technology was created in service of something, namely religion. This was the age of Providence.

But at a certain point, technology shed its role as religion’s co-conspirator and became a religion itself. This was the age of Progress.

Crucially, Progress still had a goal in mind. Technologies were created for a purpose, usually to maximize human flourishing (ostensibly, anyway).

Then came the final transition, to the age of Innovation. Technology was no longer a means to an end. Innovation itself was the goal, which led us to where we find ourselves now:

Perhaps most importantly, however, I would argue that the religion of technology was always fundamentally unstable. Technology is a means to an end. The moment it became an end in itself, that is to say, the moment technology became the dominant partner in the religion of technology and took up the role of civil religion, at that moment our present moment became inevitable.

That inevitability is the thread running through Postman’s Technopoly.

Early in the book, Postman argues that technology has always been limited by the ideology of the culture it’s created in and by, even when by technology” we mean something as simple as early humans making tools:

In any case, theological assumptions served as a controlling ideology, and whatever tools were invented had, ultimately, to fit within that ideology. We may say, further, that all tool-using cultures from the technologically most primitive to the most sophisticated are theocratic or, if not that, unified by some metaphysical theory. Such a theology or metaphysics provides order and meaning to existence, making it almost impossible for technics to subordinate people to its own needs.

Technology operates within the ideological framework of its creators, and can’t escape those constraints. Until relatively recently, technology was deployed as a means to further entrench that ideological framework, or to support it in some way.

So what happens when technology becomes the means and the end? Postman again (writing in the early 1990s, mind you):

Attend any conference on telecommunications or computer technology, and you will be attending a celebration of innovative machinery that generates, stores, and distributes more information, more conveniently, at greater speeds than ever before. To the question What problem does the information solve?” the answer is usually How to generate, store, and distribute more information, more conveniently, at greater speeds than ever before.” This is the elevation of information to a metaphysical status: information as both the means and end of human creativity. In Technopoly, we are driven to fill our lives with the quest to access” information. For what purpose or with what limitations, it is not for us to ask.

Sound familiar? The quest for efficiency for efficiency’s sake is so deeply engrained in our culture that to question is it is heresy. (The implications for the current AI discourse are obvious, but that’s for another time.)

So we march on, chasing the dream of perfect efficiency. The messiness of human life is an obstacle to that goal, so we ignore aspects of it that can’t be quantified. There is simply too much information, so we must reduce it to something intelligible.

Schools were the first technology to accomplish that goal, through curriculums. Bureaucracy was the next innovation in information reduction.

In principle a bureaucracy is simply a coordinated series of techniques for reducing the amount of information that requires processing. Beniger notes, for example, that the invention of the standardized form, a staple of bureaucracy, allows for the destruction” of every nuance and detail of a situation. By requiring us to check boxes and fill in blanks, the standardized form admits only a limited range of formal, objective, and impersonal information, which in some cases is precisely what is needed to solve a particular problem.

The process of eliminating extraneous information to produce something useful and/or intelligible isn’t necessarily pernicious. Our brains do it every day; without that capacity, we’d be too busy processing every individual blade of grass we walk by to do anything else.

But someone has to decide what information to include and what to omit, and it’s here where we start to run in to trouble. Using medicine as an example, Postman explains how medical technology shapes how doctors approach their work:

Technology is not a neutral element in the practice of medicine: doctors do not merely use technologies but are used by them. Second, technology creates its own imperatives and, at the same time, creates a wide-ranging social system to reinforce its imperatives. And third, technology changes the practice of medicine by redefining what doctors are, redirecting where they focus their attention, and reconceptualizing how they view their patients and illness.

And who decides what information to omit and what to measure? The creators of the technology being used. Spotify, for example, could choose to use albums as the app’s organizing principle. That decision might lead to the inclusion of critical reviews of albums or artist commentary as central, rather than peripheral, components of the app.

Of course, that doesn’t serve Spotify’s interests very well. Their goal is to maximize the number of listener hours, and the more efficient way to do that is to make the decision as to what to listen to for you. Open the app and press play. We know what you like.

Music streaming is a relatively benign example, but a recent ProPublica story shines a light on what can happen when we extend this line of reasoning to more consequential areas like prisoner parole.

The state of Louisiana now uses an algorithm to determine who among the prison population is eligible for parole, and who is not. According to ProPublica, the algorithm does not take into account efforts prisoners make to rehabilitate themselves. Instead, it focuses on factors that cannot be changed.”

This is the future we’re building: a world where the purpose of a technology is secondary to its ability to efficiently organize information, if it has a purpose at all.

Instead, we fetishize gathering gargantuan amounts of information to feed to algorithms to sort through it for us, and cede our own power to it in the process. Doing so presupposes that the most meaningful and intransigent problems we face can be solved by simply acquiring more information for the machine to sort through. Postman again:

The computer argues, to put it baldly, that the most serious problems confronting us at both personal and public levels require technical solutions through fast access to information otherwise unavailable. I would argue that this is, on the face of it, nonsense. Our most serious problems are not technical, nor do they arise from inadequate information. If a nuclear catastrophe occurs, it shall not be because of inadequate information. Where people are dying of starvation, it does not occur because of inadequate information. If families break up, children are mistreated, crime terrorizes a city, education is impotent, it does not happen because of inadequate information. Mathematical equations, instantaneous communication, and vast quantities of information have nothing whatever to do with any of these problems. And the computer is useless in addressing them.

Efficiency may be the answer to some questions. But increasingly, we don’t even get to ask questions for which efficiency or technology isn’t the answer. And when we stop asking those questions, we limit the possible worlds we can build to those that technology (and its makers) build for us.

What might it mean to instead pick up the hammer and build for ourselves again?

May 26, 2025

But does it yearn?

Writing that stops me in my tracks tends to come in two flavors. The first is an intensely aesthetic experience, the prose itself being almost heartbreakingly beautiful. The second feels like it unlocks some hidden recess of my psyche, as if that particular construction of language is a key to a very specific corner of my soul.

And then there is writing that does both:

Eroticism, then, is about gravity, not about gratification, the pull toward something that reorients the self. And this gravity manifests not only in romantic or sexual relationships, but in every domain where desire dares to disrupt duty. A woman leaves a secure career to pursue sculpture because it makes her pulse quicken, not because it makes sense. A man moves across the world for a language he barely speaks. A reader becomes a writer because a single sentence broke them open. These are erotic decisions: driven not by logic, but by the magnetic logic of the soul. They often look like madness to the outside world — irresponsible, inexplicable, dramatic — but they are the precise moments where life, real life, begins to throb.

I went to bed last night with these words dancing in my mind, and when I woke up, the shadow of the rhythm still echoed. It’s a cliché to say that by the time we hit middle age, we get bogged down in the minutiae of life and forget what all this is for. But there’s truth in the cliché.

This conception of eroticism—as a yearning that is inevitable and immortal—strikes me as the best description of what animates humanity as any.

It’s also the very thing missing from AI.

In his latest post, Oliver Burkeman gets at a concept adjacent to eroticism, which he calls aliveness:

The concept that sits right at the heart of a sane and meaningful life, I’m increasingly convinced, is something like aliveness. It goes by other names, too, none of which quite nail it — but it’s the one thing that, so long as you navigate by it, you’ll never go too far wrong. Sometimes it feels like a subtle electrical charge behind what’s happening, or a mildly heightened sense of clarity, or sometimes like nothing I can put into words at all.

Burkeman points out that none of the current futures being laid out by tech bros feature this aliveness:

Most obviously, aliveness is what generally feels absent from the written and visual outputs of ChatGPT and its ilk, even when they’re otherwise of high quality. I’m not claiming I couldn’t be fooled into thinking AI writing or art was made by a human (I’m sure I already have been); but that when I realise something’s AI, either because it’s blindingly obvious or when I find out, it no longer feels so alive to me. And that this change in my feelings about it isn’t irrelevant: that it means something.

Much ink has been spilled in trying to articulate exactly what’s missing from the so-called intelligence that’s being foisted upon us (with or without our consent). That lack is what we feel sitting heavy in our stomachs when we think of machines taking on human roles like caregivers, resumé readers, insurance claim adjusters, or even partners.

I can think of no better way to frame that lack than the capacity for eroticism. Humans yearn. It’s what we do, when we strip away all other pretenses. (It’s also precisely this yearning that an author can gift to a machine in a work of fiction to make us think of that machine as more human, and thus worthy of our empathy. It has to want.)

AI can help us solve certain problems, especially deterministic ones. But as Burkeman points out, we’re also being sold a future where AI doesn’t stay in its deterministic lane, where it solves” problems like loneliness.

But to solve a problem so fundamental to the human experience requires other humans, because longing is what defines us. We yearn for acceptance. We yearn to be seen. We yearn, most of all, for each other.

AI will almost certainly help us develop new cures for disease. It will help us alleviate traffic, minimize food waste, increase supply chain efficiency.

But there are some problems AI is not equipped to solve, and we forget that at our own peril. For some problems, we need humans, which means we need to build infrastructure to bring us closer. We need to yearn, together.

May 17, 2025

Jia Tolentino captures the vibe

It feels like an impossible task to describe the vibe of 2025. Gesturing broadly became a thing because how can you possibly sum up what’s happening, let alone how you feel about it, in a few words? Everything is happening, all at once, and it’s mostly bad (and indescribably stupid).

But Jia Tolentino put it all into words:

Fake images of real people, real images of fake people; fake stories about real things, real stories about fake things. Fake words creeping like kudzu into scientific papers and dating profiles and e-mails and text messages and news outlets and social feeds and job listings and job applications. Fake entities standing guard over chat boxes when we try to dispute a medical bill, waiting sphinxlike for us to crack the code that allows us to talk to a human. The words blur and the images blur and a permission structure is erected for us to detach from reality—first for a moment, then a day, a week, an election season, maybe a lifetime.

May 15, 2025

What did we trade for a frictionless life?

Why I quit ____” is not a new genre, but its staying power is telling. In the early 2010s, when techno-optimism ran high and we were only just beginning to realize the scope of Silicon Valley’s destructive powers, these confessions felt vaguely counter-cultural.

Technology was a drug, and while we were all riding that high, some were beginning to notice the side effects. We see the costs much more clearly today, now that the euphoria has worn off.

Still, these takes can be useful—arguably more so since software has eaten the world. Back then, the warnings came mostly from millennials who knew—or at least felt, like a fresh itch—what might be lost. Like me, they grew up in an analog world. They spent their childhoods twirling phone cords around their fingers and lugging around a binder filled with 300 CDs everywhere they went.

As digital natives, Gen Z doesn’t have the luxury of that comparison. The algorithmic world is all they’ve ever known, so they’re mostly unaware that the water they swim in is tainted with the residue of dark patterns and surveillance capitalism and enshittification, like the microplastics that line our guts.

It’s with all this in mind that I read Kyle Chayka’s piece on why Spotify’s latest redesign pushed him over the edge:

A tweak to an app’s landing page may seem minor; what’s the big deal if it takes an extra click or two to get to your library of albums? But such inconveniences have rippling effects; if albums are harder to get to, then over time they become less important as units of online listening.

Music is often held up as the industry most thoroughly mutilated by software. Its ubiquity, its cultural impact, and its role in shaping our identities make it easy to see how changes to our listening habits are changes to us.

But I fear we risk losing sight of those changes, which is why I’m still grateful for takes like these. I still miss my CD binder, and I’m not the only one. Writing about the loss of the music store snob, Lisa Kholostenko describes how the physical nature of the CD binder impacted the way we moved through the world:

The physicality of that media felt crucial, like a talisman or a rib: a document of sonic runes to someone’s teenage psyche, where every small drama felt like a Greek tragedy. It wasn’t just music. It was a film score for the cinematic masterpiece of Being Sixteen and Misunderstood. You could flip through the binder like a priest reading entrails, divine the shape of someone’s desperate yearning to be both hidden and seen. Today, the equivalent would be asking to browse someone’s phone (a request so egregious it borders on social contract violation) or cyber-peeking at their Spotify, which I famously endorse.

The binder had pageantry. The binder bled.

The CD binder, the music store snob, the six-disc CD changer: The physical nature of music and how we interacted with it left an imprint in a way that no algorithm, no frictionless software experience can, the same way reading about how to swing a baseball bat and actually swinging a baseball bat are two entirely different experiences.

I suspect this is why concerts can feel like quasi-religious experiences today. That was always true to a degree, but it feels even more pronounced now, because we are desperate to feel our music in our bodies, for it to leave the digital realm and meet us in meatspace.

We need to be reminded of what we lost if we’re ever going to reclaim it. There’s no going back, of course, but the resurgence of vinyl record stores and physical book sales tell us something important: frictionless software is all well and good, but what we didn’t realize back then is that we need friction. We need experiences that remind us that we’re physical bodies moving through a physical world.

The same goes for all software. In another piece, Tara McMullin reminds us that the same flattening effect that erodes our relationship to music changes how we work, too:

Do you see your software? Do you see how it influences how you run meetings, brainstorm ideas, fulfill your responsibilities, and communicate with others? Do you see how its text boxes, radio buttons, tabs, search results, and menus train you to think?

There’s a world in which we learn this lesson, where we take the best digital has to offer and harmonize it with our physical needs. But first we have to be sufficiently aware—and sufficiently tired—of the tradeoffs that convenience and frictionlessness have demanded of us.

April 25, 2025

The Bureau of Nutrition and Home Economics

This has me all kinds of nostalgic. For NOEMA, John Last describes a government agency whose sole function was to find new ways to live a better life.

They shared what they found with private companies, which incorporated those findings to build better products:

The ideal kitchen sink, for a working kitchen of any reasonable size, has two basins. On the right, it’s a shallow five inches, a comfortable depth for washing dishes. On the left, it’s deeper — eight inches, perfect for rinsing down fresh fruits and vegetables. A wire drying rack is sized to fit the deeper basin; a cupboard behind the faucets secrets away soaps and sponges.

You’re unlikely to find this sink design in most modern houses. But it is the fruit of decades of diligent government research conducted primarily by a little-known agency known as the Bureau of Human Nutrition and Home Economics. From 1923 to 1962, the bureau deployed mass public surveys, built experimental houses and conducted research into hundreds of consumer products from textiles to meats to kitchen sinks, all to deduce scientifically the best possible way to live a middle-class life in midcentury America.

This gets at a very deep-rooted problem in contemporary America: distrust of government. This type of government entity isn’t possible today because the government is seen as inherently wasteful, and who is the government to tell private companies how to build better products?

It wasn’t perfect, but this level of cooperation between the public and private sectors feels like a dream. Sometimes we forget what we’ve been deprived of.

April 23, 2025

I’m blogging again

I’ve decided it’s time to start (link) blogging again, though I reserve the right to go on the occasional rant.

This is mostly (if not entirely) for my own benefit. Making sense of the world is an increasingly sisyphean task. We’re drowning in information, so when a writer or thinker helps me make sense of it, it feels like coming up for air.

Those moments are invaluable to me, and I’d like a place to record them, to give them some weight and permanence. So this place is for me, but I hope some of these moments of reinvigoration help others feel restored, too. At least on occasion.

April 23, 2025

View the archives