The Gap
What we lose when we abstract away the hard parts of our work
Just this week, I saw an article in the New York Times Magazine shared in a Telegram group for developer educator professionals. The piece, by Clive Thompson, is called "Coding After Coders: The End of Computer Programming as We Know It." I read the whole thing in one sitting, and then I sat with it for a while, because something about the framing kept bothering me.
The article is excellent reporting. Thompson interviewed more than 70 software developers at Google, Amazon, Microsoft, and small startups. The picture he paints is consistent: coders no longer code. They talk. They describe what they want in plain English, and AI agents write the software. One developer said he's "the most prolific coder on the team" despite never writing a single line by hand. Another said the productivity boost is "20X." A veteran called it "like we've been walking our whole lives" and now they've been given a ride.
The article frames this transformation through the lens of "abstraction," a concept with deep roots in computing history. Each generation of programmers has automated away the tedious parts of the previous generation's work. Assembly gave way to Python. Python gave way to open-source libraries. And now libraries give way to plain-language conversation with an AI. It's progress. It's inevitable. It's the next logical step.
So why does something about it feel off?
The Thing Behind the Thing
The philosopher Albert Borgmann spent decades thinking about what happens when technology makes things easier. In his 1984 book, Technology and the Character of Contemporary Life, he introduced what he called the "device paradigm." The idea is deceptively simple: modern technology tends to split things into two parts. There's the commodity (the thing you want: the warmth, the music, the meal) and there's the machinery (the furnace, the speaker, the microwave). As technology advances, the machinery becomes invisible. You just get the commodity.
A fireplace, Borgmann argued, is not a device. It's what he called a "focal thing." Building a fire requires skill, attention, and physical engagement. It structures your evening. It pulls people toward it. The warmth it provides is inseparable from the act of making it. A central heating system, by contrast, is a device. You set a thermostat. Warmth appears. The machinery is hidden in a basement. You don't think about it. You don't gather around it.
Borgmann wasn't anti-technology. He wasn't suggesting we all go back to chopping wood. His point was subtler: when we hide the machinery, we also hide the engagement. And engagement, the focused attention that a difficult task demands from us, is where a surprising amount of meaning lives.
Reading Thompson's article through Borgmann's lens, something clicks. The developers who are "weirdly jazzed" about AI writing their code are celebrating the arrival of a very powerful device. The commodity (working software) now appears with minimal friction. The machinery (the act of writing, debugging, struggling with code) has been abstracted away. But that machinery wasn't just machinery. For many of these people, it was the focal thing itself.
What Work Feels Like
In 1974, Studs Terkel published Working, a collection of oral histories from people across the American economy: steelworkers, waitresses, dentists, gravediggers, executives. The book's subtitle captures its real subject: "People Talk About What They Do All Day and How They Feel About It." Not what they produce. How they feel.
What Terkel found, again and again, was that people's relationship to their work was rarely about the output. A stonemason didn't talk about the wall. He talked about the way the chisel felt in his hand, the satisfaction of a joint that fit perfectly, the knowledge that the work would outlast him. A piano tuner described listening for overtones the way a painter watches for light. The meaning wasn't in the product. It was in the process, in the quality of attention the work demanded.
Terkel also found something darker. When that engagement was taken away, when the work became rote, mechanical, or disconnected from the worker's judgment, people didn't just feel less productive. They felt less real. "Most of us have jobs that are too small for our spirit," one of his subjects said.
Thompson's article captures this tension without quite naming it. He quotes one Apple engineer who says: "I didn't do it to make a lot of money and to excel in the career ladder. I did it because it's my passion. I don't want to outsource that passion." The article notes this developer is in the minority. But I wonder if the majority's enthusiasm is covering something up. When Kent Beck, a developer since 1972, describes AI's unpredictable output as "addictive, in a slot-machine way," that comparison should give us pause. Slot machines are not typically associated with deep fulfillment.
The Conversation That Isn't
One of the article's most striking observations is that coding has become "a conversation." Developers spend their days talking to AI agents, describing what they want, reviewing what comes back, scolding the agents when they misbehave. One developer writes in his prompt file that pushing code that fails a test is "unacceptable and embarrassing." Another types in frustration: "who told you there is gonna be this table? i havent created this table."
But let's be honest about what kind of conversation this is. It's not a dialogue between equals. It's not even a dialogue between a mentor and a student. It's closer to managing a very fast, very capable, occasionally delusional employee who has no idea why anything matters. The developer provides judgment, context, and intent. The AI provides speed and volume. That's a useful division of labor. But calling it a conversation is generous.
A conversation, at least as most of us understand it, involves two parties who can be changed by what the other says. You talk to a friend about a problem, and sometimes their response reframes the problem entirely. You didn't know what you thought until you heard yourself say it out loud, and then your friend's reaction showed you something you'd missed. That's not what's happening when a developer tells Claude to run its tests.
What's happening is closer to what Borgmann would recognize as the device paradigm applied to thought itself. The developer provides the intent (the commodity they want) and the AI handles the execution (the hidden machinery). The gap between wanting and having shrinks to almost nothing. And in that gap is where a particular kind of thinking used to live: the slow, frustrating, sometimes excruciating process of translating an idea into something that actually works.
What Gets Lost in the Abstraction
Ted Chiang, the science fiction writer whose stories are probably some of the most thoughtful explorations of technology and consciousness being written today, once made a distinction that feels relevant here. He argued that the struggle of composition isn't a bug to be optimized away. It's the thing itself. The output matters, but the process of getting there is where the understanding forms.
Software developers have always known this intuitively. The reason debugging is so painful is the same reason it's so valuable: it forces you to understand your own system deeply enough to find where your assumptions broke down. That understanding doesn't just fix the bug. It makes you a better engineer. It gives you the intuition that lets you design better systems next time.
Thompson's article acknowledges this concern, mostly through the voice of junior developers. Pia Torain, only two years into her career when her company told her to start using AI, noticed her coding skills weakening after just four months. "If you don't use it, you're going to lose it," she said. The article treats this as a generational worry, something the old guard doesn't share because they've already banked decades of expertise. But I think the concern runs deeper than skills.
What's at stake isn't just whether the next generation can write a for-loop by hand. It's whether they'll develop the kind of judgment that only comes from having struggled with the material directly. You can't assess code quality if you've never felt the difference between elegant and clumsy in your own fingers. You can't spot a bad architecture if you've never lived inside the consequences of one. The senior developers in Thompson's article aren't just using knowledge they accumulated years ago. They're using taste. And taste comes from direct engagement with the medium, not from managing agents who engage with it for you.
A Question Worth Sitting With
I don't think AI-assisted coding is going away. I don't think it should. The article makes a compelling case that it's enabling things that genuinely matter: small businesses that could never afford custom software can now have it. Developers can clear through backlogs that have been growing for years. Boring, repetitive code gets written in minutes instead of days.
But I keep coming back to Borgmann's fireplace. The question isn't whether central heating is better than a fire. By most practical measures, it obviously is. The question is what kind of life you build around each one. A thermostat doesn't structure your evening. It doesn't pull your family into the same room. It doesn't give you something to tend.
Thompson's article ends with a line that's meant to sound expansive: "Abstraction may be coming for us all." He means it as a description of the future of white-collar work, and he's probably right. But the word "abstraction" has a double meaning that the article doesn't quite reckon with. To abstract something is to remove it, to pull it away. When we abstract away the hard parts of our work, we might also be abstracting away the parts of ourselves that were formed by doing hard things.
The developers who are thrilled about their new productivity are measuring something real. But Studs Terkel would have asked them a different question. Not "how much more can you build?" but "how does it feel to build it?" The answer to that question matters more than most of us are comfortable admitting. And right now, in the rush to celebrate what AI gives us, it's a question that's barely being asked.


