Grief and the AI Split
But where I think Amodei’s remarks, quoted above, are facile is that it hasn’t played out as simply that lines of code that would have been written by human programmers are now generated by AI models. That’s part of it, for sure. But what’s revolutionary — a topic I’ve been posting about twice already today — is that AI code generation tools are being used to create services and apps and libraries that simply would not have been written at all before. It may well be that the total number of lines of code that will be written by people today isn’t much different from the number of lines of code that were written by people a year ago. But there might be 10× more code generated by AI than is written by people today.
Before AI, both camps were doing the same thing every day. Writing code by hand. Using the same editors, the same languages, the same pull request workflows. The craft-lovers and the make-it-go people sat next to each other, shipped the same products, looked indistinguishable. The motivation behind the work was invisible because the process was identical.
Now there's a fork in the road. You can let the machine write the code and focus on directing what gets built, or you can insist on hand-crafting it. And suddenly the reason you got into this in the first place becomes visible, because the two camps are making different choices at that fork.
I wrote about this split before in terms of my college math and CS classes: some of us loved proofs and theorems for their own sake, some of us only clicked with the material when we could apply it to something.
[…]
Here's what I notice about my grief: none of it is about missing the act of writing code. It's about the world around the code changing. The ecosystem, the economy, the culture. I think that's a different kind of loss than what Randall and Lawson are describing. Theirs is about the craft itself. Mine is about the context and the reasons why we're doing any of this.
Orchard’s fine essay examines a philosophical divide within the ranks of talented, considerate craftsperson developers. The divide that I’m talking about has been present ever since the demand for programmers exploded, but AI code generation tooling is turning it into an expansive gulf. The best programmers are more clearly the best than ever before. The worst programmers have gone from laying a few turds a day to spewing veritable mountains of hot steaming stinky shit, while beaming with pride at their increased productivity.
There are additional groups. Some good programmers don’t use AI for coding. They’re against it for philosophical reasons, or it doesn’t apply well to their current project, or they haven’t taken the time or built the skills to really make it work for them. There are also hobbyists and people who know some programming, but are not really professional programmers, who are successfully using AI to help build tools for themselves.
Pssst… most software development was slop before AI.
Sure, your code was beautiful and artisanal and custom crafted to be beautiful, performant, and not at all created to be shipped in a fortnight between countless meetings with unclear goals and zero understanding of the bigger picture or user needs.
But those other folks…
I have so much gratitude to people who wrote extremely complex software character-by-character. It already feels difficult to remember how much effort it really took.
This is such a completely different reality from where I live, at this point it’s just difficult to say anything meaningful about it at all.
Previously:
3 Comments RSS · Twitter · Mastodon
People are upset about advances in AI leading to staffing reductions, layoffs, and AI-washing of the same.
The thing I keep coming back to, and I wish more people would discuss openly, is why wealthy companies can over-hire, fire, hire younger and cheaper, replace with machines, etc. with impunity, at least in the US.
The business of digital software and design has for decades put the responsibility of continuous education on the employee, often without any stipend or other compensation, much less clear expectations or career advancement as a reward. This is notably different from many other professions.
Some people think all regulations are bad. Not all regulations are bad. I like clean air and water. Companies will happily accept dirty air and water if the existence of poison in the environment increases profit.
We have laws that require companies to pay into unemployment benefits funds at the state level to prevent their staffing decisions from overburdening ex-employees in the short term to the extent such burdens become a burden on the state. This makes sense.
These benefits (which are paid by the companies and are not welfare, regardless of opinions on that topic) are paid out at a fraction of wages often not enough to cover frugal living expenses and typically run out after months, not years. Recently unemployed tech workers are experiencing periods of unemployment of one to two years, or even more. Current hiring and firing practices have rendered the unemployment benefits system largely insufficient.
Imagine a world where wealthy tech companies are required to provide compensation (time and money and access) to continuing education. Imagine a world where companies are required to pay into retraining funds, similar to unemployment benefit funds — more if they choose not to retrain workers before laying them off, mirroring unemployment benefits surcharges for companies with excessive layoffs.
Tech has always favored the young, healthy (read: return to office), and those willing to work 80 hours per week for 40 hours salary. This trend is accelerating with AI, and we know even these ideal workers are working more for less, leading to more stress, burnout, and health issues. Many young tech workers do not plan to stay in the industry longer than about a decade; get in, get as much as you can, and get out. At what point is that unsustainable? Perhaps never, because robots?
What is happening to laid off tech workers nearing retirement is particularly disheartening. For many, being five or even ten years from retirement age can mean retraining is a higher cost to incur than simply taking the hit and having less as they exit the workforce. Combined with where social security is heading, this is particularly concerning. Even those who saved and invested well may end up with a much different standard of living late in life, and no doubt some will die earlier as a result.
Simply put, the current benefits and risks are asymmetrical to such a high degree that it's practically criminal. But those are "just evolutionary changes in the business of tech" and there is "nothing we can do about it".
We are not adapting fast enough in this rapidly changing landscape. Segue into discussion on the future of work (or not), UBI, the value of a human and what it means to be one, etc.
What an exciting and terrifying time to be alive.
Perhaps a bit more narrowly scoped comment regarding craftsmanship:
Individual craftsmanship in any industry eventually gives way to industrial automation, by and large.
I lament this for our profession, in general but mostly due to how quickly the transformation is taking place. That said, it seems that a lot of people fighting the change are in denial that it has already happened and there is no going back. Most of those willing to die on the hill of craftsmanship without AI will be left behind in droves, at least in terms of professional employment.
As alluded to in my previous, I strongly believe companies should be compelled to assist in providing a path forward for these people. I also recognize that these people's resistance to AI-driven change seems to work against this. Though, I do wonder how many would resist less if the rug wasn't being pulled out from under them economically and otherwise.
I've said before here that I'm a sys admin rather than a software developer, so I come at this from a different perspective.
I've loved computers since I was young, but math has always held me back. I understand the concepts as they apply but my brain just can't handle the math. Which was fine for me because I came up at exactly the right time that the computers could do the math for me, as long as I understood the concepts to properly apply it.
I realize it's a complex topic for many reasons, but application-wise, this is how I see AI. Increasingly the computer is not there to do what I want it to, it's there to push me towards what the few companies who control most technology want me to do.
The point about software being slop before AI is spot on. At some point it's not about how the software got there, it's about what the objective of the software is. Corporate software is not about what the users want anymore, it's not even about what's technically best. It's only about the money. Software companies are no longer founded by people who care about technology, they are founded and run by people who care about money. They could be selling anything. They're sales people and executives and accountants. The software is more like the latest model car than about carefully crafted information tools.
My point is this: giving the people who actually have to use the software the power to create more software can be a good thing. Of course the same corporations who have been offshoring and cutting every possible corner to increase their personal wealth are going to take advantage of the latest way to essentially legally defraud the company. That's just what they do.
Of course the people who were paying others to write their papers or otherwise cheating will continue cheating/paying their way through college, getting cushy jobs through connections, and generally draining society. AI didn't do that. Like the internet, it's just accelerating it.
We need more true direct democracy on the internet, in software, and in general. This could at least be a way to democratize software development. And those few developers who already knew how to make good software will stand out because the end product will be better. And as we know, "better" has absolutely nothing to do with commercial success.