Thing is: Industrialization is about repeating manufacturing steps. You don't need to repeat anything for software. Software can be copied arbitrarily for no practical cost.
The idea of automation creating a massive amount of software sounds ridiculous. Why would we need that? More Games? Can only be consumed at the pace of the player. Agents? Can be reused once they fulfill a task sufficently.
We're probably going to see a huge amount of customization where existing software is adapted to a specific use case or user via LLMs, but why would anyone waste energy to re-create the same algorithms over and over again.
This thought-provoking essay does not consider one crucial aspect of software: the cost of a user developing a facility with a given software product. Historically monopolistic software producers can force these costs to be borne because the user has no alternative to upgrading to the latest version of, for example, Windows, or gmail, or the latest version of the github GUI. A signficant portion of the open source / free software movement is software providing stable interfaces (including for the user) so that resources otherwise spent on compulsory retraining to use the latest version of something proprietary, can be invested in configuring existing resources to better suit the user's problem domain. For example, programs like mutt or vim, or my latest discovery, talon.
I don't think the division line runs on the open-source software front here. Windows has historically offered some of the most stable APIs, meanwhile there's plenty of examples of popular open-source software with a lot of breaking changes.
I've never found a term I liked for this particular concept at the intersection of education & business so I made one up a while back:
A Knowledge Pool is the reservoir of shared knowledge that a group of people have about a particular subject, tool, method, etc. In product strategy, knowledge pools represent another kind of moat, and a form of leverage that can be used to grow or maintain market share.
Usage: Resources are better spent on other things besides draining the knowledge pool with yet another new interface to learn and spending time and money filling it up again with retraining.
As a developer for almost 30 years now, if I think where most of my code went, I would say, quantitatively, to the bin.
I processed much data, dumps and logs over the years. I collected statistical information, mapped flows, created models of the things I needed to understand.
And this was long before any "big data" thing.
Nothing changed with AI. I keep doing the same things, but maybe the output have colours.
That holds true for a tailor, even expensive clothing items eventually wear out and get thrown away. They are cared for better, repaired a few times, but in the end, disposed of. I’d say that analogy holds up for 'traditionally' created software vs. AI-created software. Handmade clothes vs. fast fashion.
I've been thinking about this for a while, and largely agree that industralization of software development is what we are seeing. But the emphasis on low quality is misplaced.
Industrial systems allow for low quality goods, but also they deliver quality way beyond what can be achieved in artisanal production. A mass produced mid-tier car is going to be much better than your artisanal car.
Scale allows you not only to produce more cheaply, but also to take quality control to the extreme.
These damn articles. Software moved into an industrial revolution when you could write in a high level language, and not in assembly. This has already happened.
The article makes this very point. From the article: “software has been industrialising for a long time: through reusable components (open source code), portability (containerisation, the cloud), democratisation (low-code / no-code tools), interoperability (API standards, package managers) and many other ways”
You either see what codex and opus are capable of and extrapolate the trendline or you don’t; the author clearly saw and extrapolated.
Not that I disagree: I’m on record agreeing with the article months ago. Folks in labs probably seen it coming for years.
Yes we’ve seen major improvements in software development velocity - libraries, OSes, containers, portable bytecodes - but I’m afraid we’ve seen nothing yet. Claude Code and Codex are just glimpses into the future.
A question that was not addressed in the article and contrasts software with industrialized products from the past is - who are the consumers of the software produced at industrial scale? Stitching of clothes by machines accelerated garment product only because there was demand and consumption tied to population. But software is not tied to population similar to food and clothes. It doesn't deprecate, it is not exclusively consumed by persons.
Another common misconception is, it is now easier to compete with big products, as the cost of building those products will go down. Maybe you think you can build your own Office suite and compete with MS Office, or build a SAP with better features and quality. But what went into these software is not just code, but decades of feedback, tuning and fixing. The industrialization of software can not provide that.
Hmm, I'm not sure I see the value in "disposable software". In any commercial service people are looking for software solutions that are durable, dependable, extensible, maintainable. This is the exact opposite of disposable software.
The whole premise of AI bringing democratization to software development and letting any layperson produce software signals a gross misunderstanding of how software development works and the requirements it should fulfill.
Yes, software needs to be secure. If we accept the premise that software is going to be churned out in bulk, then the mechanisms for securing software must evolve rapidly... I don't see a world where there is custom software for everything but all insecure in different ways.
But I think the important part of this is the reach that the Industrial Revolution had. Consumer facing software, or the endusers who were able to "benefit" from the Industrial Revolution, and individual needs for all of these mass produced goods.
The important thing is that goods =/= software. I, as an end user, of software rarely need specialized software. I dont need an entire app generated on the spot to split the bill and remember the difference if I have the calculator.
So, yes, we are industrializing software, but this reach that people talk about (I believe) will be severely limited.
The industrial revolution was constrained by access to the means of production, leaving only those with capital able to actually produce, which lead to new economic situations.
What are the constraints with LLMs? Will an Anthropic, Google, OpenAI, etc, constrain how much we can consume? What is the value of any piece of software if anyone can produce everything? The same applies to everything we're suddenly able to produce. What is the value of a book if anyone can generate one? What is the value of a piece of art, if it requires zero skill to generate it?
i would say comparing making of software and working factory makes analogy mistake. complete software is analogy to running factory. making software is making of the factory. that is specialised tooling, layouts, supply chain etc. when you have all this your factory runs on industrial scale and produces things. like your software produces value when its completed and used by enduser.
Not convinced. There is an obvious value in having more food or more products for almost anybody on Earth. I am not sure this is the case for software. Most people's needs are completely fulfilled with the amount and quality of software they already have.
> There is an obvious value in having more food or more products for almost anybody on Earth
Quite the opposite is true. For a large proportion of people, they would increase both the amount of years they live and quality of life by eating less.
I think the days where more product is always better lapse to an end - we just need to figure out how the economy should work.
But how about some silly software for just a giggle. Like 'write website that plays fart sound when you push button'? That can be a thing for the kids at school.
Personally I think AI is going to turn software into a cottage industry, it will make custom software something the individual can afford. AI is a very long ways off from being able to allow the average person to create the software they want unless they are willing to put a great deal of time into it, but it is almost good enough that the programmer can take the average person's idea and execute it at an affordable price. Probably only a year or two from when a capable programmer will be able to offer any small buisness a completely customized POS setup for what the cost of a canned industrial offering today; I will design your website and build you a POS system tailored to your needs and completely integrated with the website, and for a little more I can throw in the accounting and tax software. A bright dishwasher realizing they can make things work better for their employer might be the next billionaire revolutionizing commerce and the small buisness.
I have some programming ability and a lot of ideas but would happily hire someone to realize those ideas for me. The idea I have put the most time into, took me the better part of a year to sort out all the details of even with the help of AI, most programmers could have probably done it in a night and with AI could write the software in a few nights. I would have my software for an affordable price and they could stick it in their personal store so other could buy it. If I am productive with it and show its utility, they will sell more copies of it so they have an incentive to work with people like me and help me realize my ideas.
Programming is going to become a service instead of an industry, the craft of programming will be for sale instead of software.
> and for a little more I can throw in the accounting and tax software
As someone who has worked in two companies that raised millions of dollars and had hundred people tackling just half of this, tax software, you are in for a treat.
Sure, that is still a ways off, but being able to hire a programmer to meet my personal modest software needs is almost there. Also, the needs of any company that required a hundred people and millions of dollars is very different from the needs of a small restaurant or the like; anyone with enough ambition to run a small restaurant can manage the accounting and taxes for that restaurant, the same can not be said for the sort of buisness you are describing. You are comparing an apple to an orange orchard.
Edit: Just noticed I said "any buisness", that was supposed to be "any small buisness." Edited the original post as well.
If that is true we will live in a funny world when you will loose all your money because you where running some outdated, riddled with holes software written by LLM running on some old router old cheap camera. Or some software will stop working after an update because some fix was written by LLM and nobody checked that nor tested. Or they will 3 outages of big internet services in 2 months.
Another AI entrepreneur who writes a long article about inevitability, lists some downsides in order to remain credible but all in all just uses neurolinguistic programming on the reader so that the reader, too, will think the the "AI" revolution is inevitable.
Tldr; initially I thought we might be onto something, but now, I don't see much of a revolution.
I won't put intention into the text because I did not check any other posts from the same guy.
That said, I think this revolution is not revolutionary yet.
Not sure if it will be, but maybe?
What is happening os that companies are going back to "normal" number of people in software development. Before it was because of adoption to custom software, later because of labour shortage, then we had a boom because people caught up into it as a viable career but then it started scaling down again because one developer can (technically) do more with AI.
There are huge red flags with "fully automated" software development that are not being fixed but for those outside of the expertise area, doesn't seem relevant. With newer restrictions related to cost and hardware, AI will be even a worse option unless there is some sort of magic that fixes everything related to how it does code.
The economy (all around the world) is bonkers right now.
Honestly, I saw some Jr Devs earning 6 fig salaries (in USD) and doing less than what me and my friends did when we were Jr. There is inflation and all, but the numbers does not seem to add.
Part of it all is a re- normalisation but part of it is certainly a lack of understanding of software and/or// engineering.
Current tools, and I include even those kiro, anti-gravity and whatever, do not solve my problems, just make my work faster.
Easier to look for code, find data and read through blocks of code I don't see in a while.
Writing code not so much better. If it is simple and easy it certainly can do, but for anything more complex it seems that it is faster and more reliable to do myself (and probably cheaper)
I think the idea is interesting, but immensely flawed.
The following is just disingenuous:
>industrialisation of printing processes led to paperback genre fiction
>industrialisation of agriculture led to ultraprocessed junk food
>industrialisation of digital image sensors led to user-generated video
Industrialization of printing was the necessary precondition for mass literacy and mass education. The industrialization of agriculture also ended hunger in all parts of the world which are able to practice it and even allows for export of food into countries which aren't (Without it most of humanity would still be plowing fields in order not to starve). The digital image sensor allows for accurate representations of the world around us.
The framing here is that industrialization degrades quality and makes products into disposable waste. While there is some truth to that, I think it is pretty undeniable that there are massive benefits which came with it. Mass produced products often are of superior quality and superior longevity and often are the only way in which certain products can be made available to large parts of the population.
>This is not because producers are careless, but because once production is cheap enough, junk is what maximises volume, margin, and reach.
This just is not true and goes against all available evidence, as well as basic economics.
>For example, prior to industrialisation, clothing was largely produced by specialised artisans, often coordinated through guilds and manual labour, with resources gathered locally, and the expertise for creating durable fabrics accumulated over years, and frequently passed down in family lines. Industrialisation changed that completely, with raw materials being shipped intercontinentally, fabrics mass produced in factories, clothes assembled by machinery, all leading to today’s world of fast, disposable, exploitative fashion.
This is just pure fiction. The author is comparing the highest quality goods at one point in time, who people took immense care of, with the lowest quality stuff people buy today, which is not even close to the mean clothing people buy. The truth is that fabrics have become far better and far more durable and versatile. The products have become better, but what has changed is the attitude of people towards their clothing.
Lastly, the author is ignoring the basic economics which separate software from physical goods. Physical goods need to be produced, which is almost always the most expensive part. This is not the case for software, distributing software millions of times is not expensive and only a minuscule part of the total costs. For fabrics industrialization has meant that development costs increased immensely, but per unit production costs fell sharply. What we are seeing with software is a slashing of development costs.
I agree with you on all of this, and found myself wondering if the author had actually studied the Industrial Revolution at all.
The Industrial Revolution created a flywheel: you built machines that could build lots of things better and for less cost than before, including the parts to make better machines that could build things even better and for less cost than before, including the parts to make better machines... and on and on.
The key part to industrialisation in the 19th-century framing, is that you have in-built iterative improvement: by driving down cost, you increase demand (the author covers this), which increases investment in driving down costs, which increases demand, and so on.
Critically, this flywheel has exponential outputs, not linear. The author shows Jevons paradox, and the curve is right there - note the lack of straight line.
I'm not sure we're seeing this in AI software generation yet.
Costs are shifting in people's minds, from developer salaries to spending on tokens, so there's a feeling of cost reduction, but that's because a great deal of that seems to be heavily subsidised today.
It's also not clear that these AI tools are being used to produce exponentially better AI tools - despite the jump we saw ~GPT-3.5, quantitive improvement in output seems to remain linear as a function of cost, not exponential. Yet investment input seems to be exponential (this makes it feel more like a bubble).
I'm not saying that industrialisation of the type the author refers to isn't possible (and I'd even say most industrialisation of software happened back in the 1960s/70s), or that the flywheel can't pick up with AI, just that we're not quite where they think it is.
I'd also argue it's not a given that we're going to see the output of "industrialisation" drive us towards "junk" as a natural order of things - if anything we'll know it's not a junk bubble when we do in fact see the opposite, which is what optimists are betting on being just around the corner.
The idea of automation creating a massive amount of software sounds ridiculous. Why would we need that? More Games? Can only be consumed at the pace of the player. Agents? Can be reused once they fulfill a task sufficently.
We're probably going to see a huge amount of customization where existing software is adapted to a specific use case or user via LLMs, but why would anyone waste energy to re-create the same algorithms over and over again.
I'm personally doing just that because I want an algorithm written in C++ in a LGPL library working in another language
A Knowledge Pool is the reservoir of shared knowledge that a group of people have about a particular subject, tool, method, etc. In product strategy, knowledge pools represent another kind of moat, and a form of leverage that can be used to grow or maintain market share.
Usage: Resources are better spent on other things besides draining the knowledge pool with yet another new interface to learn and spending time and money filling it up again with retraining.
As a developer for almost 30 years now, if I think where most of my code went, I would say, quantitatively, to the bin.
I processed much data, dumps and logs over the years. I collected statistical information, mapped flows, created models of the things I needed to understand. And this was long before any "big data" thing.
Nothing changed with AI. I keep doing the same things, but maybe the output have colours.
Take this for example:
``` Industrial systems reliably create economic pressure toward excess, low quality goods. ```
Industrial systems allow for low quality goods, but also they deliver quality way beyond what can be achieved in artisanal production. A mass produced mid-tier car is going to be much better than your artisanal car.
Scale allows you not only to produce more cheaply, but also to take quality control to the extreme.
Not that I disagree: I’m on record agreeing with the article months ago. Folks in labs probably seen it coming for years.
Yes we’ve seen major improvements in software development velocity - libraries, OSes, containers, portable bytecodes - but I’m afraid we’ve seen nothing yet. Claude Code and Codex are just glimpses into the future.
Another common misconception is, it is now easier to compete with big products, as the cost of building those products will go down. Maybe you think you can build your own Office suite and compete with MS Office, or build a SAP with better features and quality. But what went into these software is not just code, but decades of feedback, tuning and fixing. The industrialization of software can not provide that.
On the contrary, this is likely the reason why we can disrupt these large players.
Experience from 2005 just don't hold that much value in 2025 in tech.
The whole premise of AI bringing democratization to software development and letting any layperson produce software signals a gross misunderstanding of how software development works and the requirements it should fulfill.
The important thing is that goods =/= software. I, as an end user, of software rarely need specialized software. I dont need an entire app generated on the spot to split the bill and remember the difference if I have the calculator.
So, yes, we are industrializing software, but this reach that people talk about (I believe) will be severely limited.
What are the constraints with LLMs? Will an Anthropic, Google, OpenAI, etc, constrain how much we can consume? What is the value of any piece of software if anyone can produce everything? The same applies to everything we're suddenly able to produce. What is the value of a book if anyone can generate one? What is the value of a piece of art, if it requires zero skill to generate it?
Quite the opposite is true. For a large proportion of people, they would increase both the amount of years they live and quality of life by eating less.
I think the days where more product is always better lapse to an end - we just need to figure out how the economy should work.
I have some programming ability and a lot of ideas but would happily hire someone to realize those ideas for me. The idea I have put the most time into, took me the better part of a year to sort out all the details of even with the help of AI, most programmers could have probably done it in a night and with AI could write the software in a few nights. I would have my software for an affordable price and they could stick it in their personal store so other could buy it. If I am productive with it and show its utility, they will sell more copies of it so they have an incentive to work with people like me and help me realize my ideas.
Programming is going to become a service instead of an industry, the craft of programming will be for sale instead of software.
As someone who has worked in two companies that raised millions of dollars and had hundred people tackling just half of this, tax software, you are in for a treat.
Edit: Just noticed I said "any buisness", that was supposed to be "any small buisness." Edited the original post as well.
Oh wait. It is already a thing.
The mass production of unprocessed food is not what led to the production of hyper processed food. That would be a strange market dynamic.
Shareholder pressure, aggressive marketing and engineering for super-palatable foods are what led to hyper processed foods.
I won't put intention into the text because I did not check any other posts from the same guy.
That said, I think this revolution is not revolutionary yet. Not sure if it will be, but maybe?
What is happening os that companies are going back to "normal" number of people in software development. Before it was because of adoption to custom software, later because of labour shortage, then we had a boom because people caught up into it as a viable career but then it started scaling down again because one developer can (technically) do more with AI.
There are huge red flags with "fully automated" software development that are not being fixed but for those outside of the expertise area, doesn't seem relevant. With newer restrictions related to cost and hardware, AI will be even a worse option unless there is some sort of magic that fixes everything related to how it does code.
The economy (all around the world) is bonkers right now. Honestly, I saw some Jr Devs earning 6 fig salaries (in USD) and doing less than what me and my friends did when we were Jr. There is inflation and all, but the numbers does not seem to add.
Part of it all is a re- normalisation but part of it is certainly a lack of understanding of software and/or// engineering.
Current tools, and I include even those kiro, anti-gravity and whatever, do not solve my problems, just make my work faster. Easier to look for code, find data and read through blocks of code I don't see in a while. Writing code not so much better. If it is simple and easy it certainly can do, but for anything more complex it seems that it is faster and more reliable to do myself (and probably cheaper)
The following is just disingenuous:
>industrialisation of printing processes led to paperback genre fiction
>industrialisation of agriculture led to ultraprocessed junk food
>industrialisation of digital image sensors led to user-generated video
Industrialization of printing was the necessary precondition for mass literacy and mass education. The industrialization of agriculture also ended hunger in all parts of the world which are able to practice it and even allows for export of food into countries which aren't (Without it most of humanity would still be plowing fields in order not to starve). The digital image sensor allows for accurate representations of the world around us.
The framing here is that industrialization degrades quality and makes products into disposable waste. While there is some truth to that, I think it is pretty undeniable that there are massive benefits which came with it. Mass produced products often are of superior quality and superior longevity and often are the only way in which certain products can be made available to large parts of the population.
>This is not because producers are careless, but because once production is cheap enough, junk is what maximises volume, margin, and reach.
This just is not true and goes against all available evidence, as well as basic economics.
>For example, prior to industrialisation, clothing was largely produced by specialised artisans, often coordinated through guilds and manual labour, with resources gathered locally, and the expertise for creating durable fabrics accumulated over years, and frequently passed down in family lines. Industrialisation changed that completely, with raw materials being shipped intercontinentally, fabrics mass produced in factories, clothes assembled by machinery, all leading to today’s world of fast, disposable, exploitative fashion.
This is just pure fiction. The author is comparing the highest quality goods at one point in time, who people took immense care of, with the lowest quality stuff people buy today, which is not even close to the mean clothing people buy. The truth is that fabrics have become far better and far more durable and versatile. The products have become better, but what has changed is the attitude of people towards their clothing.
Lastly, the author is ignoring the basic economics which separate software from physical goods. Physical goods need to be produced, which is almost always the most expensive part. This is not the case for software, distributing software millions of times is not expensive and only a minuscule part of the total costs. For fabrics industrialization has meant that development costs increased immensely, but per unit production costs fell sharply. What we are seeing with software is a slashing of development costs.
The Industrial Revolution created a flywheel: you built machines that could build lots of things better and for less cost than before, including the parts to make better machines that could build things even better and for less cost than before, including the parts to make better machines... and on and on.
The key part to industrialisation in the 19th-century framing, is that you have in-built iterative improvement: by driving down cost, you increase demand (the author covers this), which increases investment in driving down costs, which increases demand, and so on.
Critically, this flywheel has exponential outputs, not linear. The author shows Jevons paradox, and the curve is right there - note the lack of straight line.
I'm not sure we're seeing this in AI software generation yet.
Costs are shifting in people's minds, from developer salaries to spending on tokens, so there's a feeling of cost reduction, but that's because a great deal of that seems to be heavily subsidised today.
It's also not clear that these AI tools are being used to produce exponentially better AI tools - despite the jump we saw ~GPT-3.5, quantitive improvement in output seems to remain linear as a function of cost, not exponential. Yet investment input seems to be exponential (this makes it feel more like a bubble).
I'm not saying that industrialisation of the type the author refers to isn't possible (and I'd even say most industrialisation of software happened back in the 1960s/70s), or that the flywheel can't pick up with AI, just that we're not quite where they think it is.
I'd also argue it's not a given that we're going to see the output of "industrialisation" drive us towards "junk" as a natural order of things - if anything we'll know it's not a junk bubble when we do in fact see the opposite, which is what optimists are betting on being just around the corner.