Here's Why Software Development Cannot Be Industrialized
What does it mean to industrialize?
Industrialization of anything is an attempt to convert a development process to a manufacturing process. In development - whether that be land development, housing development, or software development there is a product that is to be delivered, but the methods to get there are not always predictable. Say we are developing a 30 acre parcel of land for eventual construction of apartments. The path to completion is wrought with unknowns. You could hit bedrock or an ancient burial ground. The soil may be denser than expected. A protected species of bird could nest in a tree on the land. None of these issues may stop a project in its tracks, but it would expand cost and timeline.
Now in manufacturing, the goal is to produce the same product the same way over and over again with an eye towards consistency in process as well as output. Henry Ford's assembly line is the classic example of how a process can be turned from a process of development to a process of manufacturing. While in development we may adjust course, and the end product may be different from what was originally planned - in manufacturing neither of these variances are acceptable. A variance in process introduces unpredictability to everything downstream from the variance. Think about the quality issues in American cars before the 1990s. They were largely the result of small variances in design and construction, which lead to consolidation of designs and robotization of the manufacturing process. Today - cars are higher quality, but that has come at the cost of variety (all of the crossover SUVs on the market today look the same for a reason, and some more than others as they use the exact same frame across some automobile manufacturers).
The move to industrialize software development is very much what was done with cars - the idea that we can create a formula for each widget that needs to get produced and just replicate that formula over an over again. We have managed to turn a great deal of processes of production - even farming, into something that is produced at massive scale with consistently diminishing labor costs through automation. So what have we done to make it so that software can be manufactured like a car? It turns out - not very much.
How come industrialization is hard in software development?
Before Henry Ford deployed the assembly line to build cars, building a car was a largely manual and custom effort. Manufacturers would focus on local markets using an increasing number of local suppliers. The application of the assembly line required that the consumer make significant tradeoffs, with the legacy of this being the famous Ford quote "A customer can have a car painted any color he wants as long as it’s black." Stripping away variables and options is the most effective way at making a manufacturing process possible: cost effective, repeatable, and requiring minimal human skill to accomplish.
In software development, there is a grand vision that some day we will be able to say "I need a system to manage my invoicing!" and it will be able to be assembled. Indeed, this is quite true for a lot of commodity products today - including financial management systems, business management systems, etc. But for many enterprises large and small, there is a resistance or an inability to conform their needs to "off the shelf" solutions and so they opt for a team of software developers to come and build a solution greenfield, or to "integrate" an existing solution so that it meets the specific needs of the business. That's now a custom job - and there is no way to industrialize something that hasn't been done before and will not be done the same way again.
Yet, companies have been trying for years to find some silver bullet to make custom software development predictable. I'd be hard pressed to think of an other industry that develops products who has tried so hard to make custom development a manufacturing process. Builders of custom homes may have some pre-fabricated materials that they use, and they may subcontract to specialists who have enough experience where they present the lowest risk for cost and schedule overruns - the same way that in software development we have frameworks and libraries that attempt to reduce the need for reinvention. But predicting a result with a fixed time and budget is still illusive in any development effort - homes or software.
So in came the consultants.
So why all of these methodologies?
The alphabet soup of many modern software development methodologies comes from a well intended source. A group of software professionals (who probably look at the world of development today and wonder whether it was all worth it) got together a build what they called The Agile Manifesto. It's name evokes Marx. It substance is important - even if most Agile practitioners don't seem to have read it. The Manifesto is basically a statement of fact construed as opinion that with all of the best efforts of the best people, software development cannot be industrialized, cannot be made 100% repeatable, and cannot be done in a mill and thus requires a level of collaboration, communication, and compromise between consumer and producer where to produce the most value the producer will ensure that there is as much visibility into the developed product as is possible. In exchange for this, the consumer will remain engaged to help make decisions on a regular basis so that the cost and the timeline of the project are ultimately their responsibility to manage.
Out of that grew exactly the opposite of what these original practitioners of "agile" had written to paper. Many of the signatories of the manifesto were enlisted themselves by companies who had tried everything else to get their software teams to deliver on time, on budget and figured this was yet another way. While there certainly were successes in agile adoption that did not include an endgame of on time, on budget projects - the main outcome was the continued industrialization of software development project management. From the same people who brought us the general contractor, consultancies repackaged project management as agile transformation or agile adoption or some other trademarked name for their so-called methodology and went to work selling clients on how their methodology delivered for their other clients.
It's a flawed bill of goods.
What about artificial intelligence? Won't that change things?
Software writing software feels a lot like saying "robots will build robots" and my guess is that we will eventually see some form of automation on this front - but it won't come in the form of intelligence. It will come in the form of every other attempt to streamline software development: standards.
Anyone older than 30 years old remembers the browser wars - the time when the young internet was a place where people built weird things that only worked in one browser, or if you're a pessimistic front end developer in their early 40's: the time when nothing worked the same way in any browser.
The competition to become the sole "set top box" for internet access between Netscape, Microsoft, and eventually Google and Apple eventually was upended by the fatigue of technology executives sick of spending money to subsidize the battle between the browser developers in order to make the experience of their users better. Some took the approach of "we don't support anything but Internet Explorer" but if anything put this argument to rest, it was the dawn of mobile and the need to reduce duplicative effort and quality management issues. We have a more homogenous web today, and I would say this is only going to become moreso.
Frameworks are the currency of software automation, taking best practices and standards and making it so that the lowest skilled labor can implement based on requirements. Frameworks touch all parts of the development spectrum today. Tools like Terraform have gone a long way to making it so that cloud infrastructure requires a lower understanding of the understanding vendor implementation. Google's Material Design Toolkit has attempted to establish a framework for how user interface is designed - and it gets strong adoption because it is perceived to be solving problems that are widespread. Nativescript and Ionic offer ways to build mobile applications with almost no need for having separate Android/iOS knowledge. Google's Flutter attempts to go even a step further - homogenizing the design and development of mobile apps.
Will robots eventually do all of this? I suppose we are headed in that direction. If we all agree that there is value in standardizing design, and UX becomes the new practitioner of business analysis, then a great deal of software development work done by developers today may be done by non-developers. Platforms like Mendix already promise this.
With all of this said, I still believe that in 2019 software development is an exercise in maximizing human capital. Agile's great epiphany was this very thing - but it's realization was corrupted by the false hope that creating methodologies that allowed for agile to be implemented would lead to productivity gains. The desire to make the most costly part of information technology in business disappear, eliminating the labor - is understandable. I doubt that the shift to agile in large software organizations has made them smaller. It simply may have created new ways of trying to measure productivity. What measuring productivity means without also measuring value created simultaneously does, that is another exercise in executive skepticism.
Ryan is the former Chief Product Officer at Medullan, CTO at Be the Partner and Vitals, and now is a CTO consultant at Osmosis Knowledge Diffusion and has projects in alternative education, digital therapeutics, and patient engagement.