(read part 1 here!
In the physical world, a product goes through 2 major steps in its life cycle: design and manufacturing. In most industries, this is a very 'over-the-wall' process, production only starts after the design phase is completely finished, and the necessary machinery has been set up and workers trained to manufacture the new item. Software does not have this structure, since there is no manufacturing step, is often argued. While this is true for most in-house developed software, it does seem to be possible to identify some post-design steps in software sold either as product or as service. These are steps that are not part of the design part, but need to happen again and again for every sale, and are therefore the closest we have to 'manufacturing'. Those include deployment to production servers, configuring the software, adapting other systems to properly integrate with the new piece of code, etc.
Why are these steps necessary? In what way is software so different from hardware that there is significant post-sale work to be done before it is operational? My watch did not need to be installed by a professional. I did not need to specify what colour I wanted the strap to be. It's perfectly normal for physical items to be something you buy without giving it input. The most important factor for this is simply choice. I did not need to configure my watch, because there are thousands of models available, from the very cheap to the extremely expensive. Some are very basic, others are intricately designed timepieces that work in the most extreme circumstances. All of those would fulfill my basic requirement. I never needed to configure exactly what I wanted, because I could shop around until I found the perfect watch. In the process, I saw options I never knew existed.
The current state of software is very different. Software requirements are, on average, not more complicated than the requirements we have for physical items. The difference is in choice. If I'm lucky, and what I'm looking for is a popular requirement, I could have a few dozen software systems I could buy. All of them will fill my need, but most will be heavy and expensive, and will do far more than I ever wanted. And most of them will need some form of configuration to match my IT and my corporate structure. Why are there not thousands of applications that just do contact management? Why can I not simply choose the one that matches what I need, and have that be the end of it?
The 1800's vs now
To figure out how we can change that market and what the future might hold, let's take a look at the physical industrial revolution. The world changed a lot in those years between the middles of the 18th and 19th century. We developed machine manufacturing, made better use of steam power and coal, and increased worker productivity by enormous amounts. We were able to refine basic elements, like iron, better and on a larger scale, and transportation systems became faster, more efficient and most of all capable of handling a far greater load. All these developments together led to a slow revolution, a time when factories started working together and becoming each other's clients on a massive scale. After those roughly 100 years, craftsmen had been replaced by factories that assemble products from components that they bought from other factories, that made them from basic materials bought from still others, etc.
The great developments of the day did not happen in isolation, of course. They played off each other, but it was only because they were all present and making great strides that the industrial revolution went so fast and was so successful. Each of them also seems to have an equivalent in the software world.
Let's start off with the basics. The first modern steam power plant
was created around 1712. By 1783, James Watt had managed to improve its power output and reliability, and managed to turn the power into a rotary motion. Power to drive machinery became cheap and easily available. Coal became ubiquitous, allowing greater energy output for less work. In later years, electricity came on the scene, which was a lot more flexible and crucially, pay-per-use. For years, we've seen the same developments in software. Computers have become more powerful and cheaper. They've also become more easily available, throughout the world, to the point where a large percentage of the population now has a computer in their pockets. We see the pay-per-use factor return in every cloud offering, and computing power on any magnitude has become available and open to everybody through cloud providers.
The increased use of coal caused major breakthroughs in metallurgy. Higher temperatures and cleaner techniques for burning improved the quality of available copper, iron and steel. To a software engineer, code is iron. A line of code is the basic element of our entire industry. Stronger computers and more advanced ways of using their power has allowed that basic element to improve in quality as well. We run garbage collectors, abstraction layers and emulation systems, all with the purpose of making our lines of code as clean and as high-quality as possible. Programming languages become more advanced by the day. New programming paradigms and models tend to be less efficient power-wise, but better for cleanliness and extensibility. We've gone from writing assembly code to python, from COBOL to .Net.
Transportation of data has steadily become faster. 10 gigabit Ethernet is no longer an exception, 4G networks are being rolled out across the globe, and even structures a simple as a message queue are now able to handle enormous quantities of data. Meanwhile, basic connectivity has become far more reliable, too. Our faster networks enable new business models, even ones that would have been completely alien to us before. Software-As-A-Service lives by this. Nobody on dial-up would ever have considered uninstalling their local word processor for something like Google Docs. Nowadays, it's not just common, for a lot of software it's the preferred model. Even local e-mail clients are in a rapid decline.
So far, these are all developments we've got down pat. The basic groundwork for an industrial revolution has been laid, but we're not there yet. How do we increase worker efficiency by a thousand-fold? How do we automate the installation and configuration procedures on a similar level as an automated factory? Are we far from those goals, or are we perhaps closer than we think? I hope I'll be able to shed some light in part 3, stick around.