SCALE THE PLANET

Thoughts and musings on scalability, messaging, Windows Azure, and programming in general

NAVIGATION - SEARCH

The Industrial Revolution, part 2

(read part 1 here!)

"Manufacturing software"

In the physical world, a product goes through 2 major steps in its life cycle: design and manufacturing.  In most industries, this is a very 'over-the-wall' process, production only starts after the design phase is completely finished, and the necessary machinery has been set up and workers trained to manufacture the new item.  Software does not have this structure, since there is no manufacturing step, is often argued.  While this is true for most in-house developed software, it does seem to be possible to identify some post-design steps in software sold either as product or as service.  These are steps that are not part of the design part, but need to happen again and again for every sale, and are therefore the closest we have to 'manufacturing'.  Those include deployment to production servers, configuring the software, adapting other systems to properly integrate with the new piece of code, etc.

Why are these steps necessary?  In what way is software so different from hardware that there is significant post-sale work to be done before it is operational?  My watch did not need to be installed by a professional.  I did not need to specify what colour I wanted the strap to be.  It's perfectly normal for physical items to be something you buy without giving it input.  The most important factor for this is simply choice.  I did not need to configure my watch, because there are thousands of models available, from the very cheap to the extremely expensive.  Some are very basic, others are intricately designed timepieces that work in the most extreme circumstances.  All of those would fulfill my basic requirement.  I never needed to configure exactly what I wanted, because I could shop around until I found the perfect watch.  In the process, I saw options I never knew existed.

The current state of software is very different.  Software requirements are, on average, not more complicated than the requirements we have for physical items.  The difference is in choice.  If I'm lucky, and what I'm looking for is a popular requirement, I could have a few dozen software systems I could buy.  All of them will fill my need, but most will be heavy and expensive, and will do far more than I ever wanted.  And most of them will need some form of configuration to match my IT and my corporate structure.  Why are there not thousands of applications that just do contact management?  Why can I not simply choose the one that matches what I need, and have that be the end of it?

The 1800's vs now

To figure out how we can change that market and what the future might hold, let's take a look at the physical industrial revolution.  The world changed a lot in those years between the middles of the 18th and 19th century.  We developed machine manufacturing, made better use of steam power and coal, and increased worker productivity by enormous amounts.  We were able to refine basic elements, like iron, better and on a larger scale, and transportation systems became faster, more efficient and most of all capable of handling a far greater load.  All these developments together led to a slow revolution, a time when factories started working together and becoming each other's clients on a massive scale.  After those roughly 100 years, craftsmen had been replaced by factories that assemble products from components that they bought from other factories, that made them from basic materials bought from still others, etc.

The great developments of the day did not happen in isolation, of course.  They played off each other, but it was only because they were all present and making great strides that the industrial revolution went so fast and was so successful.  Each of them also seems to have an equivalent in the software world.

Let's start off with the basics.  The first modern steam power plant was created around 1712. By 1783, James Watt had managed to improve its power output and reliability, and managed to turn the power into a rotary motion.  Power to drive machinery became cheap and easily available. Coal became ubiquitous, allowing greater energy output for less work. In later years, electricity came on the scene, which was a lot more flexible and crucially, pay-per-use.  For years, we've seen the same developments in software.  Computers have become more powerful and cheaper.  They've also become more easily available, throughout the world, to the point where a large percentage of the population now has a computer in their pockets.  We see the pay-per-use factor return in every cloud offering, and computing power on any magnitude has become available and open to everybody through cloud providers.  

The increased use of coal caused major breakthroughs in metallurgy.  Higher temperatures and cleaner techniques for burning improved the quality of available copper, iron and steel.  To a software engineer, code is iron.  A line of code is the basic element of our entire industry.  Stronger computers and more advanced ways of using their power has allowed that basic element to improve in quality as well.  We run garbage collectors, abstraction layers and emulation systems, all with the purpose of making our lines of code as clean and as high-quality as possible.  Programming languages become more advanced by the day. New programming paradigms and models tend to be less efficient power-wise, but better for cleanliness and extensibility.  We've gone from writing assembly code to python, from COBOL to .Net.

Transportation of data has steadily become faster.  10 gigabit Ethernet is no longer an exception, 4G networks are being rolled out across the globe, and even structures a simple as a message queue are now able to handle enormous quantities of data.  Meanwhile, basic connectivity has become far more reliable, too.  Our faster networks enable new business models, even ones that would have been completely alien to us before.  Software-As-A-Service lives by this.  Nobody on dial-up would ever have considered uninstalling their local word processor for something like Google Docs.  Nowadays, it's not just common, for a lot of software it's the preferred model.  Even local e-mail clients are in a rapid decline.

Now what?

So far, these are all developments we've got down pat.  The basic groundwork for an industrial revolution has been laid, but we're not there yet.  How do we increase worker efficiency by a thousand-fold?  How do we automate the installation and configuration procedures on a similar level as an automated factory?  Are we far from those goals, or are we perhaps closer than we think?  I hope I'll be able to shed some light in part 3, stick around.

The Industrial Revolution, part 1

What is the general verb to use when a developer does his job?  Does he 'create' software? 'build'? 'engineer'?  In recent years, the word 'craft' has become more fashionable.  We are all 'software craftsmen', and the craftsmanship movement has gained enormous momentum.  It is definitely an apt description. We focus on quality, creativity and mastery of the skill of programming.  Like craftsmen in the past, every line is bespoke, every function something toiled over.

The desk my computer is on was not 'crafted'.  Neither was the chair I sit in, or the computer itself.  Not even complex machinery, like my car, was crafted. It was manufactured and assembled.  Somewhere along the line from idea to my driveway came engineers and designers, but they would be hard-pressed to describe themselves as craftsmen.  None of them worried about how to build the screws that hold it together, and yet it will work for years in changing conditions and constant use, with minor maintenance.  Why is that so hard for software?

"We are in the "pre-industrial" era of software development"

History, when seen through a vague enough lens, is fractal in nature.  Parts of it repeat the whole, on shorter time scales.  Until about 350 years ago, assembly lines and manufacturing processes hadn't been invented yet.  The world was filled with craftsmen.  Artisans, experts at their job, knew everything there is to know about their subject.  They crafted products of amazing quality and beauty, some that still last to this day.  What they couldn't do, was handle complexity.  They were unable to produce complex machinery reliably.  Does that ring a bell?

All of that changed with the Industrial Revolution.  Suddenly, all kinds of products flooded the market.  They were cheap and reliable, and the longer time went on, the more intricate and well-designed they were.  Economics of scale started playing, and it was suddenly possible to afford conveniences nobody could before.  The complexity of available products shot up, but economics of scale and progressing knowledge meant they could be cheap and reliable on a level no craftsman could even imagine.  Workers became able to build those complicated products, even without much schooling or in-depth knowledge of the underlying physics.  The craftsmen of before morphed into engineers and designers.  They did not build anything beyond prototypes themselves, they laid out plans and defined techniques.  While the current generation of products are being produced and work, they invent the next.  Workshops with a master and his apprentices ceased to exist, replaced by Research and Development, and production divisions.

Software is on the verge of the industrial revolution.  Several of the important steps have already been taken, others are being worked on.  In part 2, I will try to examine if a few of the most important factors in the Industrial Revolution are at play in the software industry, and what the missing factors might look like.

(read part 2 here!)