SCALE THE PLANET

Thoughts and musings on scalability, messaging, Windows Azure, and programming in general

NAVIGATION - SEARCH

The Industrial Revolution, part 3

(read part 2 here!)

"Don't do it yourself"

The primary way worker efficiency can be improved is an old principle: "Don't do it yourself".  If you want a particular worker to produce tables twice as fast, stop making him saw his own wood, but supply wood pieces in the correct sizes.  The worker can now specialize in composing tables from those precut parts, while another can specialize in turning raw materials into correctly sized wood pieces.  The next step after that is buying the wood from someone else.

In the 19th century, interchangeable parts became the name of the game.  It was no longer common for a worker or even a factory to produce everything they needed themselves from raw materials.  They could go out and purchase the parts from any other manufacturer.  The godfather of this evolution was Henry Maudslay.  Maudslay is credited with inventing the first bench micrometer, a device capable of measuring anything his factories produced to an accuracy of 3 micrometers, or 0.003 millimetres.  It gave us compatible thread and diameters on screws, nuts and bolts, and quickly became one of the biggest motors for standardization.  The end result was that factories didn't need to produce custom parts for their products any more. They could buy parts with standard measurements or have things produced for them by providing the exact measurements.  They could trust that whatever they bought, if it conformed to those measurements, it would work in their own products.

Why an 19th century inventor matters to software

Software hasn't reached this point yet, but we are on the right path.  Open standards have always been important, but in recent years more and more software companies have jumped ship from their proprietary protocols and formats to open industry-wide standards.  This goes an all levels, from how you send data through a copper wire to queueing protocols.  As a direct consequence of that, loose coupling -that holy grail of robust software- is made easier.  Service Oriented Architecture, CBSE, Microservices, Actor model, Communicating Sequential Processes, it doesn't matter what you call it.  The principe is always the same.  Build an application out of independent parts that communicate, and it will be more robust, easier to scale, and easier to maintain and improve.  Working more with open standards are now making this easier and allow us to build components on entirely different languages and paradigms, but still compose them into functional software.

The next step is to make these components so standardized that they can be switched out at will.  If I can switch out components of my application at will, I can also build software that way.  In theory, I won't ever need to know how code is written, or how the protocols function.  I could take a bunch of components, build my flow from those, and be done.  Software would go from a craft to a commodity.  This goes for distributing it to clients as well.  Everybody in corporate IT dreams of an install procedure that consist of exactly one step: clicking 'done'.  If protocols are standardized enough, that's all that will ever be required.

Standards, standards standards

All this means having a default way of communicating, so commands and queries can be issued without worrying about how they will get to destination.  That's the easy part.  The hard part is how you communicate your intent to a component.  There are no standard commands to send to a service that stores something, or that sends an invoice.  We have it for e-mail in SMTP, why not for accounting software?  What is the standard protocol to address a social network?

What we are missing is Mr. Maudslay.  There is no software equivalent to the micrometer.  There is no way to define a standard, and then run a battery of checks to see if it will behave correctly in all situations.  Unit testing and TDD help us on this path, but they aren't far-reaching enough.  The goal to shoot for should be being able to write any component, and validate that comprehensively against the intended standard protocol, without writing any tests yourself.  It also requires a standard protocol for providing standard protocols (Yo dawg...), so someone in need of a custom component can provide the exact specifications it should meet, in the same way a factory provides a set of measurements to its suppliers.
I don't know if it is even possible to build such a micrometer.  Theoretically it must be, but it requires wide adoption of a common system.  This kind of wide spread has happened in the past, but it is never easy.  My magic eight-ball is clear: "Ask again later."

The Industrial Revolution, part 1

What is the general verb to use when a developer does his job?  Does he 'create' software? 'build'? 'engineer'?  In recent years, the word 'craft' has become more fashionable.  We are all 'software craftsmen', and the craftsmanship movement has gained enormous momentum.  It is definitely an apt description. We focus on quality, creativity and mastery of the skill of programming.  Like craftsmen in the past, every line is bespoke, every function something toiled over.

The desk my computer is on was not 'crafted'.  Neither was the chair I sit in, or the computer itself.  Not even complex machinery, like my car, was crafted. It was manufactured and assembled.  Somewhere along the line from idea to my driveway came engineers and designers, but they would be hard-pressed to describe themselves as craftsmen.  None of them worried about how to build the screws that hold it together, and yet it will work for years in changing conditions and constant use, with minor maintenance.  Why is that so hard for software?

"We are in the "pre-industrial" era of software development"

History, when seen through a vague enough lens, is fractal in nature.  Parts of it repeat the whole, on shorter time scales.  Until about 350 years ago, assembly lines and manufacturing processes hadn't been invented yet.  The world was filled with craftsmen.  Artisans, experts at their job, knew everything there is to know about their subject.  They crafted products of amazing quality and beauty, some that still last to this day.  What they couldn't do, was handle complexity.  They were unable to produce complex machinery reliably.  Does that ring a bell?

All of that changed with the Industrial Revolution.  Suddenly, all kinds of products flooded the market.  They were cheap and reliable, and the longer time went on, the more intricate and well-designed they were.  Economics of scale started playing, and it was suddenly possible to afford conveniences nobody could before.  The complexity of available products shot up, but economics of scale and progressing knowledge meant they could be cheap and reliable on a level no craftsman could even imagine.  Workers became able to build those complicated products, even without much schooling or in-depth knowledge of the underlying physics.  The craftsmen of before morphed into engineers and designers.  They did not build anything beyond prototypes themselves, they laid out plans and defined techniques.  While the current generation of products are being produced and work, they invent the next.  Workshops with a master and his apprentices ceased to exist, replaced by Research and Development, and production divisions.

Software is on the verge of the industrial revolution.  Several of the important steps have already been taken, others are being worked on.  In part 2, I will try to examine if a few of the most important factors in the Industrial Revolution are at play in the software industry, and what the missing factors might look like.

(read part 2 here!)