History of the automation industry
Automation is a major factor in the modern world and shows no signs of getting less important in the immediate future. When a task becomes cheap enough to automate, relative to employing people to do it, such a task is increasingly being automated. The potential upside of automation for the public is that it can be a way to free people from having to do mundane or menial tasks requiring minimal creativity and training, or as some PR campaigns put it; to free up humans from having to perform dangerous tasks more appropriate for robots. For many organizations, however, automation is primarily a matter of economics. Of course when the public benefits from automation, this will feature prominently in PR materials even if it is not the driving factor. With the right political structures in place automation could become a major factor in improving the livelihoods of people in the world. However, a better understanding of the factors involved in automation projects would help the public and the organization at hand to get the most out of automation.
With the 2020 pandemic causing physical proximity to become almost dangerous to humans, talks of automation have once again become a mainstream topic of discussion. One of the recent shifts in public opinion has been around the adoption of automation of various jobs like checkout clerks. As we all know, the technology for this has existed for quite some time, yet there have been economic and political concerns that have kept people directly employed in these roles. The economics of replacing a human with a machine leads to a brutal calculation that many in industry make where they must weigh up the costs to themselves and their political allies against the financial benefits from having a task be automated. From the perspective of the general public there are different political repercussions of replacing humans with machines; especially in a society that assumes that a job is required for survival. Because automation directly shifts the balance of power between people, it's an inherently political topic: more so than many other topics in technology.
Many of these economic and political concerns about reducing employment of customer facing staff seem to have evaporated as fears of virus transmission have taken hold. This might end up in a wave of jobs being automated away because the cost of people doing those jobs has increased temporarily while the political opposition to reducing employment via automation is taking a pandemic-induced holiday.
All this indicates that automation is a multifaceted topic that crosses the boundaries between economics, politics, technology and management. Any successful automation project that actually delivers a net positive value will thus need to take each of these aspects into account. Despite this, automation is a reactionary topic for many people, as far fewer people are intrinsically interested in automation.
In 2001 Repenning and Sterman wrote the fantastically named article "Nobody Ever Gets Credit for Fixing Problems that Never Happened: Creating and Sustaining Process Improvement". Here's a great quote from this paper:
Thus, today’s managers face a paradox. On the one hand, the number of tools and techniques available to improve performance is growing rapidly. Further, with advances in information technology and the ever-growing legions of management consultants, it is easier than ever to learn about these techniques and to learn who else is using them. On the other hand, there has been little improvement in the ability of organizations to incorporate these innovations in their everyday activities. The ability to identify and learn about new improvement methods no longer presents a significant barrier to most managers. Instead, successfully implementing these innovations presents the biggest challenge. Put more simply, you can’t buy a turnkey six-sigma quality program. It must be developed from within.
And close to 20 years after this publication, automation (just like process improvement) remains difficult because of the fundamental inability to buy turnkey solutions when the requirements get complex. Perhaps this is no surprise since a complex topic that requires careful consideration of technology, politics and economics to be successful will be hard to buy off the shelf. Despite this there are a number of offerings that are marketed as silver bullets.
Why did automation take so long to go mainstream?
A large part of what got me involved in the tech world was the prospect of realizing real productivity growth. The idea of automating mundane, dangerous or boring work was highly appealing and remains one of my largest motivations for staying involved in the tech industry. It seems other people are seeing some of those benefits too as one of the questions I'm hearing with increasing frequency is "why did it take so long for automation/RPA to go mainstream?". Since computational machinery from the earliest days of its industry has always been about enabling automation, looking at the history of the field is instructive.
Any discussion of computing machinery and automation would be very incomplete without the concept of cost gravity. Cost gravity is a concept that roughly says, that barring artificial interference, the cost to produce output that humans have a consistently strong demand for will drop over time. This happens because people apply their intelligence to improving things that they want sufficiently much. So innovation and progress are in some sense reflected by the drop in production costs for any item that is desired and for which the supply is not artificially constrained by political measures. In recent times, process improvement and automation have been major drivers of reducing costs. The natural state of affairs for things that are popular commodities is for the price to drop, and so in the same way one counters the effects of gravity, you need something artificial to hold the price of such an item up in the face of innovation, competition and free markets 1. Provided automation of a task is possible (which often it is not), automation decisions tend to heavily hinge on the interplay between the cost of these two factors:
- How expensive is it to automate the task?
- How expensive is it for people to do the task?
If getting people to do the task is significantly cheaper than automation, you'll find automation projects don't tend to get off the ground at all. These factors have driven automation, which has literally been the mainstay of computational and IT systems for the entire history of the computing industry. In the earliest days of the computing, when Charles Babbage was making the complex mechanical Difference Engine and Analytical Engine primitive computers, he was doing so to automate a particularly tedious task. Instead of looking at how these were made I think it is instructive even close to 200 years later to look at why these devices were being made. The Admiralty wanted a way to automate various tedious calculations that were needed for Naval operations 2. As such, they were willing to invest a large amount capital in computational machinery 3 to reduce the costs of doing such computations by hand. The other part of this of course, is that the increased computational power would have allowed new directions that previously weren't possible at all. The Admiralty saw a cheaper way to get results they wanted and the possibility of enabling new naval developments.
The first project was the difference engine, which had much in common with the ever ubiquitous desk calculator (which incidentally are getting rarer with the prevalence of cheaper general purpose computers: smartphones, which can run a calculator app on them). But back before these calculators existed a "computer" meant a person who did computations. Back in the day people sat around and did computations by hand that created giant tables of results that you could look up if you needed to do your work:
This is a photo of a page from a book of values for logarithms compiled by Henry Briggs in 1617. As you can see, creating all these entries by hand was a very manual, laborious and painstaking process. We now have extremely good access to powerful computational devices that can do these calculations, but this was not always so even if we take it completely for granted 400 years later.
Back in 1842 the major "customer", the British government, mostly cared about the reduction of the cost enabled by the output from the machines much more than the development of the machines. If you wanted to develop some new designs for the Navy Admiralty, then having to wait around for a lot of calculations to be done by hand would be expensive not just in labor costs but also in terms of the delays this would impose on their schedule. The development of powerful computational machinery and the enormous civilization-level impacts that computers have enabled weren't valued because that was far less tangible back then. People in the 1800s just could not have known at the time how much general purpose computers would change the course of everything. Even if they did, it would have been hard to make a line entry in the accounting of the project for "Civilization-level changes" though the value may have been picked up on at a broader level due to the amount of military and political power such a development would have brought the British Empire. Babbage seemed to have a vision of the impacts of a more general purpose computational machine and was entranced by the Analytical Engine project. Upon seeing some of the power of general purpose computing and became fixated on working on what would turn out to be one of the biggest paradigm shifts in human civilization. Along with Ada Lovelace's observations about how computers could store symbolic data and not just numbers, they really were on to something big. Unfortunately, like many brilliant people, they invested their time in their work instead of the politics and sales work needed to actually sell an automation project to the government.
(Picture from https://en.wikipedia.org/wiki/File:Babbage_Difference_Engine.jpg)
So why didn't the Analytical Engine get completed in the 1800's?
It seems like this project was one of the first — if not the first — computing project to fail due to the benefits not being tangible enough to those providing the funding. Being cutting edge also meant costs were impossible to accurately estimate ahead of time. A lesson that many people in the industry seem to painfully relearn all the time.
The failure of the Analytical Engine project basically comes down to the difficulty of selling the project given the very high cost of implementation at the time. The design itself was capable of automating a variety of important computations, but despite this, the cost of the build was higher than people were willing to bear. Back then the cost of precision machining was vastly higher than it is today but the capacity existed back then. 4
But anyway why am I mentioning all this 1800s stuff anyway? I think the main challenges with contemporary automation projects are mostly the same as they were back then and the engineering portion remains only part of it. Many of the successful developments in automation from the distant past have been so successful that they have been forgotten about and taken for granted. The small time window for which automation products are actually acknowledged is the topic of the next post in this series. However since the 1800s people have learned a lot about how to manage and sell projects, and those developments have greatly impacted the automation industry, topics that will be covered in later posts.
Some items exist where supply and demand don't impact the price in ways that are intuitive. For example Veblen goods or positional goods will see demand rise with certain price rises. The costs of production of these goods may still fall with advances in production knowledge but pricing might not fall in response for these goods, more efficiency tends to just add margins in these cases. Automation in general is not a Veblen good because lower prices tend to lead to greater demand. However there's an important exception to this rule that any automation practitioner must be aware of, when departmental prestige is positively impacted by a high capital expenditure you can see some automation projects where the more expensive option is taken. ↩
Let's say you are the admiralty, there's a lot of situations where these values are useful, for example in creating tables that allow you to determine the distance of a ship from the shore from calculating the angle between two observers on the shore through to a number of engineering properties useful in shipbuilding, just to name a couple of applications out of a vast number. The actual act of creating these tables is helped by being able to create easy to calculate representations of these functions. Specifically you can represent many trigonometry identities via polynomials which then lets you calculate numerical values for those. An infinitely differentiable function can be approximated using a Taylor series and when you take the first
nitems of this series you get an approximation of the function known as a Taylor polynomial. Then if you have a machine that can calculate the values of these polynomials quickly you get a lot of numbers to populate whatever table of numbers you needed much faster than if someone needed to do the calculations by hand. ↩
Recent efforts to make a working version of the Difference Engine have demonstrated that it would have been possible to make such a machine with the machining tolerances that were possible in the 1800s. What is extremely different is the cost gravity factors in manufacturing at large has resulted in far cheaper processes to make such a machine. Breakthroughs such as computer aided machining and 3D printing have allowed people to make much of the analytical engine's precision parts at a tiny fraction of the cost (and they have indeed done so by making working replicas from the original designs). Progress is fascinating, to see how an old mechanical computer is more feasible to make today due to computer aided machining which is itself only exists due to the advances in computational machinery started by work on those old mechanical computers. ↩
This post is part 1 of the "AutomationAndRPA" series: