I’ve always defended simplicity. From Occam’s razor in the 14th century we know the lex parsimoniae, meaning parsimoniae simplicity, brevity and succincty. When different explanations are available, ceteris paribus or other things being equal in plain English, the simplest wins. Thus the simpler the better.
Probably you’ll also remember Einstein‘s quote: “make it as simple as possible, but no simpler”. Why should we want complicated explanations when simpler are available? We shouldn’t, unless we were missing something.
What does that have to do with management? Well, there’s a connection. A fairly strong connection if you use systemic thinking. Remember the holistic approach I was proposing? The local maximums don’t lead to a global maximum, optimising the parts doesn’t mean optimising the system… that means that not only adding simple strategies in an organisation’s parts don’t lead to a simple global explanation but that maybe the simplest strategy may not be the most suitable one.
So, should we aim for the simplest solution? Well, that depends… it just may be too simple. There’s an exact point of compromise between simplicity and effectivity.
W. Ross Ashby, psychiatrist, one of the founding fathers of cybernetics
That takes us to Ashby’s law:
«The larger the variety of actions available to a control system, the larger the variety of perturbations it is able to compensate.»
Ashby’s law is also known as Law of Requisite Variety. It’s a funny concept because we intuitively know that sometimes we need very complicated solutions, but the idea of simplicity sells a lot more. So it’s not too fashionable to search for complex solutions. Looks like that you have the obligation to come up with something simple.
Guess what? Reality is stubborn. And while you try to implement simplicity, complexity will crawl and arise again and again. This basic law of life will stab your project.
One example: a railway level crossing. Engineers design the signals on the road efficiently. Other engineers design the traffic lights, with several tests and taking into account the different timings. Both teams with my-box thinking. Efficient. There are some problems, cars get very close to the trains. The system is reviewed and timings measured again. Nothing’s wrong again, until 1995:
October 25th 1995, Illinois. One of the worst track-crossing accidents in the history of the US. A train collides with a high-school bus. Seven teenagers killed. No human error. All the systems working fine. The warning lights activated on time. There simply was an unexpected, very improbable, combination of events that led to the disaster. $27.3 million were paid to the victims’ relatives, but that didn’t save them.
The control system was too simple. It didn’t contemplate all the variety that could be produced. It simply had some timings established and a big margin above them to ensure safety. But statistics and margins are not enough. There is a need to understand the system to be able to control it. That’s why there are still humans controlling machines.
There’s more to systems than meets the eye. The typical examples belong to engineering: from the first Tacoma Narrows bridge in San Francisco, also known as Galloping Gertie, that collapsed November 7th 1940, to the £18.2m Millennium Bridge in London, that was opened in 2000 and was closed three days later because of swaying and extreme wobble and remained closed until 2002 and an additional £5m. Bridges are good examples of systems doing things they were not meant to do.
But don’t read this as an engineering story. All this is also applicable to organisations. The manager must be able to counteract disturbances, and she is always outnumbered. Only organisations with enough variety and diversity will adapt to changing realities -and thrive in them-. Only that way they are prepared to foreseen -or unforeseen- contingencies. Those too homogeneous will crash when the wrong wind blows, unable to adapt.