logo

Wrong brothers, a wind tunnel and the cockpit

Saqiful Alam | Tuesday, 24 December 2013


We all have heard about Orville Wright and Wilbur Wright, the brothers who built the first airplanes. But no one has ever before heard of the Wrong brothers. As it happens, they were also having the thought of building a flying machine. Their plan was to fly people commercially, long distances, and generate revenue in return. Since they were thinking of long travel times, the concern of emptying the bladder was one of the top priorities while designing the flying machine. So the final design looked very much like the modern day airplane bodies, with seating arrangements and lavatories. But, as you might have guessed by now, a flying machine designed in those times with such facilities without the power of jet engines, the Wrong brothers never managed to get their plane to fly.
Now, to tell you the truth, the Wrong brothers never existed. They were a couple of fictional characters I borrowed from the book Flaw of Averages by Sam L. Savage to prove a point. Unlike the Wrong brothers, the Wright brothers spent a large portion of time experimenting with a model of an aircraft which does not even remotely represent the modern aircraft, and yet it was the first machine that actually flew. A good example would be when Orville was initially struck by the realisation of how to turn the wings of an airplane while idly twisting a long thin bicycle inner tube box. And this highlights the most important insight into building any models to manage risk - "the most important models are like embryos that may not resemble the final product but that nonetheless contain the developmental necessities of the application" (which, incidentally also abbreviates to DNA)! Also, this insight is covered by the quote of George Box - "All models are wrong, some are useful".
I would discuss some insights to help develop a mindset for building models to handle risk and uncertainty. One such insight - simple models being most effective - has already been illustrated. Let's look into another one.
One of the most unaided and pure form of flight is using gliders. A small body of an aircraft without any engine is towed up by a tow plane, and then left in the air. The pilot uses the air current to stay afloat in the air. The cockpit of the gliders planes are simple, and show the altitude, pressure and the speed. The tricky thing with gliders is that if you are rising too high up, you will slow down and stall. On the other hand, if you dive down too sharply, the speed increases alarmingly and the gliders wings may tear off! Thrilling, yes, but pure fun! So, an experienced glider pilot never really relies on the instruments. The speedometer shows a decrease in speed only when you are going up, and to increase speed, you need to point the nose of the glider down, which causes your speed to increase. So all these dive ups and downs create what flyers call Pilot Induced Oscillation, which again is bad for the pilot and the aircraft. The best strategy is to have an imaginary line in the windshield which makes the plane as straight as possible, and then use the steering to stick to that line. But then, what happens if you fly into a cloud? How would you know that the airplane is straight or not? That's when an experienced pilot looks to the instruments.
But why am I playing the role of an amateur flight instructor? Because I believe managing risk is very similar to flying a glider. The instruments in the cockpit are the very many complex tools that we have at out our disposal to handle risk. These have been designed to handle complex data, but just as the instruments, managing risk just by using them will lead to equally bad experiences. An experienced risk manager, just like an experienced pilot, uses the indications of everyday activities 'out there' and uses them to calibrate the models of risk management, which they use in times of uncertainties. And this is another important insights - calibrating the models of handling risk and uncertainty first with the real world outcomes and intuition, and then using them to make predictions is more effective than just relying on what the model only says.
This leaves me to the last set of insights that I would like to share in this article. One of my favorite quotes is "A successful model tells you things you didn't tell it to tell you" by J.P. Brashear, a business consultant. We spend a lot much time and money to develop a model that tells us what we want it to tell us. But that does not help illuminate risk. Better have a model which tells us something that we didn't expect in the first place, which will help us prepare for that outcome. A similar insight into modeling risk and uncertainty was provided by a computer scientist of Stanford, who said that the five stages of model development are -
1.    Decide what to model
2.    Decide how to build the model
3.    Build the model
4.    Debug the model
5.    Trash the model and start again, now that you know what you wanted in the first place.
Once we realize that the last stage is inevitable, we would be more likely to discard a faulty model and start over again rather than spend time and money to completely patch the faulty one up.
So this article represents the summed up mindset that we need to have when modeling risk and uncertainty. The bottom-line of handling risk is that in present day business, efficient handling of risk and uncertainty is a pressing matter. Although there are tools and models available to handle risk, the effectiveness of the tool will rely on the decision maker, and this is where the insights discussed here come in handy. And sometimes, it is more about simple understanding of an issue that helps us handle risk and uncertainty better than having a complex approach, as stressed by my most favorite quote by the author who convinced me to take the study of risk management as my passion - "It's dumb to be too smart".
The writer is a faculty member of North South University (NSU)
Email: [email protected]