Tuesday, March 15, 2016

BOOK SUMMARY 62 To Engineer is Human

BOOK SUMMARY 62 To Engineer is Human

·         Summary written by: Ingrid Urgolites
"What the engineers of the nineteenth century developed and passed down to those of the twentieth was the trial and error of the mind over matter. They learned how to calculate to obviate the failure of structural materials, but they did not learn how to calculate to obviate the failure of the mind."
- To Engineer is Human, page 62
In To Engineer is Human, an erudite exposition of engineering failures, Henry Petroski explores the reasons designs and structures fail. He incorporates examples of everyday objects and basic designs as well as large complex design failures. He cites several literary passages in the book that illuminate or obfuscate human failure and help illustrate his ideas. Each case study reinforces his underlying theme: engineering is more than precise science it is also human judgment, and humans have a proclivity for error. If we identify and analyze the mistakes, we can learn from them and improve our results.
As Petroski examines the nuances of engineering, he illustrates several ways that we make decisions often for aesthetics, simplicity or comfort that ultimately compromise our enjoyment and quality of life or even cause death. Always, failure is a learning opportunity and the means for making progress. He describes engineering as an art, a working hypothesis that evolves as we continuously reevaluate mistakes. I used to study architecture, so I am a little partial to the paradigm, but I cannot think of a better way to analyze decisions in business and life.
Some formulas for success work under some conditions and not under others. Distinguishing weaknesses between theories to find the best one can be difficult. Using some of Petroski’s Engineering examples from the book, I have outlined how to define your problem before taking action and two ways to identify a solution that may fail.

The Golden Egg
Define the “Puzzle” and the “Solution”
"After the fact there is a well-defined ‘puzzle’ to solve to show how clever one is. Before the fact one must not only define the design ‘puzzle’ but also verify one’s ‘solution’ by checking all possible ways in which it can fail."- To Engineer is Human, page 90
Armchair quarterbacking is much easier than playing the game. Political analysts are quick to catch the missteps of politicians. Letters to the editor often criticize “obvious” flaws. Catching and correcting mistakes after the fact is like solving a crossword puzzle. All you have to do is make the answers fit. We feel smart for seeing the right answer someone missed. Often the person who made a mistake is a leader in their profession, not an incompetent fool. Nevertheless, the consequences of the error may be devastating.
Analyzing mistakes is an excellent way to prevent future errors. The problem is many blunders are possible but have not happened yet. Usually, strategies work when the future resembles the past because plans often focus on preventing previous errors. When the future deviates from the past, plans may fail. Often, this happens when we have innovative ideas. When we put our ideas into action, they also shape the future, which makes it even harder to define all weaknesses. Read on to the GEMs for some ways we can spot flaws.

Gem #1
Find Counterexamples
"While the initial structural requirement of bridging a space may be seen as a positive goal to be reached through induction, the success of a designer’s paper plan, which amounts to a hypothesis, can never be proven by deduction. The goal of the designer is rather to recognize counterexamples to a structurally inadequate hypothesis that he makes."- To Engineer is Human, page 104
A counterexample is an example that disproves a theory. Using logical reasoning alone creates a plan that works on paper but may fall short when put into action. Find examples where others in similar circumstances had tried this before and failed. Analogous to a new design created by an engineer, situations in life may not have an exact match. The key is to break the puzzle into pieces and look for familiar parts that have failed under similar circumstances.
We can trick ourselves into believing a plan will succeed because we have identified the differences between our scheme and the ones that have failed. The differences are often untested theories that look good on paper. They may even be a distraction from real problems. The pieces of the plan that resemble failures of the past may be the same things that will cause this plan to fail.

Gem #2
Computers Dilute Our Experience
"And as more complex structures are designed because it is believed that the computer can do what man cannot, then there is indeed an increased likelihood that structures will fail, for the further we stray from experience the less likely we are to think of all the right questions."- To Engineer is Human, page 200-201
Computers can store and process large amounts of information quickly. They are good at plans that are not innovative and meant for a tried and true purpose. There are limits to their efficacy. They are prone to error partially because their speed and efficiency foster overconfidence in users. Instead of double-checking data, the computer accepts the input. Computers are great at filtering to the degree a user may never see what is important. It may have the wrong answer because it cannot make judgment calls based on its experience. The computer cannot identify the small piece of the puzzle that will fail. Only an experienced human mind recognizes nuances and weaknesses in facts.
There are many typical examples of how using information from computers can be misleading. Relying too much on spell-check can cause more errors than it corrects. An online search for a new restaurant will deliver results based on how the user entered the parameters and how closely that matches the site’s description of what they want. An exercise app might compare calories burned doing pushups and running, but it takes a knowledgeable user to distinguish the benefits of each exercise. Computers are only tools, not a replacement for human contemplation.
As I read this book, I thought of Shakespeare’s Macbeth. In the beginning, pretentious Macbeth believes he is invincible, and he cannot lose his ambitious political conquest but in the end, there is a pivotal moment where Macbeth verbalizes his epic failure, “And all our yesterdays have lighted fools the way to dusty death. Out, out, brief candle! Life’s but a walking shadow, a poor player, that struts and frets his hour upon the stage, and then is heard no more.” In this poetic paradigm for failure, the light illuminates the truth, blocking the light creates a shadow or hides the truth, and the shadow walks and others follow and meet their demise. The misconception masquerades as the truth until the show is over and reality is clear. Too often, we dismiss the tragedy and move on without understanding the fatal error.
Macbeth’s demise has a lot in common with engineering disasters. The plans seem sound and the magnificent structure designed to be invincible. Some unforeseen flaw causes the trusted structure to fail which often causes injury or death. Because we are human, our plans and errors share common characteristics whether it is a political miscalculation or a design problem. Flaws may present themselves as a systematic failure, error or an accident, misunderstanding or miscommunication. It is not possible to obviate every flaw, but learning from the rich history of trial and error of others is like standing on the shoulders of great thinkers. We can see farther and make better decisions in the brief time we have to make a significant contribution.


No comments: