The problem with solutions
METASKILLS Chapter 24
A small country wants affordable energy for its growing population. Since it has an abiding concern for the quality of the environment and the health of its people, it comes to believe that the answer lies not in dirty coal power but in clean, renewable nuclear energy. It receives a great deal of encouragement, expertise, and financial help from a larger nation to make this initiative possible. Soon, fresh new power plants spring up around the countryside, and the environment improves. Yet the energy generated by the plants turns out to be anything but clean. Live radioactive waste must be buried for up to ten thousand years before it’s safe, and additional nuclear material must be contained in cooling pools where it’s vulnerable to accidents. Eventually, a freak wave destroys a number of power plants, contaminating the environment and damaging the very health the country was trying to protect.
A young woman dreams of becoming a lawyer so she can devote herself to social issues. Wanting the best possible education, she enrolls in a well-respected college. She’s not eligible for scholarships and takes out $150,000 in loans to cover the cost of tuition, books, housing, food, and transportation for six years of schooling. She graduates at the top of her class. However, due to a difficult job market, she’s unable to find a position that pays enough to cover her loan payments. She goes to work for a firm that defends socially irresponsible companies from class-action suits, thereby betraying her own dreams.
A new CEO is hired to turn around a public company with eroding profit margins. He goes to work trimming any costs that are not likely to lead to immediate revenues. He offers “early out” retirement packages to highly paid managers and lays off a large percentage of employees who aren’t involved in sales. He then divides the company into separate businesses, giving each manager the autonomy to run his or her business in a manner that increases revenues. Profits improve. He soon forges a strong bond with analysts, who begin to trust his quarterly earnings guidance. After a few years of steady financial gains, however, the company finds that its brand is no longer coherent, and the products in its pipeline are less than exciting. Earnings decline. Shareholders become nervous, so the directors find a way to remove him. The new CEO inherits a company that’s worse off than before, and he’s unable to fix the deepening systemic problems.
These are all true stories of how solutions can turn into problems. With complex systems such as companies, governments, and markets, the answers aren’t always obvious. The difficulty is that they can seem obvious. Even when decision makers find the right levers, they often pull them in the wrong direction. A driver whose car skids on an icy road is more likely to turn the wheel the wrong way than the right way, simply because the right way is counter-intuitive. A CEO whose company suffers from sagging profits is likely to focus on costcutting instead of innovation, simply because the rewards are more direct and immediate.
With complex systems the answers aren’t obvious—the difficulty is that they can seem obvious.
There’s a Sufi parable that goes like this: You think that because you understand one that you also understand two, since one and one make two; but you forget that you must also understand and. When we encounter a system that’s complex enough to create multiple interactions, we need to beware of traps. Truly complex systems are not only riddled with traps, but can also be reactive, meaning that they fight back when we try to fix them. Thus was born the concept of wicked problems. The Israeli-Palestinian conflict is a wicked problem, as is the global economic crisis. Every solution seems to make the problem worse.
If we see that pulling a lever a short distance gives us a desirable response, we might think that pulling it twice as far will produce double the response. It certainly might, but if the system is complex enough, it might produce a much smaller response, or the response times ten, or an opposite response. For example, redesigning a supermarket package with a little more yellow might boost sales, but doubling the amount of yellow could kill sales. Nonlinear problems like packaging graphics can utterly confound the relationship between cause and effect.
When complex systems meet simple measurement, the results can be perverse. In the 1930s we began measuring our welfare by the goods and services we produce each year. It wasn’t long before productivity became the Holy Grail for the entire society, replacing the previous goal of happiness with one that’s more easily measured. This is the cultural equivalent of the drunk who forgets his car keys in the bar, but searches for them under the street lamp because the light is better.
Economist Victor Lebow introduced the term “conspicuous consumption” in the 1950s, complaining that we’d already begun to ritualize the purchase of goods in search of spiritual satisfaction. “We need things consumed, burned up, replaced, and discarded at an ever-accelerating rate.” The result has not been happiness, nor spirituality, nor economic health, but a national shopping jones that’s turning our birthright into a landfill. Too harsh? We’ll see.
Narrow measurement has also dogged educational reform, often producing exactly the opposite effect that we wanted. When we measured our progress in dollars spent per student, we got an increase in dollars spent per student. When we measured performance on standardized tests, we got improved performance on standardized tests. Meanwhile, the quality of education continued to sink. Football coach Vince Lombardi famously said, “If winning isn’t everything, why do they keep score?” It’s axiomatic in sports, education, business, and government that whatever gets measured gets better. So the lesson is this: Be careful what you measure; the scoreboard can easily become the game.
Is there a way to ensure that our proposed solutions are actually solutions? Probably not. But when you understand that complex systems don’t behave in linear ways, you can often rule out linear solutions and measurements that produce nasty surprises. There’s no such thing as a foolproof system. Anyone who says otherwise is underestimating the ingenuity of fools. All you can do is adopt a humble attitude and look at the challenge from a number of perspectives. Here are some questions to ask before tinkering with a system:
What will happen if I do nothing?
What might be improved?
What might be diminished?
What will be replaced?
Will it expand future options?
What are the ethical considerations?
Will it simplify or complicate the system?
Are my basic assumptions correct?
What has to be true to make this possible?
Are events likely to unfold this way?
If so, will the system really react this way?
What are the factors behind the events?
What are the long-term costs and benefits?
We shouldn’t become too discouraged if at first we don’t succeed. It took nature 13 billion years to create the systems around us, and they still don’t always work perfectly.



