System Surprise: Bounded Rationality and Electric Meters in Dutch Houses

Near Amsterdam, there is a suburb of single-family houses all built at the same time, all alike. Well, nearly alike. For unknown reasons it happened that some of the houses were built with the electric meter down in the basement. In other houses, the electric meter was installed in the front hall.

These were the sort of electric meters that have a glass bubble with a small horizontal metal wheel inside. As the household uses more electricity, the wheel turns faster and a dial adds up the accumulated kilowatt-hours.

During the oil embargo and energy crisis of the early 1970s, the Dutch began to pay close attention to their energy use. It was discovered that some of the houses in this subdivision used one-third less electricity than other houses. No one could explain this. All houses were charged the same price for electricity, all contained similar features.

The difference, it turns out, was in the position of the electric meter. The families with high electricity use were the ones with the meter in the basement, where people rarely saw it. The ones with low use had the meter in the front hall where people passed, the little wheel turning around, adding up the monthly electricity bill many times a day.

Prompt: When have you seen changing or merely surfacing measurement alter the course of a project or product? What did you learn? Tell a story.

System Hierarchy: Two Watchmakers

There once were two watchmakers, named Hora and Tempus. Both of them made fine watches, and they both had many customers. People dropped into their stores, and their phones rang constantly with new orders. Over the years, however, Hora prospered, while Tempus became poorer and poorer. That’s because Hora discovered the principle of hierarchy…

The watches made by both Hora and Tempus consisted of about one thousand parts each. Tempus put his together in such a way that if he had one partly assembled and had to put it down--to answer the phone, say--it fell to pieces. When he came back to it, Tempus would have to start all over again. The more his customers phoned him, the harder it became for him to find enough uninterrupted time to finish a watch.

Hora’s watches were no less complex than those of Tempus, but he put together stable subassemblies of about ten elements each. Then he put ten of these subassemblies together into a larger assembly; and ten of those assemblies constituted the whole watch. Whenever Hora had to put down a partly completed watch to answer the phone, he lost only a small part of his work. So he made his watches much faster and more efficiently than did Tempus.

Complex systems can evolve from simple systems only if there are stable intermediate forms. The resulting complex forms will naturally be hierarchic. That may explain why hierarchies are so common in the systems nature presents to us. Among all possible complex forms, hierarchies are the only ones that have had the time to evolve.

Prompt: When have you worked on a complex system with stable components? How about a complex system with unstable components? Tell a story.

System Surprise: Splitting an Orange

Two chefs, Alice and Bob, are in a kitchen. Their supplier messed up, and only shipped them a single orange that day. There were two customers in the store that day, each ordering something different.

The two chefs reached for the singular orange, stacking hands.

“Oh, I need an orange for my recipe,” said Alice.

“Well I need an orange for my recipe,” said Bob.

In haste, they cut the orange in half and went to work. They served their dishes and their customers were disappointed.

Only when they looked out of the kitchen did they see their mistake. Alice’s customer had ordered orange juice while Bob’s customer had ordered a dish where the orange peel was a garnish. Alice’s customer was unhappy to get a smaller-than-usual juice and Bob’s customer found the peel garnish underwhelming.

Prompt: When have coarse requests resulted in an unhappy outcome for both parties of a negotiation? How have you found suss out that Alice needs the fruit while Bob only needs the rind? Tell a story.

System Suprise: Cobras and Incentives

While under British Rule, the Indian city Delhi had a cobra problem. The British government became concerned about the number of venomous cobras in Delhi.

To incentivize capturing and killing cobras, the British government created a bounty system. They would pay out for every dead cobra brought in.

At first, this was a successful strategy. Large numbers of snakes were killed for the reward. After initial success, some entrepreneurial folks had a great idea: cobra farms. They would breed cobras, kill them, hand over the carcasses, and collect the bounty. The program was scrapped when British authorities became aware of these startups.

Because the snakes were worthless and cost money to keep, the cobra breeders set their now-worthless snakes free. The end result of the British incentive was that Delhi had a worse cobra population than when it started.

This story is an incident of “Rule Beating,” where the rules are followed but the desired outcome fails to be achieved.

Prompt: When have you seen incentives used in creative ways? If you were observing this from afar, how did that make you feel? What was the end result? Tell a story.

System Design: The Goal of Sailboat Design

Once upon a time, people raced sailboats not for millions of dollars or for national glory, but just for the fun of it.

They raced the boats they already had for normal purposes, boats that were designed for fishing, or transporting goods, or sailing around on weekends.

It quickly was observed that races are more interesting if the competitors are roughly equal in speed and maneuverability. So rules evolved, that defined various classes of boat by length and sail area and other parameters, and that restricted races to competitors of the same class.

Soon boats were being designed not for normal sailing, but for winning races within the categories defined by the rules. They squeezed the last possible burst of speed out of a square inch of sail, or the lightest possible load out of a standard-sized rudder. These boats were strange-looking and strange-handling, not at all the sort of boat you would want to take out fishing or for a Sunday sail. As the races became more serious, the rules became stricter and the boat designs more bizarre.

Now racing sailboats are extremely fast, highly responsive, and nearly unseaworthy. They need athletic and expert crews to manage them. No one would think of using an America’s Cup yacht for any purpose other than racing within the rules. The boats are so optimized around the present rules that they have lost all resilience. Any change in the rules would render them useless.

Prompt: Have you ever seen hyper-efficiency decrease the resilience of a system, where one small gust of wind would capsize the whole thing? Tell a story.

System Trap: Success to the Successful

One of the strongest predictors of future income in the 21st century of earnings in the US is education. The more educated one is, the more likely they are to earn more throughout their lifetime. Millions of young people delay the gratification (somewhat) of adult life to spend another 4+ years learning in the American university system.

Competition into elite schools is fierce, with some parents going as far as to break the law by fabricating sports history and collude with coaches to secure sports-related slots and scholarships. For those not willing to break the law, strong standardized test scores and multiple extracurricular activities are required.

And what’s the most causal factor of a child’s standardized test scores? Their parents’ income. The most likely predictor of university admission is the parents’ income. The more income, the more likely their children will attend a 4-year university.

Prompt: When have you seen a “success to the successful” mechanic at work? Was the cycle interrupted? Tell a story.

System Trap: Tragedy of the Commons

Picture a pasture open to all. It is to be expected that each herdsman will try to keep as many cattle as possible on the commons…

Explicitly or implicitly, more or less consciously, he asks, “What is the utility to me of adding one more animal to my herd?”...

Since the herdsman receives all the proceeds from the sale of additional animal, the positive utility is nearly +1… Since, however, the effects of overgrazing are shared by all, … the negative utility for any particular decision-making herdsman is only a fraction of -1.

The rational herdsman concludes that the only sensible course for him to pursue is to add another animal to his herd. And another; and another… But this is the conclusion reached by each and every rational herdsman sharing a commons. Therein is the tragedy. Each is locked into a system that compels him to increase his herd without limit—in a world that is limited. Ruin is the destination toward which all rush, each pursuing his own best interest.

— Garrett Hardin, ecology, 1968

This might feel like a familiar one if you’ve worked in a monolith. Thinking in Systems says there are three ways out:

  1. Educate and exhort. Help people to see the consequences of unrestrained use of the commons. Appeal to their morality. Persuade them to be temperate. Threaten transgressors with social disapproval or eternal hellfire.
  2. Privatize the commons. Divide it up, so that each person reaps the consequences of his or her own actions. If some people lack the self-control to stay below the carrying capacity of their own private resource, those people will harm only themselves and not others.
  3. Regulate the commons. Garrett Hardin calls this option, bluntly, “mutual coercion, mutually agreed upon.” Regulation can take many forms, from outright bans on certain behaviors to quotas, permits, taxes, incentives. To be effective, regulation must be enforced by policing and penalties.

Prompt: When have you seen a Tragedy of the Commons when it comes to work? Was the commons ever repaired, or did it stay a mess forever? How was it repaired? Tell a story.

System Trap: Drift to Low Performance

The years before 2008 were different.

For many, it was much easier to buy a home. Income requirements for individuals and families had slowly loosened over time.

Banks, who would hold the loans, would repackage the loans to mix the bad with the good. Yet almost all of these loan packages were rated as solid investments by reputable parties, like Moody’s.

Why was that? The banks were customers of Moody’s, and they would often shop for the agency that would give them the best rating. A higher rating meant being able to find a wider market. Yet although these packages were given AAA ratings, the mortgages within them were junk.

Well, we know how that ended. (For those too young: A global collapse of the housing market and a deep recession that took a decade to dig out of.)

Prompt: When in your career have you seen safety norms erode over time? Were they corrected? What was the end result? Tell a story.

Gall’s Law

A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system.

— John Gall

Prompt: Have you ever seen a complicated system built from scratch succeed on delivery? Is there a relationship between complex systems and big rewrites? Why does Gall’s Law exist? Tell a story.

Break: A Documentary

If you’re running out of stories or need a break from talking, here’s a short documentary on safety in complex systems. It’s recommended you do not do this first. If you’re to this point and feeling spent, please also consider the option of ending early.