Bits of Books - Books by Title


Future Babble

Dan Gardner

Why Expert Predictions Are Next to Worthless, and You Can Do Better



Economists have a virtually unblemished record of failing to predict recessions. A near-universal truth that economists predictions are least accurate when most needed.

So 1) why do expert opinions fail? and 2)why do we believe them anyway?

1) the world is complex and our brains make systematic mistakes - we are trying to predict an unpredictable world using an error-prone brain.

2) we have a hard-wired aversion to uncertainty:
- we don't like not knowing what is coming, so we try to eliminate uncertainty as much as possible
- we see patterns where there are none
- we treat random results as if they are meaningful
- we treasure stories that replace complexity and uncertainty of real life with simple narratives
- sometimes we create our own stories, but we know these are made up and we know we're not 'experts'
- so stories made up by supposed authority figures have much more credibility
- and the only experts the media prefer are ones who are confident and conclusive
- the vague and waffly ones don't appear
- and that's how we like it bc that's how we want to feel

Philip Telock: Foxes and Hedgehogs
- found 284 prognosticators and got from them 27,500 predictions about future
- analysis showed they'd have been just as successful if they'd got a chimp to throw darts at a board

Worst (ie one's who scored worse than if they'd just flipped a coin) were the ones who didn't like doubt or uncertainty. They had one Big Idea, one way of interpreting the world, and they used that as a template, simplifying everything to fit that template.

Ones who did better than average tried to synthesise info from a range of sources, rather than eliminating data which didn't fit into a Big Idea. Self-critical and happy to acknowledge mistakes.

Paradox that the most accurate experts were the ones with least confidence of being right.

The bigger the media profile of the expert, the less accurate his predictions.

Fox/hedgehog comes from ancient Greek philosopher Archilochus who said 'the fox knows many things but the hedgehog knows just one big thing. Paul Erlich is a hedgehog, as are most media pundits

More expertise means more knowledge, which produces more details and more complexity. With more details there's more potential conflict - you never find all the facts lining up and pointing one way. Foxes are ok with that sort of ambiguity - 'maybe' is an acceptable answer. But hedgehogs find that unacceptable. They want simple and certain answers, and they're certain they can get them via their One Big Idea that drives their thinking.

With this mindset, the hedgehog's extra knowledge doesn't challenge his existing bias, it supercharges it. Expertise boosts hedgehog's ability to see patterns that aren't there, and to deal with contradictory evidence by rationalising it away or by twisting it so it actually supports your conclusion.

Hedgehogs are far more confident than foxes - they never accept that they might be wrong, and they use words like 'certain' or 'impossible'. 1980's just about all the experts were predicting that japan was going to take over the world. Yet what happened was that Japan crashed and burned, whereas the USA turned into a tech-driven golden age. Why such a clash? Because we take existing trends and extrapolate. But it's like driving no hands - fine when the road is straight but catastrophic when you reach a sharp bend, which you inevitably will.

Status Quo bias - it's a truism in literature that novels set in the future say more about the time they were written in, and little or nothing about the future.

Just about everything we are interested in predicting is non-linear
- prob of 1) chaos - cumulative effect of butterfly fluttering its wings in Brazil
- 2) feedback - one part of the system slowing/increasing/changing another part - and often multiple feedback loops

And this is talking about physical objects - now imagine the billiard ball has eyes, self-awareness and complicated psychological motivations.

Yet social scientists persist in making predictions as if people are subject to the old idea of predictable science.

Unpredictability of random events such as the fall of the Berlin Wall Nov 9 1989. An East German spokesman on TV announced that Eat Germans would be allowed to visit West Germany. The new policy was meant to come into effect the next day, with strict conditions, but that part wasn't in the script. So when the spokesman was asked when the policy came into effect, he looked through his papers, shrugged and said "immediately." Thousands took him at his word and descended on border gates. Guards had also seen the announcement so they opened the gates. The strict conditions never had chance to be implemented. History turned on the misuse of a simple word.

Demographics often flawed. Trends change all the time. Even basic facts change. You wd think that in 1952 they cd tell you what the population was in 1951. But UN Demographic Yearbook changed official estimate of 1951 popn 17 times between 1952 and 1996.

Price of oil: "maybe the era of mad Max really is coming, finally. Or maybe cheap oil will rise from the dead again. Or maybe new technologies will surprise us all and create a future quite unlike anything we imagine. The simple truth is that no one really knows, and no one will know until the future becomes the present. The only thing we can say with confidence is that when the time comes, there will be experts who are sure they know what the future holds."

We are just not designed for the life we lead. We have spines that let us walk upright but are so flawed that we are routinely disabled with back pain. Our vision is marred by blindspot bc our retina designed backwards. We have wisdom teeth that emerge to inflict pain for no good reason. We live in Information Age but our brains are still in Stone Age.

We think we have one mind, but every time we make a decision, we find we have two. We have a conscious mind, and bc it's the only thing we're aware of, we think that's 'me'. But most of what our brain does happens below cs level. Our uncs mind delivers feelings/hunches/intuitons. Judgement usually comes from the uncs mind which operates quickly, on the basis of scant evidence, and then passes it to the cs system which slowly and deliberately adjusts them.

Richard Wiseman Wallet expt - 240 wallets dropped around Edinburgh - no money, just a variety of personal stuff, plus a photo. Different return rates depending on photo:
no photo 15% returned
elderly couple 25% returned
family 48%
puppy 53%
baby 88%

Our uncs mental system equates a picture with reality, whether it's a baby or a naked woman - sounds silly until you realize our Stone Age brain has very little experience with unreal images.

In same way jilted girlfriends tear up photos of their ex's, and parents can't rip up pics of their kids (even duplicates)

We have a huge problem with randomness. We understand it intellectually but we don't get it intuitively. So people will believe a slot machine is 'due' to pay out.

Apple had to change iPod random shuffle algorithm bc, as it was random, sometimes threw up what looked like patterns. So they had to fiddle with the formula to make it less random to make it seem more random.

In the Stone Age there was no pay off for understanding randomness intuitively, but ability to spot patterns and causal relationships was a matter of life-and-death. So pattern recognition got hard-wired into our genes.

Expt where put people in front of red light and a green light and had to guess which one was going to light up. Rigged so that the red light came on 80% of time. Every creature - pigeons, rats, dogs - beat humans at test, bc they just hit the red button all the time. But humans only guessed the red light 80% of the time - knew red wasn't coming up all the time.

Overconfidence a universal human trait. Ask a smoker about risk of lung cancer, and they know it's high. But their risk? Not so high. Starting a new business? Most fail but mine won't. Getting married? Everyone shd have a prenup, but I don't need one.

Confirmation bias - Once you come to a conclusion you only notice data which supports that. We don't just not admit mistakes - we will do anything to avoid facing.

Anchoring - even affects psych students who know they're being tested, and even when the anchor numbers are absurd. Ask one group 'Was Gandhi older or younger than 9 when he died?' and ask other group 'Was Gandhi older or younger than 140 when he died?' and then ask how old they thought he actually was when he died. First group average guess 50, second group 67.

We are suddenly living in a world awash with visual information. Google 'Rome' and within seconds you'll have access to a selection if real time live feeds that will show you Rome, no matter where you are in the world. Of course, no one is particularly impressed with this, bc we've had this for, well, about 10 years.

But for every generation before us, this was simply inconceivable. And our brains are not equipped for it - they evolved in a time of extreme info scarcity.

We expect the future to be 'more of the same'. Immediately after Sept 11 2001 many Americans expected more terrorist attacks , even ones which wd hit them personally. Yet the previous terrorist attack had been the bombing of th World Trade Center in 1993. But our Stone Age brains don't think in that way. Shocking terrorist attack? Didn't see it coming? Let's imagine another Shocking Terrorist Attack. Groupthink demonstrated in basic Psych 101 expts where we abandon our own judgements in face of group consensus.(Classic expt where subjects asked to guess length of bit of wood - all but one of the subjects are actually stooges who deliberately overestimate - the last subject's estimate is always 'pulled' toward the rest of the group, even though silly). Irving Janis in 'Victims of Groupthink' analysed decisions on defence of Pearl Harbour, Bay of Pigs, go to war with Korea and Vietnam, and showed that even though the experts highly educated, experienced and skilled, they still got overwhelmed by conformity. Tell clients what they already believe to be true and they will be happy. We all want our beliefs to be confirmed. But dispute those beliefs and the same psychology works against you. You risk losing client and yr reputation. Following the herd is usually the safest bet. Status quo bias means predictions most likely to be correct when current trends continue, and most wrong when drastic change. When the road is straight anyone can see where it is going. It's the curves and corners that cause the crashes. So predictions are most likely to be right when they are least needed, and wrong when they are essential. Superstition rises and falls in tandem with stress and uncertainty. People living in areas of risk (eg Israelis near Gaza), are more likely to engage in 'magical thinking'. Some churches strictly enforce a clearly defined moral and spiritual dogma. They insist that there is only one truth and that they know what it is. Others are more liberal, and tolerate a wide range of beliefs and behaviours. Research showed that in uncertain times (such as 1930's and 1970's and 2000's) membership of the stricter more dogmatic churches went up at the expense of the more liberal churches. Conspiracy theories. To the orderly mind of the conspiracy theorist, there is no such thing as randomness. Every accident, every coincidence, is meaningful. "History is a war between good and evil where everything unfolds according to a plan." It's usually when things are going wrong that we worry most and look to 'experts' who can take away our uncertainty. But at such times, optimistic forecasts feel wrong. The gloom merchants message is supported by our intuitive pessimism. Foxes can't deliver certainty - they can see a range of possible futures and are cautious about probabilities. But hedgehogs give people the certainty they crave. Unfortunately this doesn't make them better able to predict the future, it makes them worse. Philip Tetlock actually measured it - he used Google hit stats to measure the fame of each of his 284 'experts'. Very simple relationship The more famous the expert, the worse his predictions We trust people who call themselves experts. A strong, enthusiastic, confident speaking style beats mere rationality. Persuasion has little to do with intelligence, ability or facts - confidence convinces. Expt where a guy phoned 22 different nursing stations at hospital wards, identifying himself as a doctor and told nurse to give a patient a large dose of a certain drug. The nurse should have refused bc a) didn't know the doctor b) was against hospital policy for doctors to direct over phone c) the drug hadn't been cleared for use on this type of patient d) label clearly said the maximum dose was half what the doctor had ordered. Despite all this, 95% of the nurses got the drug and were on their way to patient when researchers intercepted them. In effect the nurses stopped thinking independently the moment they heard "doctor". People don't want to hear stats that prove yr performance; they want to hear stories. Stats are rational, people are not. You have to convince client that you're caring and trustworthy. You do it with a good story - a confident and clear explanation of how you (make decisions for eg) that get the client nodding 'Yes that makes sense'. Paul Erlich the gold standard. He is articulate, enthusiastic, authorative, likeable (and wrong). His stories are simple clear and compelling. Doesn't acknowledge mistakes, and never ever says 'I don't know'. The fact that his predictions have been mostly wrong is irrelevant. Simon, a business prof at U of Maryland argued that humans are endlessly resourceful, and if running out of something they'd find alternatives or new ways of getting stuff. He challenged Erlich to a bet that price of any resource wd decline over time (proving that it wasn't getting scarcer). Erlich accepted and chose copper, tin, chrome, nickel and tungsten. In 1980 they bought $1000 worth of each and in 1990, sold them. If the price of metals higher, Erlich wd win and collect all profits; if cheaper, Erlich wd lose and have to pay for the losses. In 1990 they were all cheaper and Erlich had to write a cheque. But he refused to accept that he was wrong - Simon was just lucky. In fact the prediction of declining costs held true for 1990 - 2000 held true as well, but for next decade, 2000-10, prices of commodities soare. This shouldn't have happened, according to Simon, but it did. Erlich has fans, some of whom are very committed, and they rationalize as intensively as Erlich does. "Oh the reason why Population Bomb didn't come true was bc it spurred people to take necessary preventitive measures such as the Green Revolution and population control." A few minutes with Google wd have shown him that both these factors predated Population Bomb. But the fan doesn't spend those few minutes bc he already has what he needed - a rationalization that soothes the cognitive dissonance the way aspirin soothes a headache. 'Jeanne Dixon Effect' - she was a psychic renown for making several supposedly accurate predictions such as the assassination of JFK. But when examine what she actually said, predictions were a lot more general and ambiguous. But mainly they were part of a long list of other predictions (USSR wd put first man on moon, China wd start a world war wit US) that failed. More books on Mind











Books by Title

Books by Author

Books by Topic

Bits of Books To Impress

Reputation Control .........................................................................................Client William Flew