Thinking Fast and slow by Daniel Kahneman
(Winner of Nobel Prize in economics)
[One of the best books I read on decision making - a very insightful book indeed. Talks about how people take decision and he divide them into System 1 & system 2 type of decision making]
Psychologists have been intensely interested for several decades in the two modes of thinking and will refer to two systems in the mind, System 1 & system 2. These names (system 1&2) take less space in your working memory and hence used in this way rather than saying other ways (say, automatic thinking process and effortful thinking process). Anything that occupies your working memory reduces your ability to think.
System 1 operates automatically and quickly with little or no effort and no sense of voluntary control
System 2 allocated attention to the effortful mental activities that demand it, including complex computations. The operation of System 2 is often associated with the subjective experience of agency, choice and concentration.
The highly diverse operations of system 2 have one feature in common: they require attention and are disrupted when attention is drawn away. Intense focusing on a task can make people effectively blind, even to stimuli that normally attract attention. See the http://www.youtube.com/watch?v=IGQmdoK_ZfY
System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by system 2, impressions and intuitions turn into beliefs and impulse turn into voluntary actions. When all goes smoothly which is most of the time, System 2 adopts the suggestions of System 1 with little or no modifications. You generally believe your impressions and act on your desires and that is fine - usually.
When System 1 turns into difficulty, it calls System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 takes over when things get difficult and normally has the last word. One limitation of System 1 is that it cannot be turned off. One such example is Muller-Lyer Illusion. To resist the illusion, there is only one thing you can do : you must learn to mistrust your impressions of the length of lines when fins are attached to them (or similar things in other cases). To implement the rule, you must be able to recognize the illusionary pattern and recall what you know about it.
Phrases that are in this context:
“He had an impression, but some of his impressions are illusions”
“This was a pure System 1 response. She reacted to the threat before she recognized it”
“This is your System 1 talking. Slow down and let your System 2 take control”
Psychologist Hess says our pupils are sensitive indicators of mental effort - they dilate substantially when people use their brain (say multiply two numbers and they dilate more if the problems are harder than if they are easy). His observations indicated that the response to mental effort is distinct from emotional arousal. As you become skilled in a task, its demand for energy diminishes. Studies of the brain have shown that the pattern of activity associated with an action changes as skill increases Low of less effort applies to cognitive as well as physical activity.
Phrases that are in this context:
“ I won’t try to solve this while driving. This is a pupil-dilating task. it requires mental effort”
“The law of least effort is operating here. He will think as little as possible”
“She did not forget about the meeting. She was completely focused on something else when the meeting was set and she just did not hear you”
“What came quickly to my mind was an intuition from System 1. I’ll have to start over and search my memory deliberately
System1 has more influence on behavior when System 2 is busy. People who are cognitively busy are also more likely to make selfish choices, use sexist languages and make superficial judgments in social situations. Too much concern about how well one is doing in a task sometimes disrupts performance by loading short-term memory pointless anxious thoughts - self-control requires attention and effort. The nervous system consumes more glucose than most other parts of the body and effortful mental activity appears to be especially expensive in the currency of glucose.
Lazy system 2 here is simple puzzle.
A bat and ball cost $1.10. Bat costs one dollar more than the ball
How much does the ball cost? More than 50% of students in Harvard, MIT & Princeton gave the intuitive -incorrect-answer 9they said, 10 cents for the ball. If you really let System 2 the math, you will find the cost of the ball as 5 cents)
Phrases that are in this context:
“She did not have the struggle to stay on task for hours. She was in a state of flow”
“His ego was depleted after a long day of meetings. So he just turned to standard operating procedures instead of thinking through the problem”
“He did not bother to check whether what he said made sense. Does je usually have a lazy system 2 or was he usually tired?”
“Unfortunately she tends to say the first thing that comes into her mind. She probably also has trouble delaying gratification. Weak system 2”
Your think with your body, not only with your brain. The mechanism that causes these mental events has been known for a long time.: it is the association of ideas. Scottish Philosopher David Hume reduced the principle of association to three: resemblance, contiguity in time and place, and causality. An idea that has been activated does not merely evoke one other idea. It activates many ideas, which in turn activate others. Only a few of the activated ideas will register in consciousness; most of the work of associative thinking is silent; hidden from our conscious selves.
Phrases that are in this context:
“The sight of all these people in uniforms does not prime creativity”
“The world makes much less sense than you think. The coherence comes mostly from the way your mind works”
“They were primed to find flaws and this is exactly what they found”
“His system 1 constructed a story and his System 2 believed it. It happens to all of us”
“I made myself smile and I’m actually feeling better”
One of the dials measures cognitive ease, and its range is between ‘easy’ and strained. Easy is a sign that things are going well and Strained indicates that a problem exists which will require increased mobilization of System 2. For example, a sentence that is printed in a clear font, or has been repeated or has been primed will be fluently processed with cognitive ease. Conversely you experience cognitive strain when you read instructions in a poor font or in faint colors or worded in complicated language or when you are in a bad mood and even when you frown.
Phrases that are in this context:
“Let’s not dismiss their business plan just because the font makes it hard to read”
“We must be inclined to believe it because it has been repeated so often, but let’s think it through again”
“Familiarity breeds liking. This is a mere exposure effect.”
“I’m in a very good mood today and my System 2 is weaker than usual. I should be extra careful”
Psychologist Paul Bloom presented the provocative claim that our inborn readiness to separate physical and intentional causality explains the near universality of religious beliefs. He observes that “we perceive the world of objects as essentially separate from the world of minds, making it possible for us to envision soulless bodies and bodiless souls:. In bloom’s view, the two concepts of causality were shaped separately by evolutionary forces, building the origins of religion into the structure of system 1.
“Her favorite position is beside herself and her favorite sport is jumping into conclusions”
Jumping into conclusions on the basis of limited evidence is so important to an understanding of intuitive thinking and comes up so often in this book., that I will use a cumbersome abbreviation for it: WYSAITI - what you see is all there is. System 1 radically insensitive to both the quality and quantity of the information that gives rise to impressions and intuitions.
WASIATI explain a long and diverse list of biases of judgment and choice, including the following among many choices.
Overconfidence: The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little.
Framing effects: 90% fat free is attractive than 10% fat. The equivalence of the alternate formulations is transparent but an individual normally see only one formulation and what she sees is all there is.
Base-rate: Statistical fact almost did not show up to your mind when you when first encounter questions of that sort.
Phrases that are in this context:
“She knows nothing about the person’s management skills. All she is going to by is the halo effect from a good presentation”
“Let’s de-correlate errors by obtaining separate judgments on the issue before any discussion. We will get more information from independent assessments”
“They made that big decision on the basis of a good report from one consultant. WYSIATI- what you see is all there is. They did not seem to realize how little information they had”
“They didn’t want more info that might spoil their story. WYSIATI”
System 1 represents categories by a prototype or a set of typical exemplars; it deals well with average but poorly with sums. The size of the category, the number of instance it contains, tends to be ignored in judgment of what I will call sum-like variables. System 1 carries out many computations at any one time. Some of the routing assessments that go on continuously. Whenever your eyes are open, your brain computes a three-dimensional representation of what is in your field of vision, complete with the shape of objects, their position in space and their identity. No intention is needed to trigger this operation or the continuous monitoring for violated expectations.
Phrases that are in this context:
“Evaluating people as attractive or not is a basic assessment. You do that automatically whether or not you want to and it influence you”
“There are circuits in the brain that evaluate dominance from the shape of the face. He looks the part for a leadership role”
“The punishment won’t feel just unless its intensity matches the crime... Just like you can match the loudness of a sound to the brightness of a light”
“This was a clear instance of a mental shotgun. He was asked whether he thought the company was financially sound, but he could not forget that he likes their product”
For most of us, this impression of 3D size is overwhelming. Only visual artists and experienced photographers have developed the skill of seeing the drawing as an object on the page. For the rest of us, substitution occurs: the dominant impression of 3-D size dictates the judgment of 2-D size. The illusion is due to 3D heuristic. What happens here is a true illusion, not misunderstanding of the question. The essential step in the heuristic - the substitution of three dimensional for two-dimensional size- occurs automatically. The picture contains cue that suggest a 3D interpretation. These cues are irrelevant to the task in hand - the judgment of size of the figure on the page- and you should have ignored them, but you could not. The bias associated with the heuristic is that objects that appear to be more distant also appear to be larger on the page.
Phrases that are in this context:
“Do we still remember the question we are trying to answer? Or have we substituted an easier one?”
“The question we face is whether this candidate can succeed. The question we seem to answer is whether she interviews well. Let’s not substitute.”
“He likes the project, so he thinks its costs are low and its benefits are high. Nice example of the affect heuristic.”
“We are using last year’s performance as a heuristic to predict the value of the firm several years from now. Is this heuristic good enough? What other information do we need?”
Amos and I called “belief in the law of small numbers” and we explained, that intuitions about random sampling appear to satisfy the law of small numbers, which asserts that the law of large numbers applies to small numbers as well. Statistical intuitions with proper suspicion and replace impression formation by computation whenever possible”.
Such mistakes are mentioned in these phrases
“Yes, the studio has had three successful films since the new CEO took over. But it is too early to declare he has a bot hand”
“I won’t believe that the new trader is a genius before consulting a statistician who could estimate the like hood of his streak being a chance event”
“The sample of observations is too small to make any references. Let’s not follow the law of small numbers”
“I plan to keep the results of the experiment secret until we have a sufficiently large sample. Otherwise we will face pressure to reach a conclusion prematurely”
Anchoring effect: It occurs when people consider a particular value for an unknown quantity before estimating that quantity.
is the height of the tallest redwood more or less than 1200 feet?
What is your guess about the height of the tallest redwood?
The ‘high anchor in this experiment was 1,200 feet and people tend to give an answer for the second question in the high value due to the anchor mentioned in the first question. This is same in any negotiation and in the bazar; the initial anchor has a powerful effect. If the seller mentions an outrageous proposal, there is no meaning in outrageous counter offer from your side, creating a gap that will be difficult to bridge in further negations. Instead you should make a scene, storm out or threaten to do so, and make it clear - to yourself as well as to the other side, - that you will not continue the negotiations with that number on the table.
Phrases that are in this context:
“The firm we want to acquire sent us their business plan, with the revenue that expect. We should not let that number influence our thinking. Set it aside”
“Plans are best-case scenarios. Let’s avoid anchoring on plans when we forecast actual outcomes. Thinking about ways the plan could do wrong is one way to do it”
“Our aim in the negotiation is to get them anchored on this number”
“Let’s not make it clear that if that is their proposal., the negotiation are over. We do not want to start there”
“The defendant’s lawyers put in a frivolous reference in which they mentioned a ridiculously low amount of damages and they got the judge anchored on it”
The availability heuristic like other heuristic of judgment, substitutes one question for another: you wish to estimate the size of a category or the frequency of an event., but you report an impression of the ease with which instances come to mind. Substitutions of questions inevitably produces systematic errors. You can discover how the heuristic leads to biases by following a simple procedure: list factors other than frequency that make it easy to come up with instances
Phrases that are in this context:
“Because of the coincidence of two planes crashing last month, she now prefers to take the train. That’s silly. The risk hasn’t really challenged; it is an availability bias”
“He underestimates the risks of indoor pollution because there are few media stories on them. That’s an availability effect. He should look at the statistics”
“She has been watching too many spy movies recently., so she is seeing conspiracies everywhere”
“The CEO has had several success in a row, so failures does not come easily to her mind. The availability bias is making here overconfident”
Judging probability by representativeness has important virtues: the intuitive impressions that it produces are often more accurate than chance guesses would be. One sin of representativeness is an excessive willingness to predict the occurrence of unlikely (low base rate) events. The second sin of representativeness is insensitivity to the quality of evidence. The correct answer is that you should stay very close to your prior beliefs, slightly reducing the initially high probabilities of well-populated field and slightly raising the low probabilities of rare specialties. You should not let yourself believe whatever comes to your mind. To be useful, your beliefs should be constrained by the logic of probability. Rev.Thomas Bayes who credited with the first major contribution to a large problem: the logic of how people should change their mind in the light of evidence. Baye’s rule specifies how prior beliefs (base rate) should be combined with the diagnosticity of the evidence, the degree to which it favors the hypothesis over the alternative. In short, the essential keys to disciplined Bayesian reasoning can be summarized:
Anchor your judgment of the probability of an outcome on a plausible base rate
Question the diagnosticity of your evidence.
Phrases that are in this context:
“The lawn is well trimmed, the lobby looks competent...but this does not mean it is a well-managed company. I hope the board does not go by representativeness”
“The start-up looks as if it could not fail, but the base rate of success in the industry is extremely low. How do we know this case is different?”
“They keep making same mistake: predicting rare events from weak evidence. When the evidence is weak, one should stick with the base rate.”
“I know this report is absolutely damning. and it may be based on solid evidence, but how sure are we? We must allow for that uncertainty in our thinking.”
The word fallacy is used, when people fail to apply a logical rule that is obviously relevant. If you visit a courtroom you will observe that lawyers apply two styles of criticism: to demolish a case they raise doubts about the strongest arguments that favor it: it discredit a witness, they focus on the weakness part of the testimony. The focus on weakness is also normal in political debate.
“They constructed a very complicated scenario and insisted on calling it highly probable - it is not: it is only a plausible story”
“They added a cheap gift to the expensive gift and made the whole deal less attractive. Less i s more in this case.”
“In most situations, a direct comparison makes people more careful and more logical. Buy not always. Sometimes intuition beats logic even when the correct answer stares you in the face”.
There are two types of base rates. Statistical base rate are facts about a population to which a case belongs, but they are not relevant to the individual case. Casual base rate change your view of how the individual case came to be. Both are treated differently.
Statistical base rates are generally underweighted and sometimes neglected altogether when specific info about the case at hand is available
Causal base rates are treated as info about the individual’s cases and are usually combined with other case-specific info.
There is deep gap between our thinking about statistics and our thinking about individual cases. Statistical results with a casual interpretation have a stronger effect on our thinking than non-casual info. But even compelling casual statistics will not change long-held beliefs or beliefs rooted in personal experience. On the other hand, surprising individual cases have a powerful impact and are a more effective tool for teaching psychology because the incongruity must be resolved and embedded in a casual story.
“We can’t assume that they will really learn anything form mere statistics. Let’s show them one or two representative individual cases to influence their System 1”
“No need to worry about this statistical info being ignored. On the contrary it will immediately be used to feed a stereotype”.
Subjective confidence in a judgment is not a reasoned evolution of the probability that the judgment is correct. Confidence is a feeling, which reflects the coherence of the info and the cognitive ease of processing it. it is wise to take admission of uncertainty seriously but declaration of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.
Errors of prediction are inevitable because the world is unpredictable and the high subjective evidence is not to be trusted as an indicator of accuracy.
Competition Neglect:
We focus on our goal, anchor on our plan, and neglect relevant base rates, exposing ourselves to the planning fallacy
We focus on what we want to do and can do, neglecting the plans and skills of others
Both in explaining the past and in predicting the future, we focus on the casual role of skill and neglect the role of luck. We are therefor prone to the an illusion of control
We focus on what we know and neglect what we do not know which makes us overlay confident in our work
[President Truman who fed up with two sides views of advise (e.g. this is recommended, but on the other-hand,... this is not practical..) asked for a ‘one-armed economist who would take a clear stand; he was sick and tired of economist who kept saying, “On the other hand...”
Bad events: The concept of loss aversion is certainly the most significant contribution of psychology to behavioral economics. This is odd, because the idea that people evaluate many outcomes as gains and losses and that losses loom larger than gains, surprises no one. Bad emotions, bad parents and bad feedback have more impact than good ones and bad info is processed more thoroughly have more impact than good ones. Bad impressions and bad stereotypes are quicker to form and more resistant to disconfirmation than good ones.
Loss aversion refers to the relative strength of two motives. We are driven more strongly to avoid losses than to achieve gains. People often adopt short-term goals that they strive to achieve but not necessarily to exceed. They are likely to reduce their efforts when they have reached an immediate goal with results that sometimes violate economic logic. Animals, including people, fight harder to prevent losses than to achieve gains. When a territory holder is challenged by a rival, the owner almost always wins the contest
The psychology of high prize lotteries is similar to the psychology of terrorism. They thrilling possibility of winning the big prize is rewarded by pleasant fantasies. Highly unlikely events either ignored or over-weighted. The great Paul Samuelson famously asked a friend whether he would accept a gamble on the toss of a coin in which he could lose $100 or win $200. His friend responded, “I won’t bet because I would feel the $100 loss more than $200 gain”. If you have the emotional discipline, you will never consider even a small gamble. The advice is not impossible to follow. Experienced traders in fin. markets live by it everyday, shielding themselves from the pain of losses by broad framing.
“He has separate mental accounts for cash and credit purchases. I constantly remind him that money is money”
“We discovered an excellent dish at that restaurant and we never try anything else, to avoid regret”
The salesperson showed me the most expensive car seat and said it was the safest and I could not bring myself to buy the cheapest car model. It felt like a taboo tradeoff”
A bad outcome is much more acceptable if it framed as the cost of a lottery ticket that did not win than if it is simply described as losing a gamble. Losses evoke stronger negative feelings than costs. The debate about whether gas stations would be allowed to charge different prices for purchase paid with cash or on credit. The credit card lobby pushed hard to make differential pricing, but it had a fallback position: the difference, if allowed would be labeled a cash discount not credit surcharge, People will more readily forgot discount than pay a surcharge. The two are economically equivalent, but they are not emotionally equivalent.
Jeremy Bentham famously said, “ Nature has placed mankind under the governance of two sovereign masters, pain and pleasure. It is for them alone to point out what we ought to do as well as to determine what we shall do”. An inconsistency is built into the design of our minds. We have a strong preferences about the duration of our experience of pain and pleasure. We want pain to be brief, and pleasure to last. But our memory a function of System1 has evolved to represent the most intense moment of an episode of pain or pleasure (the peak) and the feelings when the episode was at its end. A memory that neglects duration will not serve our preference for long pleasure and short pains.
Duration neglect and the peak-end rule originate in System 1 and do not necessarily correspond to the values of System 2.
Books referred in this book. `
George Polya - How to solve it
Max Bazerman - Judgement in Managerial decisionmaking
Gary Klein - Sources of Power
Philip Rosenweig - The Halo effect
Burton Malkiel - A random walk down wall street.
Philip Tetlock - Expert Political judgement: How good is it? How can we know?
Paul Meehl - Clinical vs Statistical prediction: A Theoritcal Analysis and a review of the evidence.
Robyn Dawes - The robust beauty of improper linear models in decision making”
Atul Gawande - A Checklist Manifesto
Watch Verdi’s opera La Triviata. http://www.youtube.com/watch?v=-cRTlrCYDSQ
No comments:
Post a Comment