Thinking, Fast and Slow

Thinking, Fast and Slow - Daniel Kahneman

In the highly anticipated Thinking, Fast and Slow, Kahneman takes us on a groundbreaking tour of the mind and explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. Kahneman exposes the extraordinary capabilitiesand also the faults and biasesof fast thinking, and reveals the pervasive influence of intuitive impressions on our thoughts and behavior. The impact of loss aversion and overconfidence on corporate strategies, the difficulties of predicting what will make us happy in the future, the challenges of properly framing risks at work and at home, the profound effect of cognitive biases on everything from playing the stock market to planning the next vacationeach of these can be understood only by knowing how the two systems work together to shape our judgments and decisions.Engaging the reader in a lively conversation about how we think, Kahneman reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and our personal livesand how we can use different techniques to guard against the mental glitches that often get us into trouble. Thinking, Fast and Slow will transform the way you think about thinking.

Published: 2011-10-25 (Farrar, Straus and Giroux)

ISBN: 9780374275631

Language: English

Format: Hardcover, 418 pages

Goodreads' rating: -

Reviews

Brooke rated it

Reading "Thinking, Fast, and Slow", ....(book choice for this month's local book club), was not exactly bedtime reading for me. I had already pre- judged it before I started reading... ( certain I would discover I'm a FAST INTUITIVE - type thinker ... ( quick, often influenced by emotion). Once in awhile I use basic common sense - logic .... but even, it is usually with 'righteous emotions'. Just being honest!I understand this is an intellectual -giant- of - a -book about "How we think"...Thinking 'deeply' about how we think... but this book hasn't changed me - transformed me--or enlighten me. Not so far. It's too technical. I understand the author is brilliant --but I found myself skimming pages-- However, what I understood - I enjoyed. Kahneman has a great talent at being a slow, rational, logical, and reflective thinker. However, fast thanking, intuitive thinking, is more influential in what experience tells us he says---being contrary to the belief that we are very rational-decision making people. A few things in the book...interesting information ... Yet I still 'believe it's incomplete ...That their are other ways in speaking about the way our minds work - that is not found in this big book. 1) Two basic systems of thinking: System 1 is the intuitive, quick, thinking System 2 is the slowest rational logical and reflective thinking .....20% of our Energy goes into our brain. .....We tend to be lazy thinkers. ( lazy controller he calls it), and do not involve our slow thinking brain and less it is needed. ...... A running theme in the book is that although the brain does contain a statisticalalgorithm, it is not accurate. The brain does not understand basic normal distribution. ...... Our brain often jumps to conclusions. ...... Our brain knows how to answer easy questions, like "what did you have for breakfast"?...but it is more challenging to answer the question, "how do you feel about yourself today"?......We have biases...... Often stereotypes will override statistics. ( again providing we have influential, lazy judging brains) ....... He talks about predictions. For example, if a child gets great grades in the lower grades of school... We often tend to over estimate our ability to predict the future. ...... When it comes to intuition versus formulas ... Often the formula does win. ..... We also are incline to expect regularity much more in our lives and really exist. You won't find any data in this book about "The Power of Now" thinking, or discussion about "You are not your Mind", Chakras, or myths about healing ... but it's a book about THE WAY WE THINK... (technical.. some of it I resonated with- but when it got too scientifically technical, he lost me). I look forward to my book club discussion- 25 people will be attending this month- (many bright people)... I'm sure to gain value and more insights.

Zoe rated it

What a monstrous chore to read! I've been working on this book since September or August (4-6 months) and just could not take reading it for more than a few minutes at a time. Many times did it put me to sleep. The book covered a lot of great material and really fascinating research, but oftentimes in such plodding, pedantic, meticulous detail as to nearly obfuscate the point. I have heard of the majority of the research (or at least their conclusions) as well, so while I thought it offered excellent insight and useful material for a lot of people to learn, I didn't think this collection of it--more of a history of the field than an introduction--added anything novel or unique for one already well-versed in the material. I guess I didn't care for the details in how the studies were conducted for every minor point in the author's theories--though I largely agreed with the theories and interpretations. A line near the end of the book struck a dissonant chord with me and I wonder if that offers an additional cause for my dislike: "That was my reason for writing a book that is oriented to critics and gossipers rather than to decision makers." I wouldn't count myself among 'decision makers' in any important sense (it's surprising how little responsibility a person can have sometimes!), but I often felt like the book wasn't speaking to me. Many times the author wrote "we think..." or "we act..." in such a way that I don't think I'd do. This isn't to say I'm a purely 'rational agent' or 'Econ' or anything like that--the majority of the authors theories (thinking can be either instinctual or effortful, rational agents act differently than emotional humans, and the experiencing self and the remembering self are different things) are immanently true--but I do think he was generalizing for a WEIRD (Western, Educated, Industrialized, Rich and Democratic) audience, and despite my background, I don't think I think that way. Recommendation: read the introduction and the conclusion (and perhaps the major section intros), cherry-pick anything else of interest.

Brit rated it

Freeman Dyson Sphere Dyson wrote the New York Times review, which has me swooning right there. Dyson was a particularly apt pick because Kahneman helped design the Israeli military screening and training systems back when the country was young, and Dyson at 20 years old cranked statistics for the British Bombing Command in its youth. Dyson was part of a small group that figured out the bombers were wrong about what mattered to surviving night time raids over Germany; a thing only about a quarter of the crews did over a tour. Dyson figured out the Royal Airforce's theories about who lived and died were wrong. But no data driven changes were made because the illusion of validity does not disappear just because facts prove it to be false. Everyone at Bomber Command, from the commander in chief to the flying crews, continued to believe in the illusion. The crews continued to die, experienced and inexperienced alike, until Germany was overrun and the war finally ended. http://www.nybooks.com/articles/archi... Why did the British military resist the changes? Because it was deeply inconsistent the heroic story of the RAF they believed in. Suppose there are stories Id die for too. But not the myth that Kahneman dethroned. Kahneman got the Nobel Prize for Economics for showing that the Rational Man of Economics model of human decision making was based on a fundamental misunderstanding of human decision making. We are not evolved to be rational wealth maximizers, and we systematically value and fear some things that should not be valued so highly or feared so much if we really were the Homo Economicus the Austrian School seems to think we should be. Which is personally deeply satisfying, because I never bought it and deeply unsettling because of how many decisions are made based on that vision. If that was all this book was, itd just be another in a mass of books that have as their thesis Youre wrong about that! Which I appreciate knowing, but theres a point where its a little eye rolling because they dont offer any helpful suggestions on how not to be wrong, or why these patterns of wrongness exist and endure. But Kahneman has a theory. He theorizes that humans have two largely separate decision-making systems: System One (the fast) and System Two (the slow). System One let us survive monster attacks and have meaningful relationships with each other. System Two let us get to the moon. Both systems have values built into them and any system of decision-making that edits them out is doomed to undercut itself. Some specifics that struck me:Ideomotor Effect: (53) Concepts live in our heads in associative networks. Once triggered, they cascade concepts. Make someone walk slow, they think about old age. Make someone smile, and theyll be happier. Seeing a picture of cash makes us more independent, more selfish, and less likely to pick up something someone else has dropped. Seeing a locker makes us more likely to vote for school bonds. Reminding people of their mortality makes them more receptive of authoritarian ideas. (56) Studies of priming effects have yielded discoveries that threaten our self-image as conscious and autonomous authors of our judgments and our choices. (55). Halo Effect (82) If you like the presidents politics, you probably like his voice and appearance as well. We find someone attractive and we conclude theyre competent. We find emotional coherence pleasing and lack of coherence frustrating. However, far fewer things are correlated than we believe. What You See Is All There Is (WYSIATI) (85). Our system one is pattern seeking. Our system 2 is lazy; happy to endorse system 1 beliefs without doing the hard math. Jumping to conclusions on the basis of limited evidence is so important to an understanding of intuitive thinking, and comes up so often in this book, that I will use a cumbersome abbreviation for it: WYSIATI. . . System 1 is radically insensitive to both the quality and quantity of information that gives rise to impressions and intuitions. (86). Absolutely essentially for not getting eaten by lurking monsters, and explains why we can think fast, and how we are able to make sense of partial information in a complex world. Much of the time, the coherent story we put together is close enough to reality to support reasonable action. Except when it doesnt. Like in our comparative risk assessments. We panic about shark attacks and fail to fear riptides; freak out about novel and unusual risks and opportunities and undervalue the pervasive ones. Answering an Easier Question (97). If one question is hard, well substitute an easier one. It can be a good way to make decisions. Unless the easier question is not a good substitute. I have an uneasy awareness that I do this. Especially since it often REALLY ANNOYS me when people do it to me. The Law of Small Numbers. (109) The counties with the lowest level of kidney cancer are rural, sparsely populated, and located in traditionally Republican states. Why? Good clean living? The counties with the highest level of kidney cancer are rural, sparsely populated, and located in traditionally Republican states. Why? Lack of access to health care? Wait, what? The System 1 mind immediately comes up with a story to explain the difference. But once the numbers are cranked, apparently, its just an artifact of the fact that a few cases in a small county skews the rate. But if you base your decision on either story, the outcomes will be bad. Anchors (119). We seize on the first value offered, no matter how obviously absurd it is. If you want to push someone in a direction, get them to accept your anchor. Regression to the Mean. (175) There will be random fluctuations in the quality of performance. A teacher who praises a randomly good performance may shape behavior, but likely will simply be disappointed as statistics asserts itself and a bad performance follows. A teacher who criticizes a bad performance may incentivize, but likely will simply have a false sense of causation when statistics asserts itself and a good performance happens. Kahneman describes it as a significant fact of the human condition: the feedback to which life exposes us too is perverse. Because we tend to be nice to other people when they please us and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty. (176). The Illusion of Understanding (204) The sense-making machinery of System 1 makes us see the world as more tidy, simple, predictable, and coherent than it really is. The illusion that one has understood the past feeds the further illusion that one can control the future. These illusions are comforting. They reduce the anxiety that we would experience if we allowed ourselves to fully acknowledge the uncertainties of existence. We all have a need for the reassuring message that actions have appropriate consequences, and that success will reward wisdom and courage. But it doesnt . (212). For example, were totally wrong about whether you can beat the stock market. Formulas are often much more predictive than learned intuition. Im going to have to wrestle with this one, but he alluded to a claim by Robyn Dawes that marital stability is well predicted by a formula: frequency of lovemaking minus frequency of quarrels. (226) Snicker. Premortems Can Help. (264) before making a decision, assign someone to imagine its a year into the future and the plan was a disaster. Have them write a history of the disaster. We value losses more than gains. (349) Which is fine except when that means we expose others to more risk because we did the math wrong. The Focusing Illusion (402) Nothing in life is as important as you think it is when you are thinking about it. We overvalue whats in our mind at the moment, which is subject to priming. He closes by stressing he does not mean to say that people are irrational. But, he says, rational in economic terms has a particular meaning that does not describe people. For economists and decision theorists, [rationality] has an altogether different meaning. The only test of rationality is not whether a persons beliefs and preferences are reasonable, but whether they are internally consistent. A rational person can believe in ghosts, so long as all her other beliefs are consistent with the existence of ghosts. . . . Rationality is logical coherence reasonable or not. Econs are rational by this definition, but there is overwhelming evidence that Humans cannot be. . . . The definition of rationality as coherence is impossibly restrictive; it demands adherence to rules of logic that a finite mind is not able to implement. Reasonable people cannot be rational by that definition, but they should not be branded as irrational for that reason. Irrational is a strong word, which connotes impulsivity, emotionality, and a stubborn resistance to reasoned argument. I often cringe when my work with Amos is credited with demonstrating that human choices are irrational, when in fact our research only showed that Humans are not well described by the rational-agent model. (411) A good read.