Bookshelf

Adam Grant
Think Again

Think Again

The Power of Knowing What You Don’t Know

by Adam Grant, 320 pages

Finished on 27th of March
🛒 Buy here
🎧 Listen to the podcast

A million-selling book about questioning your own beliefs and changing your mind. Does it live up to the hype? I’m very interested in what makes us think the way we do and where we might have it wrong and could improve. Here’s what the book taught me.

🎨 Impressions

The book started out really promising. I usually enjoy pop-science that talks about how our animal brains trick us into making false sense of our surroundings and how to overcome those built-in limitations we have. We tell ourselves lots of things that sound reasonable but are actually wrong, and we’re especially fast at reaching conclusions in order to not having to think about it more. The first vaguely plausible argument our brain comes up with often wins.

The point is, this behavior is far from optimal and we can do better. We just need to think again more often. Doubt our own conclusions and try to find out what pieces we’re missing in order to make better decisions. Get rid of all the blind spots by realizing they exist.

The book drives this single point home well. I have highlighted a great amount of brilliant sentences. There are lots of examples and citations, and also some anecdotes and a bunch of personal stories by the author, Adam Grant. So many of them, in fact, I found it tedious to read. I don’t know anything about the author except from the information he presents about himself in the book, and I’m sorry I have to say I personally did not find him likable. I might not be doing him justice, it could have been the editor’s fault which makes him seem that way to me. In the end it doesn’t matter, though. Reading a book written by someone who seems like a person you wouldn’t want to spend time with isn’t the best of experiences.

I was quickly annoyed by his anecdotes and especially the little “funny” comics he included — most of them very well known ones which really added nothing to the message of the book. It’s just lots and lots of filler content and a lot less entertaining as similar books in this genre. This isn’t great. It made the book feel super long.

The book was lacking a good structure even though the chapter sections suggest otherwise. It just flows from chapter to chapter in a sort of stream of consciousness approach, it seemed to me. There’s no foundation building in the early chapters and no big new ideas built on that are presented in the later ones. It’s just one bit after the other.

All these factors made it tough for me to get through. It took me a long time to finish it, longer than usual. There was never a situation where I just couldn’t wait to find out what’s in the chapter. No build-up, no revelations.

Lots of the content presented ideas which weren’t news to me, either, but I can’t hold that against the author, it’s just that I have already read a lot of these types of books.

Towards the end he has some strong points about crafting a lifestyle and what chasing happiness means, but ends rather abruptly albeit at a highpoint.

But then, Grant decided to use the epilogue in order to discuss epilogues. How self-centered and clearly not in service of the reader. He unknowingly made it clear that his focus was on selling himself here, not enriching the lives of us who have bought the book.

There are so-called Action Items at the end of the book which summarize all important ideas talked about in the book in just a few pages. Reading just those gets you 95% of the way there.

Taking all this into account, it’s not a book I can recommend reading, but only barely so. The main topic is fascinating and knowing more about how our brains work definitely is helpful. The book is ugly packaging for a few little gems of insight.

📔 Highlights

Prologue

The smarter you are, the more complex the problems you can solve—and the faster you can solve them. Intelligence is traditionally viewed as the ability to think and learn. Yet in a turbulent world, there’s another set of cognitive skills that might matter more: the ability to rethink and unlearn.

Part of the problem is cognitive laziness. Some psychologists point out that we’re mental misers: we often prefer the ease of hanging on to old views over the difficulty of grappling with new ones.

Tossed into the scalding pot, the frog will get burned badly and may or may not escape. The frog is actually better off in the slow-boiling pot: it will leap out as soon as the water starts to get uncomfortably warm. It’s not the frogs who fail to reevaluate. It’s us.

Under acute stress, people typically revert to their automatic, well-learned responses. That’s evolutionarily adaptive—as long as you find yourself in the same kind of environment in which those reactions were necessary.

Part I. Individual Rethinking; Updating Our Own Views

The accelerating pace of change means that we need to question our beliefs more readily than ever before. This is not an easy task. As we sit with our beliefs, they tend to become more extreme and more entrenched.

Research reveals that the higher you score on an IQ test, the more likely you are to fall for stereotypes, because you’re faster at recognizing patterns. And recent experiments suggest that the smarter you are, the more you might struggle to update your beliefs.

If they were liberals, math geniuses did worse than their peers at evaluating evidence that gun bans failed. If they were conservatives, they did worse at assessing evidence that gun bans worked.

My favorite bias is the “I’m not biased” bias, in which people believe they’re more objective than others. It turns out that smart people are more likely to fall into this trap. The brighter you are, the harder it can be to see your own limitations.

The curse of knowledge is that it closes our minds to what we don’t know. Good judgment depends on having the skill—and the will—to open our minds.

We all have blind spots in our knowledge and opinions. The bad news is that they can leave us blind to our blindness, which gives us false confidence in our judgment and prevents us from rethinking. The good news is that with the right kind of confidence, we can learn to see ourselves more clearly and update our views.

The less intelligent we are in a particular domain, the more we seem to overestimate our actual intelligence in that domain.

The antidote to getting stuck on Mount Stupid is taking a regular dose of it. “Arrogance is ignorance plus conviction,” blogger Tim Urban explains. “While humility is a permeable filter that absorbs life experience and converts it into knowledge and wisdom, arrogance is a rubber shield that life experience simply bounces off of.”

You can be confident in your ability to achieve a goal in the future while maintaining the humility to question whether you have the right tools in the present. That’s the sweet spot of confidence.

The most effective leaders score high in both confidence and humility. Although they have faith in their strengths, they’re also keenly aware of their weaknesses. They know they need to recognize and transcend their limits if they want to push the limits of greatness.

Those who self-identified as impostors didn’t do any worse in their diagnoses, and they did significantly better when it came to bedside manner—they were rated as more empathetic, respectful, and professional, as well as more effective in asking questions and sharing information.

If we never worry about letting other people down, we’re more likely to actually do so. When we feel like impostors, we think we have something to prove.

A mark of lifelong learners is recognizing that they can learn something from everyone they meet. Arrogance leaves us blind to our weaknesses.

In a classic paper, sociologist Murray Davis argued that when ideas survive, it’s not because they’re true—it’s because they’re interesting. What makes an idea interesting is that it challenges our weakly held opinions.

Research suggests that the more frequently we make fun of ourselves, the happier we tend to be. Instead of beating ourselves up about our mistakes, we can turn some of our past misconceptions into sources of present amusement.

Agreeable people make for a great support network: they’re excited to encourage us and cheerlead for us. Rethinking depends on a different kind of network: a challenge network, a group of people we trust to point out our blind spots and help us overcome our weaknesses. Their role is to activate rethinking cycles by pushing us to be humble about our expertise, doubt our knowledge, and be curious about new perspectives.

I’ve watched too many leaders shield themselves from task conflict. As they gain power, they tune out boat-rockers and listen to bootlickers. They become politicians, surrounding themselves with agreeable yesmen and becoming more susceptible to seduction by sycophants. Research reveals that when their firms perform poorly, CEOs who indulge flattery and conformity become overconfident.

I’m looking for disagreeable people who are givers, not takers. Disagreeable givers often make the best critics: their intent is to elevate the work, not feed their own egos. They don’t criticize because they’re insecure; they challenge because they care. They dish out tough love.

Part II. Interpersonal Rethinking; Opening Other People’s Minds

Starting a disagreement by asking, “Can we debate?” sends a message that you want to think like a scientist, not a preacher or a prosecutor—and encourages the other person to think that way, too.

Recent experiments show that having even one negotiator who brings a scientist’s level of humility and curiosity improves outcomes for both parties, because she will search for more information and discover ways to make both sides better off.

Convincing other people to think again isn’t just about making a good argument—it’s about establishing that we have the right motives in doing so. When we concede that someone else has made a good point, we signal that we’re not preachers, prosecutors, or politicians trying to advance an agenda. We’re scientists trying to get to the truth.

Being reasonable literally means that we can be reasoned with, that we’re open to evolving our views in light of logic and data.

Psychologists have long found that the person most likely to persuade you to change your mind is you. You get to pick the reasons you find most compelling, and you come away with a real sense of ownership over them.

In a heated argument, you can always stop and ask, “What evidence would change your mind?” If the answer is “nothing,” then there’s no point in continuing the debate. You can lead a horse to water, but you can’t make it think.

I think it’s a ritual. A fun but arbitrary ritual—a ceremony that we perform out of habit. We imprinted on it when we were young and impressionable, or were new to a city and looking for esprit de corps.

To activate counterfactual thinking, you might ask people questions like: How would your stereotypes be different if you’d been born Black, Hispanic, Asian, or Native American? What opinions would you hold if you’d been raised on a farm versus in a city, or in a culture on the other side of the world?

If you get people to pause and reflect, they might decide that the very notion of applying group stereotypes to individuals is absurd. Research suggests that there are more similarities between groups than we recognize. And there’s typically more variety within groups than between them.

For over half a century, social scientists have tested the effects of intergroup contact. In a meta-analysis of over five hundred studies with over 250,000 participants, interacting with members of another group reduced prejudice in 94 percent of the cases.

“We are living in space-age times, yet there are still so many of us thinking with stone-age minds,” he reflects. “Our ideology needs to catch up to our technology.”

Before Marie-Hélène left the hospital, she had Tobie vaccinated. A key turning point, she recalls, was when Arnaud “told me that whether I chose to vaccinate or not, he respected my decision as someone who wanted the best for my kids. Just that sentence—to me, it was worth all the gold in the world.”

Overall, motivational interviewing has a statistically and clinically meaningful effect on behavior change in roughly three out of four studies, and psychologists and physicians using it have a success rate of four in five.

Listening well is more than a matter of talking less. It’s a set of skills in asking and responding. It starts with showing more interest in other people’s interests rather than trying to judge their status or prove our own. We can all get better at asking “truly curious questions that don’t have the hidden agenda of fixing, saving, advising, convincing or correcting,” journalist Kate Murphy writes, and helping to “facilitate the clear expression of another person’s thoughts.”

A skilled motivational interviewer resists the righting reflex—although people want a doctor to fix their broken bones, when it comes to the problems in their heads, they often want sympathy rather than solutions.

Among managers rated as the worst listeners by their employees, 94 percent of them evaluated themselves as good or very good listeners. Dunning and Kruger might have something to say about that.

The power of listening doesn’t lie just in giving people the space to reflect on their views. It’s a display of respect and an expression of care.

Part III. Collective Rethinking; Creating Communities of Lifelong Learners

Hearing an opposing opinion doesn’t necessarily motivate you to rethink your own stance; it makes it easier for you to stick to your guns (or your gun bans). Presenting two extremes isn’t the solution; it’s part of the polarization problem. Psychologists have a name for this: binary bias. It’s a basic human tendency to seek clarity and closure by simplifying a complex continuum into two categories.

When the middle of the spectrum is invisible, the majority’s will to act vanishes with it. If other people aren’t going to do anything about it, why should I bother?

And multiple experiments have shown that when experts express doubt, they become more persuasive. When someone knowledgeable admits uncertainty, it surprises people, and they end up paying more attention to the substance of the argument.

Rethinking needs to become a regular habit. Unfortunately, traditional methods of education don’t always allow students to form that habit.

It turns out that although perfectionists are more likely than their peers to ace school, they don’t perform any better than their colleagues at work. This tracks with evidence that, across a wide range of industries, grades are not a strong predictor of job performance.

Ultimately, education is more than the information we accumulate in our heads. It’s the habits we develop as we keep revising our drafts and the skills we build to keep learning.

Rethinking is more likely to happen in a learning culture, where growth is the core value and rethinking cycles are routine. In learning cultures, the norm is for people to know what they don’t know, doubt their existing practices, and stay curious about new routines to try out.

Edmondson is quick to point out that psychological safety is not a matter of relaxing standards, making people comfortable, being nice and agreeable, or giving unconditional praise. It’s fostering a climate of respect, trust, and openness in which people can raise concerns and suggestions without fear of reprisal. It’s the foundation of a learning culture.

How do you know? It’s a question we need to ask more often, both of ourselves and of others. The power lies in its frankness. It’s nonjudgmental—a straightforward expression of doubt and curiosity that doesn’t put people on the defensive.

It takes confident humility to admit that we’re a work in progress. It shows that we care more about improving ourselves than proving ourselves.

Part IV. Conclusion

Escalation of commitment happens because we’re rationalizing creatures, constantly searching for self-justifications for our prior beliefs as a way to soothe our egos, shield our images, and validate our past decisions.

There’s a fine line between heroic persistence and foolish stubbornness. Sometimes the best kind of grit is gritting our teeth and turning around.

In some ways, identity foreclosure is the opposite of an identity crisis: instead of accepting uncertainty about who we want to become, we develop compensatory conviction and plunge head over heels into a career path.

As we identify past images of our lives that are no longer relevant to our future, we can start to rethink our plans. That can set us up for happiness—as long as we’re not too fixated on finding it.

when it comes to careers, instead of searching for the job where we’ll be happiest, we might be better off pursuing the job where we expect to learn and contribute the most.

Interest doesn’t always lead to effort and skill; sometimes it follows them.

My favorite test of meaningful work is to ask: if this job didn’t exist, how much worse off would people be?

“Those only are happy,” philosopher John Stuart Mill wrote, “who have their minds fixed on some object other than their own happiness; on the happiness of others, on the improvement of mankind, even on some art or pursuit, followed not as a means, but as itself an ideal end. Aiming thus at something else, they find happiness by the way.”

At work and in life, the best we can do is plan for what we want to learn and contribute over the next year or two, and stay open to what might come next.

Our identities are open systems, and so are our lives. We don’t have to stay tethered to old images of where we want to go or who we want to be.

Epilogue

But in times of crisis as well as times of prosperity, what we need more is a leader who accepts uncertainty, acknowledges mistakes, learns from others, and rethinks plans.

We can all improve at thinking again. Whatever conclusion we reach, I think the world would be a better place if everyone put on scientist goggles a little more often. I’m curious: do you agree? If not, what evidence would change your mind?

Actions for Impact

The Dunning-Kruger effect is a good reminder that the better you think you are, the greater the risk that you’re overestimating yourself—and the greater the odds that you’ll stop improving.

When you find out you’ve made a mistake, take it as a sign that you’ve just discovered something new.

If you pile on too many different reasons to support your case, it can make your audiences defensive—and cause them to reject your entire argument based on its least compelling points.

How do you feel after reading this?

This helps me assess the quality of my writing and improve it.

Leave a Comment

This post’s URL is teesche.com/bookshelf/adam_grant_think_again
Copy to Clipboard for Sharing

Don’t want to miss new stories?

Get notified by emails with new blog posts, book notes, running ideas and more—no noise.

🎁 Bonus:
New subscribers get 20% off my book “The Beginner’s Guide to Running Your First Marathon” after confirmation.

Book Cover of The Beginner’s Guide to Running Your First Marathon

Tim Teege
The Beginner’s Guide to Running Your First Marathon

You’ll never ever receive spam email and you can unsubscribe at any point.

loading