Alex Edmunds is Professor of Finance at London Business School. Alex received his PhD from MIT as a Fulbright Scholar and was previously a tenured professor at Wharton College and an investment banker at Morgan Stanley. He is a non-executive director of the Investor Forum, the World Economic Forum's Global Future Council on Responsible Investment, and Royal London Asset Management's Responsible Investment Advisory Board.
Below, Alex shares five key insights from his new book. May contain lies: How stories, statistics, and research exploit our biases and what we can do about it. Listen to the audio version read by Alex himself on the Next Big Idea app.
1. Take responsibility.
We tend to think that fighting misinformation is someone else's business. Governments should regulate and prosecute misinformation, publishers should scrutinize more thoroughly, and scientific experts should expel offending members. However, this is unrealistic and ineffective. This is unrealistic because misinformation can be created much faster than authorities can correct it. Some of the most pervasive misinformation is ineffective because it is sophisticated and cannot be prosecuted. Even if the facts people give him are 100% accurate, he may draw misleading inferences, such as overestimating from a single example.
Instead, the solution is to take responsibility for ourselves and be vigilant. The most important thing to watch out for is confirmation bias. It's the temptation to uncritically accept a claim because you want it to be true, and to dismiss something out of hand because you don't want to believe it. How often do we repost or “like” an article without actually reading it just because we liked the headline? Or maybe we read it? You will believe every claim without actually examining the evidence behind it. Sometimes we read something that we don't like the sound of, but we read it with the intention of trying to break it down. We need to apply the same skepticism to the things we like and the things we don't.
2. Review the evidence.
What does it mean to apply skepticism? When we see a claim that we don't like, we will ask for evidence behind it. You should do the exact same thing for any claim you like. Note that we say “check the evidence,” not “check the facts.” We tend to think of fighting misinformation as simply fact-checking. If people spread false information that Barack Obama is not a natural-born U.S. citizen, you can check his birth certificate.great book factfulness We showed that looking at simple facts, such as how many countries have an average daily income of less than $2, can make us more optimistic about the world. However, many claims are not simple statements that can be proven or disproved by fact-checking.
Consider the famous phrase, “Culture eats strategy for breakfast.” This is accepted as gospel because it comes from Peter Drucker, a highly respected management guru. But even if what Drucker said is true, that alone is not enough. “Peter Drucker said this” is not evidence. Drucker actually conducted a study of his one set of companies with a strong culture and weak strategy, and another set of companies with a strong strategy and weak culture, and the first company was compared to his second set of companies. Have we shown that we can win against companies? The answer is no. So no matter who says it and no matter how much you want it to be true, you should question what they say.
“Many claims are not simple statements that can be proven or disproved by fact-checking.”
Or consider the famous “2-minute rule” introduced in David Allen's time management book. get things done. It says that tasks that take less than 2 minutes must be performed immediately. However, there is no evidence behind this rule. He just made it up. One business magazine said, “Fortunately for Mr. Allen, he didn't need empirical evidence. People felt better after taking his seminars.” People started practicing this rule. This rule makes you feel good and completing a task releases dopamine, so confirmation bias means you want it to be true. But the focus on quick wins and low-hanging fruit comes at the expense of thoroughness and reduces productivity.
Regularly ask yourself, “What is the evidence behind that claim?”
3. Know the whole truth.
At trial, witnesses are sworn to tell the whole truth, not just the truth. But because people choose what they reveal, many of the beliefs we hold are based on half-truths.
Consider Simon Sinek's argument. why It will lead to success. He reasons why Apple, Wikipedia, and the Wright Brothers all came to be successful. Their success is a true fact, but again, facts alone are not enough. Apple, Wikipedia, and the Wright brothers are all selected examples. Mr. Sinek is not telling the whole truth. There may be hundreds of other companies that started with . why And it failed. Hundreds of companies can succeed even if they don't succeed in the first place. why. But these examples will never appear in Sinek's books or lectures. Because they don't fit his story.
“There could be hundreds of other companies like this: why And it failed. ”
The best evidence is a medical trial. You have a set of patients who are given a drug, and you see how many of them get better and how many don't get better. The important thing is to also have a control group given a placebo to see how many people get better and how many people don't get better. The trial will involve people who were given the drug but whose symptoms did not improve, and those who were given a placebo but whose symptoms improved. Both are counterexamples. Even with these counterexamples, if there is still evidence that the drug makes people better, it is strong. If there are no counterexamples, this is a clear sign that the data are cherry-picked, and the conclusion is meaningless.
Ask, “Did the person making the claim consider counterexamples?”
4. Investigate another suspect.
Sticking with the trial analogy, evidence is only evidence if it points to one culprit and not the other. It is meaningless if it suggests that multiple suspects may have committed the crime. The same goes for outside the courtroom.
Consider research showing that companies that care about broader society also make more money. It's in the data. That's a real fact. But what to make of that fact? We argue that it means having a social conscience increases a company's profits. Our confirmation bias wants this to be true. We want to live in a world where there is good karma and good people always win.
“When companies do well, they have the luxury of caring for broader society.”
But there are other suspects as well. Perhaps the causal relationship lies in a different direction. When a company does well, it has the luxury of caring for wider society. Or maybe a third factor is causing both. Great CEOs make their companies successful, and great CEOs also care about broader society. If you think about it calmly, most people know that correlation is not causation. Confirmation bias prevents us from thinking clearly. We interpret the evidence the way we like, jump to whatever conclusions we like, and ignore alternative suspects. This is not only a problem in criminal investigations, but also in other types of evidence.
Ask yourself, “What are the alternative explanations for the same data?”
5. Encourage opposing views.
Our sources of information are much more than just data and evidence, so making better decisions is about more than just correctly interpreting data and evidence. One great source of information is colleagues' opinions about why our strategy might backfire or why we shouldn't hire a certain person.
We've talked about how to respond when you receive an opposing opinion, how to suppress confirmation bias, and how to take it as seriously as an opinion you agree with. But the bigger problem is that sometimes we don't even listen to opposing views in the first place. Many organizations have a huge problem with groupthink. People only say what they think their boss wants to hear and are afraid to sway the situation. A good leader is not one who forces others to follow, but one who gets others to speak up and let the leader know when they are about to go off course.
When Alfred Sloan ran General Motors, he ended meetings by asking, “I think we're all in complete agreement about the decision here, right?” Everyone nodded. Mr. Sloan continued, “We will now postpone further discussion of this matter until our next meeting to allow time for disagreements to develop and perhaps to gain some understanding of the implications of this decision.'' ” He believed that none of his ideas were 100% perfect. So if no one expressed a concern, it's not because they weren't concerned, but because they hadn't given their colleagues time to think about it yet. If concerns are raised at the next meeting, but a deal goes ahead, a good leader will then turn to the opposition and say: Be careful when implementing your strategy. ” By taking proactive steps to encourage dissent, leaders can make the most of their organization's collective wisdom.
To hear the audio version read by author Alex Edmans, download the Next Big Idea app now.