Categories
Uncategorized

THE CURSE OF KNOWLEDGE: UNDERSTANDING LESS-INFORMED PERSPECTIVES

vector abstract illustration with brain and puzzle

The term the “curse of knowledge” was coined in a 1989 paper by researchers Colin Camerer, George Loewenstein, and Martin Weber. This phenomenon is sometimes also conceptualized as epistemic egocentrism, though some theoretical distinctions may be drawn between these concepts.

The curse of knowledge is a cognitive bias that causes people to fail to properly understand the perspective of those who do not have as much information as them. For example, the curse of knowledge can mean that an expert in some field might struggle to teach beginners, because the expert intuitively assumes that things that are obvious to them are also obvious to the beginners, even though that’s not the case. Because the curse of knowledge can cause issues in various areas of life, such as when it comes to communicating with others, it’s important to understand it.

The Curse Of Knowledge: Common Occurrences & Influences

This can make it harder for experts to teach beginners (also known as the curse of expertise). For example, a math professor might find it difficult to teach first-year math students, because it’s hard for the professor to account for the fact that they know much more about the topic than the students.

This can make it harder for people to communicate. For example, it can be difficult for a scientist to discuss their work with laypeople, because the scientist might struggle to remember that those people aren’t familiar with the terminology in the scientist’s field.

This can make it harder for people to predict the behavior of others. For example, an experienced driver may be surprised by something dangerous that a new driver does, because the experienced driver struggles to understand that the new driver doesn’t understand the danger of what they’re doing. This aspect of the curse of knowledge is associated with people’s expectation that those who are less-informed than them will use information that the less-informed individuals don’t actually have.

This can make it harder for people to understand their own past behavior. For example, it can cause someone to think that they were foolish for making a certain decision in the past, even though the information that they had at the time actually strongly supported that decision. This aspect of the curse of knowledge can manifest in various ways and be referred to using various terms, such as the hindsight biasknew-it-all along effect, and creeping determinism.

When it comes to the curse of knowledge, the perspective of the less-informed individual, whether it’s a different person or one’s past self, is often referred to as a naive perspective.

The Tapping Study

One well-known example of the curse of knowledge is the tapping study. In this study, participants were randomly assigned to be either a tapper or a listener. Each tapper finger-tapped three tunes (which were selected from a list of 25 well-known songs) on a desk, and was then asked to estimate the probability that the listener will be able to successfully identify the song that they tapped, based only on the finger tapping.

On average, tappers estimated that listeners will be able to correctly identify the tunes that they tapped in about 50% of cases, with estimates ranging anywhere from 10% to 95%. However, in reality, listeners were able to successfully identify the tune based on the finger tapping in only 2.5% of cases, which is far below even the most pessimistic estimate provided by a tapper, and which therefore represents evidence of the curse of knowledge.

Overall, the tapping study demonstrates how the curse of knowledge can affect people’s judgment. Specifically, it shows that people who know which tune is being tapped have an easy time identifying it, and therefore struggle to accurately predict the perspective of others, who don’t have the same knowledge that they do.

The Psychology & Causes Of The Curse Of Knowledge

The curse of knowledge is attributed to two main cognitive mechanisms:

People’s curse of knowledge can be caused by either of these mechanisms, and both mechanisms may play a role at the same time. Other cognitive mechanisms may also lead to the curse of knowledge. For example, one such mechanism is anchoring and adjustment, which in this case means that when people try to reason about a less-informed perspective, their starting point is often their own perspective, which they struggle to adjust from properly.

All these mechanisms, in turn, can be attributed to various causes, such as the brain’s focus on acquiring and using information, rather than on inhibiting it, which is beneficial in most cases but problematic in others. In addition, various factors, such as age and cultural background, can influence people’s tendency to display the curse of knowledge, as well as the way and degree to which they display it.

Finally, other psychological concepts are associated with the curse of knowledge. The most notable of these is theory of mind, which is the ability to understand that other people have perceptions, thoughts, emotions, beliefs, desires, and intentions that are different from our own, and that these things can influence people’s behavior. Insufficient theory of mind can therefore lead to an increase in the curse of knowledge, and conversely, proper theory of mind can reduce the curse of knowledge.

Dealing with The Curse Of Knowledge

There are several things that you can do to reduce the curse of knowledge:

Other Debiasing Techniques

We can use various general debiasing techniques, such as slowing down our reasoning process and improving our decision-making environment. In addition, we can use debiasing techniques that are meant to reduce egocentric biases, such as visualizing the perspective of others and then adjusting our judgment based on this, or using self-distancing language (e.g., by asking “are you teaching in a way that the students can understand?” instead of “am I teaching in a way that the students can understand?”).

It is important to keep in mind that none of these techniques may work perfectly in every situation. This means, for example, that some techniques might not work for some individuals in some circumstances, or that even if a certain technique does work, it will only reduce someone’s curse of knowledge to some degree, but won’t eliminate it entirely.

Related Biases

The curse of knowledge is considered to be a type of egocentric bias, since it causes people to rely too heavily on their own point of view when they try to see things from other people’s perspective. However, an important feature of the curse of knowledge, which differentiates it from some other egocentric biases, is that it is asymmetric, in the sense that it influences those who attempt to understand a less-informed perspective, but not those who attempt to understand a more-informed perspective. The curse of knowledge is also associated with various other cognitive biases, such as:

***Source Credits:-

http://www.effectiviology.com/

http://www.doi.org/

Content Curated By: Dr Shoury Kuttappa

Categories
Uncategorized

DECISION MAKING: COGNITIVE BEHAVIOURS INVOLVED – (CHAPTER 01)

Decision making is a cognitive process leading to the selection of a course of action among alternatives. It is a method of reasoning which can be rational or irrational, and can be based on explicit assumptions or tacit assumptions. Common examples include shopping, deciding what to eat, when to sleep, and deciding whom or what to vote for in an election.

Decision making is said to be a psychological construct. This means that although we can never “see” a decision, we can infer from observable behaviour that a decision has been made. It is a construction that imputes commitment to action.

Structured rational decision making is an important part of all science-based professions. For example, medical decision making often involves making a diagnosis and selecting an appropriate treatment. Some research using naturalistic methods shows, however, that in situations with higher time pressure, higher stakes, or increased ambiguities, experts use intuitive decision making rather than structured approaches, following a recognition primed decision approach to fit a set of indicators into the expert’s experience and immediately arrive at a satisfactory course of action without weighing alternatives.

Head, Heart and Gut – Powerful Decision Makers

We are living in unprecedented times of stress, confusion, and overwhelm. We all need resources to help navigate these challenging times and make the right decisions for the highest and best long-term good for ourselves, our families and our businesses. Those resources can be found within each of us if we pause to consider three reliable indicators: the head (intellect), the heart (feelings), and the gut (intuition).

Head:  Makes use of intellect and past knowledge. This involves using the conscious mind to discern questions that need to be answered. For example, is this person telling the truth? What has worked in the past? Have we done our due diligence and homework before making a decision?

Heart: The internal part of us, the voice inside, tells us when things feel right or wrong. For example, are we relaxed around the person we are asking the question about, or do we feel upright and uncomfortable? Keep in mind that our bodies do talk to us.

Gut:  We need to trust our intuition. If it doesn’t feel right, chances are it’s not right for us. What may be right for one person can be wrong for another. Our gut instinct, our inner voice, is always there for us when we take the time to pay attention and listen.

Decision making styles

A person’s decision making process depends to a significant degree on their cognitive style. There are more than a few models to explain these styles.  For example, the common personality test Myers-Briggs Type Indicator (MBTI) examines a set of four bi-polar dimensions, which are:

She claimed that a person’s decision making style is based largely on how they score on these four dimensions. For example, someone that scored near the thinking, extroversion, sensing, and judgement ends of the dimensions would tend to have a logical, analytical, objective, critical and empirical decision making style.

Every leader prefers a different way to contemplate a decision. The four styles of decision making are directive, analytical, conceptual and behavioral. Each style is a different method of weighing alternatives and examining solutions.

The two spectrum work together to create the decision-making style framework. The first spectrum is structure vs. ambiguity. This spectrum measures people’s propensity to prefer either structure (i.e., defined processes and expectations) or ambiguity (i.e., open-ended and flexible). The second spectrum is task/technical vs. people/social. This spectrum measures if the motivation to make a specific choice is guided more by a desire to be right, or to get results (task/technical), or if it’s to create harmony or social impact (people/social).

Areas Of The Brain (Neuroscience) In Decision Making

Decision Making In Uncertainty

Decision making often occurs in the face of uncertainty about whether one’s choices will lead to benefit or harm. Emotion appears to aid the decision-making process heavily in the face of uncertainty. The somatic-marker hypothesis is a neuro-biological theory of how decisions are made in the face of uncertain outcome. In brief, this theory holds that such decisions are aided by emotions, in the form of bodily states, that are elicited during the deliberation of future consequences and that mark different options for behavior as being advantageous or disadvantageous. This process involves an interplay between neural systems that elicit emotional/bodily states and neural systems that map these emotional/bodily states.

Cognitive and personal biases in decision making

It is generally agreed that biases can creep into our decision making processes, calling into question the correctness of a decision. Some of the more commonly debated cognitive biases may be:

***To be continued in Chapter 02 (Various Techniques in use in individual and group Decision Making) Link to Chapter -02:

Content Curated By: Dr Shoury Kuttappa

Categories
Uncategorized

ILLUSORY CORRELATION: MISGUIDED THINKING

Human beings have been blaming strange behaviour on the full moon for centuries. In the Middle Ages, for example, people claimed that a full moon could turn humans into werewolves. In the 1700s, it was common to believe that a full moon could cause epilepsy or feverish temperatures. We even changed our language to match our beliefs. The word lunatic comes from the Latin root word ‘luna’, which means moon.

Today, we have (mostly) come to our sanities. While we no longer blame sickness and disease on the phases of the moon, we will hear people use it as a casual explanation for outlandish behaviour. For example, a common story in medical circles is that during a chaotic evening at the hospital one of the nurses will often say, “Must be a full moon tonight.”

There is little evidence that a full moon actually impacts our behaviours. A complete analysis of more than 30 peer-reviewed studies found no correlation between a full moon and hospital admissions, lottery ticket pay-outs, suicides, traffic accidents, crime rates, and many other common events. But here’s the interesting thing: even though the research says otherwise, a 2005 study revealed that 7 out of 10 nurses still believed that “a full moon led to more chaos and patients that night.”

How is that possible? The nurses who swear that a full moon causes strange behavior aren’t stupid. They are simply falling victim to a common mental error that plagues all of us. Psychologists refer to this little brain mistake as an “illusory correlation.”

How We Fool Ourselves Without Realizing It

An illusory correlation happens when we mistakenly over-emphasize one outcome and ignore the others. For example, let’s say we visit Mumbai City and someone cuts us off as we’re boarding the subway train. Then, we go to a restaurant and the waiter is rude to us. Finally, we ask someone on the street for directions and they blow us off. When we think back on our trip to Mumbai, it is easy to remember these experiences and conclude that “people from Mumbai are rude” or “people in big cities are rude.”

However, we are forgetting about all of the meals we ate when the waiter acted perfectly normal or the hundreds of people we passed on the Subway platform who didn’t cut us off. These were literally non-events because nothing notable happened. As a result, it is easier to remember the times someone acted rudely toward you than the times when you dined happily or took the subway in peace.

Here’s where the brain science comes into play: . . . . . Hundreds of psychology studies have proven that we tend to overestimate the importance of events we can easily recall and underestimate the importance of events we have trouble recalling. The easier it is to remember, the more likely we are to create a strong relationship between two things that are weakly related or not related at all.

The Genesis

Our ability to think about causes and associations is fundamentally important, and always has been for our evolutionary ancestors – we needed to know if a particular berry makes us sick, or if a particular cloud pattern predicts bad weather. So it is not surprising that we automatically make judgements of this kind. We don’t have to mentally count events, tally correlations and systematically discount alternative explanations. We have strong intuitions about what things go together, intuitions that just spring to mind, often after very little experience. This is good for making decisions in a world where you often don’t have enough time to think before you act, but with the side-effect that these intuitions contain some predictable errors. One such error is illusory correlation. Two things that are individually salient seem to be associated when they are not.

One explanation is that things that are relatively uncommon are more vivid (because of their rarity). This, and an effect of existing stereotypes, creates a mistaken impression that the two things are associated when they are not. This is a side effect of an intuitive mental machinery for reasoning about the world. Most of the time it is quick and delivers reliable answers – but it seems to be susceptible to error when dealing with rare but vivid events, particularly where preconceived biases operate. Associating bad traffic behaviour with ethnic minority drivers, or cyclists, is another case where people report correlations that just are not there. Both the minority (either an ethnic minority, or the cyclists) and bad behaviour stand out. Our quick-but-dirty inferential machinery leaps to the conclusion that the events are commonly associated, when they are not.

Self Perspective

Sometimes we feel like the whole world is against us. The other lanes of traffic always move faster than ours. Traffic signals are always red when we are in a hurry. The same goes for the supermarket queues. Why does it always rain on those occasions we do not carry an umbrella, and why do flies always want to eat our sandwiches at a picnic and not other people’s? It feels like there is only one reasonable explanations. The universe itself has a vendetta against us and  we get back to the universe-victim theory.

So here we have a mechanism which might explain our woes. The other lanes or queues moving faster is one salient event, and our intuition wrongly associates it with the most salient thing in our environment – us (Self). What, after all, is more important to us than ourselves. Which brings us back to the universe-victim theory. When our lane is moving along we are focusing on where we are going, ignoring the traffic we overtake. When our lane is stuck we think about us and our hard luck, looking at the other lane. No wonder the association between self and being overtaken sticks in memory more.

This distorting influence of memory on our judgement lies behind a good chunk of our feelings of victimization. In some situations there is a real bias. We really do spend more time being overtaken in traffic than we do overtaking. And the smoke really does tend follow us around the campfire, because wherever we sit creates a warm up-draught that the smoke fills. But on top of all of these is a mind that over-exaggerates our own importance, giving each of us the false impression that we are more important in how events work out than we really are.

Woman under a dark cloud as the rain drops turn to color

How to Spot an Illusory Correlation: . . . . . . . . . There is a simple strategy we can use to spot our hidden assumptions and prevent ourselves from making an illusory correlation. It’s called a contingency table and it forces you to recognize the non-events that are easy to ignore in daily life.

Let’s break down the possibilities for having a full moon and a crazy night of hospital admissions.

This contingency table helps reveal what is happening inside the minds of nurses during a full moon. The nurses quickly remember the one time when there was a full moon and the hospital was overflowing, but simply forget the many times there was a full moon and the patient load was normal. Because they can easily retrieve a memory about a full moon and a crazy night and so they incorrectly assume that the two events are related. Ideally, we would plug in a number into each cell so that we can compare the actually frequency of each event, which will often be much different than the frequency we easily remember for each event.

How to Fix Your Misguided Thinking

We make illusory correlations in many areas of life
: . . .. . . . . . . We hear about Dirubhai Ambani or Bill Gates dropping out of college to start a billion-dollar business and we over-value that story in our head. Meanwhile, we never hear about all of the college dropouts that fail to start a successful company. We only hear about the hits and never hear about the misses even though the misses far outnumber the hits.

We see someone of a particular ethnic or racial background getting arrested and so you assume all people with that background are more likely to be involved in crime. We never hear about the 99 percent of people who don’t get arrested because it is a non-event.
We hear about a shark attack on the news and refuse to go into the ocean during our next beach vacation. The odds of a shark attack have not increased since we went in the ocean last time, but we never hear about the millions of people swimming safely each day. The news is never going to run a story titled, “Millions of Tourists Float in the Ocean Each Day.” We over-emphasize the story we hear on the news and make an illusory correlation.

Most of us are unaware of how our selective memory of events influences the beliefs we carry around with us on a daily basis.
We are incredibly poor at remembering things that do not happen. If we don’t see it, we assume it has no impact or rarely happens. If we understand how an illusory correlation error occurs and use strategies like the Contingency Table Test mentioned above, we can reveal the hidden assumptions we didn’t even know we had and correct the misguided thinking that plagues our everyday lives.

Even Shakespeare blamed our occasional craziness on the moon. In his play Othello he wrote, “It is the very error of the moon. She comes more near the earth than she was wont. And makes men mad.”

For lovers of psychology, this phenomenon is often referred to as the Availability Heuristic.

The more easily we can retrieve a certain memory or thought – that is, the more available it is in our brains – the more likely we are to overestimate it’s frequency and importance. The Illusory Correlation is sort of a combination of the Availability Heuristic and Confirmation Bias.

You can easily recall the one instance when something happened (Availability Heuristic), which makes you think it happens often. Then, when it happens again – like the next full moon, for example – your Confirmation Bias kicks in and confirms your previous belief.

Content Curated By: Dr Shoury Kuttappa.