Limited Time
We’re constrained by time and information, and yet we can’t let that paralyze us. Without the ability to act fast in the face of uncertainty, we surely would have perished as a species long ago. With every piece of new information, we need to do our best to assess our ability to affect the situation, apply it to decisions, simulate the future to predict what might happen next, and otherwise act on our new insight.
In order to act, we need to be confident in our ability to make an impact and to feel like what we do is important.
In reality, most of this confidence can be classified as overconfidence, but without it we might not act at all.
Overconfidence effect: The overconfidence effect is a well-established bias in which a person's subjective confidence in his or her judgements is reliably greater than the objective accuracy of those judgements, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.
Egocentric bias: Egocentric bias is the tendency to rely too heavily on one's own perspective and/or have a higher opinion of oneself than reality. It appears to be the result of the psychological need to satisfy one's ego) and to be advantageous for memory consolidation. Research has shown that experiences, ideas, and beliefs are more easily recalled when they match one's own, causing an egocentric outlook. Egocentric bias is referred to by most psychologists as a general umbrella term under which other related phenomena fall under.
Optimism bias: Optimism bias (also known as unrealistic or comparative optimism) is a cognitive bias that causes a person to believe that they are at a lesser risk of experiencing a negative event compared to others. Optimism bias is quite common and transcends gender, race, nationality and age. Optimistic biases are even reported in non-human animals such as rats and birds.
Social desirability bias: In social science research, social desirability bias is a type of response bias that is the tendency of survey respondents to answer questions in a manner that will be viewed favorably by others. It can take the form of over-reporting "good behavior" or under-reporting "bad," or undesirable behavior. The tendency poses a serious problem with conducting research with self-reports, especially questionnaires. This bias interferes with the interpretation of average tendencies as well as individual differences.
Third-person effect: The Third-person effect hypothesis predicts that people tend to perceive that mass media messages have a greater effect on others than on themselves, based on personal biases. The Third-person effect manifests itself through an individual’s overestimation of the effect of a mass communicated message on the generalized other, or an underestimation of the effect of a mass communicated message on themselves.
Forer effect, Barnum effect: The Barnum effect, also called the Forer effect, is a common psychological phenomenon whereby individuals give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically to them, that are in fact vague and general enough to apply to a wide range of people. This effect can provide a partial explanation for the widespread acceptance of some paranormal beliefs and practices, such as astrology, fortune telling, aura reading), and some types of personality tests.
Illusion of control: The illusion of control is the tendency for people to overestimate their ability to control events; for example, it occurs when someone feels a sense of control over outcomes that they demonstrably do not influence. The effect was named by psychologist Ellen Langer and has been replicated in many different contexts. It is thought to influence gambling behavior and belief in the paranormal. Along with illusory superiority and optimism bias, the illusion of control is one of the positive illusions.
False consensus effect: In psychology, the false-consensus effect or false-consensus bias is an attributional type of cognitive bias whereby people tend to overestimate the extent to which their opinions, beliefs, preferences, values, and habits are normal and typical of those of others (i.e., that others also think the same way that they do). This cognitive bias tends to lead to the perception of a consensus that does not exist, a "false consensus".
Dunning-Kruger effect: In the field of psychology, the Dunning–Kruger effect is a cognitive bias in which people of low ability have illusory superiority and mistakenly assess their cognitive ability as greater than it is. The cognitive bias of illusory superiority comes from the inability of low-ability people to recognize their lack of ability; without the self-awareness of metacognition, low-ability people cannot objectively evaluate their actual competence or incompetence. On the other hand, people of high ability incorrectly assume that tasks that are easy for them are also easy for other people.
Hard-easy effect: The hard–easy effect is a cognitive bias that manifests itself as a tendency to overestimate the probability of one's success at a task perceived as hard, and to underestimate the likelihood of one's success at a task perceived as easy. The hard-easy effect takes place, for example, when individuals exhibit a degree of underconfidence in answering relatively easy questions and a degree of overconfidence in answering relatively difficult questions. "Hard tasks tend to produce overconfidence but worse-than-average perceptions," reported Katherine A. Burson, Richard P. Larrick, and Jack B. Soll in a 2005 study, "whereas easy tasks tend to produce underconfidence and better-than-average effects."
Illusory superiority: In the field of social psychology, illusory superiority is a condition of cognitive bias whereby a person overestimates their own qualities and abilities, in relation to the same qualities and abilities of other persons. Illusory superiority is one of many positive illusions, relating to the self, that are evident in the study of intelligence, the effective performance of tasks and tests, and the possession of desirable personal characteristics and personality traits.
Lake Wobegone effect: The Lake Wobegon effect, a natural human tendency to overestimate one's capabilities, was coined by David Myers in honour of the fictional town. The characterization of the fictional location, where "all the women are strong, all the men are good looking, and all the children are above average," has been used to describe a real and pervasive human tendency to overestimate one's achievements and capabilities in relation to others.
Self-serving bias: A self-serving bias is any cognitive or perceptual process that is distorted by the need to maintain and enhance self-esteem, or the tendency to perceive oneself in an overly favorable manner. It is the belief that individuals tend to ascribe success to their own abilities and efforts, but ascribe failure to external factors. When individuals reject the validity of negative feedback, focus on their strengths and achievements but overlook their faults and failures, or take more responsibility for their group's work than they give to other members, they are protecting their ego from threat and injury. These cognitive and perceptual tendencies perpetuate illusions and error, but they also serve the self's need for esteem. For example, a student who attributes) earning a good grade on an exam to their own intelligence and preparation but attributes earning a poor grade to the teacher's poor teaching ability or unfair test questions might be exhibiting the self-serving bias. Studies have shown that similar attributions are made in various situations, such as the workplace, interpersonal relationships, sports, and consumer decisions.
Actor-observer bias, Fundamental attribution error: In social psychology, fundamental attribution error (FAE), also known as correspondence bias or attribution effect, is the concept that, in contrast to interpretations of their own behavior, people tend to (unduly) emphasize the agent's internal characteristics (character or intention), rather than external factors, in explaining other people's behavior. This effect has been described as "the tendency to believe that what people do reflects who they are".
-
Trait ascription bias: Trait ascription bias is the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable in their personal traits across different situations. More specifically, it is a tendency to describe one's own behaviour in terms of situational factors while preferring to describe another's behaviour by ascribing fixed dispositions to their personality. This may occur because peoples' own internal states are more readily observable and available to them than those of others.
-
Risk compensation, Peltzman effect: Risk compensation is a theory which suggests that people typically adjust their behavior in response to the perceived level of risk, becoming more careful where they sense greater risk and less careful if they feel more protected. Although usually small in comparison to the fundamental benefits of safety interventions, it may result in a lower net benefit than expected.
In order to stay focused, we favor the immediate, relatable thing in front of us over the delayed and distant.
We value stuff more in the present than in the future, and relate more to stories of specific individuals than anonymous individuals or groups. I’m surprised there aren’t more biases found under this one, considering how much it impacts how we think about the world.
-
Appeal to novelty: The appeal to novelty (also called argumentum ad novitatem) is a fallacy in which one prematurely claims that an idea or proposal is correct or superior, exclusively because it is new and modern. In a controversy between status quo and new inventions, an appeal to novelty argument is not in itself a valid argument. The fallacy may take two forms: overestimating the new and modern, prematurely and without investigation assuming it to be best-case, or underestimating status quo, prematurely and without investigation assuming it to be worst-case.
Identifiable victim effect: The "identifiable victim effect" refers to the tendency of individuals to offer greater aid when a specific, identifiable person ("victim") is observed under hardship, as compared to a large, vaguely defined group with the same need. The effect is also observed when subjects administer punishment rather than reward. Research has shown that individuals can be more likely to mete out punishment, even at their own expense, when they are punishing specific, identifiable individuals ("perpetrators").
In order to get anything done, we’re motivated to complete things that we’ve already invested time and energy in.
The behavioral economist’s version of Newton’s first law of motion: an object in motion stays in motion. This helps us finish things, even if we come across more and more reasons to give up.
-
Irrational escalation, Escalation of commitment: Escalation of commitment is a human behavior pattern in which an individual or group facing increasingly negative outcomes from some decision, action, or investment nevertheless continues the same behavior rather than alter course. The actor maintains behaviors that are irrational, but align with previous decisions and actions.
Loss aversion: In cognitive psychology and decision theory, loss aversion refers to people's tendency to prefer avoiding losses to acquiring equivalent gains: it is better to not lose $5 than to find $5. The principle is very prominent in the domain of economics. What distinguishes loss aversion from risk aversion is that the utility of a monetary payoff depends on what was previously experienced or was expected to happen. Some studies have suggested that losses are twice as powerful, psychologically, as gains. Loss aversion was first identified by Amos Tversky and Daniel Kahneman.
-
Generation effect: The generation effect is a phenomenon where information is better remembered if it is generated from one's own mind rather than simply read). Researchers have struggled to account for why generated information is better recalled than read information, but no single explanation has been sufficient.
Zero-risk bias: Zero-risk bias is a tendency to prefer the complete elimination of a risk even when alternative options produce a greater reduction in risk (overall). This effect on decision making has been observed in surveys presenting hypothetical scenarios and certain real-world policies (e.g. war against terrorism as opposed to reducing the risk of traffic accidents or gun violence) have been interpreted as being influenced by it.
-
Unit bias: Unit bias is the tendency for individuals to want to complete a unit of a given item or task. People want to finish whatever portion they have no matter the size, it is a perception of completion that is satisfying to people.
Pseudocertainty effect: In prospect theory, the pseudocertainty effect is the tendency for people to perceive an outcome as certain while it is actually uncertain. It can be observed in multi-stage decision making, in which evaluation of the certainty of the outcome in a previous stage of decisions is disregarded when selecting an option in subsequent stages. Not to be confused with certainty effect, the pseudocertainty effect was discovered from an attempt at providing a normative use of decision theory for the certainty effect by relaxing the cancellation rule.
-
Backfire effect: The backfire effect is a name for the finding that, given evidence against their beliefs, people can reject the evidence and believe even more strongly.
In order to avoid mistakes, we’re motivated to preserve our autonomy and status in a group, and to avoid irreversible decisions.
If we must choose, we tend to choose the option that is perceived as the least risky or that preserves the status quo. Better the devil you know than the devil you do not.
System justification, Reactance: System justification theory (SJT) is a theory within social psychology that system-justifying beliefs serve a psychologically palliative function. It proposes that people have several underlying needs, which vary from individual to individual, that can be satisfied by the defense and justification of the status quo, even when the system may be disadvantageous to certain people. People have epistemic, existential, and relational needs that are met by and manifest as ideological support for the prevailing structure of social, economic, and political norms. Need for order and stability, and thus resistance to change or alternatives, for example, can be a motivator for individuals to see the status quo as good, legitimate, and even desirable.
Reverse psychology: Reverse psychology is a technique involving the advocacy of a belief or behavior that is opposite to the one desired, with the expectation that this approach will encourage the subject of the persuasion to do what actually is desired: the opposite of what is suggested. This technique relies on the psychological phenomenon of reactance), in which a person has a negative emotional reaction to being persuaded, and thus chooses the option which is being advocated against. The one being manipulated is usually unaware of what is really going on.
Decoy effect: In marketing, the decoy effect (or attraction effect or asymmetric dominance effect) is the phenomenon whereby consumers will tend to have a specific change in preference between two options when also presented with a third option that is asymmetrically dominated. An option is asymmetrically dominated when it is inferior in all respects to one option; but, in comparison to the other option, it is inferior in some respects and superior in others. In other words, in terms of specific attributes determining preferences, it is completely dominated by (i.e., inferior to) one option and only partially dominated by the other. When the asymmetrically dominated option is present, a higher percentage of consumers will prefer the dominating option than when the asymmetrically dominated option is absent. The asymmetrically dominated option is therefore a decoy serving to increase preference for the dominating option. The decoy effect is also an example of the violation of the independence of irrelevant alternatives axiom of decision theory.
Social comparison bias: Social comparison bias is having feelings of dislike and competitiveness with someone that is seen physically, or mentally better than yourself.
Status quo bias: Status quo bias is an emotional bias; a preference for the current state of affairs. The current baseline (or status quo) is taken as a reference point, and any change from that baseline is perceived as a loss. Status quo bias should be distinguished from a rational preference for the status quo ante, as when the current state of affairs is objectively superior to the available alternatives, or when imperfect information is a significant problem. A large body of evidence, however, shows that status quo bias frequently affects human decision-making.
We favor options that appear simple or that have more complete information over more complex, ambiguous options.
We’d rather do the quick, simple thing than the important complicated thing, even if the important complicated thing is ultimately a better use of time and energy.
Ambiguity bias: The ambiguity effect is a cognitive bias where decision making is affected by a lack of information, or "ambiguity". The effect implies that people tend to select options for which the probability of a favorable outcome is known, over an option for which the probability of a favorable outcome is unknown. The effect was first described by Daniel Ellsberg in 1961.
Information bias: Information bias is a cognitive bias to seek information when it does not affect action. People can often make better predictions or choices with less information: more information is not always better. An example of information bias is believing that the more information that can be acquired to make a decision, the better, even if that extra information is irrelevant for the decision.
Belief bias: Belief bias is the tendency to judge the strength of arguments based on the plausibility of their conclusion rather than how strongly they support that conclusion. A person is more likely to accept arguments that supports a conclusion that aligns with our values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion. Belief bias is an extremely common and therefore significant form of error; we can easily be blinded by our beliefs and reach the wrong conclusion. Belief bias has been found to influence various reasoning tasks, including conditional reasoning, relation reasoning and transitive reasoning.
-
Bike-shedding effect, Law of Triviality: Parkinson's law of triviality is C. Northcote Parkinson's 1957 argument that members of an organization give disproportionate weight to trivial issues. Parkinson provides the example of a fictional committee whose job was to approve the plans for a nuclear power plant spending the majority of its time on discussions about relatively minor but easy-to-grasp issues, such as what materials to use for the staff bike shed, while neglecting the proposed design of the plant itself, which is far more important and a far more difficult and complex task.
Delmore effect: The Delmore effect is our tendency to provide more articulate and explicit goals for lower priority areas of our lives. It appears that the daunting nature of truly important goals may motivate the self to deflect this anxiety by attending to less important, but also less threatening goals.
Conjunction fallacy: The conjunction fallacy (also known as the Linda problem) is a formal fallacy that occurs when it is assumed that specific conditions are more probable than a single general one.
Occam’s razor: Occam's razor (also Ockham's razor or Ocham's razor; Latin: lex parsimoniae "law of parsimony") is the problem-solving principle that the simplest solution tends to be the right one. When presented with competing hypotheses to solve a problem, one should select the solution with the fewest assumptions. The idea is attributed to William of Ockham (c. 1287–1347), who was an English Franciscan friar, scholastic philosopher, and theologian.
Less-is-better effect: The less-is-better effect is a type of preference reversal that occurs when the lesser or smaller alternative of a proposition is preferred when evaluated separately, but not evaluated together. The term was first proposed by Christopher Hsee.