Cognitive Bias Cheat Sheet


Every cognitive bias is there for a reason — primarily to save our brains time or energy. If you look at them by the problem they’re trying to solve, it becomes a lot easier to understand why they exist, how they’re useful, and the trade-offs (and resulting mental errors) that they introduce.


Cognitive biases are just tools, useful in the right contexts, harmful in others. They’re the only tools we’ve got, and they’re even pretty good at what they’re meant to do. We might as well get familiar with them and even appreciate that we at least have some ability to process the universe with our mysterious brains.


https://betterhumans.coach.me/cognitive-bias-cheat-sheet-55a472476b18


https://en.wikipedia.org/wiki/List_of_cognitive_biases

Noisy Data

There is just too much information in the world, we have no choice but to filter almost all of it out. Our brain uses a few simple tricks to pick out the bits of information that are most likely going to be useful in some way.

Bizarre/funny/visually-striking/anthropomorphic things stick out more than non-bizarre/unfunny things.

Our brains tend to boost the importance of things that are unusual or surprising. Alternatively, we tend to skip over information that we think is ordinary or expected.

Bizarreness effect, Humor effect: Bizarreness effect is the tendency of bizarre material to be better remembered than common material. The scientific evidence for its existence is contested. Some research suggests it does exist, some suggests it doesn't exist and some suggests it leads to worse remembering.

Von Restorff effect: The von Restorff effect, also known as the "isolation effect", predicts that when multiple homogeneous stimuli are presented, the stimulus that differs from the rest is more likely to be remembered. The theory was coined by German psychiatrist and pediatrician Hedwig von Restorff (1906–1962), who, in her 1933 study, found that when participants were presented with a list of categorically similar items with one distinctive, isolated item on the list, memory for the item was improved.

Picture superiority effect: The picture superiority effect refers to the phenomenon in which pictures and images are more likely to be remembered than words. This effect has been demonstrated in numerous experiments using different methods. It is based on the notion that "human memory is extremely sensitive to the symbolic modality of presentation of event information". Explanations for the picture superiority effect are not concrete and are still being debated.

Self-relevance effect: The self-reference effect is a tendency for people to encode information differently depending on the level on which they are implicated in the information. When people are asked to remember information when it is related in some way to themselves, the recall rate can be improved.

Negativity bias: The negativity bias, also known as the negativity effect, refers to the notion that, even when of equal intensity, things of a more negative nature (e.g. unpleasant thoughts, emotions, or social interactions; harmful/traumatic events) have a greater effect on one's psychological state and processes than neutral or positive things. In other words, something very positive will generally have less of an impact on a person's behavior and cognition than something equally emotional but negative. The negativity bias has been investigated within many different domains, including the formation of impressions and general evaluations; attention, learning, and memory; and decision-making and risk considerations.

We notice things that are already primed in memory or repeated often.

This is the simple rule that our brains are more likely to notice things that are related to stuff that’s recently been loaded in memory.

Availability heuristic: The availability heuristic is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method or decision. The availability heuristic operates on the notion that if something can be recalled, it must be important, or at least more important than alternative solutions which are not as readily recalled. Subsequently, under the availability heuristic, people tend to heavily weigh their judgments toward more recent information, making new opinions biased toward that latest news.

Attentional bias: Attentional bias is the tendency for people's perception to be affected by their recurring thoughts at the time. Attentional biases may explain an individual's failure to consider alternative possibilities, as specific thoughts guide the train of thought in a certain manner. For example, smokers tend to possess a bias for cigarettes and other smoking-related cues around them, due to the positive thoughts they've already attributed between smoking and the cues they were exposed to while smoking. Attentional bias has also been associated with clinically relevant symptoms such as anxiety and depression).

Illusory truth effect: The illusory truth effect (also known as the validity effect, truth effect or the reiteration effect) is the tendency to believe information to be correct after repeated exposure. This phenomenon was first identified in a 1977 study at Villanova University and Temple University. When truth is assessed, people rely on whether the information is in line with their understanding or if it feels familiar. The first condition is logical as people compare new information with what they already know to be true. Repetition makes statements easier to process relative to new, unrepeated, statements, leading people believe that the repeated conclusion is more truthful. The illusory truth effect has also been linked to "hindsight bias", in which the recollection of confidence is skewed after the truth has been received.

Mere exposure effect: The mere-exposure effect is a psychological phenomenon by which people tend to develop a preference for things merely because they are familiar with them. In social psychology, this effect is sometimes called the familiarity principle. The effect has been demonstrated with many kinds of things, including words, Chinese characters, paintings, pictures of faces, geometric figures, and sounds. In studies of interpersonal attraction, the more often a person is seen by someone, the more pleasing and likeable that person appears to be.

Context effect, Cue-dependent forgetting, Mood-congruent memory bias: Cue-dependent forgetting, or retrieval failure, is the failure to recall information without memory cues. The term either pertains to semantic cues, state-dependent cues or context-dependent cues.

Frequency illusion, Baader-Meinhof Phenomenon: The frequency illusion (also known as the Baader-Meinhof phenomenon) is the phenomenon in which people who just learn or notice something start seeing it everywhere. For instance, a person who just saw a movie about sharks might start seeing the word "shark" everywhere. This is not necessarily because the person really has come across more instances of the word "shark"; rather, before seeing the movie, they usually simply passed the word over and quickly forgot it, while later, after having seen the movie, the word started sticking in their memory.

Empathy gap: A hot-cold empathy gap is a cognitive bias in which people underestimate the influences of visceral drives on their own attitudes, preferences, and behaviors.

Omission bias: The omission bias is an alleged type of cognitive bias. It is the tendency to judge harmful actions as worse, or less moral than equally harmful omissions (inactions) because actions are more obvious than inactions. It is contentious as to whether this represents a systematic error in thinking, or is supported by a substantive moral theory. For a consequentialist, judging harmful actions as worse than inaction would indeed be inconsistent, but deontological ethics may, and normally does, draw a moral distinction between doing and allowing. The bias is usually showcased through the trolley problem.

Base rate fallacy: The base rate fallacy, also called base rate neglect or base rate bias, is a formal fallacy. If presented with related base rate information (i.e. generic, general information) and specific information (information pertaining only to a certain case), the mind tends to ignore the former and focus on the latter.

We notice when something has changed.

And we’ll generally tend to weigh the significance of the new value by the direction the change happened (positive or negative) more than re-evaluating the new value as if it had been presented alone. Also applies to when we compare two similar things.

Anchoring, Contrast effect, Focusing effect: Anchoring or focalism is a cognitive bias for an individual to rely too heavily on an initial piece of information offered (known as the "anchor") when making decisions.

Money illusion: In economics, money illusion, or price illusion, is the tendency of people to think of currency in nominal, rather than real), terms. In other words, the numerical/face value (nominal value) of money is mistaken for its purchasing power (real value) at a previous point in the general price level (in the past). This is false, as modern fiat currencies have no intrinsic value and their real value is derived from all the underlying value systems in an economy, e.g., sound government, sound economics, sound education, sound legal system, sound defence, etc. The change in this real value over time is indicated by the change in the Consumer Price Index over time.

Framing effect: The framing effect is an example of cognitive bias, in which people react to a particular choice in different ways depending on how it is presented; e.g. as a loss or as a gain. People tend to avoid risk when a positive frame is presented but seek risks when a negative frame is presented. Gain and loss are defined in the scenario as descriptions of outcomes (e.g., lives lost or saved, disease patients treated and not treated, lives saved and lost during accidents, etc.).

Weber–Fechner law: The Weber–Fechner law refers to two related laws in the field of psychophysics, known as Weber's law and Fechner's law. Both laws relate to human perception, more specifically the relation between the actual change in a physical stimulus) and the perceived change. This includes stimuli to all senses: vision, hearing, taste, touch, and smell.

Conservatism: In cognitive psychology and decision science, conservatism or conservatism bias is a bias in human information processing, which refers to the tendency to revise one's belief insufficiently when presented with new evidence. This bias describes human belief revision in which persons over-weigh the prior distribution (base rate) and under-weigh new sample evidence when compared to Bayesian belief-revision.

Distinction bias: Distinction bias, a concept of decision theory, is the tendency to view two options as more distinctive when evaluating them simultaneously than when evaluating them separately.

We are drawn to details that confirm our own existing beliefs.

This is a big one. As is the corollary: we tend to ignore details that contradicts our own beliefs.

Confirmation bias: Confirmation bias, also called confirmatory bias or myside bias, is the tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses. It is a type of cognitive bias and a systematic error of inductive reasoning. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs. Confirmation bias is a variation of the more general tendency of apophenia.

Congruence bias: Congruence bias is a type of cognitive bias similar to confirmation bias. Congruence bias occurs due to people's overreliance on directly testing a given hypothesis as well as neglecting indirect testing.

Post-purchase rationalization, Choice-supportive bias: In cognitive science, choice-supportive bias or post-purchase rationalization is the tendency to retroactively ascribe positive attributes to an option one has selected. It is a cognitive bias. For example, if a person chooses option A instead of option B, they are likely to ignore or downplay the faults of option A while amplifying those of option B. Conversely, they are also likely to notice and amplify the advantages of option A and not notice or de-emphasize those of option B.

Selective perception: Selective perception is the tendency not to notice and more quickly forget stimuli that cause emotional discomfort and contradict our prior beliefs. For example, a teacher may have a favorite student because they are biased by in-group favoritism. The teacher ignores the student's poor attainment. Conversely, they might not notice the progress of their least favorite student.

Observer-expectancy effect, Experimenter’s bias, Observer effect, Expectation bias: The observer-expectancy effect (also called the experimenter-expectancy effect, expectancy bias, observer effect, or experimenter effect) is a form of reactivity) in which a researcher's cognitive bias causes them to subconsciously influence the participants of an experiment. Confirmation bias can lead to the experimenter interpreting results incorrectly because of the tendency to look for information that conforms to their hypothesis, and overlook information that argues against it. It is a significant threat to a study's internal validity, and is therefore typically controlled using a double-blind experimental design.

Ostrich effect: In behavioral finance, the ostrich effect is the attempt made by investors to avoid negative financial information. The name comes from the common (but false) legend that ostriches bury their heads in the sand to avoid danger.

Subjective validation: Subjective validation, sometimes called personal validation effect, is a cognitive bias by which a person will consider a statement or another piece of information to be correct if it has any personal meaning or significance to them. In other words, a person whose opinion is affected by subjective validation will perceive two unrelated events (i.e., a coincidence) to be related because their personal belief demands that they be related. Closely related to the Forer effect, subjective validation is an important element in cold reading. It is considered to be the main reason behind most reports of paranormal phenomena. According to Bob Carroll, psychologist Ray Hyman is considered to be the foremost expert on subjective validation and cold reading.

Continued influence effect: The continued influence effect is the tendency to believe previously learned misinformation even after it has been corrected. Misinformation can still influence inferences one generates after a correction has occurred.

Semmelweis reflex: The Semmelweis reflex or "Semmelweis effect" is a metaphor for the reflex-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs or paradigms.

We notice flaws in others more easily than flaws in ourselves.

Yes, before you see this entire article as a list of quirks that compromise how other people think, realize that you are also subject to these biases.

Bias blind spot: The bias blind spot is the cognitive bias of recognizing the impact of biases on the judgment of others, while failing to see the impact of biases on one's own judgment. The term was created by Emily Pronin, a social psychologist from Princeton University's Department of Psychology, with colleagues Daniel Lin and Lee Ross. The bias blind spot is named after the visual blind spot). Most people appear to exhibit the bias blind spot. In a sample of more than 600 residents of the United States, more than 85% believed they were less biased than the average American. Only one participant believed that he or she was more biased than the average American. People do vary with regard to the extent to which they exhibit the bias blind spot. It appears to be a stable individual difference that is measurable (for a scale, see Scopelliti et al. 2015).

Naïve cynicism: Naïve cynicism is a philosophy of mind, cognitive bias and form of psychological egoism that occurs when people naïvely expect more egocentric bias in others than actually is the case.

Naïve realism: In social psychology, naïve realismis the human tendency to believe that we see the world around us objectively), and that people who disagree with us must be uninformed, irrational, or biased.

Incomplete Data

The world is very confusing, and we end up only seeing a tiny sliver of it, but we need to make some sense of it in order to survive. Once the reduced stream of information comes in, we connect the dots, fill in the gaps with stuff we already think we know, and update our mental models of the world.

We find stories and patterns even in sparse data.

Since we only get a tiny sliver of the world’s information, and also filter out almost everything else, we never have the luxury of having the full story. This is how our brain reconstructs the world to feel complete inside our heads.

Confabulation: In psychiatry, confabulation (verb: confabulate) is a disturbance of memory, defined as the production of fabricated, distorted, or misinterpreted memories about oneself or the world, without the conscious intention to deceive. People who confabulate present incorrect memories ranging from "subtle alterations to bizarre fabrications", and are generally very confident about their recollections, despite contradictory evidence.

Clustering illusion: The clustering illusion is the tendency to erroneously consider the inevitable "streaks" or "clusters" arising in small samples from random distributions to be non-random. The illusion is caused by a human tendency to underpredict the amount of variability likely to appear in a small sample of random or semi-random data.

Insensitivity to sample size: Insensitivity to sample size is a cognitive bias that occurs when people judge the probability of obtaining a sample statistic without respect to the sample size. For example, in one study subjects assigned the same probability to the likelihood of obtaining a mean height of above six feet [183 cm] in samples of 10, 100, and 1,000 men. In other words, variation is more likely in smaller samples, but people may not expect this.

Neglect of probability: The neglect of probability, a type of cognitive bias, is the tendency to disregard probability when making a decision under uncertainty and is one simple way in which people regularly violate the normative rules for decision making. Small risks are typically either neglected entirely or hugely overrated. The continuum between the extremes is ignored. The term probability neglect was coined by Cass Sunstein.

Anecdotal fallacy: The Anecdotal Fallacy: Quantitative scientific measures are almost always more accurate than personal perceptions and experiences, but our inclination is to believe that which is tangible to us, and/or the word of someone we trust over a more 'abstract' statistical reality.

Illusion of validity: Illusion of validity is a cognitive bias in which a person overestimates his or her ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data "tell" a coherent story.

Masked man fallacy: In philosophical logic, the masked-man fallacy (also known as the intensional fallacy and the epistemic fallacy) is committed when one makes an illicit use of Leibniz's law in an argument. Leibniz's law states that, if one object has a certain property, while another object does not have the same property, the two objects cannot be identical. The fallacy is "epistemic" because it posits an immediate identity between a subject's knowledge of an object with the object itself.

Recency illusion: The recency illusion is the belief or impression that a word or language usage is of recent origin when it is long-established.

Gambler’s fallacy: The gambler's fallacy, also known as the Monte Carlo fallacy or the fallacy of the maturity of chances, is the mistaken belief that, if something happens more frequently than normal during a given period, it will happen less frequently in the future. It may also be stated as the belief that, if something happens less frequently than normal during a given period, it will happen more frequently in the future. In situations where the outcome being observed is truly random and consists of independent trials of a random process, this belief is false. The fallacy can arise in many situations, but is most strongly associated with gambling, where it is common among players.

Hot-hand fallacy: The "hot hand" (also known as the "hot hand phenomenon" or "hot hand fallacy") is the purported phenomenon that a person who experiences a successful outcome with a random event has a greater probability of success in further attempts. The concept is often applied to sports and skill-based tasks in general and originates from basketball, whereas a shooter is allegedly more likely to score if their previous attempts were successful, i.e. while having "hot hands". While previous success at a task can indeed change the psychological attitude and subsequent success rate of a player, researchers for many years did not find evidence for a "hot hand" in practice, dismissing it as fallacious. However, later research questioned whether the belief is indeed a fallacy. Recent studies using modern statistical analysis show there is evidence for the "hot hand" in some sporting activities.

Illusory correlation: In psychology, illusory correlation is the phenomenon of perceiving a relationship between variables (typically people, events, or behaviors) even when no such relationship exists. A false association may be formed because rare or novel occurrences are more salient) and therefore tend to capture one's attention. This is one way stereotypes form and endure. Hamilton & Rose (1980) found that stereotypes can lead people to expect certain groups and traits to fit together, and then to overestimate the frequency with which these correlations actually occur.

Pareidolia: Pareidolia is a psychological phenomenon in which the mind responds to a stimulus, usually an image or a sound, by perceiving a familiar pattern where none exists.

Anthropomorphism: Anthropomorphism is a cognitive process by which people use their schemas about other humans as a basis for inferring the properties of non-human entities in order to make efficient judgements about the environment, even if those inferences are not always accurate. Schemas about humans are used as the basis because this knowledge is acquired early in life, is more detailed than knowledge about non-human entities, and is more readily accessible in memory.

We fill in characteristics from stereotypes, generalities, and prior histories whenever there are new specific instances or gaps in information.

When we have partial information about a specific thing that belongs to a group of things we are pretty familiar with, our brain has no problem filling in the gaps with best guesses or what other trusted sources provide. Conveniently, we then forget which parts were real and which were filled in.

Group attribution error: The group attribution error refers to people's tendency to believe either (1) that the characteristics of an individual group member are reflective of the group as a whole, or (2) that a group's decision outcome must reflect the preferences of individual group members, even when external information is available suggesting otherwise.

Ultimate attribution error: The ultimate attribution error is a group-level attribution error that offers an explanation for how one person views different causes of negative and positive behavior in ingroup and outgroup members.

Stereotyping: In social psychology, a stereotype is an over-generalized belief about a particular category of people. Stereotypes are generalized because one assumes that the stereotype is true for each individual person in the category. While such generalizations may be useful when making quick decisions, they may be erroneous when applied to particular individuals. Stereotypes encourage prejudice and may arise for a number of reasons.

Essentialism: Essentialism is the view that every entity has a set of attributes that are necessary to its identity) and function. In early Western thought Plato's idealism held that all things have such an "essence"—an "idea" or "form". In Categories), Aristotle similarly proposed that all objects have a substance that, as George Lakoff put it "make the thing what it is, and without which it would be not that kind of thing". The contrary view—non-essentialism—denies the need to posit such an "essence'".

Functional fixedness: Functional fixedness is a cognitive bias that limits a person to use an object only in the way it is traditionally used. The concept of functional fixedness originated in Gestalt psychology, a movement in psychology that emphasizes holistic processing. Karl Duncker defined functional fixedness as being a "mental block against using an object in a new way that is required to solve a problem". This "block" limits the ability of an individual to use components given to them to complete a task, as they cannot move past the original purpose of those components. For example, if someone needs a paperweight, but they only have a hammer, they may not see how the hammer can be used as a paperweight. Functional fixedness is this inability to see a hammer's use as anything other than for pounding nails; the person couldn't think to use the hammer in a way other than in its conventional function.

Moral credential effect: Self-licensing (aka moral self-licensing, moral licensing, licensing effect, moral credential effect) is a term used in social psychology and marketing to describe the subconscious phenomenon whereby increased confidence and security in one’s self-image or self-concept tends to make that individual worry less about the consequences of subsequent immoral behavior and, therefore, more likely to make immoral choices and act immorally. In simple terms, self-licensing occurs when people allow themselves to indulge after doing something positive first; for example, drinking a diet soda with a greasy hamburger and fries can lead one to subconsciously discount the negative attributes of the meal’s high caloric and cholesterol content.

Just-world hypothesis: The just-world hypothesis or just-world fallacy is the cognitive bias (or assumption) that a person's actions are inherently inclined to bring morally fair and fitting consequences to that person, to the end of all noble actions being eventually rewarded and all evil actions eventually punished. In other words, the just-world hypothesis is the tendency to attribute consequences to—or expect consequences as the result of—a universal force that restores moral balance. This belief generally implies the existence of cosmic justice, destiny, divine providence, desert), stability), or order, and has high potential to result in fallacy, especially when used to rationalize) people's misfortune on the grounds that they "deserve" it.

Placebo effect: A placebo is a substance or treatment of no intended therapeutic value. Common placebos include inert tablets (like sugar pills), inert injections (like saline)), sham surgery, and other procedures.

Bandwagon effect: The bandwagon effect is a phenomenon whereby the rate of uptake of beliefs, ideas, fads and trends increases the more that they have already been adopted by others. In other words, the bandwagon effect is characterized by the probability of individual adoption increasing with respect to the proportion who have already done so. As more people come to believe in something, others also "hop on the bandwagon" regardless of the underlying evidence.

Automation bias: Automation bias is the propensity for humans to favor suggestions from automated decision-making systems and to ignore contradictory information made without automation, even if it is correct.

Argument from fallacy: Argument from fallacy is the formal fallacy of analyzing an argument and inferring that, since it contains a fallacy, its conclusion must be false. It is also called argument to logic (argumentum ad logicam), the fallacy fallacy, the fallacist's fallacy, and the bad reasons fallacy.

Authority bias: Authority bias is the tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to its content) and be more influenced by that opinion. The Milgram experiment in 1961 was the classic experiment that established its existence.

We imagine things and people we’re familiar with or fond of as better than things and people we aren’t familiar with or fond of.

Similar to the above but the filled-in bits generally also include built in assumptions about the quality and value of the thing we’re looking at.

Halo effect: The halo effect is a type of immediate judgement discrepancy, or cognitive bias, where a person making an initial assessment of another person, place, or thing will assume ambiguous information based upon concrete information. A simplified example of the halo effect is when an individual noticing that the person in the photograph is attractive, well groomed, and properly attired, assumes, using a mental heuristic, that the person in the photograph is a good person based upon the rules of that individual's social concept. This constant error in judgment is reflective of the individual's preferences, prejudices, ideology, aspirations, and social perception. The halo effect is an evaluation by an individual and can affect the perception of a decision, action, idea, business, person, group, entity, or other whenever concrete data is generalized or influences ambiguous information.

Cross-race effect: The cross-race effect (sometimes called cross-race bias, other-race bias or own-race bias) is the tendency to more easily recognize faces of the race) that one is most familiar with (which is most often one's own race). A study was made which examined 271 real court cases. In photographic line-ups, 231 witnesses participated in cross-race versus same-race identification. In cross-race lineups, only 45% were correctly identified versus 60% for same-race identifications.

Cheerleader effect: The cheerleader effect, also known as the group attractiveness effect, is the cognitive bias which causes people to think individuals are more attractive when they are in a group. The term was backed up by research by Drew Walker & Edward Vul (2013) and van Osch et al. (2015).

Well-traveled road effect: The well travelled road effect is a cognitive bias in which travellers will estimate the time taken to traverse routes differently depending on their familiarity with the route. Frequently travelled routes are assessed as taking a shorter time than unfamiliar routes. This effect creates errors when estimating the most efficient route to an unfamiliar destination, when one candidate route includes a familiar route, whilst the other candidate route includes no familiar routes. The effect is most salient when subjects are driving, but is still detectable for pedestrians and users of public transport. The effect has been observed for centuries but was first studied scientifically in the 1980s and 1990s following from earlier "heuristics and biases" work undertaken by Daniel Kahneman and Amos Tversky.

In-group bias: In-group favoritism, sometimes known as in-group–out-group bias, in-group bias, or intergroup bias, is a pattern of favoring members of one's in-group over out-group members. This can be expressed in evaluation of others, in allocation of resources, and in many other ways.

Out-group homogeneity bias: The out-group homogeneity effect is one's perception of out-group) members as more similar to one another than are in-group members, e.g. "they are alike; we are diverse". The term "outgroup homogeneity effect", "outgroup homogeneity bias" or "relative outgroup homogeneity" has been explicitly contrasted with "outgroup homogeneity" in general, the latter referring to perceived outgroup variability unrelated to perceptions of the ingroup.

Not invented here: Not invented here (NIH) is a stance adopted by social, corporate, or institutional cultures that avoid using or buying already existing products, research, standards, or knowledge because of their external origins and costs, such as royalties. Research illustrates a strong bias against ideas from the outside.

Reactive devaluation: Reactive devaluation is a cognitive bias that occurs when a proposal is devalued if it appears to originate from an antagonist. The bias was proposed by Lee Ross and Constance Stillinger (1988).

Positivity effect: In psychology and cognitive science, the positivity effect is the ability to constructively analyze a situation where the desired results are not achieved; but still obtain positive feedback that assists our future progression. When a person is considering people they like (including themselves), the person tends to make situational attributions about their negative behaviors and dispositional attributions about their positive behaviors. The reverse may be true for people that the person dislikes. This may well be because of the dissonance between liking a person and seeing them behave negatively. Example: If a friend hits someone, one would tell them the other guy deserved it or that he had to defend himself.

We simplify probabilities and numbers to make them easier to think about.

Our subconscious mind is terrible at math and generally gets all kinds of things wrong about the likelihood of something happening if any data is missing.

Mental accounting: A concept first named by Richard Thaler, mental accounting (or psychological accounting) attempts to describe the process whereby people code, categorize and evaluate economic outcomes. Mental accounting deals with the recollection and perception of our various expenditures; its purpose is to keep track of our money-related decisions so as to give us a model with which to evaluate future financial decisions. It is a way of making sense of the world. Like many other cognitive processes, it can prompt biases and systematic departures from rational, value-maximizing behavior, and its implications are quite robust. Understanding the flaws and inefficiencies of mental accounting is essential to making good decisions and reducing human error.

Normalcy bias: The normalcy bias, or normality bias, is a belief people hold when facing a disaster. It causes people to underestimate both the likelihood of a disaster and its possible effects, because people believe that things will always function the way things normally have functioned. This may result in situations where people fail to adequately prepare themselves for disasters, and on a larger scale, the failure of governments to include the populace in its disaster preparations. About 70% of people reportedly display normalcy bias in disasters.

Appeal to probability fallacy: An appeal to probability (or appeal to possibility) is the logical fallacy of taking something for granted because it would probably be the case (or might possibly be the case). Inductive arguments lack deductive validity and must therefore be asserted or denied in the premises.

Murphy’s Law: Murphy's law is an adage or epigram that is typically stated as: "Anything that can go wrong will go wrong".

Subadditivity effect: The subadditivity effect is the tendency to judge probability of the whole to be less than the probabilities of the parts.

Survivorship bias: Survivorship bias or survival bias is the logical error of concentrating on the people or things that made it past some selection process and overlooking those that did not, typically because of their lack of visibility. This can lead to false conclusions in several different ways. It is a form of selection bias.

Zero sum bias: Zero-sum bias is the tendency to assume that resources gained by one party are matched by corresponding losses to another party. This tendency may be a legacy of intra-group competition when finite resources (e.g., mates, high-status positions, food) were not guaranteed to be allocated evenly across group members in hunter-gatherer societies.

Denomination effect: The denomination effect is a form of cognitive bias relating to currency, suggesting people may be less likely to spend larger currency denominations than their equivalent value in smaller denominations. It was proposed by Priya Raghubir, professor at the New York University Stern School of Business, and Joydeep Srivastava, professor at University of Maryland, in their 2009 paper "Denomination Effect".

Magic number 7+-2: "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information" is one of the most highly cited papers in psychology. It was published in 1956 in _Psychological Review_ by the cognitive psychologist George A. Miller of Princeton University's Department of Psychology. It is often interpreted to argue that the number of objects an average human can hold in working memory is 7 ± 2. This is frequently referred to as _Miller's law_.

We think we know what others are thinking.

In some cases this means that we assume that they know what we know, in other cases we assume they’re thinking about us as much as we are thinking about ourselves. It’s basically just a case of us modeling their own mind after our own (or in some cases after a much less complicated mind than our own).

Curse of knowledge: The curse of knowledge is a cognitive bias that occurs when an individual, communicating with other individuals, unknowingly assumes that the others have the background to understand. For example, in a classroom setting, teachers have difficulty teaching novices because they cannot put themselves in the position of the student. A brilliant professor might no longer remember the difficulties that a young student encounters when learning a new subject. This curse of knowledge also explains the danger behind thinking about student learning based on what appears best to faculty members, as opposed to what has been verified with students.

Illusion of transparency: The illusion of transparency is a tendency for people to overestimate the degree to which their personal mental state is known by others. Another manifestation of the illusion of transparency (sometimes called the observer's illusion of transparency) is a tendency for people to overestimate how well they understand others' personal mental states. This cognitive bias is similar to the illusion of asymmetric insight.

Spotlight effect: The spotlight effect is the phenomenon in which people tend to believe they are being noticed more than they really are. Being that one is constantly in the center of one's own world, an _accurate _evaluation of how much one is noticed by others is uncommon. The reason behind the spotlight effect comes from the innate tendency to forget that although one is the center of one's own world, one is not the center of everyone else's. This tendency is especially prominent when one does something atypical.

Illusion of external agency: The illusion of external agency is a set of attributional biases consisting of illusions of influence, insight and benevolence, proposed by Daniel Gilbert), Timothy D. Wilson, Ryan Brown and Elizabeth Pinel.

Illusion of asymmetric insight: The illusion of asymmetric insight is a cognitive bias whereby people perceive their knowledge of others to surpass other people's knowledge of them. This bias "has been traced to people’s tendency to view their own spontaneous or off-the-cuff responses to others' questions as relatively unrevealing even though they view others' similar responses as meaningful".

Extrinsic incentive error: The extrinsic incentives bias is an attributional bias according to which people attribute relatively more to "extrinsic incentives" (such as monetary reward) than to "intrinsic incentives" (such as learning a new skill) when weighing the motives of others rather than themselves.

We project our current mindset and assumptions onto the past and future.

Magnified also by the fact that we’re not very good at imagining how quickly or slowly things will happen or change over time.

Hindsight bias: Hindsight bias, also known as the knew-it-all-along effect or creeping determinism, is the inclination, after an event has occurred, to see the event as having been predictable, despite there having been little or no objective basis for predicting it. It is a multifaceted phenomenon that can affect different stages of designs, processes, contexts, and situations. Hindsight bias may cause memory distortion, where the recollection and reconstruction of content can lead to false theoretical outcomes. It has been suggested that the effect can cause extreme methodological problems while trying to analyze, understand, and interpret results in experimental studies. A basic example of the hindsight bias is when, after viewing the outcome of a potentially unforeseeable event, a person believes he or she "knew it all along". Such examples are present in the writings of historians describing outcomes of battles, physicians recalling clinical trials, and in judicial systems trying to attribute responsibility and predictability of accidents.

Declinism: Declinism is the belief that a society or institution is tending towards decline. Particularly, it is the predisposition, possibly due to cognitive bias, such as rosy retrospection, to view the past more favourably and future negatively. “The great summit of declinism,” according to Adam Gopnick, “was established in 1918, in the book that gave decline its good name in publishing: the German historian Oswald Spengler’s best-selling, thousand-page work 'The Decline of the West.'”

Telescoping effect: In cognitive psychology, the telescoping effect (or telescoping bias) refers to the temporal displacement of an event whereby people perceive recent events as being more remote than they are and distant events as being more recent than they are. The former is known as backward telescoping or time expansion, and the latter as is known as forward telescoping. Three years is approximately the time frame in which events switch from being displaced backward in time to forward in time, with events occurring three years in the past being equally likely to be reported with forward telescoping bias as with backward telescoping bias. Although telescoping occurs in both the forward and backward directions, in general the effect is to increase the number of events reported too recently. This net effect in the forward direction is because of forces that impair memory, such as lack of salience), also impair time perception. Telescoping leads to an over reporting of the frequency of events. This over reporting is because participants include events beyond the period, either events that are too recent for the target time period (backward telescoping) or events that are too old for the target time period (forward telescoping).

Moral luck: Moral luck describes circumstances whereby a moral agent is assigned moral blame or praise for an action or its consequences even if it is clear that said agent did not have full control over either the action or its consequences. This term, introduced by Bernard Williams, has been developed, along with its significance to a coherent moral theory, by Williams and Thomas Nagel in their respective essays on the subject.

Outcome bias: The outcome bias is an error made in evaluating the quality of a decision when the outcome of that decision is already known. Specifically, the outcome effect occurs when the same "behavior produce[s] more ethical condemnation when it happen[s] to produce bad rather than good outcome, even if the outcome is determined by chance."

Rosy retrospection: Rosy retrospection refers to the psychological phenomenon of people sometimes judging the past disproportionately more positively than they judge the present. Rosy retrospection is very closely related to the concept of nostalgia. The difference between the terms is that rosy retrospection is a cognitive bias, whereas the broader phenomenon of nostalgia is not necessarily based on a biased perspective.

Impact bias: In the psychology of affective forecasting, the impact bias, a form of which is the durability bias, is the tendency for people to overestimate the length or the intensity of future feeling states.

Self-consistency bias: Self-consistency bias is the commonly held idea that we are more consistent in our attitudes, opinions, and beliefs than we actually are, i.e. being unable to see the changes in your thoughts/opinions because you’re sure you’ve always thought the same way.

Planning fallacy: The planning fallacy, first proposed by Daniel Kahneman and Amos Tversky in 1979, is a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed.

Time-saving bias: The time-saving bias describes people's tendency to misestimate the time that could be saved (or lost) when increasing (or decreasing) speed. In general, people underestimate the time that could be saved when increasing from a relatively low speed (e.g., 25mph or 40km/h) and overestimate the time that could be saved when increasing from a relatively high speed (e.g., 55mph or 90km/h). People also underestimate the time that could be lost when decreasing from a low speed and overestimate the time that could be lost when decreasing from a high speed.

Pro-innovation bias: In diffusion of innovation theory, a pro-innovation bias is the belief that an innovation should be adopted by whole society without the need of its alteration. The innovation's "champion" has such strong bias in favor of the innovation, that he/she may not see its limitations or weaknesses and continues to promote it nonetheless.

Projection bias: Projection bias is the tendency to falsely project current preferences onto a future event. When people are trying to estimate their emotional state in the future they attempt to give an unbiased estimate. However, people’s assessments are contaminated by their current emotional state and thus it may be difficult for them to predict their emotional state in the future an occurrence known as mental contamination. For example, if a college student was currently in a negative mood because he just found out he failed a test, and if the college student forecasted how much he would enjoy a party two weeks later, his current negative mood may influence his forecast.

Restraint bias: Restraint bias is the tendency for people to overestimate their ability to control impulsive behavior. An inflated self-control belief may lead to greater exposure to temptation, and increased impulsiveness. Therefore, the restraint bias has bearing on addiction. For example, someone might experiment with drugs, simply because they believe they can resist any potential addiction. An individual's inability to control, or their temptation can come from several different visceral impulses. Visceral impulses can include hunger, sexual arousal, and fatigue. These impulses provide information about the current state and behavior needed to keep the body satisfied.

Pessimism bias: Pessimism bias is an effect in which people exaggerate the likelihood that negative things will happen to them. It contrasts with optimism bias.

Limited Time

We’re constrained by time and information, and yet we can’t let that paralyze us. Without the ability to act fast in the face of uncertainty, we surely would have perished as a species long ago. With every piece of new information, we need to do our best to assess our ability to affect the situation, apply it to decisions, simulate the future to predict what might happen next, and otherwise act on our new insight.

In order to act, we need to be confident in our ability to make an impact and to feel like what we do is important.

In reality, most of this confidence can be classified as overconfidence, but without it we might not act at all.

Overconfidence effect: The overconfidence effect is a well-established bias in which a person's subjective confidence in his or her judgements is reliably greater than the objective accuracy of those judgements, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

Egocentric bias: Egocentric bias is the tendency to rely too heavily on one's own perspective and/or have a higher opinion of oneself than reality. It appears to be the result of the psychological need to satisfy one's ego) and to be advantageous for memory consolidation. Research has shown that experiences, ideas, and beliefs are more easily recalled when they match one's own, causing an egocentric outlook. Egocentric bias is referred to by most psychologists as a general umbrella term under which other related phenomena fall under.

Optimism bias: Optimism bias (also known as unrealistic or comparative optimism) is a cognitive bias that causes a person to believe that they are at a lesser risk of experiencing a negative event compared to others. Optimism bias is quite common and transcends gender, race, nationality and age. Optimistic biases are even reported in non-human animals such as rats and birds.

Social desirability bias: In social science research, social desirability bias is a type of response bias that is the tendency of survey respondents to answer questions in a manner that will be viewed favorably by others. It can take the form of over-reporting "good behavior" or under-reporting "bad," or undesirable behavior. The tendency poses a serious problem with conducting research with self-reports, especially questionnaires. This bias interferes with the interpretation of average tendencies as well as individual differences.

Third-person effect: The Third-person effect hypothesis predicts that people tend to perceive that mass media messages have a greater effect on others than on themselves, based on personal biases. The Third-person effect manifests itself through an individual’s overestimation of the effect of a mass communicated message on the generalized other, or an underestimation of the effect of a mass communicated message on themselves.

Forer effect, Barnum effect: The Barnum effect, also called the Forer effect, is a common psychological phenomenon whereby individuals give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically to them, that are in fact vague and general enough to apply to a wide range of people. This effect can provide a partial explanation for the widespread acceptance of some paranormal beliefs and practices, such as astrology, fortune telling, aura reading), and some types of personality tests.

Illusion of control: The illusion of control is the tendency for people to overestimate their ability to control events; for example, it occurs when someone feels a sense of control over outcomes that they demonstrably do not influence. The effect was named by psychologist Ellen Langer and has been replicated in many different contexts. It is thought to influence gambling behavior and belief in the paranormal. Along with illusory superiority and optimism bias, the illusion of control is one of the positive illusions.

False consensus effect: In psychology, the false-consensus effect or false-consensus bias is an attributional type of cognitive bias whereby people tend to overestimate the extent to which their opinions, beliefs, preferences, values, and habits are normal and typical of those of others (i.e., that others also think the same way that they do). This cognitive bias tends to lead to the perception of a consensus that does not exist, a "false consensus".

Dunning-Kruger effect: In the field of psychology, the Dunning–Kruger effect is a cognitive bias in which people of low ability have illusory superiority and mistakenly assess their cognitive ability as greater than it is. The cognitive bias of illusory superiority comes from the inability of low-ability people to recognize their lack of ability; without the self-awareness of metacognition, low-ability people cannot objectively evaluate their actual competence or incompetence. On the other hand, people of high ability incorrectly assume that tasks that are easy for them are also easy for other people.

Hard-easy effect: The hard–easy effect is a cognitive bias that manifests itself as a tendency to overestimate the probability of one's success at a task perceived as hard, and to underestimate the likelihood of one's success at a task perceived as easy. The hard-easy effect takes place, for example, when individuals exhibit a degree of underconfidence in answering relatively easy questions and a degree of overconfidence in answering relatively difficult questions. "Hard tasks tend to produce overconfidence but worse-than-average perceptions," reported Katherine A. Burson, Richard P. Larrick, and Jack B. Soll in a 2005 study, "whereas easy tasks tend to produce underconfidence and better-than-average effects."

Illusory superiority: In the field of social psychology, illusory superiority is a condition of cognitive bias whereby a person overestimates their own qualities and abilities, in relation to the same qualities and abilities of other persons. Illusory superiority is one of many positive illusions, relating to the self, that are evident in the study of intelligence, the effective performance of tasks and tests, and the possession of desirable personal characteristics and personality traits.

Lake Wobegone effect: The Lake Wobegon effect, a natural human tendency to overestimate one's capabilities, was coined by David Myers in honour of the fictional town. The characterization of the fictional location, where "all the women are strong, all the men are good looking, and all the children are above average," has been used to describe a real and pervasive human tendency to overestimate one's achievements and capabilities in relation to others.

Self-serving bias: A self-serving bias is any cognitive or perceptual process that is distorted by the need to maintain and enhance self-esteem, or the tendency to perceive oneself in an overly favorable manner. It is the belief that individuals tend to ascribe success to their own abilities and efforts, but ascribe failure to external factors. When individuals reject the validity of negative feedback, focus on their strengths and achievements but overlook their faults and failures, or take more responsibility for their group's work than they give to other members, they are protecting their ego from threat and injury. These cognitive and perceptual tendencies perpetuate illusions and error, but they also serve the self's need for esteem. For example, a student who attributes) earning a good grade on an exam to their own intelligence and preparation but attributes earning a poor grade to the teacher's poor teaching ability or unfair test questions might be exhibiting the self-serving bias. Studies have shown that similar attributions are made in various situations, such as the workplace, interpersonal relationships, sports, and consumer decisions.

Actor-observer bias, Fundamental attribution error: In social psychology, fundamental attribution error (FAE), also known as correspondence bias or attribution effect, is the concept that, in contrast to interpretations of their own behavior, people tend to (unduly) emphasize the agent's internal characteristics (character or intention), rather than external factors, in explaining other people's behavior. This effect has been described as "the tendency to believe that what people do reflects who they are".

Defensive attribution hypothesis: The defensive attribution hypothesis (or bias, theory, or simply defensive attribution) is a social psychological term from the attributional approach) referring to a set of beliefs used as a shield against the fear that one will be the victim or cause of a serious mishap.

Trait ascription bias: Trait ascription bias is the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable in their personal traits across different situations. More specifically, it is a tendency to describe one's own behaviour in terms of situational factors while preferring to describe another's behaviour by ascribing fixed dispositions to their personality. This may occur because peoples' own internal states are more readily observable and available to them than those of others.

Effort justification: Effort justification is an idea and paradigm) in social psychology stemming from Leon Festinger's theory of cognitive dissonance. Effort justification is a person's tendency to attribute a value to an outcome, which they had to put effort into achieving, greater than the objective value of the outcome.

Risk compensation, Peltzman effect: Risk compensation is a theory which suggests that people typically adjust their behavior in response to the perceived level of risk, becoming more careful where they sense greater risk and less careful if they feel more protected. Although usually small in comparison to the fundamental benefits of safety interventions, it may result in a lower net benefit than expected.

In order to stay focused, we favor the immediate, relatable thing in front of us over the delayed and distant.

We value stuff more in the present than in the future, and relate more to stories of specific individuals than anonymous individuals or groups. I’m surprised there aren’t more biases found under this one, considering how much it impacts how we think about the world.

Hyperbolic discounting: In economics, hyperbolic discounting is a time-inconsistent model of delay discounting. It is one of the cornerstones of behavioral economics.

Appeal to novelty: The appeal to novelty (also called argumentum ad novitatem) is a fallacy in which one prematurely claims that an idea or proposal is correct or superior, exclusively because it is new and modern. In a controversy between status quo and new inventions, an appeal to novelty argument is not in itself a valid argument. The fallacy may take two forms: overestimating the new and modern, prematurely and without investigation assuming it to be best-case, or underestimating status quo, prematurely and without investigation assuming it to be worst-case.

Identifiable victim effect: The "identifiable victim effect" refers to the tendency of individuals to offer greater aid when a specific, identifiable person ("victim") is observed under hardship, as compared to a large, vaguely defined group with the same need. The effect is also observed when subjects administer punishment rather than reward. Research has shown that individuals can be more likely to mete out punishment, even at their own expense, when they are punishing specific, identifiable individuals ("perpetrators").

In order to get anything done, we’re motivated to complete things that we’ve already invested time and energy in.

The behavioral economist’s version of Newton’s first law of motion: an object in motion stays in motion. This helps us finish things, even if we come across more and more reasons to give up.

Sunk cost fallacy: In economics and business decision-making, a sunk cost is a cost that has already been incurred and cannot be recovered (also known as retrospective cost).

Irrational escalation, Escalation of commitment: Escalation of commitment is a human behavior pattern in which an individual or group facing increasingly negative outcomes from some decision, action, or investment nevertheless continues the same behavior rather than alter course. The actor maintains behaviors that are irrational, but align with previous decisions and actions.

Loss aversion: In cognitive psychology and decision theory, loss aversion refers to people's tendency to prefer avoiding losses to acquiring equivalent gains: it is better to not lose $5 than to find $5. The principle is very prominent in the domain of economics. What distinguishes loss aversion from risk aversion is that the utility of a monetary payoff depends on what was previously experienced or was expected to happen. Some studies have suggested that losses are twice as powerful, psychologically, as gains. Loss aversion was first identified by Amos Tversky and Daniel Kahneman.

IKEA effect, Processing difficulty effect: The IKEA effect is a cognitive bias in which consumers place a disproportionately high value on products they partially created. The name derives from the name of Swedish manufacturer and furniture retailer IKEA, which sells many furniture products that require assembly.

Generation effect: The generation effect is a phenomenon where information is better remembered if it is generated from one's own mind rather than simply read). Researchers have struggled to account for why generated information is better recalled than read information, but no single explanation has been sufficient.

Zero-risk bias: Zero-risk bias is a tendency to prefer the complete elimination of a risk even when alternative options produce a greater reduction in risk (overall). This effect on decision making has been observed in surveys presenting hypothetical scenarios and certain real-world policies (e.g. war against terrorism as opposed to reducing the risk of traffic accidents or gun violence) have been interpreted as being influenced by it.

Disposition effect: The disposition effect is an anomaly discovered in behavioral finance. It relates to the tendency of investors to sell shares whose price has increased, while keeping assets that have dropped in value.

Unit bias: Unit bias is the tendency for individuals to want to complete a unit of a given item or task. People want to finish whatever portion they have no matter the size, it is a perception of completion that is satisfying to people.

Pseudocertainty effect: In prospect theory, the pseudocertainty effect is the tendency for people to perceive an outcome as certain while it is actually uncertain. It can be observed in multi-stage decision making, in which evaluation of the certainty of the outcome in a previous stage of decisions is disregarded when selecting an option in subsequent stages. Not to be confused with certainty effect, the pseudocertainty effect was discovered from an attempt at providing a normative use of decision theory for the certainty effect by relaxing the cancellation rule.

Endowment effect: In psychology and behavioral economics, the endowment effect (also known as divestiture aversion and related to the mere ownership effect in social psychology) is the hypothesis that people ascribe more value to things merely because they own them.

Backfire effect: The backfire effect is a name for the finding that, given evidence against their beliefs, people can reject the evidence and believe even more strongly.

In order to avoid mistakes, we’re motivated to preserve our autonomy and status in a group, and to avoid irreversible decisions.

If we must choose, we tend to choose the option that is perceived as the least risky or that preserves the status quo. Better the devil you know than the devil you do not.

System justification, Reactance: System justification theory (SJT) is a theory within social psychology that system-justifying beliefs serve a psychologically palliative function. It proposes that people have several underlying needs, which vary from individual to individual, that can be satisfied by the defense and justification of the status quo, even when the system may be disadvantageous to certain people. People have epistemic, existential, and relational needs that are met by and manifest as ideological support for the prevailing structure of social, economic, and political norms. Need for order and stability, and thus resistance to change or alternatives, for example, can be a motivator for individuals to see the status quo as good, legitimate, and even desirable.

Reverse psychology: Reverse psychology is a technique involving the advocacy of a belief or behavior that is opposite to the one desired, with the expectation that this approach will encourage the subject of the persuasion to do what actually is desired: the opposite of what is suggested. This technique relies on the psychological phenomenon of reactance), in which a person has a negative emotional reaction to being persuaded, and thus chooses the option which is being advocated against. The one being manipulated is usually unaware of what is really going on.

Decoy effect: In marketing, the decoy effect (or attraction effect or asymmetric dominance effect) is the phenomenon whereby consumers will tend to have a specific change in preference between two options when also presented with a third option that is asymmetrically dominated. An option is asymmetrically dominated when it is inferior in all respects to one option; but, in comparison to the other option, it is inferior in some respects and superior in others. In other words, in terms of specific attributes determining preferences, it is completely dominated by (i.e., inferior to) one option and only partially dominated by the other. When the asymmetrically dominated option is present, a higher percentage of consumers will prefer the dominating option than when the asymmetrically dominated option is absent. The asymmetrically dominated option is therefore a decoy serving to increase preference for the dominating option. The decoy effect is also an example of the violation of the independence of irrelevant alternatives axiom of decision theory.

Social comparison bias: Social comparison bias is having feelings of dislike and competitiveness with someone that is seen physically, or mentally better than yourself.

Status quo bias: Status quo bias is an emotional bias; a preference for the current state of affairs. The current baseline (or status quo) is taken as a reference point, and any change from that baseline is perceived as a loss. Status quo bias should be distinguished from a rational preference for the status quo ante, as when the current state of affairs is objectively superior to the available alternatives, or when imperfect information is a significant problem. A large body of evidence, however, shows that status quo bias frequently affects human decision-making.

We favor options that appear simple or that have more complete information over more complex, ambiguous options.

We’d rather do the quick, simple thing than the important complicated thing, even if the important complicated thing is ultimately a better use of time and energy.

Ambiguity bias: The ambiguity effect is a cognitive bias where decision making is affected by a lack of information, or "ambiguity". The effect implies that people tend to select options for which the probability of a favorable outcome is known, over an option for which the probability of a favorable outcome is unknown. The effect was first described by Daniel Ellsberg in 1961.

Information bias: Information bias is a cognitive bias to seek information when it does not affect action. People can often make better predictions or choices with less information: more information is not always better. An example of information bias is believing that the more information that can be acquired to make a decision, the better, even if that extra information is irrelevant for the decision.

Belief bias: Belief bias is the tendency to judge the strength of arguments based on the plausibility of their conclusion rather than how strongly they support that conclusion. A person is more likely to accept arguments that supports a conclusion that aligns with our values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion. Belief bias is an extremely common and therefore significant form of error; we can easily be blinded by our beliefs and reach the wrong conclusion. Belief bias has been found to influence various reasoning tasks, including conditional reasoning, relation reasoning and transitive reasoning.

Rhyme as reason effect: The rhyme-as-reason effect (or Eaton-Rosen phenomenon) is a cognitive bias whereupon a saying or aphorism is judged as more accurate or truthful when it is rewritten to rhyme.

Bike-shedding effect, Law of Triviality: Parkinson's law of triviality is C. Northcote Parkinson's 1957 argument that members of an organization give disproportionate weight to trivial issues. Parkinson provides the example of a fictional committee whose job was to approve the plans for a nuclear power plant spending the majority of its time on discussions about relatively minor but easy-to-grasp issues, such as what materials to use for the staff bike shed, while neglecting the proposed design of the plant itself, which is far more important and a far more difficult and complex task.

Delmore effect: The Delmore effect is our tendency to provide more articulate and explicit goals for lower priority areas of our lives. It appears that the daunting nature of truly important goals may motivate the self to deflect this anxiety by attending to less important, but also less threatening goals.

Conjunction fallacy: The conjunction fallacy (also known as the Linda problem) is a formal fallacy that occurs when it is assumed that specific conditions are more probable than a single general one.

Occam’s razor: Occam's razor (also Ockham's razor or Ocham's razor; Latin: lex parsimoniae "law of parsimony") is the problem-solving principle that the simplest solution tends to be the right one. When presented with competing hypotheses to solve a problem, one should select the solution with the fewest assumptions. The idea is attributed to William of Ockham (c. 1287–1347), who was an English Franciscan friar, scholastic philosopher, and theologian.

Less-is-better effect: The less-is-better effect is a type of preference reversal that occurs when the lesser or smaller alternative of a proposition is preferred when evaluated separately, but not evaluated together. The term was first proposed by Christopher Hsee.

Limited Memory

There’s too much information in the universe. We can only afford to keep around the bits that are most likely to prove useful in the future. We need to make constant bets and trade-offs around what we try to remember and what we forget. For example, we prefer generalizations over specifics because they take up less space. When there are lots of irreducible details, we pick out a few standout items to save and discard the rest. What we save here is what is most likely to inform our filters related to problem 1’s information overload, as well as inform what comes to mind during the processes mentioned in problem 2 around filling in incomplete information. It’s all self-reinforcing.

We edit and reinforce some memories after the fact.

During that process, memories can become stronger, however various details can also get accidentally swapped. We sometimes accidentally inject a detail into the memory that wasn’t there before.

Misattribution of memory, Source confusion: Memory plays an important role in a number of aspects of our everyday lives and allows us to recall past experiences, navigate our environments, and learn new tasks. Information about a source of memory contains certain characteristics that reflect the conditions under which the memory representations were attained. The accuracy of their recall varies depending on the circumstances at which they are retrieved. Misattribution of memory refers to the ability to remember information correctly, but being wrong about the source of that information.

Cryptomnesia: Cryptomnesia occurs when a forgotten memory returns without it being recognized as such by the subject, who believes it is something new and original. It is a memory bias whereby a person may falsely recall generating a thought, an idea, a tune, or a joke, not deliberately engaging in plagiarism but rather experiencing a memory as if it were a new inspiration.

False memory: In psychiatry, confabulation (verb: confabulate) is a disturbance of memory, defined as the production of fabricated, distorted, or misinterpreted memories about oneself or the world, without the conscious intention to deceive. People who confabulate present incorrect memories ranging from "subtle alterations to bizarre fabrications", and are generally very confident about their recollections, despite contradictory evidence.

Suggestibility: Suggestibility is the quality of being inclined to accept and act on the suggestions of others where false but plausible information is given and one fills in the gaps in certain memories with false information when recalling a scenario or moment. Suggestibility uses cues to distort recollection: when the subject has been persistently told something about a past event, his or her memory of the event conforms to the repeated message.

Spacing effect: The spacing effect is the phenomenon whereby learning is greater when studying is spread out over time, as opposed to studying the same amount of content in a single session. That is, it is better to use spaced presentation rather than massed presentation. Practically, this effect suggests that "cramming)" (intense, last-minute studying) the night before an exam is not likely to be as effective as studying at intervals in a longer time frame. It is important to note, however, that the benefit of spaced presentations does not appear at short retention intervals in which massed presentations tend to lead to better memory performance. This effect is a desirable difficulty; it challenges the learner but leads to better learning in the long-run.

We discard specifics to form generalities.

We do this out of necessity, but the impact of implicit associations, stereotypes, and prejudice results in some of the most glaringly bad consequences from our full set of cognitive biases.

Implicit associations, Implicit stereotypes, Stereotypical bias: An implicit bias, or implicit stereotype, is the unconscious attribution of particular qualities to a member of a certain social group.

Prejudice: Prejudice, or bigotry, is an affective feeling towards a person or group member based solely on that person's group membership. The word is often used to refer to preconceived, usually unfavorable, feelings towards people or a person because of their sex, gender, beliefs, values), social class, age, disability, religion, sexuality, race)/ethnicity, language, nationality, beauty, occupation, education, criminality, sport team affiliation or other personal characteristics. In this case, it refers to a positive or negative evaluation of another person based on that person's perceived group membership.

Negativity bias: The negativity bias, also known as the negativity effect, refers to the notion that, even when of equal intensity, things of a more negative nature (e.g. unpleasant thoughts, emotions, or social interactions; harmful/traumatic events) have a greater effect on one's psychological state and processes than neutral or positive things. In other words, something very positive will generally have less of an impact on a person's behavior and cognition than something equally emotional but negative. The negativity bias has been investigated within many different domains, including the formation of impressions and general evaluations; attention, learning, and memory; and decision-making and risk considerations.

Fading affect bias: The fading affect bias, more commonly known as FAB, is a psychological phenomenon in which information regarding negative emotions tends to be forgotten more quickly than that associated with pleasant emotions. Although there have been some contradictory findings regarding the presence of FAB, it has been largely found to be real.

We reduce events and lists to their key elements.

It’s difficult to reduce events and lists to generalities, so instead we pick out a few items to represent the whole.

Peak–end rule: The peak–end rule is a psychological heuristic in which people judge an experience largely based on how they felt at its peak (i.e., its most intense point) and at its end, rather than based on the total sum or average of every moment of the experience. The effect occurs regardless of whether the experience is pleasant or unpleasant. According to the heuristic, other information aside from that of the peak and end of the experience is not lost, but it is not used. This includes net pleasantness or unpleasantness and how long the experience lasted. The peak–end rule is thereby a specific form of the more general extension neglect and duration neglect.

Leveling and sharpening: Leveling and sharpening are two functions that are automatic and exist within memory. Sharpening is usually the way people remember small details in the retelling of stories they have experienced or are retelling those stories. Leveling is when people keep out parts of stories and try to tone those stories down so that some parts are excluded. Therefore, it makes it easier to fill in the memory gaps that exist.

Misinformation effect: The misinformation effect happens when a person's recall) of episodic memories becomes less accurate because of post-event information. For example, in a study published in 1994, subjects were initially shown one of two different series of slides that depicted a college student at the university bookstore, with different objects of the same type changed in some slides. One version of the slides would, for example, show a screwdriver while the other would show a wrench, and the audio narrative accompanying the slides would only refer to the object as a "tool". In the second phase, subjects would read a narrative description of the events in the slides, except this time a specific tool was named, which would be the incorrect tool half the time. Finally, in the third phase, subjects had to list five examples of specific types of objects, such as tools, but were told to only list examples which they had not seen in the slides. Subjects who had read an incorrect narrative were far less likely to list the written object (which they hadn't actually seen) than the control subjects (28% vs. 43%), and were far more likely to incorrectly list the item which they had actually seen (33% vs. 26%).

Serial position effect: Serial-position effect is the tendency of a person to recall the first and last items in a series best, and the middle items worst. The term was coined by Hermann Ebbinghaus through studies he performed on himself, and refers to the finding that recall accuracy varies as a function of an item's position within a study list. When asked to recall a list of items in any order (free recall), people tend to begin recall with the end of the list, recalling those items best (the recency effect). Among earlier list items, the first few items are recalled more frequently than the middle items (the primacy effect).

Recency effect: When asked to recall a list of items in any order (free recall), people tend to begin recall with the end of the list, recalling those items best (the recency effect).

Suffix effect: People often have to recall a series of items in order, such as a phone number. When the list of items is heard (as opposed to read silently), people usually are very good at remembering the final list item. However, if the list is followed by an irrelevant item (the suffix), recall of the final item is substantially impaired. This impairment is called the suffix effect.

Memory inhibition, Part-list cueing effect: In psychology, memory inhibition is the ability not to remember irrelevant information. The scientific concept of memory inhibition should not be confused with everyday uses of the word "inhibition". Scientifically speaking, memory inhibition is a type of cognitive inhibition, which is the stopping or overriding of a mental process, in whole or in part, with or without intention.

Primacy effect: The primacy effect, in psychology and sociology, is a cognitive bias that results in a subject recalling primary information presented better than information presented later on. For example, a subject who reads a sufficiently long list of words is more likely to remember words toward the beginning than words in the middle.

Modality effect: The modality effect is a term used in experimental psychology, most often in the fields dealing with memory and learning, to refer to how learner performance depends on the presentation mode of studied items.

Serial recall effect, List-length effect: The List Length Effect refers to the way the ability to recall items in the correct order decreases as the length of the list or sequence increases.

Duration neglect: Duration neglect is the psychological observation that people's judgments of the unpleasantness of painful experiences depend very little on the duration of those experiences. Multiple experiments have found that these judgments tend to be affected by two factors: the peak (when the experience was the most painful) and how quickly the pain diminishes. If it diminishes more slowly, the experience is judged to be more painful. Hence, the term "peak–end rule" describes this process of evaluation.

We store memories differently based on how they were experienced.

Our brains will only encode information that it deems important at the time, but this decision can be affected by other circumstances (what else is happening, how is the information presenting itself, can we easily find the information again if we need to, etc) that have little to do with the information’s value.

Levels of processing effect, Testing effect: The levels-of-processing effect, identified by Fergus I. M. Craik and Robert S. Lockhart in 1972, describes memory recall of stimuli) as a function of the depth of mental processing. Deeper levels of analysis produce more elaborate, longer-lasting, and stronger memory traces than shallow levels of analysis. Depth of processing falls on a shallow to deep continuum. Shallow processing (e.g., processing based on phonemic and orthographic components) leads to a fragile memory trace that is susceptible to rapid decay. Conversely, deep processing (e.g., semantic processing) results in a more durable memory trace.

Tip of the tongue phenomenon: Tip of the tongue (or TOT) is the phenomenon of failing to retrieve a word from memory, combined with partial recall) and the feeling that retrieval is imminent. The phenomenon's name comes from the saying, "It's on the tip of my tongue." The tip of the tongue phenomenon reveals that lexical access occurs in stages.

Google effect: The Google effect, also called digital amnesia, is the tendency to forget information that can be found readily online by using Internet search engines such as Google. According to the first study about the Google effect people are less likely to remember certain details they believe will be accessible online. However, the study also claims that people's ability to learn information offline remains the same. This effect may also be seen as a change to what information and what level of detail is considered to be important to remember>

Absent-mindedness: Absent-mindedness is where a person shows inattentive or forgetful behavior. It can have three different causes:

Next-in-line effect: The Next-in-line effect causes impaired recall for an event immediately preceding an anticipated public performance.