Skip to content

Instantly share code, notes, and snippets.

@sundarj
Last active November 12, 2018 14:28
Show Gist options
  • Save sundarj/b96ba4d022dca3c9ba7401b8b13e9172 to your computer and use it in GitHub Desktop.
Save sundarj/b96ba4d022dca3c9ba7401b8b13e9172 to your computer and use it in GitHub Desktop.

Lee Ross and fellow psychologist Andrew Ward have outlined three interrelated assumptions, or "tenets," that make up naïve realism. They argue that these assumptions are supported by a long line of thinking in social psychology, along with several empirical studies. According to their model, people:

  • Believe that they see the world objectively and without bias.
  • Expect that others will come to the same conclusions, so long as they are exposed to the same information and interpret it in a rational manner.
  • Assume that others who do not share the same views must be ignorant, irrational, or biased.

Functional fixedness is a cognitive bias that limits a person to use an object only in the way it is traditionally used. The concept of functional fixedness originated in Gestalt psychology, a movement in psychology that emphasizes holistic processing. Karl Duncker defined functional fixedness as being a "mental block against using an object in a new way that is required to solve a problem". This "block" limits the ability of an individual to use components given to them to complete a task, as they cannot move past the original purpose of those components. For example, if someone needs a paperweight, but they only have a hammer, they may not see how the hammer can be used as a paperweight. Functional fixedness is this inability to see a hammer's use as anything other than for pounding nails; the person couldn't think to use the hammer in a way other than in its conventional function.


The mere-exposure effect is a psychological phenomenon by which people tend to develop a preference for things merely because they are familiar with them. In social psychology, this effect is sometimes called the familiarity principle. The effect has been demonstrated with many kinds of things, including words, Chinese characters, paintings, pictures of faces, geometric figures, and sounds.


The normalcy bias, or normality bias, is a belief people hold when facing a disaster. It causes people to underestimate both the likelihood of a disaster and its possible effects, because people believe that things will always function the way things normally have functioned. This may result in situations where people fail to adequately prepare themselves for disasters, and on a larger scale, the failure of governments to include the populace in its disaster preparations. About 70% of people reportedly display normalcy bias in disasters. Journalist Amanda Ripley identified common response patterns of people in disasters and found that there are three phases of response: Denial, Deliberation and the Decisive Moment. The faster people can get through the Denial and Deliberation phase, the quicker they will reach the Decisive Moment and begin to take action.


Selective perception is the process by which individuals perceive what they want to in media messages while ignoring opposing viewpoints. It is a broad term to identify the behavior all people exhibit to tend to "see things" based on their particular frame of reference. It also describes how we categorize and interpret sensory information in a way that favors one category or interpretation over another. In other words, selective perception is a form of bias because we interpret information in a way that is congruent with our existing values and beliefs. Psychologists believe this process occurs automatically. Selective perception may refer to any number of cognitive biases in psychology related to the way expectations affect perception. Human judgment and decision making is distorted by an array of cognitive, perceptual and motivational biases, and people tend not to recognise their own bias, though they tend to easily recognise (and even overestimate) the operation of bias in human judgment by others. One of the reasons this might occur might be because people are simply bombarded with too much stimuli every day to pay equal attention to everything, therefore, they pick and choose according to their own needs.


In the social sciences, framing comprises a set of concepts and theoretical perspectives on how individuals, groups, and societies, organize, perceive, and communicate about reality. Framing involves social construction of a social phenomenon – by mass media sources, political or social movements, political leaders, or other actors and organizations. Participation in a language community necessarily influences an individual's perception of the meanings attributed to words or phrases. Politically, the language communities of advertising, religion, and mass media are highly contested, whereas framing in less-sharply defended language communities might evolve imperceptibly and organically over cultural time frames, with fewer overt modes of disputation. Framing can manifest in thought or interpersonal communication. Frames in thought consist of the mental representations, interpretations, and simplifications of reality. Frames in communication consist of the communication of frames between different actors.


Confirmation bias, also called confirmatory bias or myside bias, is the tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses. It is a type of cognitive bias and a systematic error of inductive reasoning. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs. Confirmation bias is a variation of the more general tendency of apophenia.


Status quo bias is an emotional bias; a preference for the current state of affairs. The current baseline (or status quo) is taken as a reference point, and any change from that baseline is perceived as a loss. Status quo bias should be distinguished from a rational preference for the status quo ante, as when the current state of affairs is objectively superior to the available alternatives, or when imperfect information is a significant problem. A large body of evidence, however, shows that status quo bias frequently affects human decision-making.


Survivorship bias or survival bias is the logical error of concentrating on the people or things that made it past some selection process and overlooking those that did not, typically because of their lack of visibility. This can lead to false conclusions in several different ways. It is a form of selection bias.


Survivorship bias can lead to overly optimistic beliefs because failures are ignored, such as when companies that no longer exist are excluded from analyses of financial performance. It can also lead to the false belief that the successes in a group have some special property, rather than just coincidence (correlation proves causality). For example, if three of the five students with the best college grades went to the same high school, that can lead one to believe that the high school must offer an excellent education. This could be true, but the question cannot be answered without looking at the grades of all the other students from that high school, not just the ones who "survived" the top-five selection process.


The well travelled road effect is a cognitive bias in which travellers will estimate the time taken to traverse routes differently depending on their familiarity with the route. Frequently travelled routes are assessed as taking a shorter time than unfamiliar routes. This effect creates errors when estimating the most efficient route to an unfamiliar destination, when one candidate route includes a familiar route, whilst the other candidate route includes no familiar routes. The effect is most salient when subjects are driving, but is still detectable for pedestrians and users of public transport. The effect has been observed for centuries but was first studied scientifically in the 1980s and 1990s following from earlier "heuristics and biases" work undertaken by Daniel Kahneman and Amos Tversky. Much like the Stroop task, it is hypothesised that drivers use less cognitive effort when traversing familiar routes and therefore underestimate the time taken to traverse the familiar route.


The availability heuristic is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method or decision. The availability heuristic operates on the notion that if something can be recalled, it must be important, or at least more important than alternative solutions which are not as readily recalled. Subsequently, under the availability heuristic, people tend to heavily weigh their judgments toward more recent information, making new opinions biased toward that latest news. The availability of consequences associated with an action is positively related to perceptions of the magnitude of the consequences of that action. In other words, the easier it is to recall the consequences of something the greater those consequences are often perceived to be. Most notably, people often rely on the content of their recall if its implications are not called into question by the difficulty that they experience in bringing the relevant material to mind.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment