Wikipedia Ideas Flashcards
Social proof
Social proof is a psychological and social phenomenon wherein people copy the actions of others in an attempt to undertake behavior in a given situation. The term was coined by Robert Cialdini in his 1984 book Influence, and the concept is also known as informational social influence.
Social proof is considered prominent in ambiguous social situations where people are unable to determine the appropriate mode of behavior, and is driven by the assumption that the surrounding people possess more knowledge about the current situation.
The effects of social influence can be seen in the tendency of large groups to conform. This is referred to in some publications as the herd behavior. Although social proof reflects a rational motive to take into account the information possessed by others, formal analysis shows that it can cause people to converge too quickly upon a single distinct choice, so that decisions of even larger groups of individuals may be grounded in very little information (see information cascades).
Social proof is one type of conformity. When a person is in a situation where they are unsure of the correct way to behave, they will often look to others for clues concerning the correct behavior. When “we conform because we believe that others’ interpretation of an ambiguous situation is more accurate than ours and will help us choose an appropriate course of action”, it is informational social influence. This is contrasted with normative social influence wherein a person conforms to be liked or accepted by others.
Social proof often leads not only to public compliance (conforming to the behavior of others publicly without necessarily believing it is correct) but also private acceptance (conforming out of a genuine belief that others are correct). Social proof is more powerful when being accurate is more important and when others are perceived as especially knowledgeable.
Normative social influence
Normative social influence is a type of social influence that leads to conformity. It is defined in social psychology as “…the influence of other people that leads us to conform in order to be liked and accepted by them.” The power of normative social influence stems from the human identity as a social being, with a need for companionship and association.
Normative social influence involves a change in behaviour that is deemed necessary in order to fit in a particular group. The need for a positive relationship with the people around leads us to conformity. This fact often leads to people exhibiting public compliance—but not necessarily private acceptance—of the group’s social norms in order to be accepted by the group.[citation needed] Social norms refers to the unwritten rules that govern social behavior. These are customary standards for behavior that are widely shared by members of a culture.
In many cases, normative social influence serves to promote social cohesion. When a majority of group members conform to social norms, the group generally becomes more stable. This stability translates into social cohesion, which allows group members to work together toward a common understanding, or “good,” but also has the unintended impact of making the group members less individualistic.
Information cascade
An Information cascade or informational cascade is a phenomenon described in behavioral economics and network theory in which a number of people make the same decision in a sequential fashion. It is similar to, but distinct from herd behavior.
An information cascade is generally accepted as a two-step process. For a cascade to begin an individual must encounter a scenario with a decision, typically a binary one. Second, outside factors can influence this decision (typically, through the observation of actions and their outcomes of other individuals in similar scenarios).
The two-step process of an informational cascade can be broken down into five basic components:
There is a decision to be made – for example; whether to adopt a new technology, wear a new style of clothing, eat in a new restaurant, or support a particular political position
A limited action space exists (e.g. an adopt/reject decision)
People make the decision sequentially, and each person can observe the choices made by those who acted earlier
Each person has some information aside from their own that helps guide their decision
A person can’t directly observe the outside information that other people know, but he or she can make inferences about this information from what they do
Social perspectives of cascades, which suggest that agents may act irrationally (e.g., against what they think is optimal) when social pressures are great, exist as complements to the concept of information cascades. More often the problem is that the concept of an information cascade is confused with ideas that do not match the two key conditions of the process, such as social proof, information diffusion, and social influence. Indeed, the term information cascade has even been used to refer to such processes.
Cognitive dissonance
In the field of psychology, cognitive dissonance is the perception of contradictory information. Relevant items of information include a person’s actions, feelings, ideas, beliefs, and values, and things in the environment. Cognitive dissonance is typically experienced as psychological stress when they participate in an action that goes against one or more of them. According to this theory, when two actions or ideas are not psychologically consistent with each other, people do all in their power to change them until they become consistent. The discomfort is triggered by the person’s belief clashing with new information perceived, wherein they try to find a way to resolve the contradiction to reduce their discomfort.
In A Theory of Cognitive Dissonance (1957), Leon Festinger proposed that human beings strive for internal psychological consistency to function mentally in the real world. A person who experiences internal inconsistency tends to become psychologically uncomfortable and is motivated to reduce the cognitive dissonance. They tend to make changes to justify the stressful behavior, either by adding new parts to the cognition causing the psychological dissonance (rationalization) or by avoiding circumstances and contradictory information likely to increase the magnitude of the cognitive dissonance (confirmation bias).
Rationalisation
In psychology rationalization or rationalisation is a defense mechanism in which controversial behaviors or feelings are justified and explained in a seemingly rational or logical manner in the absence of a true explanation, and are made consciously tolerable—or even admirable and superior—by plausible means. It is also an informal fallacy of reasoning.
Confirmation bias
Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one’s prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias cannot be eliminated entirely, but it can be managed, for example, by education and training in critical thinking skills.
Confirmation bias is a broad construct covering a number of explanations. Biased search for information, biased interpretation of this information, and biased memory recall, have been invoked to explain four specific effects: 1) attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence); 2) belief perseverance (when beliefs persist after the evidence for them is shown to be false); 3) the irrational primacy effect (a greater reliance on information encountered early in a series); and 4) illusory correlation (when people falsely perceive an association between two events or situations).
A series of psychological experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. Later work re-interpreted these results as a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives (myside bias, an alternative name for confirmation bias). In general, current explanations for the observed biases reveal the limited human capacity to process the complete set of information available, leading to a failure to investigate in a neutral, scientific way.
Flawed decisions due to confirmation bias have been found in political, organizational, financial and scientific contexts. These biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. For example, confirmation bias produces systematic errors in scientific research based on inductive reasoning (the gradual accumulation of supportive evidence). Similarly, a police detective may identify a suspect early in an investigation, but then may only seek confirming rather than disconfirming evidence. A medical practitioner may prematurely focus on a particular disorder early in a diagnostic session, and then seek only confirming evidence. In social media, confirmation bias is amplified by the use of filter bubbles, or “algorithmic editing”, which display to individuals only information they are likely to agree with, while excluding opposing views.
Attitude polarisation
Attitude polarization, also known as belief polarization and polarization effect, is a phenomenon in which a disagreement becomes more extreme as the different parties consider evidence on the issue. It is one of the effects of confirmation bias: the tendency of people to search for and interpret evidence selectively, to reinforce their current beliefs or attitudes. When people encounter ambiguous evidence, this bias can potentially result in each of them interpreting it as in support of their existing attitudes, widening rather than narrowing the disagreement between them.
The effect is observed with issues that activate emotions, such as political ‘hot-button’ issues. For most issues, new evidence does not produce a polarization effect. For those issues where polarization is found, mere thinking about the issue, without contemplating new evidence, produces the effect. Social comparison processes have also been invoked as an explanation for the effect, which is increased by settings in which people repeat and validate each other’s statements. This apparent tendency is of interest not only to psychologists, but also to sociologists, and philosophers.
Illusory correlation
In psychology, illusory correlation is the phenomenon of perceiving a relationship between variables (typically people, events, or behaviors) even when no such relationship exists. A false association may be formed because rare or novel occurrences are more salient and therefore tend to capture one’s attention. This phenomenon is one way stereotypes form and endure. Hamilton & Rose (1980) found that stereotypes can lead people to expect certain groups and traits to fit together, and then to overestimate the frequency with which these correlations actually occur. These stereotypes can be learned and perpetuated without any actual contact occurring between the holder of the stereotype and the group it is about.
Belief perseverance
belief perseverance - when beliefs persist after the evidence for them is shown to be false.
Irrational primacy effect
the irrational primacy effect - a greater reliance on information encountered early in a series.
Inductive reasoning
Inductive reasoning - is a method of reasoning in which the premises are viewed as supplying some evidence, but not full assurance, of the truth of the conclusion. It is also described as a method where one’s experiences and observations, including what is learned from others, are synthesized to come up with a general truth. Many dictionaries define inductive reasoning as the derivation of general principles from specific observations (arguing from specific to general), although there are many inductive arguments that do not have that form.
Inductive reasoning is distinct from deductive reasoning. If the premises are correct, the conclusion of a deductive argument is certain; in contrast, the truth of the conclusion of an inductive argument is probable, based upon the evidence given.
Filter bubble
A filter bubble - is a term coined by the Internet activist Eli Pariser to refer to a state of intellectual isolation that can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history. As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. The choices made by these algorithms are not transparent. Prime examples include Google Personalized Search results and Facebook’s personalized news-stream. The bubble effect may have negative implications for civic discourse, according to Pariser, but contrasting views regard the effect as minimal and addressable. The results of the U.S. presidential election in 2016 have been associated with the influence of social media platforms such as Twitter and Facebook, and as a result have called into question the effects of the “filter bubble” phenomenon on user exposure to fake news and echo chambers, spurring new interest in the term, with many concerned that the phenomenon may harm democracy and well-being by making the effects of misinformation worse.
Echo chamber
In discussions of news media, an echo chamber refers to situations in which beliefs are amplified or reinforced by communication and repetition inside a closed system and insulated from rebuttal. By participating in an echo chamber, people are able to seek out information that reinforces their existing views without encountering opposing views, potentially resulting in an unintended exercise in confirmation bias. Echo chambers may increase social and political polarization and extremism.
The term is a metaphor based on an acoustic echo chamber, in which sounds reverberate in a hollow enclosure. Another emerging term for this echoing and homogenizing effect within social media communities on the Internet is cultural tribalism.
Many scholars note the effects that echo chambers can have on citizens’ stances and viewpoints, and specifically implications has for politics. However, some studies have suggested that the effects of echo chambers are weaker than often assumed.
Concept
The Internet has expanded the variety and amount of accessible political information. On the positive side, this may create a more pluralistic form of public debate; on the negative side, greater access to information may lead to selective exposure to ideologically supportive channels. In an extreme “echo chamber”, one purveyor of information will make a claim, which many like-minded people then repeat, overhear, and repeat again (often in an exaggerated or otherwise distorted form) until most people assume that some extreme variation of the story is true.
The echo chamber effect occurs online when a harmonious group of people amalgamate and develop tunnel vision. Participants in online discussions may find their opinions constantly echoed back to them, which reinforces their individual belief systems due to the declining exposure to other’s opinions. Their individual belief systems are what culminate into a confirmation bias regarding a variety of subjects. When an individual wants something to be true, they often will only gather the information that supports their existing beliefs and disregard any statements they find that are contradictory or speak negatively upon their beliefs. Individuals who participate in echo chambers often do so because they feel more confident that their opinions will be more readily accepted by others in the echo chamber. This happens because the Internet has provided access to a wide range of readily available information. People are receiving their news online more rapidly through less traditional sources, such as Facebook, Google, and Twitter. These and many other social platforms and online media outlets have established personalized algorithms intended to cater specific information to individuals’ online feeds. This method of curating content has replaced the function of the traditional news editor. The mediated spread of information through online networks causes a risk of an algorithmic filter bubble, leading to concern regarding how the effects of echo chambers on the internet promote the division of online interaction.
It is important to note that members of an echo chamber are not fully responsible for their convictions. Once part of an echo chamber, an individual might adhere to seemingly acceptable epistemic practices and still be further misled. Many individuals may be stuck in echo chambers due to factors existing outside of their control, such as being raised in one.
Furthermore, the function of an echo chamber does not entail eroding a member’s interest in truth; it focuses upon manipulating their credibility levels so that fundamentally different establishments and institutions will be considered proper sources of authority.
Self-fulfilling prophecy
A self-fulfilling prophecy is the sociopsychological phenomenon of someone “predicting” or expecting something, and this “prediction” or expectation coming true simply because the person believes it will and the person’s resulting behaviors align to fulfill the belief. This suggests that people’s beliefs influence their actions. The principle behind this phenomenon is that people create consequences regarding people or events, based on previous knowledge of the subject.
There are three factors within an environment that can come together to influence the likelihood of a self-fulfilling prophecy becoming a reality: appearance, perception and belief. When a phenomenon cannot be seen, appearance is what we rely upon when a self-fulfilling prophecy is in place. When it comes to a self-fulfilling prophecy there also must be a distinction “between ‘brute and institutional’ facts”. The philosopher John Searle states the difference as “facts [that] exist independently of any human institutions; institutional facts can only exist within institutions.” There is an inability of institutional facts to be self-fulfilling. For example, the old belief that the Earth is flat (institutional) when it is known to be spherical (brute) is not self-fulfilling because Earth’s shape is proven through significant evidence. There has to be a consensus by “large numbers of people within a given population” aside from being institutional, social, or bound by the laws of nature for an idea to be seen as self-fulfilling.
A self-fulfilling prophecy can have either negative or positive outcomes. It can be concluded that establishing a label towards someone or something significantly impacts their perception and influences them to establish self-fulfilling prophecy. Interpersonal communication plays a significant role in establishing these phenomena as well as impacting the labeling process. Intrapersonal communication can have both positive and negative effects, dependent on the nature of the self-fulfilling prophecy. The expectations of a relationship or the inferiority complex felt by young minority children are examples of the negative effects of real false beliefs being self-fulfilling.
American sociologists W. I. Thomas and Dorothy Swaine Thomas were the first to discover this phenomenon. In 1928 they developed the Thomas theorem (also known as the Thomas dictum), stating that, “If men define situations as real, they are real in their consequences.” Because of the way the couple defined a self-fulfilling prophecy, their definition was regarded as flexible in its meaning. On a societal level, there can be a consensus on what’s deemed true depending on the importance of the part of the culture even if it is a false assumption and as a result of this perception of the culture it will become the outcome based on the behavior of the society. A person’s perception can be “self-creating” if the belief they have is acted upon by their behavior which aligns with the outcome. Building on Thomas’ idea, another American sociologist, Robert K. Merton, used the term “self-fulfilling prophecy” for it, popularizing the idea that “a belief or expectation, correct or incorrect, could bring about a desired or expected outcome.” While Robert K. Merton is typically credited for this theory since he coined the name, the Thomases developed it earlier on along with the philosophers Karl Popper and Alan Gerwith who also independently contributed to the idea behind this theory in their works which came before Merton as well. Self-fulfilling prophecies are an example of the more general phenomenon of positive feedback loops.
Positive feedback loop
Positive feedback (exacerbating feedback, self-reinforcing feedback) is a process that occurs in a feedback loop which exacerbates the effects of a small disturbance. That is, the effects of a perturbation on a system include an increase in the magnitude of the perturbation. That is, A produces more of B which in turn produces more of A. In contrast, a system in which the results of a change act to reduce or counteract it has negative feedback. Both concepts play an important role in science and engineering, including biology, chemistry, and cybernetics.