Which of the following phenomena is defined as the tendency to overestimate the commonality of ones opinions and ones undesirable or unsuccessful behaviors?

Which of the following phenomena is defined as the tendency to overestimate the commonality of ones opinions and ones undesirable or unsuccessful behaviors?

Which of the following phenomena is defined as the tendency to overestimate the commonality of ones opinions and ones undesirable or unsuccessful behaviors?

  • Entertainment & Pop Culture
  • Geography & Travel
  • Health & Medicine
  • Lifestyles & Social Issues
  • Literature
  • Philosophy & Religion
  • Politics, Law & Government
  • Science
  • Sports & Recreation
  • Technology
  • Visual Arts
  • World History
  • On This Day in History
  • Quizzes
  • Podcasts
  • Dictionary
  • Biographies
  • Summaries
  • Top Questions
  • Week In Review
  • Infographics
  • Demystified
  • Lists
  • #WTFact
  • Companions
  • Image Galleries
  • Spotlight
  • The Forum
  • One Good Fact
  • Entertainment & Pop Culture
  • Geography & Travel
  • Health & Medicine
  • Lifestyles & Social Issues
  • Literature
  • Philosophy & Religion
  • Politics, Law & Government
  • Science
  • Sports & Recreation
  • Technology
  • Visual Arts
  • World History
  • Britannica Classics
    Check out these retro videos from Encyclopedia Britannica’s archives.
  • Britannica Explains
    In these videos, Britannica explains a variety of topics and answers frequently asked questions.
  • Demystified Videos
    In Demystified, Britannica has all the answers to your burning questions.
  • #WTFact Videos
    In #WTFact Britannica shares some of the most bizarre facts we can find.
  • This Time in History
    In these videos, find out what happened this month (or any month!) in history.
  • Student Portal
    Britannica is the ultimate student resource for key school subjects like history, government, literature, and more.
  • COVID-19 Portal
    While this global health crisis continues to evolve, it can be useful to look to past pandemics to better understand how to respond today.
  • 100 Women
    Britannica celebrates the centennial of the Nineteenth Amendment, highlighting suffragists and history-making politicians.
  • Britannica Beyond
    We’ve created a new place where questions are at the center of learning. Go ahead. Ask. We won’t mind.
  • Saving Earth
    Britannica Presents Earth’s To-Do List for the 21st Century. Learn about the major environmental problems facing our planet and what can be done about them!
  • SpaceNext50
    Britannica presents SpaceNext50, From the race to the Moon to space stewardship, we explore a wide range of subjects that feed our curiosity about space!

  • PDFView PDF

Which of the following phenomena is defined as the tendency to overestimate the commonality of ones opinions and ones undesirable or unsuccessful behaviors?

PrimerThe optimism bias

Summary

The ability to anticipate is a hallmark of cognition. Inferences about what will occur in the future are critical to decision making, enabling us to prepare our actions so as to avoid harm and gain reward. Given the importance of these future projections, one might expect the brain to possess accurate, unbiased foresight. Humans, however, exhibit a pervasive and surprising bias: when it comes to predicting what will happen to us tomorrow, next week, or fifty years from now, we overestimate the likelihood of positive events, and underestimate the likelihood of negative events. For example, we underrate our chances of getting divorced, being in a car accident, or suffering from cancer. We also expect to live longer than objective measures would warrant, overestimate our success in the job market, and believe that our children will be especially talented. This phenomenon is known as the optimism bias, and it is one of the most consistent, prevalent, and robust biases documented in psychology and behavioral economics.

Cited by (0)

Copyright © 2011 Elsevier Ltd. All rights reserved.

The neurophysiological basis of optimism bias

Mihai Dricu, ... Tatjana Aue, in Cognitive Biases in Health and Psychiatric Disorders, 2020

Summary

Optimism bias describes people’s tendency to overestimate their likelihood to experience positive events and underestimate their likelihood to experience negative events in the future. Such an optimistic outlook on the future can enhance their motivation to engage in self-relevant and difficult situations and make it more likely to obtain rewards. Theoretical considerations vary in the extent to which they claim the existence of an optimism bias in the general population. Even though some researchers suggest that optimism bias exists in up to 80% of the population, others argue that it may merely represent other, closely related cognitive phenomena (e.g., illusion of control). Whereas neuroimaging studies on optimism bias have revealed some involvement of cingulate and prefrontal brain areas in the emergence of the bias, there are no studies on the somatovisceral processes related to optimism bias so far. Interestingly, first attempts to compare the extent of optimism bias displayed by healthy people and patients with mental disorders reveal an absence of optimism bias in various patient groups (e.g., patients with depression, borderline personality disorder, and OCD). However, the few findings on optimism bias in patients with mental disorders must be further supported by future research. Such research would greatly benefit from the use of more consistent terminology and more reliable and rigorous methods to target some of the important limitations that present and previous research on optimism bias suffers from. In addition, future research on optimism bias should take into account potential interactions with other positive cognitive biases (e.g., in attention or memory) to yield a more elaborative view on healthy information processing.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128166604000039

Positive interpretation bias across the psychiatric disorders☆

Ellen Jopling, ... Joelle LeMoult, in Cognitive Biases in Health and Psychiatric Disorders, 2020

Evidence for the involvement of occipital regions (including the inferior occipital and fusiform gyri) comes from work examining the neural correlates of the optimism bias. Aue, Nusbaum, and Cacioppo (2012), for example, documented that the optimism bias is associated with increased functional connectivity between an occipital cluster (including the left inferior occipital gyrus and fusiform gyrus) and key structures of the human reward system (including limbic and dorsal striatal regions). Conceptually, these results indicate that the optimism bias is associated with increased functional connectivity of brain regions responsible for visual processing and attention with reward-related behaviors (Mangun, Buonocore, Girelli, & Jha, 1998; Robbins, Cador, Taylor, & Everitt, 1989; Rossion, Schiltz, & Crommelinck, 2003). Given that optimism is related to a positive interpretation bias (Kress & Aue, 2017; Tran, Hertel, & Joormann, 2011), the role of various occipital areas might be examined in future work on positive interpretation biases.

There is also evidence for the involvement of both the rostral anterior cingulate cortex (rACC) and the amygdala in the optimism bias. For instance, Sharot et al. (2007) demonstrated that individuals with more optimism about the future showed enhanced activation of the rACC and the amygdala when imagining positive future events, relative to negative future events. Similarly, Sharot (2011) documented increased functional connectivity between the rACC and the amygdala in individuals who display an optimism bias. These findings suggest that an optimism bias is related to increased connectivity between regions responsible for affective error responding (the rACC; Blair et al., 2013) and emotional responding (the amygdala; Phelps & LeDoux, 2005).

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128166604000052

Project Management of Innovative Teams

Susannah B.F. Paletz, in Handbook of Organizational Creativity, 2012

Scheduling Estimates and Temporal Perspectives

These suggestions assume that managers are making rational decisions about schedules. In fact, behavioral decision-making research suggests that there is an optimism bias for estimating task completion, such that people anticipate tasks taking less time than they do (e.g., Buehler, Messervey, & Griffin, 2005; Sanna, Parks, Chang, & Carter, 2005). Buehler and colleagues (2005) found, across three tasks, that not only does this bias exist in individuals, but it was enhanced in ad hoc groups, suggesting that bringing in others’ opinions does not lessen the bias. In fact, these biased estimates were driven by groups focusing on factors that promote successful future task completion rather than by considering negative factors. Furthermore, Sanna and colleagues (2005) found that it was the ease of (availability) of thinking about successful task completion, not simply thinking about successful task completion per se, which caused this planning fallacy. The obvious recommendations from this line of research is for managers to consider possible threats to schedule, not just success, regardless of their cognitive availability, and to adjust their estimates to make up for what they can’t anticipate. Otherwise, projects may encounter time pressure simply due to incorrectly estimated schedules.

The temporal perspective of the group is also related to time pressure. Antes and Mumford (2009) found a significant three-way interaction between time pressure, temporal perspective, and positive/negative frames, such that the generally negative effect they found for time pressure on problem definition was reversed when the temporal perspective focused on the past and the framing was negative. This finding suggested that a past orientation focused on negative events was benefitted from time pressure when defining the problem. In addition, Antes and Mumford (2009) found that information gathering was more effective when there was time pressure and a present or future orientation, whereas when a past temporal orientation was employed, no time pressure was best; idea evaluation using a future orientation did not reveal differences between time pressure and no time pressure. These findings suggest a complex pattern of orientation, framing, and time pressure that may be task-specific (Antes & Mumford, 2009). The study reminds us that time itself is socially constructed (Arrow, Henry, Poole, Wheelan, & Moreland, 2005).

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123747143000173

“Strangers” in Neuroscientific Research*

B. Bringedal, ... A. Rábano, in The Human Sciences after the Decade of the Brain, 2017

Principles

Primum Non Nocere—First Do No Harm

The golden rule of medicine can guide more than medical treatment. The positive intentions of research—producing new knowledge for the benefit of humans—cannot come without the downside of potential negative or harmful effects. The assessment of the balance between positive and negative implications tends to be in favor of the positive, due to optimism bias. To counteract this tendency, explicit attention to “what can go wrong?” is helpful. Such a precautionary attituden is important in order to pursue a careful and cautious approach and to promote guards against hubris caused by optimism bias.

A precautionary attitude involves a necessary epistemological correction, since it challenges scientists—and others involved—to be concerned with observations that count against what one is eager to prove. Karl Popper’s principle of falsification goes well with a precautionary attitude. Originally, Popper distinguished between a scientific and nonscientific statement according to whether the statement is, in principle, possible to falsify through empirical testing (Popper, 1959/1999). It is not the task of scientific inquiry to prove that a particular empirical statement is correct, but rather to search for evidence to its rejection. We are not concerned with demarcation between science and nonscience in this context, but the epistemological attitude it expresses. The difference between searching for verification versus falsification reflects a fundamental difference in attitudes to knowledge. Further, the falsification attitude can be a safeguard against optimism bias.

It is almost impossible to predict the total effects of research and innovations, especially the longer-term effects. For this reason, the precautionary principle is advocated. On a global level, the principle is included in virtually every policy document on environmental protection, sustainable development, and public health (Andorno & Biller-Andorno, 2015).

In European law, the principle is operationalized as follows: “The precautionary principle in public decision making concerns situations where following an assessment of the available scientific information, there are reasonable grounds for concern for the possibility of adverse effects on the environment or human health, but scientific uncertainty persists. In such cases provisional risk management measures may be adopted, without having to wait until the reality and seriousness of those adverse effects become fully apparent” (Von Schomberg, 2012, p. 147).

In public health, the principle is formulated as follows: “(W)here there are significant risks of damage to public health, we should be prepared to take action to diminish those risks, even when the scientific knowledge is not conclusive, if the balance of likely costs and benefits justifies it!” (Horton, 1998, p. 252).

To be willing to take action despite insufficient evidence is less straightforward than it might seem at first sight, as the debate on precautionary principles readily demonstrates. Precautionary principles are accused of being antiscientific, conservative, and outright irrational—as potential benefits of scientific and technological advances are sacrificed on its altar (Harris & Holm, 2002).

Since the potential negative implications of research and innovation can be seen as arguments against innovation altogether, a principle of precautions seems too absolute. A precautionary attitude, however, is in place. The challenge is to strike a balance between benefit and harm; the duty to avoid harm is not the same as the duty to abstain from carrying out research (or any action) altogether—clearly also because no action can involve more harm than the action itself. This introduces a second general principle we build on, namely, the need to weigh the benefits and the drawbacks.

Weighing Benefits and Harm

Jonathan Wolff argues in favor of the precautionary attitude not least because it involves a more pragmatic attitude to risk, compared to the precautionary principle (Wolff, 2006). Any action involves risk; thus, the task is not to avoid risk altogether but to weigh the potential positive implications against the negative ones. This requires an explicit assessment of the potential beneficial as well as potential harmful implications.

Risk is a product of hazard and probability. There is a substantial difference between severe, perhaps fatal, outcomes of low probability and minor problems of high probability. As a guiding principle, identifying high-risk areas—those that should be marked with red lights—is useful.

When weighing benefits against harm, it is important to address the question, “Whose benefit, whose harm?” Whether the decision maker and those affected by the decision are one and the same or not is important since one person’s benefit can be another person’s harm (Luhmann, 2005). The same holds true for harm and risk. In a risk situation, the one who runs a risk, for example, in order to obtain a preferred state in the future, need not be the same one who bears the costs. One HBP-related example would be the differences regarding risk-taking in controversial areas such as neuroenhancement to improve individual performances of healthy subjects, or neuroeconomics research that could inform neuromarketing (Voarino, 2014).o Those who exploit commercial opportunities and laypeople tend to have different views, especially when it comes to potential long-term threats emerging from the uncontrolled application of neuroscientific findings. This makes it difficult to reach a consensus because someone’s risk is often another’s danger, and perspectives differ according to the respective position. Thus the evaluation of risk associated with the outcomes of scientific research in the HBP, and the willingness to accept risk decisions, are most of all social problems.

As a starting point, scientific knowledge never comes free from social interests or implications. Disregard of public concern and laypeople’s perspective in the scientific enterprise entails a normative problem; as such disregard undermines the possibility of establishing a democratic knowledge society (European Commission, 2007). It is also important to acknowledge laypeople’s specific knowledge, based on their everyday lives, and needed for assessing long-term ethical/social implications (Myskja, 2007) in terms of benefits and harms. The inclusion of lay expertise is thus crucial to establish empirically well-informed ethical governance of science and technology.

The combination of the “first do no harm” principle and the weighing of harms and benefits can be combined into a principle of “first do no net harm.” Since all action potentially involves negative and positive outcomes, the task is to choose the ones where the harm is as little as possible, while the benefit is maximized.

Transparency

Transparency not only has the advantage of enhancing quality of research but also increases our attention to what can go wrong. When the research content is shared within a wider community, epistemological and ethical implications are laid open to broader scrutiny. Transparency has the potential benefit of involving diverse individuals and approaches, which can be particularly helpful in large collaborations such as HBP, since it is impossible for the few to maintain oversight over, or possess insight into, all dimensions of all subprojects.

Clearly, transparency is not always justified; in some situations, it can involve a breach in fundamental ethical principles, such as when information includes confidential medical records. The principle of transparency means that nondisclosure is explicitly qualified (e.g., by reference to the rules of confidentiality) in contrast to a requirement to qualify openness explicitly.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978012804205200015X

Psychological and Social Factors of Health Inequalities

Mohamed Lamine Bendaoud, Stéphane Callens, in New Health Systems, 2017

2.4.3 Preventative measures vis-à-vis risk

When studying the preventive behaviors adopted vis-à-vis the risk of CO poisoning, it was seen that individuals are characterized by a certain number of important variables. When we asked the people interviewed if they could implement measures to prevent CO poisoning, 42% responded in the affirmative. Among the 41 respondents who said “No”, 18 indicated that this was because they did not know about such measures, 6 said it was because they were very expensive and 17 people had other reasons. However, the fact of having followed a specific training program on risks and health, being a landowner as well as being a smoker who was trying to quit – all had a positive influence on the probability of their possessing a controlled mechanical ventilation (CMV). It must be pointed out that knowing someone who had been the victim of CO poisoning had a positive impact on the adoption of behaviors to prevent this, while people with children were also better informed on the means of prevention against CO poisoning.

As concerns the risk of using agrochemical products, prevention is dependent on the knowledge of the product, the application protocol and the wearing of individual protective equipment. In our study, we divided these means of prevention into regular medical exams and means of protection that can be used every day by farmers in the course of their daily activities. Twenty out of 56 farmers have a health check-up at least once every two years, compared to 36 who go less often. Thirty-one have medical tests done at least once every two years. Age and education levels are the variables that have a favorable impact on the frequency of the medical tests. As regards the means of protection made available to farmers for use during their professional activities, 46 out of the 56 farmers interviewed used gloves as a means of protection. We found that farmers working on large-scale production, who had a good level of education and who used agrochemical products intensively, used a protective suit. Only 25 of the 56 wore protective goggles, and 39 confirmed that they used a mask. We noted that age also had a positive impact on this behavior.

When we study the behavior of people vis-à-vis the risk of a Legionnaires’ epidemic, we see that on the one hand, the more people consider themselves to be in good health, the less ready they are to adopt preventive measures against an epidemic at the community level. On the other hand, 20 out of 56 respondents said that they would do nothing because they did not know what measures to adopt. Older people, smokers, tenants, those with a monthly income lower than 700 euros and those who saw illness as a passing phase were in this category.

This study allowed us to observe that there was an “optimism bias” that covered economic, sociological as well as psychological factors. The farmers interviewed practiced intensive agriculture, with daily use of agrochemical products sprayed by tractor. However, while 93% of these farmers confirmed that they had information on the use of their agrochemical products, many of them mixed products without the potential harmful effects of these mixtures having been scientifically studied. About one-third of them had already encountered problems with the storage of their products. It also seems here that the less educated among them were more often the victims than the others.

Even though the people had the necessary information available, we saw a gap between the knowledge of recommended behavior and the practices actually carried out. Consequently, they exposed themselves to the risk of CO poisoning, agrochemical product poisoning or contamination by legionella.

This discrepancy can be explained, on the one hand, by economic determinants, (characterized by the low financial resources of these individuals) and on the other hand, by psychological factors that signify that a person’s behavior is limited by their individual perception of reality. This leads people to associate preferences and beliefs and bring in an optimistic perspective when taking a decision, which makes it possible to commit errors in judgments and pushes a person to adopt risky behavior.

Finally, we have sociocultural determinants that are defined by the pressure exerted by society and one’s living environment. This is confirmed by the accounts the farmers give of their working conditions: as this profession is highly dependent on climatic conditions, they must make the best use of the favorable conditions, at harvest time, for example. The specific demands of this profession, coupled with the current problems faced by farmers vis-à-vis intensive cultivation, as well as the downward pressure on their operating profits lead to the farmers sometimes being obliged to give up precautionary measures when using agricultural products. For example, not wearing gloves or protective suits, which reduces their productivity, or ignoring the recommendation to not use the products on windy days.

When it comes to the perceived state of health, it seems that the fact of not living on an agricultural land, possessing a list of the substances that carry the risk of poisoning, or of not feeling that one is exposed to the risk of absorbing toxic products through respiration or the skin, all have a favorable impact on the perceived state of health. We observed that there is an optimism bias regarding risk perception. In fact, the degree of risk perception is determined by a mixture of professional knowledge and beliefs and conceptions shared in one’s networks. A person systematically estimates that they are less exposed to risk compared to other people. In fact, the underestimation of certain risks shows that optimism is a defense mechanism against anxiety in an environment where negative events are more frequent than positive events. Thus, an individual has a tendency to think that they are better informed than others, and consequently, that they are the only ones who adopt precautionary measures (egocentrism).

When it comes to the risk of legionella contamination, we can clearly see that on the one hand, individuals who know about the Noroxo incident are more likely to fear a similar episode recurring, and on the other hand, the risk of catching Legionnaires’ disease at an automobile washing station or in a pool is relatively high. It is important to note that when a person knows of a case of poisoning, they become more careful about taking measures to protect against CO poisoning. Indeed, depending on the information available to them about the behavior that others adopt, the individual adjusts their perception of their own engagement in risky behaviors.

The gap seen between the perception of information on a risk and the adoption of risky behaviors does not always seem linked to a lack of knowledge and seems to be explained by other psychological and sociocultural factors. The fact that farmers are able to recognize the problem shows that not conforming to recommendations is not a problem related to the perception of real risks. The gap between theory and practice can be explained by failure in communication and especially by poorly identifying the target population. In fact, we saw that it was mainly poorer and less educated people who were most exposed to different risks. We can say that information on risks does not take into account individual and social representations, nor the psychological and sociological determinants of behavior. This leads to differences in risk perception and evaluation between the agents responsible for communication and the public receiving the information. Public authorities must inform, educate and sensitize people to risk levels and give them information for knowledge and judgment, both to increase their awareness and diminish their concerns, as well as to reinforce their abilities to make choices and take decisions relative to their health.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781785481659500022

What Does It Mean to be Biased

Ulrike Hahn, Adam J.L. Harris, in Psychology of Learning and Motivation, 2014

2.1.1 The Pitfalls of Moderators

Moderators can clearly be very influential in theory development, but they must be theoretically derived. Post hoc moderation claims ensure the unfalsifiability of science, or at least can make findings pitifully trivial. Consider the result—reported in the Dutch Daily News (August 30th, 2011)—that thinking about meat results in more selfish behavior. As this study has since been retracted—its author Stapel admitting that the data were fabricated—it is likely that this result would not have replicated. After (say) 50 replication attempts, what is the most parsimonious conclusion? One can either conclude that the effect does not truly exist or posit moderators. After enough replication attempts across multiple situations, the latter strategy will come down to specifying moderators such as “the date, time and experimenter,” none of which could be predicted on the basis of an “interesting” underlying theory.

This example is clearly an extreme one. The moderators proposed for the optimism bias and better-than-average effects are clearly more sensible and more general. It is still, however, the case that these moderators must be theoretically justified. If not, “moderators” may prop up a bias that does not exist, thus obscuring the true underlying explanation (much as in the toy example above). In a recent review of the literature, Shepperd, Klein, Waters, and Weinstein (2013) argue for the general ubiquitousness of unrealistic optimism defined as “a favorable difference between the risk estimate a person makes for him- or herself and the risk estimate suggested by a relevant, objective standard…Unrealistic optimism also includes comparing oneself to others in an unduly favorable manner,” but state that this definition makes “no assumption about why the difference exists. The difference may originate from motivational forces…or from cognitive sources, such as…egocentric thinking” (Shepperd et al., 2013, p. 396).

However, the question of why the difference exists is critical for understanding what is meant by the term unrealistic optimism especially in the presence of findings that clearly appear inconsistent with certain accounts. The finding that rare negative events invoke comparative optimism, while common negative events invoke comparative pessimism seems entirely inconsistent with a motivational account. If people are motivated to see their futures as “rosy,” why should this not be the case for common negative events (or rare positive events) (Chambers, Windschitl, & Suls, 2003; Kruger & Burrus, 2004)? One can say that comparative optimism is moderated by the interaction of event rarity and valence, such that for half the space of possible events pessimism is in fact observed, but would one really want to call this “unrealistic optimism” or an “optimistic bias”? Rather, it seems that a more appropriate explanation is that people focus overly on the self when making comparative judgments (e.g., Chambers et al., 2003; Kruger & Burrus, 2004; see Harris & Hahn, 2009 for an alternative account which can likewise predict this complete pattern of data)—a process that simply has the by-product of optimism under certain situations. It might be that such overfocus on the self gives rise to bias, but through a correct understanding of it one can better predict its implications. Likewise, one is in a better position to judge the potential costs of it.

In summary, when bias is understood in a statistical sense as a property of an expectation, demonstration of deviation across a range of values is essential to establishing the existence of a bias in the first place, let alone understanding its nature. Conflicting findings across a range of values (e.g., rare vs. common events in the case of optimism) suggest an initial misconception of the bias, and any search for moderators must take care to avoid perpetuating that misconception by—unjustifiedly—splitting up into distinct circumstances one common underlying phenomenon (i.e., one bias) which has different effects in different circumstances (for other examples, see on the better-than-average/worse-than-average effect, see e.g., Benoit & Dubra, 2011; Galesic, Olsson, & Rieskamp, 2012; Kruger, 1999; Kruger, Windschitl, Burrus, Fessel, & Chambers, 2008; Moore & Healy, 2008; Moore & Small, 2007; Roy, Liersch, & Broomell, 2013; on the false uniqueness/false consensus effect see Galesic, Olsson, & Rieskamp, 2013; more generally, see also, Hilbert, 2012).

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128002834000022

Laura Kress, Tatjana Aue, in Neuroscience & Biobehavioral Reviews, 2017

Abstract

Both optimism bias and reward-related attention bias have crucial implications for well-being and mental health. Yet, the extent to which the two biases interact remains unclear because, to date, they have mostly been discussed in isolation. Examining interactions between the two biases can lead to new directions in neurocognitive research by revealing their underlying cognitive and neurophysiological mechanisms. In the present article, we suggest that optimism bias and reward-related attention bias mutually enforce each other and recruit a common underlying neural network. Key components of this network include specific activations in the anterior and posterior cingulate cortex with connections to the amygdala. We further postulate that biased memory processes influence the interplay of optimism and reward-related attention bias. Studying such causal relations between cognitive biases reveals important information not only about normal functioning and adaptive neural pathways in maintaining mental health, but also about the development and maintenance of psychological diseases, thereby contributing to the effectiveness of treatment.

Read full article

URL: https://www.sciencedirect.com/science/article/pii/S0149763416306406

Default Rules Are Better Than Active Choosing (Often)

Cass R. Sunstein, in Trends in Cognitive Sciences, 2017

Steering Choices

Over the past decade, public officials have become keenly interested in behavioral science [1–3]. Research findings about present bias (see Glossary), the selective nature of attention, loss aversion, status quo bias, optimism bias, habit formation, the use of heuristics in risk perception (including the availability heuristic), and sources of fear are playing an active role in helping officials to understand how best to address serious social problems [1,2]. For example, present bias might lead people to neglect long-term consequences, potentially endangering their health (consider the problems of smoking and obesity) and their financial situation (consider the failure to save for retirement, or to save at all) [3,4]. Selective attention might lead people to ignore important (and expensive) features of products and services, including credit card agreements, household appliances, and mortgages [3]. Status quo bias might lead people to stick with arrangements, such as a relationship with an expensive energy provider, that are not in their interest [3]. Armed with findings of this kind, more than 150 governments are enlisting psychology, neuroscience, and behavioral economics in efforts to design and improve programs that involve savings, poverty, consumer protection, the environment, smoking, traffic safety, education, obesity, national security, and more [1,2,4].

In these efforts, an organizing idea is ‘choice architecture’, which refers to the background environment against which people choose products, services, and activities [3] (Table 1). That background might be some physical organization (of, say, an office or a cafeteria); it might consist of a user interface (on, say, an application form, menu, cell phone, or website); or it might be the social environment (as, for example, when the presence and the actions of peers are made visible or salient). Whenever consumers, employees, or others make decisions, some such background is inevitably in place, even if it seems invisible. Subtle features of its design – such as the ordering of items, the presence of a default rule, how information is framed, colors, font size, sounds – may have decisive effects on what people choose to do [2,3]. Features or changes of the design can operate as ‘nudges[3], in the form of freedom-preserving interventions that steer people in ways that they find helpful. In many domains, nudges and behaviorally informed choice architecture have produced significant results, for example by increasing savings, reducing pollution, combatting obesity, and improving the access of poor children to free school meals; the range of demonstrations is large, and constantly growing [3,5].

Table 1. Choice Architecture Is Pervasive in the Public and Private Sectors

Example of choice architectureIllustration of policy options or issue area
Design of government websites Order of items, number of items, font size, colors; number of ‘clicks’ to receive necessary information
Design of cafeterias Healthy foods first, unhealthy foods first
Default rules Automatic enrollment in green energy, automatic enrollment in savings programs
Increased simplification Asking fewer questions on forms that must be filled out to obtain benefits or permits; reducing the length of a communication to citizens or customers
Required active choosing Asking people whether they want to be organ donors as a condition for obtaining a driver’s license
Prompted choice Asking people whether they want to be organ donors, with no requirement that they answer the question
Disclosure of information Calorie content of foods; ‘traffic lights’ (red, yellow, green) for foods; fuel economy of vehicles; costs of late payments
Warnings, graphic or others Risks associated with smoking and distracted driving
Reminders Bills coming due, time to take medicines
Use of social norms Informing people that their behavior deviates from that of most people
Framing Describing consequences, such as economic or health benefits, as either losses or gains
Personalized communication Tailoring a letter or email to the specific situation of the recipient

Notwithstanding recent progress, there have been increasingly heated debates about the choice between two potentially appealing forms of choice architecture. [6] The first relies on default rules; the second involves active choosing. Both approaches have the virtue of retaining freedom of choice, in the sense that they allow people to go their own way. But in multiple contexts, policymakers must choose between the two approaches. For governments, the stakes can be high, in the sense that in areas that range from environmental protection to savings plans to relief of poverty, the outcomes can be radically different depending on which approach is chosen [6]. For example, automatic enrollment in a savings plan or in green energy is likely to produce significantly higher participation rates than is active choosing. This point has serious implications for the private sector, which is often in a position to enlist default rules or instead to require customers to make explicit choices. For example, a rental car company can simply default customers into certain terms (such as insurance), or it can insist instead on an active choice, and when a bank provides a mortgage, it can simply default customers into a variety of terms that are the usual or generally preferred course (subject to opt-out), or it can instead require them explicitly to indicate their preferences on each one.

Read full article

URL: https://www.sciencedirect.com/science/article/pii/S1364661317301043

Anticipatory feelings: Neural correlates and linguistic markers

Elka Stefanova, ... Leroy Lowe, in Neuroscience & Biobehavioral Reviews, 2020

3.1.1 Optimism

The neurobiology of positive anticipatory emotions can be investigated by examining the relationship between brain function and structure with a trait or personality characteristic such as optimism. Optimism, defined as the expectation of positive outcomes, has been linked to psychological and physical well-being as well as higher life satisfaction (Hinz et al., 2018) and is believed to be very similar to the personality trait of hope (Bryan and Cvengros, 2004).

While optimism and hope both involve positive future-oriented expectations and are often used synonomously in the literature, some researchers argue that they are dissociable constructs. Optimism is proposed to reflect a state of general positive expectancy (e.g., today will be a good day), whereas hope is associated with personal agency and self-initiated actions that are expected to result in specific positive outcomes (e.g., getting a good grade on a test) and are more closely related to wanting “states (Alarcon et al., 2013; Bruininks and Malle, 2005; Gallagher and Lopez, 2009).

Investigations of the neural basis of these constructs have indicated that the OFC, particularly its medial region (referred to as medial OFC or vmPFC), plays a key role in trait optimism and hope. In one study, the fractional amplitude of low-frequency fluctuations with resting-state fMRI, which measures the strength or power of low frequency (<0.8 Hz) oscillations, trait hope was negatively correlated with power in the bilateral medial OFC during rest (Wang et al., 2017a). Trait optimism has been shown to be negatively correlated with resting-state connectivity between the vmPFC/medial OFC and right inferior frontal gyrus (IFG) (Ran et al., 2017), and positively related to gray matter volume in the lateral and medial OFC (Dolcos et al., 2016) as well as the thalamus/pulvinar (Yang et al., 2013). In a task-based investigation where participants imagined future events, greater trait optimism was associated with increased activity in a region appearing to include the vmPFC, referred to in the article as rostral anterior cingulate (Sharot et al., 2007) when comparing positive with negative future event imagination. A critical component of trait optimism is believed to lie in a pervasive cognitive bias known as optimism bias. This is broadly defined as the tendency to overestimate the likelihood of positive events occurring (e.g., living longer than average) and underestimate the likelihood of negative events occurring (e.g., getting a divorce) (Kress and Aue, 2017; Sharot et al., 2007). The optimism bias can be found in the majority of people, i.e. in about 80 % - even in animals such as rats and birds - but it disappears in psychiatric diseases such as depression (Garrett et al., 2014; Sharot, 2011) or anxiety disorders (Blair et al., 2017). Remarkably, optimistic illusions seem to be only adaptive misbeliefs (McKay and Dennett, 2009). It is possible that optimism bias could be related to inaccuracies in affective forecasting, or the ability to predict how one will feel about a future event. Individuals have a tendency to overestimate the intensity and duration of both positive and negative future emotional reactions (Miloyan and Suddendorf, 2015; Wilson and Gilbert, 2005), and it is possible that overestimations regarding the anticipated happiness that would result from a positive future event could further contribute to optimism bias.

Optimism bias is driven by the motivation to adopt the most favorable or rewarding expectations and dismiss evidence that would lead to unfavorable expectations, thus creating asymmetrical belief patterns in an overly positive direction (Sharot et al., 2011). In line with this supposition, studies have confirmed that healthy individuals consider themselves significantly more likely to experience positive events than negative events (Blair et al., 2013; Sharot et al., 2007). In recent neuroimaging research, several paradigms have been utilized to elicit the optimism bias. In one of the earlier fMRI studies focusing on optimism, Sharot et al. (Sharot et al., 2007) directed participants to think of either a past or future autobiographical life events (e.g., winning an award, the end of a romantic relationship), which were then classified into positive, negative, and neutral events. Although this study did not compute the optimism bias behaviorally, the neuroimaging contrast (comparing brain activation when imagining future positive events to that when imagining future negative events) may provide a neural index of the asymmetrical relationship between positive and negative event processing that is at the core of the optimism bias. Results showed greater activity in vmPFC, the amygdala, and increased functional connectivity between the two regions when imagining future positive events relative to future negative events. In an adapted version of this paradigm, Blair and colleagues (Blair et al., 2013) presented participants with both positive future events (e.g., finding a cure for AIDS) and negative future events (e.g., being sentenced to jail). The optimism bias was behaviorally quantified through ratings where participants indicated the probability of each event occurring across their lifetime compared to individuals of the same age and gender. There was a highly significant optimism bias such that participants rated themselves as more likely than others to have positive events and less likely than others to have negative events. Similar to the results from Sharot et al. (Sharot et al., 2007), increased activity in the vmPFC was associated with thinking about positive versus negative future events; however, activity in this region was not modulated by individual participants’ optimism bias. Instead, greater optimism bias for positive events was associated with increased rACC activity in a region that did not overlap with the vmPFC (area found for the contrast of positive vs. negative event processing), whereas greater optimism bias for future negative events was associated with reduced anterior insula and dorsomedial prefrontal cortex (dmPFC) activity (Blair et al., 2013). The authors argue that these findings distinguish between a more general evaluation signal in the vmPFC (distinguishing “good” from “bad”) – a sensitivity that could no doubt contribute to trait optimism (Sharot et al., 2007) - and other regions including rACC, dmPFC, and insula, which are involved in generating an individual’s specific optimism bias. Research from Sharot’s lab provides further evidence that patients with major depression tend to update their beliefs in a more unbiased way (Garrett et al., 2014). In their study, depressed patients updated their hopes in both ways, positive and negative, while healthy controls showed an optimism bias tending to more integrate positive news. Importantly, mild depressed patients suffer from a lack in a positive bias than the occurrence of a negative bias - leading to a more realistic view. Activation in fMRI showed that the left IFG and bilateral superior frontal gyrus mediate positive information, while the right inferior parietal lobule and right IFG processed negative information about the future.

Finally, in order to examine the maintenance of an optimistic outlook even in the face of disconfirming evidence, a belief updating paradigm was implemented in two studies (Kuzmanovic et al., 2016; Sharot et al., 2011). In both studies, healthy individuals updated their beliefs less after receiving unfavorable information (i.e., the base rate of the negative event is higher than the participant estimated) versus favorable information (i.e., the base rate of the event is lower than the participant estimated). That is, they showed a strong optimism bias since they were more likely to update their estimates in an optimistic direction after receiving a base rate that was better (i.e., lower) than they had originally estimated, but less likely to update their beliefs in a pessimistic direction after receiving a base rate that was worse (i.e., higher) than they originally estimated (Kuzmanovic et al., 2016; Sharot et al., 2011). Greater estimation updating following the presentation of favorable information was associated with augmented activity in the vmPFC (Kuzmanovic et al., 2016), as well as dmPFC and right cerebellum (Sharot et al., 2011). Interestingly, greater vmPFC activity was also linked to smaller estimation updates following the presentation of unfavorable information an effect that was also found in occipital and temporal cortex, dmPFC, ventral striatum, and thalamus (Kuzmanovic et al., 2016). Moreover, the results of Sharot and colleagues (Sharot et al., 2011) suggest a specific role for right inferior frontal gyrus (IFG) in the ability to integrate undesirable information and update likelihood estimates for negative events. Additionally, individuals with higher trait optimism show reduced IFG responses to unfavorable information,

Read full article

URL: https://www.sciencedirect.com/science/article/pii/S0149763419300570

Teaching the use of framing and decontextualization to address context-based bias in psychiatry

Rami Bou Khalil, ... Elie Nemr, in Asian Journal of Psychiatry, 2020

5 Conclusion

The importance of teaching cognitive debiasing strategies to medical students has been emphasized in previous research (Chew et al., 2016; Oliver et al., 2017). In addition, it has become obvious for the scientific society since the second half of the last century that logical reasoning can be taught and trained (Nisbett et al., 1987). More recently, reference class forecasting was made mandatory in several countries such as the United Kingdom for large government infrastructure projects with the explicit purpose of eliminating optimism bias induced by the framing effect (Flyvberg, 2008). Reference class forecasting is a debiasing strategy elaborated by Kahneman and Tversky, which later helped Kahneman win the Nobel Prize in economics in 2002. This strategy is becoming increasingly popular in business circles because it improves the accuracy of projections by basing them on actual performance in a reference class of comparable actions thereby bypassing both optimism bias and strategic misrepresentation (Flyvberg, 2008). Accordingly, the importance of debiasing decision-making in high risk industries such as healthcare is a recognized priority for managers and policy-makers. These factors lead us to recommend the systematic implementation of cognitive debiasing techniques in general and decontextualization exercises in particular in medical education. This is of particular importance in psychiatry, a specialty that is heavily influenced by cultural, emotional and social frames, while almost exclusively relying on individual judgment in the absence of clinically validated biological markers for diagnosis and prognosis. Future research should evaluate the effectiveness of such interventions in reducing avoidable human error that can have significant consequences on patient care.

Read full article

URL: https://www.sciencedirect.com/science/article/pii/S1876201820303889

Which of the following phenomena is defined as the tendency to overestimate the commonality of one's opinions and ones undesirable or unsuccessful behaviors?

False Consensus Effect—The tendency to overestimate the commonality of one's opinions and one's undesirable or unsuccessful behaviors. False Uniqueness Effect—The tendency to underestimate the commonality of one's abilities and one's desirable or successful behaviors.

Which of the following terms is defined as a person's overall self evaluation or sense of self worth?

Abstract. Self-esteem refers to a person's evaluation of his/her worth. The best-known form is global self-esteem: general, dispositional, and consciously accessible self-evaluation. Psychologists have argued that self-esteem is important because it signals how well accepted or culturally valued one is.

Which of the following is a characteristic of narcissists?

Have an exaggerated sense of self-importance. Have a sense of entitlement and require constant, excessive admiration. Expect to be recognized as superior even without achievements that warrant it. Exaggerate achievements and talents.

When you overestimate the uniqueness of your own abilities you demonstrate?

The Dunning-Kruger effect is one of many cognitive biases that can affect your behaviors and decisions, from the mundane to the life-changing. While it may be easier to recognize the phenomenon in others, it is important to remember that it is something that impacts everyone.