Tag Archives: #conspiracy

THE “GREAT REPLACEMENT”. 60% of Britons believe in conspiracy theories – Esther Addley * Critical Thinking Skills. Why more highly educated people are less into conspiracy theories – Christian Jarrett.

Leavers are more likely to doubt immigration figures and think there is a plot to make Muslims the majority in UK.

Sixty per cent of British people believe at least one conspiracy theory about how the country is run or the veracity of information they have been given, a major new study has found, part of a pattern of deep distrust of authority that has become widespread across Europe and the US.

In the UK, people who supported Brexit were considerably more likely to give credence to conspiracy theories than those who opposed it, with 71% of leave voters believing at least one theory compared with 49% of remain voters.

Almost half (47%) of leave voters believed the government had deliberately concealed the truth about how many immigrants live in the UK, versus 14% of remain voters. A striking 31% of leave voters believed that Muslim immigration was part of a wider plot to make Muslims the majority in Britain, a conspiracy theory that originated in French far-right circles that was known as the “great replacement”. The comparable figure for remain voters was 6%.

The disparities between those who voted for Donald Trump and Hillary Clinton in the US was even more stark, where 47% of Trump voters believed that man-made global warming was a hoax, compared with 2.3% of Clinton voters.

The figures were the result of a large-scale international project conducted over six years and in nine countries by researchers at the University of Cambridge and YouGov, funded by the Leverhulme Trust. The study was the most comprehensive examination of conspiracy theories ever conducted, and marks the first time academics have explored questions of conspiracy beliefs, social trust and news consumption habits across different countries.

. . . The Guardian

Conspiracy and Democracy: History, Political Theory and Internet Research

The Leverhulme-funded project based at Cambridge University at CRASSH

Directors: Professor Sir Richard J Evans (History), Professor John Naughton (CRASSH), Professor David Runciman (Politics and International Studies)

Theories and beliefs about conspiracies are an enduring feature of modern societies. This is partly a reflection of the fact that real conspiracies do exist, and have existed in the past. But the pervasiveness of conspiracy theories in the twenty-first century suggests that many other factors are also at work, And studying them provides opportunities for understanding how people make sense of the world and how societies function. What does the prevalence of conspiracy theories tell us about trust in democratic societies, and about the differences between cultures and societies? How have conspiracies and conspiracy theorising changed over the centuries and what, if any, is the relationship between them? Have conspiracy theories appeared at particular moments in history, and why?

This ambitious, five-year, interdisciplinary research project aims to explore these and related questions. It sets out not to debunk particular theories but to provide a “natural history” of conspiracy theorising. To do that, the project combines the perspectives, investigative methods and insights of historians, political theorists, network engineers and other disciplines to produce what we hope will be a deeper and richer understanding of a fascinating and puzzling phenomenon.

Conspiracy and Democracy Study

Also on TPPA = CRISIS

Critical Thinking Skills. Why more highly educated people are less into conspiracy theories – Christian Jarrett

Debunking the Conspiratists. The Federal Reserve System, Illuminati and The New World Order. Real Facts.

I KNOW WHAT I KNOW! Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence – Charles G. Lord, Lee Ross, and Mark R. Lepper, Stanford University, 1979.

People who hold strong opinions on complex social issues are likely to examine relevant empirical evidence in a biased manner.

They are apt to accept “confirming” evidence at face value while subjecting “disconfirming” evidence to critical evaluation, and as a result to draw undue support for their initial positions from mixed or random empirical findings. Thus, the result of exposing contending factions in a social dispute to an identical body of relevant empirical evidence may be not a narrowing of disagreement but rather an increase in polarization.

Our beliefs can survive the complete subtraction of the critical formative evidence on which they were initially based. In a complementary fashion, the present study shows that strongly entrenched beliefs can also survive the addition of nonsupportive evidence.

It is unsurprising, therefore, that important social issues and policies generally prompt sharp disagreements, even among highly concerned and intelligent citizens, and that such disagreements often survive strenous attempts at resolution through discussion and persuasion.

To test these assumptions and predictions, subjects supporting and opposing Capital punishment were exposed to two purported studies, one seemingly confirming and one seemingly disconfirming their existing beliefs about the deterrent eficacy of the death penalty. As predicted, both proponents and opponents of capital punishment rated those results and procedures that confirmed their own beliefs to be the more convincing and probative ones, and they reported corresponding shifts in their beliefs as the various results and procedures were presented. The net effect of such evaluations and opinion shifts was the postulated increase in attitude polarization.

The human understanding when it has once adopted an opinion draws all things else to support and agree with it. And though there be a greater number and weight of instances to he found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects, in order that by this great and pernicious predetermination the authority of its former conclusion may remain inviolate. (Bacon, 1620/1960)

Often, more often than we care to admit, our attitudes on important social issues reflect only our preconceptions, vague impressions, and untested assumptions. We respond to social policies concerning compensatory education, water fluoridation, or energy conservation in terms of the symbols or metaphors they evoke (Abelson, 1976; Kinder 8: Kiewiet, Note 1) or in conformity with views expressed by opinion leaders we like or respect (Katz, 1957). When “evidence” is brought to bear it is apt to be incomplete, biased, and of marginal probative value, typically, no more than a couple of vivid, concrete, but dubiously representative instances or cases (cf. Abelson, 1972; Nisbett & Ross, in press). It is unsurprising, therefore, that important social issues and policies generally prompt sharp disagreements, even among highly concerned and intelligent citizens, and that such disagreements often survive strenous attempts at resolution through discussion and persuasion.

An interesting question, and one that prompts the present research, involves the consequences of introducing the opposing factions to relevant and objective data. This question seems particularly pertinent for contemporary social scientists, who have frequently called for “more empirically based” social decision making (e.g., Campbell, 1969).

Very likely, data providing consistent and unequivocal support for one or another position on a given issue can influence decision makers and, with sufficiently energetic dissemination, public opinion at large. But what effects can be expected for more mixed or inconclusive evidence of the sort that is bound to arise for most complex social issues, especially where full-fledged experiments yielding decisive and easy-to-generalize results are a rarity? Logically, one might expect mixed evidence to produce some moderation in the views expressed by opposing factions. At worst, one might expect such inconclusive evidence to be ignored.

The present study examines a rather different thesis, one born in an analysis of the layperson‘s general shortcomings as an intuitive scientist (cf. Nisbett 81 Ross, in press; Ross, 1977) and his more specific shortcomings in adjusting unwarranted beliefs in the light of empirical challenges (cf. Ross, Lepper, 81 Hubbard, 1975).

Our thesis is that belief polarization will increase, rather than decrease or remain unchanged, when mixed or inconclusive findings are assimilated by proponents of opposite viewpoints.

This “polarization hypothesis” can be derived from the simple assumption that data relevant to a belief are not processed impartially. Instead, judgments about the validity, reliability, relevance, and sometimes even the meaning of proffered evidence are biased by the apparent consistency of that evidence with the perceiver’s theories and expectations. Thus individuals will dismiss and discount empirical evidence that contradicts their initial views but will derive support from evidence, of no greater probativeness, that seems consistent with their views. Through such biased assimilation even a random set of outcomes or events can appear to lend support for an entrenched position, and both sides in a given debate can have their positions bolstered by the same set of data.

As the introductory quotation suggests, the notions of biased assimilation and resulting belief perseverance have a long history. Beyond philosophical speculations and a wealth of anecdotal evidence, considerable research attests to the capacity of preconceptions and initial theories to bias the consideration of subsequent evidence, including work on classic Einstellung effects (Luchins, 1942, 1957), social influence processes (Asch, 1946), impression formation (e.g., Jones & Goethals, 1971), rtxognition of degraded stimuli (Bruner 81 Potter, 1964), resistance to change of social attitudes and stereotypes (Abelson, I959; Allport, 1954), self fullilling prophecies (Merton, 1948; Rosenhan, 1973′, Snyder, Tanke, & Berscheid, 1977), and the persistence of “illusory correlations” (Chapman & Chapman, 1967, 1969).

In a particularly relevant recent demonstration, Mahoney (1977) has shown that trained social scientists are not immune to theory based evaluations. In this study, professional reviewers‘ judgments about experimental procedures and resultant publication recommendations varied dramatically with the degree to which the findings of a study under review agreed or disagreed with the reviewers’ own theoretical predilections.

Thus, there is considerable evidence that people tend to interpret subsequent evidence so as to maintain their initial beliefs. The biased assimilation processes underlying this effect may include a propensity to remember the strengths of confirming evidence but the weaknesses of disconfirming evidence, to judge confirming evidence as relevant and reliable but disconfirming evidence as irrelevant and unreliable, and to accept confirming evidence at face value while scrutinizing disconfirming evidence hypercritically.

With confirming evidence, we suspect that both lay and professional scientists rapidly reduce the complexity of the information and remember only a few well chosen supportive impressions. With this confirming evidence, they continue to reflect upon any information that suggests less damaging “alternative interpretations.” Indeed, they may even come to regard the ambiguities and conceptual flaws in the data opposing their hypotheses as somehow suggestive of the fundamental correctness of those hypotheses. Thus, completely inconsistent or even random data, when “processed” in a suitably biased fashion, can maintain or even reinforce one’s preconceptions.

The present study was designed to examine both the biased assimilation processes that may occur when subjects with strong initial attitudes are confronted with empirical data concerning a controversial social issue and the consequent polarization of attitudes hypothesized to result when subjects with differing initial attitudes are exposed to a common set of “mixed” experimental results. The social controversy chosen for our investigation was the issue of capital punishment and its effectiveness as a deterrent to murder. This choice was made primarily because the issue is the subject of strongly held views that frequently do become the target of public education and media persuasion attempts, and has been the focus of considerable social science research in the last twenty years. Indeed, as our basic hypothesis suggests, contending factions in this debate often cite and derive encouragement from the same body of inconclusive correlational research (Furman v. Georgia, 1972; Sarat & Vidmar, 1976; Sellin, 1967).

In the present experiment, we presented both proponents and opponents of capital punishment first with the results and then with procedural details, critiques, and rebuttals for two studies dealing with the deterrent eficacy of the death penalty, one study confirming their initial beliefs and one study disconfirming their initial beliefs.

We anticipated biased assimilation at every stage of this procedure. First, we expected subjects to rate the quality and probative value of studies confirming their beliefs on deterrent efficacy more highly than studies challenging their beliefs. Second, we anticipated corresponding effects on subjects’ attitudes and beliefs such that studies confirming subjects’ views would exert a greater impact than studies disconfirming those views. Finally, as a function of these assimilative biases, we hypothesized that the net result of exposure to the conflicting results of these two studies would be an increased polarization of subjects’ beliefs on deterrent eficacy and attitudes towards capital punishment.

Figure 1. Top panel: Attitude changes on capital punishment relative to start of experiment as reported across time by subjects who received prodetcerrence study first. Bottom panel: Attitude changes on capital punishment relative to start of experiment as reported across time by subjects who received antideterrence study first.

Discussion

The results of the present experiment provide strong and consistent support for the attitude polarization hypothesis and for the biased assimilation mechanisms postulated to underlie such polarization. The net effect of exposing proponents and opponents of capital punishment to identical evidence-studies ostensibly offering equivalent levels of support and discontinuation, was to increase further the gap between their views. The mechanisms responsible for this polarization of subjects’ attitudes and beliefs were clearly suggested by correlational analyses. Subjects’ decisions about whether to accept a study’s findings at face value or to search for flaws and entertain alternative interpretations seemed to depend far less on the particular procedure employed than on whether the study’s results coincided with their existing beliefs.

The Normative Issue

It is worth commenting explicitly about the normative status of our subjects’ apparent biases. First, there can be no real quarrel with a willingness to infer that studies supporting one’s theory based expectations are more probative than, or methodologically superior to, studies that contradict one’s expectations. When an “objective truth” is known or strongly assumed, then studies whose outcomes reflect that truth may reasonably be given greater credence than studies whose outcomes fail to reflect that truth, Hence the physicist would be “biased,” but appropriately so, if a new procedure for evaluating the speed of light were accepted if it gave the “right answer” but rejected if it gave the “wrong answer.”

The same bias leads most of us to be skeptical about reports of miraculous virgin births or herbal cures for cancer, and despite the risk that such theory-based and experience-based skepticism may render us unable to recognize a miraculous event when it occurs, overall we are surely well served by our bias. Our subjects’ willingness to impugn or defend findings as a function of their conformity to expectations can, in part, be similarly defended. Only the strength of their initial convictions in the face of the existing inconclusive social data and arguments can be regarded as “suspect.”

Our subjects’ main inferential shortcoming, in other words, did not lie in their inclination to process evidence in a biased manner. Willingness to interpret new evidence in the light of past knowledge and experience is essential for any organism to make sense of, and respond adaptively to, its environment. Rather, their sin lay in their readiness to use evidence already processed in a biased manner to bolster the very theory or belief that initially “justified” the processing bias. In so doing, subjects exposed themselves to the familiar risk of making their hypotheses unfalsifiable, a serious risk in a domain where it is clear that at least one party in a dispute holds a false hypothesis, and allowing themselves to be encouraged by patterns of data that they ought to have found troubling. Through such processes laypeople and professional scientists alike find it all too easy to cling to impressions, beliefs, and theories that have ceased to be compatible with the latest and best evidence available (Mahoney, 1976, 1977).

Figure 2. Top panel: Belief changes on capital punishment’s deterrent efficacy relative to start of experiment as reported across time by subjects who received prodeterrence study first. Bottom panel: Belief changes on capital punishment’s deterrent efficacy relative to start of experiment as reported across time by subjects who received antideterrence study first.

Polarization: Real or Merely Reported?

Before further pursuing the broader implications of the present demonstration, it is necessary to consider an important question raised by our procedure: Did our subjects really show change (i.e., polarization) in their private beliefs about the desirability and deterrent efficacy of capital punishment? Certainly they told us, explicitly, that their attitudes and beliefs did change after each new piece of evidence was presented, and from the beginning to the end of the experiment. Moreover, they did show a willingness to report a shift in their attitudes in the direction of findings that were contrary to their beliefs, at least until those findings were exposed to methodological scrutiny and possible alternative interpretations. Nevertheless, it could be argued that subjects were not reporting real shifts in attitudes but instead were merely reporting what they believed to be a rational or appropriate response to each increment in the available evidence.

Although we believe that it remains an impressive demonstration of assimilation biases to show that contending factions both believe the same data to justify their position “objectively,” the potential limitations of the present measures should be kept in mind in evaluating the relationship of this study to prior polarization research. As noted earlier our intended strategy of assessing direct changes from our initial selection measures of attitudes and beliefs, rather than asking subjects to report such changes within the experiment, was neither feasible nor appropriate, given the necessity of selecting subjects with strong and consistent initial views on this issue. Potentially such methodological problems could be overcome in subsequent research through the use of less extreme samples or, perhaps more convincingly, by seeing whether biased assimilation of mixed evidence will make subjects more willing to act on their already extreme beliefs.

Belief Perseverance and Attribution Processes

The present results importantly extend the growing body of research on the perseverance of impressions and beliefs. Two of the present authors and their colleagues have now amassed a number of studies showing that, once formed, impressions about the self (Ross et alt, 1975; Jennings, Lepper, 81. Ross, Note 2; Lepper, Ross, 81 Lau, Note 3), beliefs about other people (Ross et 3.1., 1975), or theories about functional relationships between variables (Anderson, Lepper, & Ross, Note 4) can survive the total discrediting of the evidence that first gave rise to such beliefs. In essence, these prior studies demonstrate that beliefs can survive the complete subtraction of the critical formative evidence on which they were initially based. In a complementary fashion, the present study shows that strongly entrenched beliefs can also survive the addition of nonsupportive evidence.

These findings pose some fundamental questions for traditional attribution models. To the extent that beliefs and impressions can be shown to persevere in the face of subsequent challenging data, we need a “top down” rather than, or perhaps in conjunction with a “bottom up” approach (ctr Bobrow 81 Norman, 1975) to the question of how individuals extract meaning from their social environment. Instead of viewing people as impartial, data driven processors, the present research suggests our models must take into account the ways in which intuitive scientists assess the relevance, reliability, representativeness, and implications of any given sample of data or behavior within the framework of the hypotheses or implicit theories they bring to the situation (Lepper, 1977). In everyday life, as well as in the course of scientihc controversies (cf. Kuhn, 1970), the mere availability of contradictory evidence rarely seems sufficient to cause us to abandon our prior beliefs or theories.

Social Science Research and Social Policy

We conclude this article, as we began it. by considering the important links between social policy, public attitudes and beliefs about such policy, and the role of the social scientist. If our study demonstrates anything, it surely demonstrates that social scientists can not expect rationality, enlightenment, and consensus about policy to emerge from their attempts to furnish “objective” data about burning social issues. If people of opposing views can each find support for those views in the same body of evidence, it is small wonder that social science research, dealing with complex and emotional social issues and forced to rely upon inconclusive designs, measures, and modes of analysis, will frequently fuel rather than calm the fires of debate.

See also:

Debunking the Conspiratists. The Federal Reserve System, Illuminati and The New World Order. Real Facts.

The enduring appeal of conspiracy theories – Melissa Hogenboom.

Critical Thinking Skills. Why more highly educated people are less into conspiracy theories – Christian Jarrett * Suspicious Minds. Why We Believe Conspiracy Theories – Rob Brotherton.