Luigi Corvaglia*
Psicologo
Keyword: manipulation, cults, Milgram experiment
ABSTRACT
This paper examines the controversy over mind manipulation by showing how the reasons for the divergence lie in the unambiguous conceptualization. The literature on mind manipulation is often tainted by specious use of the metaphor “brainwashing.” This is sometimes done under the guise of emphasizing the irrationality of manipulation and locating it outside the scientific realm. This paper will instead show how the findings of experimental psychology and behavioral economics lead to the identification of a process of change in thinking and behavior that can be described in scientific terms. The classic Milgram experiment is an appropriate tool to explain this. The framing effect identified by Tversky and Kahneman, so important in marketing, is quite capable of bringing about counterproductive decisions in a context that makes them reasonable to the person making them. The paper shows that thought manipulation isn’t a magical and supernatural phenomenon, but a persuasion process based on a set of well-studied basic principles of social psychology. The paper, therefore, proposes a psychosocial reading according to which this persuasion occurs through the selection of the most willing subjects in sequential and orderly steps. Indoctrination occurs primarily among final recruits and occurs in a psychological and relational context that’s undergone fundamental changes today.
Introduction
The topic of mind manipulation is one of the most debated because of its ethical implications and possible legal consequences. The term “brainwashing,” first used by Edward Hunter in 1951 to describe the work the Chinese seemed capable of by inducing incredible changes in the way prisoners of war thought and acted (Hunter, 1951) , has entered popular culture with vigour, but is a controversial concept in academia. Although mind control has been frequently associated with joining destructive “cults” that have made headlines since the 1970s, some sociologists consider this idea “implausible” because no methodologically sound research would support the thesis of brainwashing as an infallible technique of behaviour change that occurs without the consent of the persuaded person. These authors therefore interpret affiliation with what they call “New Religious Movements” (NRM) as the result of the individual’s free choice, even if it should be the result of persuasion. The risk is in proposing two opposing rhetorics, that of “thought control” and that of “free choice.” These two positions have in the past taken the form of two opposing camps, which have been given not uncritical labels by their opponents. Scholars concerned with the phenomenon of manipulation have been referred to as the anti-cult movement, while critics of this approach have been called cult apologists. The present work aims to overcome this opposing reductionist schematism by introducing variables that modulate and influence the decisions of individuals exposed to persuasion.
Preliminary notes on the concept of persuasion
Discussion, and consequently research, on the subject of manipulation in groups cannot free itself from the bonds created by the adoption of intellectual positions that deeply contaminate the serenity and objectivity that should be inherent in scientific debate (Zabloki, 1997) . It is therefore a preliminary work for any discourse to clearly define certain terms in order to avoid misunderstandings.
- Construct validity
First of all, it should be made clear that this is not a dispute between “believers” and “non-believers” in a metaphysical phenomenon. To deny the existence of psychological manipulation would be a foolish act for any serious scholar who respects his readers (the fact that there are authors who deviate from these characteristics is none of our business in this scientific discussion). In fact, it is now undeniable that there are communicative modes capable of bypassing central and rational processing, and even specific techniques capable of doing so (Kahneman and Tversky, 1979 , Petty and Cacioppo, 1984 , Bohner, Moskowitz, and Chaiken, 1995 , Kahneman, 2011 , Cialdini, 2017 , Sharot, 2018 ). The main scientific evidence supports the existence of two different pathways or processes through which persuasive communication is implemented.
The Elaboration Likelihood Model (ELM) by Petty and Cacioppo (1984) describes the two paths of information processing by the receiver as follows: - the central route: logical, characterized by the strength of arguments and content and a high level of information. The emphasis is on content.
- the peripheral route: characterized by a low level of elaboration and a greater focus on the superficial and formal aspects of the message, without critical analysis and cognitive engagement, and with a low level of information. The emphasis is on form.
People rarely use the central route because it requires significant cognitive effort (Budzynska et al., 2011) . It is easier to use the peripheral route, which focuses on marginal elements such as the speaker’s attractiveness, confidence, tone of voice, environment, background, etc. What is called charisma is precisely the ability to use the peripheral route powerfully.
In accordance with ELM, the Heuristic-Systematic Model, HSM (Bohner, Moskowitz, and Chaiken, 1995) , envisages the existence of two different mechanisms in the elaboration of persuasive messages: a heuristic and a systematic process. The first, which corresponds broadly to the peripheral path of ELM, assumes a low level of cognitive effort and makes use of heuristics. These are intuitive and rapid mental processes, mental shortcuts that make it possible to form a general idea of a topic without too much cognitive effort.
Systematic processing, on the other hand, requires a high commitment of cognitive resources, as does the central route in the Elaboration Likehood Model. However, the difference with ELM is that the two types of processing, heuristic-systematic, are not mutually exclusive and can even interact with each other.
As early as 1979, Kahneman and Tversky had breathed life into behavioral economics with the elaboration of Prospect Theory , showing with an irrefutable body of data that people do not rationally choose strategies that maximize their benefits (the expected utility of economists), but are misled by a set of cognitive constraints. The mind is susceptible to a number of cognitive biases, systematic errors induced by heuristics that can be exploited by marketing and propaganda.
A clear example of how cognitive biases work is the framing effect, which causes our brains to evaluate, judge, or interpret information based on how its positive or negative consequences are presented. A therapy with a 70 percent success rate is more popular than one with a 30 percent failure rate, even if the success and failure rates are equal. Kahneman and Tversky’s studies have had a tremendous impact on marketing.
Kahneman himself, who won the 2004 Nobel Prize in Economics for his studies on the rational choice fallacy, has more recently proposed a dual model of thinking. He describes slow thinking and fast thinking. Slow thinking is rational thinking. It works slowly, sequentially, laboriously, and in a controlled manner. The other is intuitive thinking, which is fast, automatic, effortless, associative, and difficult to control (Kahneman, 2011) .
Ultimately, one or more individuals with charisma and the ability to activate rapid (Kahneman), heuristic (HSM), and peripheral (ELM) cognitive processes are able to make any kind of propaganda and indoctrination effective, especially when acting on individuals who are searching for meaning or sure reference points.
In scientific circles, therefore, the issue of persuasion is not controversial. After all, marketing works with these tools, as does political propaganda . However, the seminal work of Eileen Barker (1984) launched a strand of criticism of the concept of “mind manipulation” applied to groups, but in a field other than experimental psychology, namely sociology. Since the 1970s, when the proliferation of alternative cults emerging from the counterculture showed its dark side , the models of brainwashing and thought reform were also applied to the process of belonging to “religious cults” . The concept of brainwashing promoted by Hunter during the Korean War had indeed influenced mass culture and even inspired a 1962 film, The Manchurian Candidate starring Frank Sinatra. In the film, a “brainwashed” Korean War hero was programmed – “reconstructed,” in the film’s language – to assassinate a U.S. presidential candidate. In practice, he was a sleeping agent transformed by certain stimuli into an automaton controlled by others. This is an unrealistic notion of mental manipulation, the result of the communist panic of the time. Between the late 1960s and the early 1970s, the concern of many relatives of adherents of groups that alienated them from their families led to the notion that membership in “cults” was also not without manipulative elements. In the meantime, new models of “mind control” had emerged that differed markedly from “brainwashing” and the Manchurian Candidate prototype (Singer, Lalich, 1996) .Many scholars had studied mind control in the years before, especially that exercised by totalitarian political regimes (e.g., Joost A.M. Meerloo , Edgard Shein , Richard J. Ofshe , and Robert J. Lifton ), but the model that was most influential was Robert Lifton’s, because although it also emerged from a study of Western and Chinese prisoners of war in Korea, it described a more subtle technique that could be used in other contexts, such as cults. Lifton used the term thought reform precisely because it could not be magically connoted as ‘brainwashing’. In the 1980s, the intrusion of some sociologists into the field of studies of what they called New Religious Movements, a term without the negative connotation of the word “cult” that was common among psychologists and psychiatrists concerned with mind control, led to the beginning of a controversy.
- Birth of the controversy
Eileen Barker (1984) is considered “the mother of cult apologists” (Hauserr, 2002) , because a classic study of hers triggered the coagulation of a nebula of sociologists critical of the concept of “brainwashing”. This study was conducted on the Reverend Moon’s Unification Church, whose members are commonly known as Moonies, and was published under the title The Making of a Moonie . Barker found that of more than 1,000 people stopped on the street who had attended their first church meeting (usually a luncheon), about 33.3 percent went to the next workshop, about 10 percent said they would join, and about 5 percent were still full members two years later. A figure Barker finds quite disappointing, and one that would show that there is no such extraordinary mechanism of persuasion as “brainwashing.” It would follow that adepts are not being manipulated.
The work was sharply attacked by other scientists, especially by the psychologist Margaret T. Singer, who had already written several studies on manipulation (Singer, 1958) . The criticism was not based only on the matter. In her classic book Cults in our Midst (Singer, Lalich, 1996) .
Singer wrote:
In 1989, the Religious News Service reported that Dr. Barker’s book was funded by the Unification Church. Barker ‘freely admits that the Unification Church paid all of her expenses to attend 18 conferences in Europe, New York, the Caribbean, Korea, and South America.’ [A member of Parliament said, “Any academic who allows himself to be manipulated into believing a cult is harming families around the world.”
This shows the poisoned climate in which what should be a scientific debate is taking place. Of the complex diatribe that is still going on, we would just like to point out that the account proposed by these authors is that the discrediting of cults is the result of a “moral panic” hatched by “anti-cult associations” that spread myths such as mind manipulation. They claim that there is no such thing as manipulation, only the belief that it is a natural human behavior that cannot be censored .
- reflections on objections to manipulation in cults
It seems to us, however, that the real difficulty in drawing a line between legitimate belief and “mental manipulation” is that considerations enter the discourse that do not pass the test of avutativity and go beyond the sphere of rigor that should characterize scientific research. The narrative that groups labeled “cults” are those rejected by prejudiced scholars takes the form of a conspiracy theory. This is possible, of course, but in any case it would have to be supported by factual data, which is not the case at present. In the preparatory work of disambiguation that we have done in the preface, it is then necessary to emphasize that when we speak of manipulation of consciousness in the context of cults, we are not talking about persuasion per se, but about that which, since it aims at exploitation, can be considered undue. It is the adjective ‘undue’ that should be emphasized, not the noun ‘persuasion’. Undue means “unethical.” As Langone (1988) writes, mind control “refers to a process in which a group or individual systematically uses unethical manipulative methods to get others to conform to the manipulator’s wishes.” The biggest mistake in the discussion on this topic is to define persuasion as a construct consisting of only one dimension. It is necessary to introduce another dimension that has not been taken into account, that of the persuader’s interest (Corvaglia, 2022) . It is the persuasion with evil intent that is at the center of the interest of cultists. Of course, once the conceptual fuzziness is removed from the ‘operating table’, the nominalist quibble falls away. An abusive cult is considered as such because it is abusive, not because it would be a group that deviates from a canon established by critics. In fact, abuse and exploitation are objective data, while deviation is always in relation to norms that may not be objective and changeable.
Having said all this, one may object to questions of method and data interpretation, but Barker’s assessments in the study of the Moonies she cites are certainly valid in rejecting the Manchurian Candidate hypothesis, i.e., a specific and infallible technique for reconstructing individuals. However, it is possible to consider another concept of manipulation, which considers mind control as a process of persuasion, carried out precisely by selecting the most willing subjects in successive and orderly steps, and which is carried out in practice mainly on the last recruits, with indoctrination taking place in a psychological and relational context that now is profoundly changed (Corvaglia, 2020 , 2022 ). We will now present this psychosocial model.
A model of the persuasion process in cults
a) The relationship between salience and procrastination
The general tendency of people to postpone actions they find unpleasant is due to the fact that the ‘here and now’ costs are more concrete and vivid, being immediately and unambiguously unpleasant, than the vague future costs. The different value that a topic assumes in relation to a context is called salience. This salience changes depending on the temporal perspective from which the procrastination occurs. This changing meaning often causes us to act in ways that are not strictly ‘rational’. This can be seen, for example, in the fact that putting off an unpleasant action usually involves a very high cost. A good example of this is someone who puts off starting an exercise program to prevent heart problems.
It is always subjective importance that causes us to adjust our preferences so that we desire more intensely those goods that are less available to us at a given time. Jon Elster (1984) used as an example of what he calls “endogenous preference change” the Hans Christian Andersen story in which a farmer goes to market to sell or trade his horse but returns home with a basket full of rotten apples. In fact, on the way the farmer had traded the horse for a cow because he liked it better than the horse, then the cow for a sheep, for the same reason, and so the sheep for a goose, the goose for a chicken, and finally the chicken for a basket of rotten apples. In other words, people behave in inconsistent ways that they themselves cannot foresee. Here, for example, the principle of transitivity fails. Indeed, if one prefers A to B and B to C, one should also prefer A to C, but salience changes this (so that the farmer comes home with a basket of rotten apples).
The area where procrastination can produce the most damaging results is obedience to inappropriate or abusive authority. Nothing explains this better than Milgram’s experiment. In his classic study, social psychologist Stanley Milgram (1975) recruited adult males to participate in an experiment that he claimed was designed to study the effects of punishment on memory. In reality, the study was about obedience to authority. An accomplice of the experimenter played the role of a student. If the student made a mistake in remembering a text he was supposed to learn, the subjects were to give him electric shocks using a special device connected to him by wires. This student, who was visible to the subjects through a glass window, was not actually connected to the device and had the task of simulating appropriate responses to the shocks administered by the subjects. Subjects were initially instructed to administer shocks at low voltage (15 volts), increasing the dose by 15 volts at a time up to a maximum of 450 volts. There are several versions of this experiment, but in all versions the “student” showed a marked response to the “shocks.” In one version, according to Milgram’s description
At 75 volts, the student began to grunt and groan. At 150 volts, he asked to be released from the experiment. At 180 volts, he screamed that he could no longer stand the pain. At 300 volts he begged to be dismissed (1965, p. 246, quoted in E. Stotland and L. K. Canon, 1972, p. 6).
This result shows that subjects prefer to obey rather than break the pact with the experimenter, i.e., that disobedience is more “costly” because of its salience than showing compassion for the tortured subject. This supports our central argument: under certain circumstances, perceived salience causes people to behave in ways that they themselves cannot predict.
Lee Ross (1988) has argued that this special meaning, this specific salience, attached to disobedience is due to the fact that there is an implicit contract between the ‘teacher’ and the experimenter, and that this contract does not require the existence of a valid reason for the interruption of the shocks. Thus, the subject, perceiving the cost of disobedience as very high now, may plan to disobey in the future if the cost continues to rise. Thus, the subject hesitates. Perhaps more importantly, the importance itself probably depends not so much on the amount of tension the subject exerts, but on the difference between the current tension and the tension already exerted. Indeed, no one would agree to administer 450 volts directly, but many have done so by gradually increasing the shocks. Subjects are willing to disobey if the demands become too high, but not now, not yet. Even assuming they will disobey in the future, these subjects continue to increase the shock levels required to induce disobedience.
The Milgram experiment shows that people isolated in a laboratory can exhibit unexpected and disruptive behaviors out of pure obedience, provided the escalation of commands is slow. Outside the laboratory, however, under non-isolated conditions, there is evidence that such behavior occurs only when there is near unanimity. The most important evidence for this comes from a variant of the famous experiment of Asch (1951) . Solomon Asch showed that subjects asked to estimate the difference in length of one line compared to another gave a wrong answer about 40% of the time when they were preceded by accomplices of the experimenter who had deliberately given wrong answers. Experimenters followed each other’s answers even when they were clearly wrong.
In one variation of the experiment, however, Asch found that the presence of even one accomplice who gave the correct answer among all the others who gave the wrong answer reduced the number of errors in the experimental subjects by two thirds (Asch, 1952) .
This suggests that the presence of people who confirm our perceptions reduces our tendency to conform. That is, the presence of other disobedient people under conditions like those in Milgram’s experiment significantly increases the probability of disobedience. It can be inferred that obedience as observed by Milgram can only occur in the laboratory, where people are protected from outside information and influence. However, there are other situations where this can occur, especially in cults.
b. Self-selection and group framing effect
Following the conclusions of Barker (1984), we can conclude that joining the Moonies involves a sequence of individual decisions. Potential recruits are first contacted individually and invited to attend a two-day weekend workshop. This workshop is followed by another 7-day workshop, then a 12-day workshop, and finally the actual joining. The potential recruit must make four different decisions: first attend the 2-day workshop, then continue with the 7-day workshop, then attend the 12-day workshop, and finally join the church. As in Milgram’s experiment, the decision is made in slow steps and by slight deviations from the previous choice.
Marc Galanter’s (1979) classic study on the same topic, the Moonies, illustrates well this gradual restructuring of thinking toward commitment. In his study, of the 104 guests in the first two days of the workshop, 74 did not continue. Of the 30 participants in the 7-day workshop, 12 did not continue. Of the 18 remaining participants in the 21-day workshop, 9 did not continue.
Of the remaining 9, 6 were active church members 6 months later. This sequence describes a conversion process. Converts make a series of small decisions regarding acceptance of authority. Some delay disobedience for a short time, others for longer. As a result of this sequence of decisions to obey rather than rebel, converts eventually develop beliefs and values that are very different from those they held at the beginning of the process.
We believe that this acquiescence to authority is facilitated by self-selection. As the majority of those who are perplexed leave the organization, there is no opportunity to develop the dissent necessary to resist the escalation of commitment. This creates the state of isolation from outside influences that exists in Milgram’s lab, and at the same time a conformity to the ideas of others, as in Asch’s experiment. This is also an example of a shift in which the change in preferences depends on previous preferences, as Elster has pointed out.
The fact that the new context gives a different “meaning” to what happens in the cult is clearly reminiscent of the framing effect highlighted by Daniel Kahneman and Amos Tversky (1979) . As Elster (1984) points out , this changes the rationality of decisions as the principle of transitivity is lost (the example of the farmer in Andersen’s novel). One of the shortcuts known as “heuristics” that contribute to the framing effect is the affect heuristic. This is a phenomenon in which we rely heavily on our emotional state during the decision-making process, rather than taking the time to consider the long-term consequences of a decision. This is related to the use of those rapid and heuristic-peripheral channels we have described. Marketing and political propaganda make extensive use of these distortions.
That charismatic cult leaders can lead their followers into a framework in which their irrational choices make perfect sense has also been expressed by Janja Lalich (2000) in her model of bounded choice . Under the conditions of a self-sealing system, members of a destructive cult are ultimately led into a state of “bounded choice” in which they make seemingly irrational choices within a framework that nonetheless makes sense to them (and is, in fact, consistent with their highest goals).
Threfore, the path outlined here leads us to believe that one does not join an abusive cult because one is persuaded to do so and thereby accepted into the group. On the contrary, one is persuaded because one is accepted into the group . Although persuasion and seduction can occur simultaneously, the weight of the latter is more temporally significant at the beginning. For example, through the well-known love bombing . Ideological indoctrination gains more weight over time. The more gradual the process of persuasion, the more people will be willing to believe in the new beliefs and act accordingly. At each step, the difference from what has already been acquired or done will be minimal (and the impulse to defect will be delayed). Although each of these steps is ‘freely’ chosen, or at least the decision to delay disobedience is free, it is done within a framework that can change the meaning of things. The same people who go through the whole process by a continuous and gradual approximation would never have freely chosen the final goal, i.e. the dogmas and the associated behaviors, if it had been presented to them in one go at the beginning. This is undue persuasion, i.e., the manipulation of the gradient of ‘free choice’ for the purpose of exploiting self-selected subjects.
——————————————————————————————————————–
BIBLIOGRAPHY
Asch S. (1952) Social Psychology, Englewood Cliff, Prentice-Hall.
Asch, S. (1951). Effects of group pressure upon the modification and distortion of judgments. In H. Guetzkow (Ed.), Groups, leadership and men; research in human relations (pp. 177-190), Pittsburgh, Carnegie Press.
Barker, E. (1984). The Making of a Moonie: Choice or Brainwashing? Oxford, Blackwell Publications
Bohner, G., Moskowitz, G. B., & Chaiken, S. (1995). The interplay of heuristic and systematic processing of social information. European Review of Social Psychology. 6, 33-68. doi: 10.1080/14792779443000003.
Bromley, D., Shupe, A. (1981). Strange Gods. The great American cult scare. Boston, Beacon Press
Budzynska, K. & Weger, H (2011). Structure of persuasive communication and elaboration likelihood model. OSSA Conference Archive. 67.
Cacioppo, John T.; Petty, Richard E. (1984). The Elaboration Likelihood Model of Persuasion, in Advances in Consumer Research, Association for Consumer Research, vol. 11.
Cialdini, R (2017). Influence: The Psychology of Persuasion. Pymble, Australia, HarperCollins e-books.
Cialdini, R. (1994), Le armi della persuasione, Florence, Giunti, (Italian edition)
Corvaglia, L. (2000), A model of persuasion in totalitarian cults, Psychofenia, year XXIII, 41-42
Corvaglia, L. (2021), Submission as Preference Shift, International Journal of Coercion, Abuse and Manipulation, Vol. II, Issue 1.
Corvaglia, L. (2022). Indoctrination, radicalization and control in coercive cults. Italian Intelligence Society, https://doi.org/10.36182/2022.01
Elster J. (1984). Ulysses and the Sirens: Studies in Rationality and Irrationality, Cambridge, Cambridge University Press.
Galanter M. et alii (1979). The ‘Moonies’: A Psychological Study of Conversion and Membership in a Contemporary Religious Sect. American Journal of Psychiatry, February 136(2), 165-70.
Hauserr, T. (2002), Cult apologists FAQ, private document still available on the net: http://home.snafu.de/tilman/faq-you/cult.apologists.txt
Hunter, E. (1951). Brain-washing in Red China: The Calculated Destruction of Men’s Mind, The Vanguard Press Inc., New York.
Introvigne, M. (2002), Lavaggio del cervello. Mito o realtà? Turin, Elledici, Leumann
Kahneman, D. (2012). Slow and fast thinking, Mondadori (Italian edition).
Kahneman, D., Tversky, A. (1979), Prospect Theory: An Analysis of Decision Under Risk, Econometrica, 47(2), 1979, 263-291.
Lalich, J. (2000), Bounded Choice: The Fusion of Personal Freedom and Self-Renunciation in Two Transcendent Groups. Ph.D. dissertation, Santa Barbara, California, Fielding Institute.
Langone, M. (1988), Questions and answers, American Family Foundation: https://www.bibliotecapleyades.net/sociopolitica/esp_sociopol_mindcon27.htm
Lifton, R.J. (1961), Thought Reform and the Psychology of Totalism: A Study of “Brainwashing” in China, New York, Norton.
Meerloo, J., A., M. (1956). The Rape of the Mind, The Psychology of Tought Control, Menticide and Brainwashing, Cleveland, The World Publishing Company.
Milgram, S. (1975). Obedience to Authority: An Experimental View, New York, Harper and Row.
Ofshe, R. J. (1990). Coercive Persuasion and Attitude Change, Encyclopedia of Sociology Volume 1, Macmillan Publishing Company, New York, 1990.
Praeger, 1956, Contemporary Psychology, February 1988, 33, 101-04
Ross, L. (1988). Review of Arthur G. Miller, The Obedience Experiments: A Case Study of Controversy in Social Sciences, New York:
Schein E.H. (1956). The Chinese Indoctrination Program for Prisoners of War. Psychiatry, XIX
Sharot, T. (2018), The science of persuasion. Our power to change others, Milan, Feltrinelli (Italian edition).
Singer, M.T., Lalich, J. (1996), Cults in Our Midst. The Hidden Menace in our Everyday Lives, Hoboken, John Wiley & Sons.
Singer, M.T., Schein, E.H. (1958). Projective test responses of prisoners of war following repatriation, Psychiatry, 21, 375-385, Margaret Singer has described a three-stage process of coercive persuasion (Singer, M.T.(1979), Coming out of the cults, Psychology Today. 12, 8, 72-82):
Tizzani, E., Giannini, A. M. (2011). La manipolazione mentale nei gruppi distruttivi. Giornale di Criminologia, Vittimologia e Sicurezza, Vol. V , 2
Wenegrat B (1989). Religious cult membership: A sociobiologic model. In Galanter M (ed.) Cults and new religious movements: A report of the American Psychiatric Association, pp. 193-208. American Psychiatric Association, Washington D.C.).
Zabloki, B. (1997), The blacklisting of a concept: the strange history of the brainwashing conjecture in the sociology of religion, Nova religio, Vol. 1, 1
_________________________________________________________________
IT – Pubblicazione N° 01 del 20/06/2024
The Mediterranean Journal of Surgery, Medicine and Forensic Sciences
ISSN: xxxxxx
Ricevuto: 18/06/2024
Accettato: 20/06/2024
Pubblicato online il 20 GIUGNO 2024