Brainwashing is a term that was adopted by the press to describe the indoctrination of U.S. prisoners of war (POWs) during the Korean War. Social scientists now recognize brainwashing as a form of severe indoctrination marked by physical and psychological stress, intense social pressure, and a variety of persuasion techniques. This form of intense indoctrination usually promotes some particular form of political or religious doctrine, often entailing costly sacrifices by adherents.
History of Brainwashing
Modern social scientists became concerned with brainwashing when American POWs during the Korean War were subjected to systematic persuasive techniques by their captors. Following this indoctrination, some of these POWs did, in fact, cooperate with the enemy, at least superficially. Such prisoners praised their captors or made hard-to-believe confessions about participating in various war atrocities. The brainwashing procedures directed against American POWs in Korea were modeled upon indoctrination procedures used by Chinese revolutionary forces when “educating” their own political cadres. In point of fact, however, at the end of hostilities in Korea, only a handful of these POWs actually elected to refuse repatriation to the United States. When one considers that several thousand American soldiers were exposed to these techniques, this low rate of refusal indicates that the long-term persuasive results from these early procedures were meager. Beginning in the 1970s however, shocking events—including series of group suicides among the members of groups such as the Heaven’s Gate cult and the Peoples Temple (where over 900 people perished)—established that group indoctrination could induce extremely costly behavior from group members. In light of these events, social scientists took renewed interest in extreme forms of systematic indoctrination.
Brainwashing Procedures and Analysis
According to most experts, the intense indoctrination associated with the term brainwashing unfolds in a series of stages. The earliest stage entails strong forms of psychological and physical stress. Here, the indoctrinee, or recruit, is almost always sequestered in a retreat or a training center away from their normal friends, coworkers, and family, where they are surrounded instead by members of the indoctrinating group and other indoctrinees. Here prolonged sleep deprivation is extremely common, as are changes in diet and pattern of dress. Public self-criticism is generally encouraged often under the guise of self-analysis. The recruit’s time is carefully regimented and filled with a multitude of activities most often related to, and advocating, an unfamiliar, complex doctrine. This advocacy can take the form of lectures, readings, and other group activities. This initial stage can be as short as a few days but also can extend for weeks. It is designed to evoke such emotions as fear, guilt, exhaustion, and confusion on the part of the recruit.
This introductory stage segues subtly into the second stage of indoctrination in which the recruit is encouraged to “try out” various group activities. These activities may involve such things as self-analysis, lectures, praying, and working at group-related chores. This tentative collaboration may be spurred by such elements as social pressure, politeness, legitimate curiosity, or a desire to curry favor with authority figures. Eventually however, this collaboration leads the recruit to begin to seriously consider the wisdom of the doctrine in question, thereby leading to the third stage of indoctrination in which actual belief change begins. In this third stage, the recruit is typically surrounded by believers and kept isolated from anyone who might disagree with the doctrine, thereby producing particularly potent peer pressure. In addition, the information and reading provided to recruits is carefully screened to justify the group teachings. Added to this, the recruit generally remains physically and mentally exhausted and is given little time for unbiased analysis of the doctrine. This makes it difficult for the recruit to generate private cognitive objections to the group doctrine. As a result, sincere belief change commonly begins at this point in the process.
In the final stage of indoctrination, initial belief change regarding the group and its doctrine is consolidated and intensified to the point that the new recruit comes to accept group teachings and decisions uncritically while viewing any contrary information as either enemy propaganda or necessary “means/ends tradeoffs.” By this point, the recruit has been cajoled into taking a series of public and/or irrevocable actions in service to the group. These acts entail increased effort, cost, and sacrifice over time. As one example, when Patricia Hearst was being indoctrinated by the Symbionese Liberation Army, she initially was asked to just train with the group. Then she was asked to tape-record a prewritten radio speech. Next she was asked to both write and record such a talk. Soon after that, she was required to accompany the group on a bank robbery carrying an unloaded weapon. Thus, the level of sacrifice required of her escalated over her time with the group. In this final stage, as before, recruits remain surrounded by those who endorse the doctrine. These co-believers corroborate the recruit’s expressions of that doctrine. Moreover, they admire, reward, and endorse the recruit’s acts of loyalty and sacrifice. Interestingly, according to recent news reports, these procedures correspond quite closely to those followed in the training of suicide bombers once they express an initial willingness to make such a sacrifice. Such individuals are kept secluded in safe houses, cut off from family, and often make videos to be used in later propaganda efforts.
Experts note that the procedures (stages) described in the previous paragraphs coordinate a variety of potent persuasive techniques. Peer pressure is known to be particularly effective when an individual faces a united consensus especially if the individual is confused, frightened, or facing an ambiguous issue. People’s ability to resist a flawed persuasive message is particularly impaired when they lack the opportunity to think clearly about inadequacies of the message due to fear, sleep deprivation, and/or overactivity. Moreover, when likeminded individuals (such those found in extremist groups) discuss a topic they basically agree upon, the result is a polarization of opinion, with group members taking a more extreme view after discussion. Similarly, extreme attitudes also result when people find that others share and admire their opinions. In addition, when individuals agree to costly (and public) sacrifices, they have a strong tendency to justify such actions by intensifying any attitudes that support these acts, a process referred to as the reduction of cognitive dissonance. Finally, the grandiose goals of many extremist groups appeal to the human need to feel important, significant, and part of some timeless, meaningful social movement be it religious, political, scientific, or historic. In this emotional context, the intense indoctrination associated with the term brainwashing combine to create a persuasive milieu that, at least for some targets, has the power to evoke surprising changes in both belief and behavior.
- Baron, R. S. (2000). Arousal, capacity, and intense indoctrination. Review of Personality and Social Psychology, 4, 238-254.
- Pratkanis, A., & Aronson, E. (2001). The age of propaganda (Rev. ed.). New York: Freeman.
- Singer, M. (1995). Cults in our midst. San Francisco: Jossey-Bass.