Stanley Milgram’s Experiment




Stanley Milgram was one of the most influential social psychologists of the twentieth century. Born in 1933 in New York, he obtained a BA from Queen’s College, and went on to receive a PhD in psychology from Harvard. Subsequently, Milgram held faculty positions in psychology at Yale University and the City University of New York until his untimely death in 1984. Although Milgram never held a formal appointment in sociology, his work was centrally focused on the social psychological aspects of social structure.

Stanley Milgram’s ExperimentStanley Milgram is mostly recognized for his experiment on obedience to authority. As many social scientists of his time and as a Jew himself, Milgram was deeply influenced by the experience of the Holocaust. Based on earlier work of his mentor Solomon Asch (1907–96), Milgram suspected that notions of an aggressive personality or authoritarian cultural traits were not sufficient to explain the mass murder of the Holocaust. Rather, he suspected that the hierarchical structure of bureaucratic organizations and the willingness of people to submit to legitimate authority provided a more plausible explanation of why so many educated and civilized people contributed to barbaric torture and mass killings.

Academic Writing, Editing, Proofreading, And Problem Solving Services

Get 10% OFF with 24START discount code


In a historic coincidence, in 1961, just as Milgram was about to begin work on his famous obedience experiments, the world witnessed the trial of Adolf Otto Eichmann, a high ranking Nazi official who was in charge of organizing the transport of millions of Jews to the death camps. To many, Eichmann appeared not at all to be the fervent anti Semite that many had suspected him to be; rather, his main defense was that he was only ‘‘following orders’’ as an administrator. To the political theorist Hannah Arendt, Eichmann’s case illustrated the ‘‘banality of evil,’’ in which personal malice appeared to matter less than the desire of individuals to fulfill their roles in the larger context of a bureaucracy. Milgram’s research is arguably the most striking example to illustrate this dynamic.

Milgram planned and conducted his obedience experiments between 1960 and 1963 at Yale University. In order to be able to study obedience to authority, he put unsuspecting research participants in a novel situation, which he staged in the laboratory. With the help of actors and props, Milgram set up an experimental ruse that was so real that hardly any of his research participants suspected that, in reality, nothing was what it pretended to be.

For this initial study, using newspaper ads promising $4.50 for participation in a psychological study, Milgram recruited men aged 20 to 50, ranging from elementary school drop outs to PhDs. Each research participant arrived in the lab along with another man, white and roughly 30 years of age, whom they thought to be another research participant. In reality, this person was a confederate, that is, an actor in cahoots with the experimenter. The experimenter explained that both men were about to take part in a study that explored the effect of punishment on memory. One man would assume the role of a ‘‘teacher’’ who would read a series of word pairings (e.g., nice day, blue box), which the other (‘‘the learner’’) was supposed to memorize. Subsequently, the teacher would read the first word of the pair with the learner having to select the correct second word from a list. Every mistake by the learner would be punished with an electric shock. It was further made clear that, although the shocks would be painful, they would not do any permanent harm.

Following this explanation, the experimenter assigned both men to the roles. Because the procedure was rigged, the unsuspecting research participant always was assigned to the role of teacher. As first order of business, the learner was seated in an armchair in an adjoining room such that he would be separated by a wall from the teacher, but would other wise be able to hear him from the main room. Electrodes were affixed to the learner’s arms, who was subsequently strapped to the chair apparently to make sure that improper movements would not endanger the success of the experiment.

In the main room, the teacher was told that he would have to apply electric shocks every time the learner made a mistake. For this purpose, the learner was seated in front of an electric generator with various dials. The experimenter instructed the teacher to steadily increase the voltage of the shock each time the learner made a new mistake. The shock generator showed a row of levers ranging from 15 volts on the left to 450 volts on the right, with each lever in between delivering a shock 15 volts higher than its neighbor on the left. Milgram labeled the voltage level, left to right, from ‘‘Slight Shock’’ to ‘‘Danger: Severe Shock,’’ with the last two switches being marked ‘‘XXX.’’ The teacher was told that he simply should work his way from the left to the right without using any lever twice. To give the teacher an idea of the electric current he would deliver to the learner, he received a sample shock of 45 volts, which most research participants found surprisingly painful. However, despite its appearance, in reality the generator never emitted any electric shocks. It was merely a device that allowed Milgram to examine how far the teacher would go in harming another person based on the experimenter’s say so.

As learning trials started, the teacher applied electric shocks to the learner. The learner’s responses were scripted such that he apparently made many mistakes, requiring the teacher to increase shock levels by 15 volts with every new mistake. As the strength of electric shocks increased, occasional grunts and moans of pain were heard from the learner. At 120 volts the learner started complaining about the pain. At 150 volts, the learner demanded to be released on account of a heart condition, and the protest continued until the shocks reached 300 volts and the learner started pounding on the wall. At 315 volts the learner stopped responding altogether.

As the complaints by the learner started, the teacher would often turn to the experimenter, who was seated at a nearby desk, wondering whether and how to proceed. The experimenter, instead of terminating the experiment, replied with a scripted succession of prods:

  • Prod 1: ‘‘Please continue.’’
  • Prod 2: ‘‘The experiment requires that you continue.’’
  • Prod 3: ‘‘It is absolutely necessary to continue.’’
  • Prod 4: ‘‘You have no other choice: you must go on.’’

These prods were successful in coaxing many teachers into continuing to apply electric shocks even when the learner no longer responded to the word memory questions. Indeed, in the first of Milgram’s experiments, a stunning 65 percent of all participants continued all the way to 450 volts, and not a single participant refused to continue the shocks before they reached the 300 volt level! The high levels of compliance illustrate the powerful effect of the social structure that participants had entered. By accepting the role of teacher in the experiment in exchange for the payment of a nominal fee, participants had agreed to accept the authority of the experimenter and carry out his instructions. In other words, just as Milgram suspected, the social forces of hierarchy and obedience could push normal and well adjusted individuals into harming others.

The overall level of obedience, however, does not reveal the tremendous amount of stress that all teachers experienced. Because the situation was extremely realistic, teachers were agonizing over whether or not to continue the electric shocks. Should they care for the well being of the obviously imperiled learners and even put their life in danger? Or should they abide by a legitimate authority figure, who presented his instructions crisply and confidently? Participants typically sought to resolve this conflict by seeking assurances that the experimenter, and not themselves, would accept full responsibility for their actions. Once they felt assured, they typically continued to apply shocks that would have likely electrocuted the learner.

Milgram expanded his initial research into a series of 19 experiments in which he carefully examined the conditions under which obedience would occur. For instance, the teacher’s proximity to the learner was an important factor in lowering obedience, that is, the proportion of people willing to deliver the full 450 volts. When the teacher was in the same room with the learner, obedience dropped to 40 percent, and when the teacher was required to touch the learner and apply physical force to deliver the electric shock, obedience dropped to 30 percent.

Milgram further suspected that the social status of the experimenter, presumably a serious Yale University researcher in a white lab coat, would have important implications for obedience. Indeed, when there was no obvious connection with Yale, and the above experiment was repeated in a run down office building in Bridgeport, Connecticut, obedience dropped to 48 percent. Indeed, when not the white coated experimenter but another confederate encouraged the teacher to continue the shocks, all participants terminated the experiment as soon as the confederate complained. Milgram concluded that ‘‘a substantial proportion of people do what they are told to do, irrespective of the content of the act and with out limitations of conscience, so long as they perceive that the command comes from a legitimate authority’’ (1965). However, additional studies highlighted that obedience is in part contingent on surveillance. When the experimenter transmitted his orders not in person but via telephone, obedience levels dropped to 20 percent, with many participants only pretending to apply higher and higher electric shocks.

Since its initial publication in 1963, Mil gram’s research has drawn a lot of criticism, mainly on ethical grounds. First, it was alleged that it was unethical to deceive participants to the extent that occurred in these studies. It is important to note that all participants were fully debriefed on the deception, and most did not seem to mind and were relieved to find out that they had not shocked the learner. The second ethical criticism is, however, much more serious. As alluded to earlier, Milgram exposed his participants to tremendous levels of stress. Milgram, anticipating this criticism, inter viewed participants after the experiment and followed up several weeks later. The over whelming majority of his participants commented that they enjoyed being in the experiment, and only a small minority experienced regret. Even though personally Milgram rejected allegations of having mistreated his participants, his own work suggests that he may have gone too far: ‘‘Subjects were observed to sweat, tremble, bite their lips, groan, and dig their fingernails into their flesh . . . A mature and initially poised businessman entered the laboratory smiling and confident. Within 20 minutes, he was reduced to a twitching, stuttering wreck who was rapidly approaching a point of nervous collapse’’ (1963: 375). Today, Milgram’s obedience studies are generally considered unethical and would not pass muster with regard to contemporary regulations protecting the well being of research participants. Ironically, partly because Milgram’s studies illustrated the power of hierarchical social relationships, contemporary researchers are at great pains to avoid coercion and allow participants to terminate their participation in any research study at any time without penalty.

Another type of criticism of the obedience studies has questioned their generality and charged that their usefulness in explaining real world events is limited. Indeed, Milgram conducted his research when trust in authorities was higher than it is nowadays. However, Milgram’s studies have withstood this criticism. Reviews of research conducted using Milgram’s paradigm have generally found obedience levels to be at roughly 60 percent (see, e.g., Blass 2000). In one of his studies Milgram further documented that there was no apparent difference in the responses of women and men. More recent research using more ethically acceptable methods further testifies to the power of obedience in shaping human action (Blass 2000).

Milgram offers an important approach to explaining the Holocaust by emphasizing the bureaucratic nature of evil, which relegated individuals to executioners of orders issued by a legitimate authority. Sociologists have extended this analysis and provided compelling accounts of obedience as root causes of many horrific crimes, ranging from the My Lai massacre to Watergate (Hamilton & Kelman 1989). How ever, it is arguably somewhat unclear to what extent Milgram’s findings can help explain the occurrence of the Holocaust itself. Whereas obedience kept the machinery of death running with frightening efficiency, historians often caution against ignoring the malice and sadism that many of Hitler’s executioners brought to the task (see Blass 2004).

Milgram’s dramatic experiments have left a lasting impression beyond the social sciences. They are the topic of various movies, including the 1975 TV film The Tenth Level starring William Shatner. Further, the 37 percent of participants who did not obey were memorialized in a 1986 song by the rock musician Peter Gabriel titled ‘‘We Do What We’re Told (Milgram’s 37).’’

References:

  1. Blass, T. (Ed.) (2000) Obedience to Authority: Current Perspectives on the Milgram Paradigm. Erlbaum, Mahwah, NJ.
  2. Blass, T. (2004) The Man Who Shocked the World: The Life and Legacy of Stanley Milgram. Basic Books, New York.
  3. Hamilton, V. L. & Kelman, H. (1989) Crimes of Obedience: Toward a Social Psychology of Authority and Responsibility. Yale University Press, New Haven.
  4. Milgram, S. (1963) Behavioral Study of Obedience. Journal of Abnormal and Social Psychology 69: 371-8.
  5. Milgram, S. (1965) Some Conditions of Obedience and Disobedience to Authority. Human Relations 18: 57-76.
  6. Milgram, S. (1974) Obedience to Authority: An Experimental View. Harper & Row, New York.