Why Did 65% of People Give the Maximum Electric Shock?
In 1963, Stanley Milgram published one of the most disturbing studies in psychology: ordinary people, instructed by an authority figure, administered what they believed were dangerous electric shocks to a stranger. 65% continued to the maximum voltage. The study has been misread ever since.
In the early 1960s, Stanley Milgram at Yale University ran an experiment that he initially designed to understand how ordinary Germans could have participated in the Holocaust.
The design was straightforward. A volunteer subject was told he was participating in a study on learning and memory. He would be the “teacher” administering word tests to a “learner” (actually a confederate) in an adjacent room. Each time the learner got an answer wrong, the teacher would administer an electric shock, increasing by 15 volts with each error.
The shock generator had labels ranging from “Slight Shock” to “XXX — DANGER: SEVERE SHOCK.” The learner screamed in pain, banged on the wall, and eventually went silent. At various points, he mentioned a heart condition.
None of the shocks were real. The learner was an actor.
65% of subjects administered the maximum shock — 450 volts — under instruction from the experimenter who simply said, when subjects wanted to stop, “Please continue” and “The experiment requires that you continue.”
What the Study Actually Showed
The Milgram study is commonly cited as evidence that “people will blindly follow authority.” This is a significant oversimplification.
More recent reanalysis by Alexander Haslam and Stephen Reicher has reframed the findings: subjects did not obey blindly or without thought. Many visibly struggled, protested, asked questions, and showed significant distress. They weren’t blank automata — they were people who had signed up to help science, who had identified with the experimenter’s stated goal (advancing knowledge of learning), and who were reluctant to abandon that purpose.
The key variable, Haslam and Reicher argue, is not obedience to authority per se but identification with the leader’s cause. When subjects understood the experimenter as pursuing a legitimate and important goal, they were willing to override their discomfort in service of that goal. When the cause lost legitimacy — when the experimenter’s behavior seemed arbitrary or cruel rather than scientific — compliance dropped.
This matters because it changes the lesson. The finding is not “people do whatever authority tells them to.” It’s closer to: “people will tolerate significant personal discomfort and cause significant harm to others when they believe they are serving a legitimate collective goal.”
The Variations
Milgram ran over 20 variations of his experiment. These are often overlooked but are crucial for interpretation.
Compliance dropped significantly when:
- The experimenter gave instructions by phone rather than in the room (20.5%)
- Two experimenters gave contradictory instructions (0% complied fully)
- The subject was told the experimenter designed the experiment for personal reasons
- A peer (fellow subject, also a confederate) refused to continue (10%)
Compliance stayed high when:
- A second experimenter (rather than the original) insisted on continuing — subjects followed the new experimenter almost as readily
- The experimenter was not in a university setting, but an office building in a rundown location (47% — still shockingly high)
The most striking variation: when the subject was paired with a peer who stopped, only 10% of subjects continued to the maximum. Social modeling of refusal was the single most effective intervention.
The Gina Perry Reexamination
Journalist Gina Perry, in her 2012 book Behind the Shock Machine, investigated the original Milgram files and interviewed participants decades later.
Her findings complicated the picture:
- Some subjects appeared to have suspected the shocks weren’t real
- The post-experiment debriefing was not as thorough as Milgram claimed
- Several participants reported lasting distress from the experience
- A significant number of participants did not comply when the experimenter stepped out of the room or when there were other signals that the authority was not fully in control
The study is not fraudulent — the results are real and have been replicated in various forms. But the cleaner narrative (two-thirds of ordinary people will torture strangers if told to) doesn’t fully capture the texture of what actually happened.
The Bystander Mechanism
One thing the variations reveal: the presence of resisters matters enormously.
When a confederate peer refused to continue, compliance dropped from 65% to 10%. This is a dramatic effect from a single social model. The implication: in real situations, visible refusal by one person dramatically lowers the threshold for others to refuse.
This may be the most practically significant lesson from Milgram: the mechanism of compliance is social, not individual. If you want people to resist harmful directives, visible examples of resistance are the most powerful intervention — not appeals to individual conscience.
Why It Still Matters
The Milgram studies were not primarily about electricity or laboratory conditions. They were about the conditions under which ordinary people cause harm to others without believing themselves to be bad people.
The answers they produced are uncomfortable: legitimate authority, group identification with a goal, the absence of visible resisters, and incremental escalation (starting with small shocks and moving up) all increase compliance with harmful acts.
These conditions appear frequently outside the laboratory: in institutions, in organizations, in political movements. The experiment’s relevance hasn’t faded with its methodological controversies.
What has held up: the horror was produced not by sadists or sociopaths, but by people who thought they were doing something legitimate. That was Milgram’s original insight, and it remains true.
The experimenter said: “The experiment requires that you continue.”
65% continued.
Not because they wanted to hurt anyone.
Because they thought they were helping something that mattered.
Comments