Milgram Experiment: Why We Obey Authority
The Milgram Experiment, conducted by social psychologist Stanley Milgram in the early 1960s, remains one of the most controversial and influential studies in the history of psychology. This groundbreaking experiment aimed to investigate the extent to which individuals would comply with authority figures, even when their actions conflicted with their personal conscience.
Milgram’s basic premise was that “the person who, with inner conviction, loathes stealing, killing, and assault may find himself performing these acts with relative ease when commanded by authority. Behavior that is unthinkable in an individual who is acting on his own may be executed without hesitation when carried out under orders” (1).
Key Definition:
Stanley Milgram’s experiments were a series of controversial psychological studies conducted in the 1960s at Yale University. The most well-known of these experiments involved participants being instructed to administer what they believed to be increasingly painful electric shocks to another person, who was actually an actor and not receiving any shocks. The study aimed to investigate the willingness of participants to obey authority figures, even when their actions caused harm to others.
What was the Milgram Experiment? The Study of Obedience
In the early 1960s, a young psychologist at Yale University named Stanley Milgram set out to explore a disturbing question: how far would ordinary people go in inflicting pain on another human being simply because an authority figure told them to? (2; 3).
To find out, he devised a brilliant, highly staged laboratory setup. Milgram recruited everyday people—postal workers, high school teachers, salesmen, and laborers—through local newspaper ads for what was billed as a harmless study on the effects of punishment on memory and learning (4; 5; 6)
When a volunteer arrived at the lab, they were introduced to another “participant.” The two men drew slips of paper from a hat to decide who would play the role of the “teacher” and who would be the “learner.” But the draw was secretly rigged (7; 8). The other participant was actually a mild-mannered actor working for Milgram, guaranteeing that the naive volunteer always became the teacher (9; 10).
The learner was taken to an adjacent room, strapped into a chair, and hooked up to an electrode. The teacher was then seated in front of a massive, intimidating piece of equipment called a “shock generator” (11; 12). The machine featured a row of 30 switches that escalated in 15-volt increments, starting at 15 volts (labeled “Slight Shock”) and ending at a terrifying 450 volts (labeled “Danger: Severe Shock” and simply “XXX”) (13; 14; 15).
The experimenter, a stern-looking man wearing a gray technician’s coat, instructed the teacher to read a list of word pairs to the learner (16). Every time the learner made a mistake, the teacher was ordered to push a switch and deliver an electric shock, moving one level higher on the machine for every single wrong answer (17; 18).
The Illusion and the Conflict
Unbeknownst to the teacher, the shock machine was a clever prop, and the learner was not receiving any actual shocks (19; 20; 21). However, the actor’s responses were carefully standardized using a pre-recorded tape (22; 23).
As the voltage steadily increased, the learner’s reactions escalated. At 75 volts, he would grunt; at 120 volts, he shouted that it was painful; at 150 volts, he demanded to be released; and by 285 volts, he let out an agonized scream (24; 25).
In some versions, the learner even complained of a heart condition, and eventually, he would fall completely, ominously silent (26; 27; 28). As the cries of pain grew worse, the teachers naturally became highly distressed, often sweating, trembling, and begging the experimenter to let them stop (29; 30). But whenever the teacher hesitated, the impassive experimenter would issue a series of rigid, standardized prods to force them to continue (31; 32):
- Prod 1: “Please continue.”
- Prod 2: “The experiment requires that you continue.”
- Prod 3: “It is absolutely essential that you continue.”
- Prod 4: “You have no other choice, you must go on.”
The Shocking Results
Before the study began, Milgram asked Yale seniors and a group of psychiatrists to predict how people would behave. They confidently guessed that almost everyone would stop early, and that only an insignificant fraction of people—about 1 to 3%—would go all the way to 450 volts (33; 34).
The actual results stunned the scientific community and the public alike: in the standard setup of the experiment, a staggering 65% of the subjects (26 out of 40) obeyed the experimenter to the bitter end, delivering the maximum 450-volt shock to a silent, unresponsive victim (35; 36; 37).
Milgram drew from his research that the strong likelihood of compliance to authority is a “fatal flaw nature has designed into us, and which in the long run gives our species only a modest chance of survival” (38).
What were the results of the Milgram Experiment?
In Stanley Milgram’s 1961 study, 65% of participants (26 out of 40) administered the maximum 450-volt shock, despite the “learner” screaming in pain and eventually going silent. Every participant went up to at least 300 volts, demonstrating the profound power of situational authority over individual conscience.
Society, Rules, and Obedience
Milgram explains:
“Obedience is the psychological mechanism that links individual action to political purpose. It is the dispositional cement that binds men to systems of authority” (39).
Humans need each other, and human societies provide tremendous benefits to their members. From an evolutionary standpoint, forming hierarchically organized groups gave our ancestors an enormous advantage in surviving physical dangers and competing threats (40).
Rules, and obedience to those rules, are necessary ingredients for the smooth functioning of the group. By clearly defining the status of each member, a social hierarchy reduces internal friction and violence, ensuring harmony. Governments, families, and employers all create rule-based expectations, along with punishments for violations, because the very life of society is predicated on obedience (41; 42).
While obedience is often a virtue—enabling acts of charity, kindness, and coordinated productivity—it can also be heavily manipulated. In order to function within a complex hierarchy, individuals are required to suppress their own independent control and cede direction to a higher authority (43).
Unscrupulous groups and leaders use this deeply ingrained, blind obedience to serve unethical and diabolical purposes. Milgram warned that when a person merges their unique personality into a larger institutional structure, they risk becoming a “new creature” who is unhindered by individual morality or humane inhibition, mindful only of the authority’s sanctions (44). We may refer to following the dictates of these leaders as destructive obedience, which history shows has led to the systematic slaughter of millions of innocent people simply because they were ordered to do so (45).
Real World Implications and Concepts of Milgram’s Experiments
Based on his research, Milgram stated that “if a system of death camps were set up in the US of the sorts we had seen in Nazi Germany, one would be able to find sufficient personnel for those camps in any medium-sized American town” (46).
We proudly defend that this could never happen here. Yet, with a little provocation from an unscrupulous leader, a group of violent protesters marched on our nation’s capital ready to kill.
Robyn Dawes wrote:
“The implication of Milgram’s experiments and others that followed is that ‘authoritarianism’ is not such an unusual phenomenon. Perfectly normal people have a tendency to accept authority—even to the point of inflicting pain and possible harm on others, or at least believing they are. In fact, so strong is this tendency that we do not make compelling demands on authorities to prove that they are indeed authorities” (47).
“It is ironic that the virtues of loyalty, discipline, and self sacrifice that we value so highly in the individual are the very properties that create destructive organisational engines of war and bind men to malevolent systems of authority.”
~Stanley Milgram (1974, p. 188)
The Agentic State: Why Do We Follow Harmful Orders?
To make sense of his shocking findings, Milgram proposed a psychological concept he called the “agentic state” (48). Normally, we operate in an “autonomous” mode, meaning we view ourselves as independent individuals whose actions are guided by our own internal moral compass (49).
However, when we enter a hierarchical situation and perceive a legitimate authority figure, a profound mental shift occurs (50). We transition into the agentic state, where we no longer see ourselves as the authors of our own actions, but rather as instruments meant to execute the wishes of someone else (51; 52).
In this state, our deeply ingrained cultural training to obey authority essentially short-circuits the personal conscience that would usually stop us from causing harm (53). The presence of an authority figure provides an escape from the pains of personal responsibility for actions that hurt others. Basically, we can act in ways that our personal ethical narrative prohibits without guilt.
Once this psychological shift happens, the most significant change is an almost complete loss of personal responsibility (54). Instead of feeling guilty about hurting the “learner,” people in the agentic state redirect their moral concern toward how well they are fulfilling their assigned duties to the experimenter (55). Participants became absorbed in the narrow, technical details of the procedure—like reading the word pairs with clear articulation and pressing the correct switches—while tuning out the broader, more tragic consequences of their actions (56).
Furthermore, they allow the authority figure to completely define reality for them, trusting that the scientist’s ultimate goals must be justified (57). By shedding the burden of personal accountability, ordinary people can participate in destructive acts while resting on the psychological defense that they are simply doing their job (58).
The “Sadist” Myth vs. The “Slippery Slope”
A common reaction to hearing that 65% of people administered the maximum 450-volt shock is to assume those participants must have been cruel, aggressive, or sadistic (59). It is a comforting thought, because it separates “ordinary people” from the perpetrators of harm.
However, Milgram effectively debunked this theory with his eleventh experiment. In this variation, he changed one crucial rule: he allowed the “teachers” to choose any shock level they wanted whenever the learner made a mistake (60). If they truly harbored dark, aggressive desires, this was their chance to unleash them. Instead, the teachers almost universally chose the lowest, most painless levels available, averaging a mere 3.6 on the 30-point scale (61). They actively avoided causing pain when the choice was left entirely up to them.
The Slippery Slope
So why did they go to 450 volts in the main experiment? Milgram pointed to two highly relatable psychological phenomena that kept the participants locked into the experiment: The Angelic State and the “Slippery Slope.” We already examined the role of authority influence in relieving the individual of guilt associated with the angelic state. Lets take a closer look at the role of the slippery slope phenomenon.
The “Slippery Slope”: The physical design of the shock generator itself was a psychological trap. The machine’s switches escalated the punishment in tiny, gradual 15-volt increments. Because the process was so incremental, participants were eased into the destructive behavior step by step (62). If a person just gave a 195-volt shock, why refuse the 210-volt switch? The trap of this “slippery slope” is that in order to stop and refuse to go further, the participant would have to admit that the previous shock they just administered was also wrong (63). It was much easier to just keep going, because continuing to press the switches served to justify the painful shocks they had already delivered (64).
Inner Tension
To understand the true horror of the Milgram experiment, it is crucial to realize that the “teachers” who followed orders were not cold, robotic, or malicious individuals (65). Instead, they were often in absolute agony. Observers recorded extreme signs of tension in the laboratory: participants sweat profusely, trembled, stuttered, groaned, and dug their fingernails into their own flesh while desperately pleading with the experimenter to let them stop (66).
One of the most bizarre and unexpected reactions to this severe stress was out-of-place “nervous laughter,” which in a few cases actually escalated into uncontrollable, full-blown physical seizures (67). What kept them pressing the switches wasn’t underlying sadism or cruelty, but the paralyzing grip of what Milgram called “situational etiquette” (68; 69).
In order to stop, the participant had to breach the implicit rules of the social occasion and directly challenge the scientist in charge; for many, the deep-seated fear of appearing rude, arrogant, or awkwardly disrupting a formal social interaction proved to be an insurmountable emotional barrier (70; 71).
Cognitive Dissonance and Obedience to Authorities
Erich Fromm wrote that social scientists discovered two findings from Milgram’s experiments.
“The first finding concerns the sheer strength of obedient tendencies manifested in this situation.” He explains that “subjects have learned from childhood that it is a fundamental breach of moral conduct to hurt another person against his will. Yet, 26 students abandon this tenet in following the instructions of an authority who has no special powers to enforce his commands. The second unanticipated effect was the extraordinary tension generated by the procedures. One might suppose that a subject would simply break off or continue as his conscience dictated. Yet, this is very far from what happened” (72).
Obedience to authority when that obedience conflicts with inner tenets of behavior creates tension that we must resolve. We refer to this inner conflict in psychology as cognitive dissonance.
These findings raised profound ethical concerns and sparked important discussions about the power of authority. Also of importance, the experiments bring attention to the influence of situational factors on behavior. Accordingly, we see the potential for ordinary individuals to commit acts that violate their moral principles when ordered to do so by an authority figure.
Moral Justification
As Milgram proclaimed, “the person who, with inner conviction, loathes stealing, killing, and assault may find himself performing these acts with relative ease when commanded by authority.” And thus, ordinary citizens act in extraordinary and violent ways, self-righteously violating self-proclaimed values and ordinary human ethics to gladly march in obedience to an authority figure.
According to Albert Bandura we internalize laws that regulate our behavior through self sanctions. However, when our behaviors violate these self sanctions, shunning the internalized ethical laws, we justify.
According to moral disengagement theory, moral justification refers to the process of framing harmful actions or behaviors in a way that makes them seem morally acceptable or justified. In many instances, it appears that obedience to authority supersedes internalized ethics. However, we then must address the cognitive dissonance arising from conflicting morals and behaviors. We do this through a number of defensive strategies.
See Moral Justification for more on this topic
Who Obeys Harmful Orders? The Role of the Authoritarian Personality
To understand why some individuals complied while others rebelled in Stanley Milgram’s experiments, researchers investigated the concept of the “authoritarian personality.” Dr. Alan Elms administered the F-scale—a psychological test designed by Theodor Adorno and colleagues to measure fascistic and authoritarian tendencies—to a group of obedient and defiant subjects from Milgram’s study (73). The results revealed a strong relationship between actual submission to authority and authoritarian personality traits: subjects who obeyed the experimenter to the bitter end showed a significantly higher degree of authoritarianism than those who refused to continue.
This personality type is heavily characterized by “authoritarian submission,” an uncritical, submissive attitude toward idealized moral authorities, and “authoritarian aggression,” a tendency to condemn and punish those who violate conventional values (74). In the stressful context of the laboratory, this psychological profile translated seamlessly into the subject’s behavior: unquestioningly submitting to the scientist in command while readily punishing the weaker, subordinate “learner” (75).
“A substantial proportion of people do what they are told to do, irrespective of the content of the act and without limitations of conscience, so long as they perceive that the command comes from a legitimate authority.”
~Stanley Milgram (1974, p. 189)
A Dynamic Interaction of Situational Pressures and Predispositions
The implications of these findings are profound, as they suggest that destructive compliance is driven by a dynamic interaction between situational pressures and latent individual predispositions (76).
Individuals with high authoritarianism often struggle with the complexities and uncertainties of individual freedom, preferring the rigid clarity of hierarchical social structures where they can find psychological comfort in obedience and subordination (77; 78). Because they possess an underdeveloped internal conscience and instead rely on external authorities to dictate right and wrong, they are highly reactive to “normative threats” like societal disorder, diversity, or shifting cultural values (79).
Consequently, when these individuals are placed in an environment that justifies aggressive behavior through the guise of a worthy cause or a legitimate institution, their internal moral controls are easily bypassed, allowing them to become willing instruments of harsh and inhumane acts without feeling personal responsibility (80).
Ethical Controversy: Why Milgram Could Never Happen Today
Peter K. Lunt wrote:
“Milgram is perhaps the best-known social psychologist outside the discipline and he and his experiments are a fantastic ambassador, bringing the issues and concerns of the relation between the individual and the social world to a wide audience” (81).
Despite its immense impact on the field of psychology and popular understanding, the Milgram Experiment has also faced criticism regarding its ethical implications and the potential stress imposed on the participants. Milgram dismissed the stress the experiment caused on the subjects, justifying it brought helpful personal insights about their willingness to obey authorities. Diana Baumrind described this process of gaining self-knowledge as ‘inflicted insight’ (82).
Nevertheless, it has significantly contributed to our understanding of obedience, conformity, and the complex interactions between individuals and authority figures. The readers of Milgram’s experiments benefit without the harm. However, how do we measure harm to some that benefits others. Can we sacrifice the good of the few for the benefit of the many when the few had no choice in the matter other than willingness to participate in a social experiment at a laboratory at Yale?
Variations of the Study: Proximity and Authority Influence
Proximity to Learner
To understand the limits of this shocking compliance, Milgram conducted a series of variations testing how physical distance between the “teacher” and the “learner” affected obedience (83). In his “four-part proximity series,” he gradually brought the victim closer to the subject (84). In the baseline “Remote” setup, where the learner was in another room and could only be heard pounding on the wall, 65% of participants obeyed to the very end. When the learner was brought into the exact same room, obedience dropped to 40% (85; 86).
In the most extreme variation, the “Touch-Proximity” condition, the teacher was required to physically force the learner’s hand onto a shock plate to deliver the punishment; under these intensely personal circumstances, obedience plummeted to 30% (87; 88). Milgram realized that distance acted as a psychological buffer: when the victim was hidden away, it was easy to view the shocks as an abstract, remote requirement of the experiment, but when forced to look the victim in the eye or physically touch them, the participants could no longer deny the immediate human cost of their actions (89).
Physical Presence and Status of Authority Figure
Just as the victim’s closeness played a critical role, so did the physical presence and perceived status of the authority figure. In one variation, the stern experimenter gave his initial instructions and then left the room, issuing all further orders over a telephone. Without the scientist directly observing them, participants found it much easier to resist, and the number of fully obedient subjects dropped sharply (90; 91).
Fascinatingly, some subjects in this telephone variation resorted to clever subterfuge: they repeatedly administered the lowest, most painless shock on the board but assured the absent experimenter over the phone that they were raising the voltage as instructed. Furthermore, when Milgram stripped the authority figure of his scientific legitimacy—having an “ordinary man” give the orders instead of a researcher in a lab coat—compliance dropped to just 20%. These variations clearly demonstrated that obedience wasn’t driven by an inherent desire to harm, but by the powerful, immediate pressure of a physically present, legitimate authority figure (92; 93).
Current Events Through the Lens of Milgram’s Experiments
Senator Mark Kelly recently stated that a Pentagon investigation into his remarks urging troops to refuse unlawful orders is an attempt to chill and silence military dissent (94). This tension between military discipline and individual conscience strikes at the very heart of Stanley Milgram’s obedience experiments. Milgram demonstrated that individuals placed within a strict hierarchical structure frequently enter an “agentic state,” a psychological shift where they shed personal accountability and view themselves merely as instruments executing the wishes of a legitimate authority (95).
In the military, where survival and operational success depend heavily on chain-of-command obedience, the risk of this agentic shift is especially high (96). Senator Kelly’s appeal to troops serves as a public countermeasure to this phenomenon, reminding service members that they must retain their autonomous moral judgment and legally resist destructive obedience, even when pressured by higher-ups.
Captain Paul Grueninger
History provides profound examples of individuals who successfully resisted the pull of the agentic state, most notably Captain Paul Grueninger, a Swiss police chief during the late 1930s. When Swiss federal authorities ordered border police to firmly close the border and turn back Jewish refugees fleeing Nazi persecution, Grueninger refused to do his duty. Instead, he actively subverted his superiors by forging documents and backdating arrivals, ultimately saving approximately 3,000 lives (97).
Much like the defiant minority in Milgram’s laboratory who broke the paralyzing grip of situational etiquette to rescue the “learner” from further shocks, Grueninger placed human compassion above the legitimate, yet cruel, administrative orders of his government. However, his righteous defiance came at a steep personal cost: he was fired, stripped of his pension, and lived in poverty for the rest of his life (98).
Society’s Demand for Obedience
The Pentagon’s investigation into Senator Kelly’s remarks illustrates the immense institutional pressures that make Grueninger’s brand of disobedience so rare and difficult. As Milgram noted, society and organizations naturally demand obedience to function smoothly, but this structural requirement often fails to instill internal controls against unlawful or malevolent directives (99; 100).
The military’s pushback against Kelly’s remarks underscores the inherent friction between maintaining essential institutional discipline and fostering independent moral courage. Ultimately, Kelly’s call for troops to refuse unlawful orders echoes the central warning of Milgram’s research: while the situational pull to obey authority is incredibly strong, human beings must cultivate the capacity—and the courage—to recognize when authority goes awry and break the spell of blind compliance (101).
What Actually Broke the Spell? (The Hopeful Angle)
While the shocking 65% compliance rate dominates history books, Milgram didn’t just study obedience; he was equally focused on understanding disobedience. By methodically altering the variables in his laboratory setup, he discovered exactly what it takes to break the grip of a malevolent authority and empower ordinary people to rebel (102; 103).
Distance from the Authority
When Milgram changed the rules so that the experimenter gave initial instructions and then left the room to issue all further orders by telephone, the obedience rate plummeted (104; 105). Participants found it much easier to resist when they didn’t have to confront the stern authority figure face-to-face. In fact, some participants even engaged in clever subterfuge, repeatedly administering the lowest, painless shock level while assuring the absent experimenter over the phone that they were raising the voltage as instructed (106; 107).
Proximity to the Victim
Just as distancing the authority reduced compliance, bringing the victim closer also broke the spell. In a variation known as “Touch-Proximity,” the learner refused to put his hand on the shock plate after 150 volts. The experimenter then ordered the teacher to physically force the learner’s hand onto the plate to deliver the punishment. Forced to make direct, physical contact with the victim’s suffering, obedience dropped sharply to just 30% (108; 109; 110).
Proximity to the victim is not an isolated finding by Milgram but a common theme throughout social psychology. Muzafer Sherif and his colleagues found in their robbers cave experiments that the natural in-group/out-group conflict between juvenile campers could be mitigated through increased cooperative contact between the warring campers (111).
In psychology, we refer to this as the contact hypothesis.
The Power of Peers
Perhaps the most powerful and hopeful variation was Experiment 17, which tested the liberating effects of group pressure. In this setup, the naive subject was paired with a team of two other “teachers” who were actually actors working for Milgram. When the two actors eventually rebelled and refused to continue shocking the learner, a staggering 90% of the real subjects followed their lead and defied the experimenter. As Milgram himself concluded, this provides a profound takeaway for all of us: mutual support provided by peers is our strongest bulwark against the excesses of authority (112; 113).
Associated Concepts
- Social Anxiety: Individuals with social phobia often experience overwhelming anxiety and self-consciousness in everyday social encounters, leading to avoidance of social events and impairment in various areas of their lives.
- Asch Conformity Experiment: These studies conducted by psychologist Solomon Asch in the 1950s aimed to investigate the extent to which social pressure from a majority group could influence a person to conform.
- Risky Shift Phenomenon: This refers to the observation that groups often make riskier decisions than individuals, which can be a component of groupthink.
- Groupthink Theory: Developed by Irving Janis, groupthink occurs when a cohesive group prioritizes consensus over critical thinking. Members suppress dissenting opinions to maintain harmony. Groupthink can lead to flawed decisions and lack of creativity.
- Stanford Prison Experiment: This study, conducted by Phillip Zimbardo in 1971, explored the psychological impact of power dynamics within a simulated prison. The study was terminated early due to abusive behavior and raised ethical concerns.
- Spiral of Silence Theory: This theory developed by German political scientist Elisabeth Noelle-Neumann explains how people remain silent when they perceive their views as the minority. Fear of isolation and the influence of media shape this behavior.
A Few Words by Psychology Fanatic
In conclusion, the Milgram Experiment serves as a powerful reminder of the intricacies of human behavior. Stanley Milgram’s experiments remind us of the dangerous tendencies deeply ingrained into our biological makeup. When we consider our natural in-group/out-group bias along with the easy escape of acting in unethical and cruel ways when it is modeled and encouraged by authority, the fight for compassion and unity seems hopeless.
The internet invites swarms of “teachers” willing to administer authority sanctioned shocks to all those that believe, look, or behavior differently then themselves. A young athlete is subject to threats and hate because prominent authority figures model the behavior, essentially directing followers: “It is absolutely essential that you continue.”
There are few courageous enough to stand-up, like Captain Paul Grueninger, and proudly proclaim, “No! This isn’t right.”
Milgram’s experiments continue to stimulate ongoing debates on the nature of obedience, moral decision-making, and the ethical boundaries.
Milgram’s experiments, like most experiments, has been subject to harsh criticism. Critics contend he manipulated numbers and tainted the experiments with his own bias. Perhaps, some of their criticism is justified. Personally, I’m not certain that any experiment in the a laboratory can be completely pristine, untainted by the environment. Accordingly, we must view the results with some skepticism. However, this does not dismiss all his findings. We still can learn much from these social experiments, consider our own tendencies to destructively obey, and engage in mindful exploration of our actions.
Last Update: February 22, 2026
References:
Adorno, T. W., Frenkel-Brunswik, E., Levinson, D. J., & Sanford, R. N. (1964). The authoritarian personality (Part Two). Science Editions. ASIN: B000BOWNZA
(Return to Main Text)
Badhwar, Neera (2009). The Milgram Experiments, Learned Helplessness, and Character Traits. The Journal of Ethics, 13(3), 257-289. DOI: 10.1007/s10892-009-9052-4
(Return to Main Text)
Blass, Thomas (2004). The man who shocked the world: The life and legacy of Stanley Milgram. Basic Books. ISBN: 9780465008070
Blass, Thomas (2000). The Milgram Paradigm After 35 Years: Some Things We Now Know About Obedience to Authority. In: Thomas Blass (ed.), Obedience to Authority: Current Perspectives on The Milgram Paradigm, (pp. 35-60). Lawrence Erlbaum Associates. ISBN: 9780805839340; APA Record: 1999-04354-000
Dawes, Robyn (1996). House of Cards. Psychology and Psychotherapy Built on Myth. Free Press; 1st edition. ISBN-10: 0029072050; APA Record: 1994-97431-000
(Return to Main Text)
Fromm, Erich (2013). The Anatomy of Human Destructiveness. Open Road Media; 1st edition. ISBN: 080501604X
(Return to Main Text)
Herrera, C. (2001). Ethics, Deception, and ‘Those Milgram Experiments’. Journal of Applied Philosophy, 18(3). DOI: 10.1111/1468-5930.00192
(Return to Main Text)
Lunt, Peter Kenneth (2009). Stanley Milgram: Understanding Obedience and its Implications. Bloomsbury Academic; 1st edition. ISBN: 9780230573154
(Return to Main Text)
Spotlight Book:
Milgram, Stanley (1963). Behavioral Study of obedience. Journal of Abnormal Psychology, 67(4), 371-378. DOI: 10.1037/h0040525
(Return to Main Text)
Milgram, Stanley (2009/1974). Obedience to Authority: An Experimental View. Harper Perennial Modern Classics; Reprint edition. ISBN: 9780061765216
(Return to Main Text)
Mook, Douglas (2004). Classic Experiments in Psychology. Greenwood Press. ISBN: 9780313318214
(Return to Main Text)
Rochat, François; Maggioni, Olivier; Modigliani, Andre (2000). The Dynamics of Obeying and Opposing Authority: A Mathematical Model. In: Thomas Blass (ed.), Obedience to Authority: Current Perspectives on The Milgram Paradigm, (pp. 161-192). Lawrence Erlbaum Associates. ISBN: 9780805839340; APA Record: 1999-04354-000
(Return to Main Text)
Rochat, François; Modigliani, Andre (2000). Captain Paul Grueninger: The Chief of Police Who Saved Jewish Refugees by Refusing To Do His Duty. In: Thomas Blass (ed.), Obedience to Authority: Current Perspectives on The Milgram Paradigm, (pp. 161-192). Lawrence Erlbaum Associates. ISBN: 9780805839340; APA Record: 1999-04354-000
Rosenbaum, Max (1983) Compliance. In: Max Rosenbaum (ed.) Compliant Behavior: Beyond Obedience to Authority (pp. 25-49). Human Science Press. ISBN: 9780898851151
Sapolsky, Robert (2018). Behave: The Biology of Humans at Our Best and Worst. Penguin Books; Illustrated edition. ISBN-10: 1594205078
(Return to Main Text)
Sherif, Muzafer; Harvey, O. J.; Hood, William R.; Sherif, Carolyn W.; White, Jack (1988). The Robbers Cave Experiment: Intergroup Conflict and Cooperation. Wesleyan University Press; Illustrated edition. ISBN: 9780819561947
Stenner, Karen (2005). The Authoritarian Dynamic. Cambridge University Press. ISBN: 9780521534789; APA Record: 2005-09667-000
(Return to Main Text)
Timming, Andrew, & Johnstone, Stewart (2015). Employee silence and the authoritarian personality. International Journal of Organizational Analysis, 23(1), 154-171. DOI: 10.1108/ijoa-06-2013-0685
(Return to Main Text)
Federal News Network. (2026, January). Sen. Mark Kelly calls Pentagon investigation into his remarks a move to ‘chill military dissent’. Website: https://federalnewsnetwork.com/litigation/2026/01/sen-mark-kelly-calls-pentagon-investigation-into-his-remarks-a-move-to-chill-military-dissent/

