Disrespect Authority

Dave Thomer"Who do you trust? And who do you serve?"

-- Crusade opening credits, written by J. Michael Straczynski

One of the things I like about Straczynski, he asks good questions. In a perceived time of crisis, when our leaders make demands and requests of us, these are particularly apt. To whom do we give the moral authority to guide or dictate our actions? How much authority are we prepared to give to them? To answer these questions accurately, we need to understand ourselves and the nature of our relationship to those in authority. Unfortunately, based on a significant psychological experiment, that understanding is often lacking.

Yale professor Stanley Milgram conducted his initial experiment in the 1960s, using newspaper ads and mail solicitations to collect a group of volunteers for what he claimed was an experiment to test the effect of pain and negative reinforcement on memory. The researcher administering the test told each volunteer that there would be two subjects for each test, one 'teacher' and one 'learner.' The learner would try to memorize a set of word pairs, then attempt to match a word with its correspondent. Each time the learner got an answer wrong, the teacher would administer an electric shock, with each shock 15 volts stronger than the previous one. The two roles would be randomly assigned to the two subjects by a drawing, after which the learner was strapped into a chair to receive the shocks, and the teacher was brought into a control room with the researcher.

In reality, the drawing was rigged so that the volunteer would always get the role of teacher. The learner was in on the actual nature of the experiment, and never received any shocks of any kind. The learner gave out a pre-determined set of answers, a majority of which were wrong. As the supposed shocks grew stronger in intensity, the learner would gradually act more and more agitated, and by the tenth or fifteenth 'shock,' he would scream in protest and demand to be let out of the experiment. If the teacher hesitated to administer the next shock, the researcher would politely but firmly instruct the teacher to do so. The experiment concluded when the teacher refused to follow the instruction, or when the teacher had administered 30 'shocks,' the last of which was labeled as 450 volts in intensity. When the experiment was over, the teacher was reintroduced to the learner and reassured that no harm had been done. The researchers also explained that the actual point of the experiment was to test the subject's willingness to follow the instructions of an authority figure, even when those instructions ran counter to the presumably basic human impulse not to cause suffering in a fellow human being.

What surprised the researchers was the number of subjects who never disobeyed the authority figure, and administered all 30 'shocks' -- 60 percent. Some calmly did so, others appeared to be agitated and looking for the researcher to give them the OK to stop. Milgram had expected and hoped for far more people to disobey, an expectation that many others would no doubt share. Indeed, in surveys and other research, when the experiment was described without any information on the result, those surveyed predicted a far higher percentage of refusals.

Milgram's book Obedience to Authority provides greater detail on the experiment, the variations Milgram ran over the years, and the conclusions he drew. The experiment raises any number of ethical questions, which I think it would be worth our while to discuss in the forums. Some of those questions are:

Is Milgram's experiment itself a violation of some kind of ethical norm?

Placing people in a situation where they believe they are causing someone extreme physical pain could be considered a kind of psychological torture. The experiment relies on dishonesty and manipulation to be effective. Treating the volunteers as experimental subjects in this way can also be seen as dehumanizing, reducing them to the position of lab rats scurrying through a maze.

I myself don't find these objections convincing, although I can understand why someone would. Dishonesty is part and parcel of the experimental method when it comes to human beings -- witness the concept of placebos. More importantly, the experimenters took great care to reassure the volunteers, not only that the 'learner' was in fact all right, but that their responses were normal and nothing to be ashamed of; in post-experiment interviews the volunteers had the opportunity to discuss their feelings on the matter as well. And most importantly, they always had the opportunity to leave. The most severe restraint placed on them was a researcher saying, "You have no choice, you must go on." No threats were made, no physical force was brought to bear. Many of us may find ourselves in positions where our job, our livelihood or our perceived safety might rest on doing something that could be harmful to others. If we expect ourselves to be able to stand up to that kind of pressure, then we should be able to handle the strain of Milgram's experiment.

As for the possibility that the experiment is dehumanizing -- I can see that. No one wants to reduce human beings to numbers, statistics or test subjects. On the other hand, one of the fundamental tenets of Deweyan pragmatism is the use of an experimental method to test our beliefs empirically, and in doing so get better information about how the world works. That's exactly what Milgram's experiment does. It helps dismiss the myth that atrocities in the name of authority 'could never happen here.' The loss of that myth might make us uncomfortable, but such is often the price we must pay for understanding and improvement.

Is 'following orders' a viable ethical defense after all?

Milgram's experiment was inspired in part by a desire to understand the behavior of citizens and soldiers of Nazi Germany, probably the most famous example of the defense that if one is ordered to do something with immoral consequences, the moral responsibility belongs solely to the person giving the order, and not the one following it. In a sense, one might argue that Milgram's findings justify that argument -- if following orders is a natural habit, then it might seem harsh to hold someone responsible for doing so. On the other hand, overcoming our natural inclinations is probably the central core of ethics in the first place -- no one said being ethical would be easy.

Now that we've identified the problem, what do we do?

This, in the end, is the big one. Clearly, some level of respect for authority is essential. But Milgram's findings indicate that, despite the protests of parents of teenagers everywhere, we may in the end have too much respect. Personally, I think Milgram's book should be required reading at the high school level -- making people aware of the phenomenon might help them when faced with a similar circumstance. Beyond that, I'm not sure. It is my hope that the general critical thinking skills that the educational system should provide would, combined with this psychological understanding, do a great deal of good. Since we're still a long way from achieving either on a widescale basis, there's plenty of work to be done.

What does the rest of the staff think of this article? Head to the forums to find out, then add your two cents.