Saturday, December 01, 2007

The Long View: the Prisoner Experiment and what it teaches us.

Crossposted to Daily Kos



Yesterday I wrote about Milgram's work and how diffusion of responsibility supports torture.

Today I'm continuing that theme, discussing how Zimbardo's Prisoner Experiment at Stanford shows us similar trends.

First, a summary of the prisoner experiment, for those of you unfamiliar with it.



If you're not interested in the YouTube version, Wikipedia has a great summary as well.

Here's the simple version:

When we set up bad structures, we end up with people who do bad things.

That's it. It's that simple. What Zimbardo mentions (more in this recent interview) is that, in his experiment, the "guards" boiled down to two kinds: "good "guards and "bad" guards. The "bad" guards are the ones who engaged in brutal behaviors against the "prisoners." The good "guards" are the ones who didn't.

But none of those "good" guards tried to stop it.

In the Stanford experiment, we weren't dealing with people who had a moral right to be guards. We weren't dealing with prisoners who had done anything wrong. Everyone was randomly assigned to a role. And yet, still, we had prisoners breaking down. We had guards deliberately demeaning and abusing prisoners.

Does this ring a bell?

Without proper leadership, people in authority tend towards chaos. Without proper controls and accountability, people in authority do damage.

Without a proper idea as to who the enemy is, soldiers don't know what to do.

So they behave badly.

And, like I mentioned yesterday, we don't want to acknowledge this:

I'm going to mention another concept that I've talked about before: cognitive dissonance -- the condition that exists when our behavior contradicts our beliefs. When dealing with cognitive dissonance we sometimes change our behavior, but we sometimes also change our beliefs.

We do not want to think of ourselves as a country which supports or promotes torture. It contradicts our beliefs. So when we see that we have, in fact, engaged in torture, we have some choices:

  1. we can change our beliefs to convince ourselves that we think torture is ok;


  2. we can say "this has to stop" and change our behavior;


  3. we can say "this has to stop" and then convince ourselves that we've changed our behavior without actually doing it;


  4. we can say "we oppose torture" and then reclassify everything we do as something that's not torture.


We're so focused on this idea of supporting our troops that we refuse to acknowledge the reality: by failing to hold them accountable and by refusing to hold them to a higher standard, we are doing them damage. We're so focused on choosing option #3 above-- pretending we're solving things without actually doing so-- that we're risking serious long-term damage.

A few weeks ago, in another post, I wrote about the problems facing our soldiers:

In the meantime, as IAVA reports, the professional component of this is far from adequate:

90% of military psychiatrists, psychologists and social workers reported no formal training or supervision in the recommended PTSD therapies, and there is a general shortage of trained mental health professionals in the military. The Pentagon screens returning troops for mental health problems via an ineffective system of paperwork. Studies have shown that many troops are not filling out their mental health forms, that there are serious disincentives for troops to fill the form out accurately, and that those whose forms indicate they need care do not consistently get referrals.


Both guards and prisoners in the Stanford experiment suffered mental damage as a result of it. And this was fake.

Imagine yourself placed in a situation where the rules are unclear and you don't know what you're supposed to do, but that your basic role is "guard." You don't know who the enemy is. Or you don't know what your prisoners have done. Or you don't know why you're there or what your mission is.

And you're there, in this prison, guarding people whom you don't understand, who don't understand you, and you're there guarding this scene where you're the "good" guard. You're not the one strapping someone to a table. You're not the one holding the suffocation hood. You're not the one doing the waterboarding.

But you're there. And you're supposed to be keeping everything in order. You're one of the 92% who won't intervene when someone in the room with you is killing someone. Because you're just following orders.

Imagine this insanity happening around you and you being part of it and yet also just a casual observer who had the power to intervene and prevent atrocities and failed to do so.

Now imagine that you think of yourself as a good person, but are connected with this.

Remember the concept of "cognitive dissonance" that I reference earlier?

What do you think this does to a person?

I'm horrified by what I see, but I get that pretending its not there is worse.

I want to support my country, but I can't do so in a way that ignores the truth.

I want to support the troops, but I can't do so without knowing who they are and what their limitations are.

I want what we're doing overseas to stop. It doesn't just do damage to other countries and other people. It does damage to us. It destroys the hearts of everyone involved: prisoner, guard, soldier, civilian.

It destroys the minds and it destroys the souls.

Friday, November 30, 2007

The Long View: How diffusion of responsibility supports torture


Crossposted to Daily Kos



Stanley Milgram began his research into obedience in the early 1960's. His original intent had been to demonstrate that "just following orders" wasn't a legitimate excuse for Nazis who committed atrocities during the holocaust.

It was his belief that only a select few people would engage in acts which could serious harm to others when ordered to do so. His belief was shared by the students he polled.

They were wrong.

Milgram's experiment was a simple one that involved three people:

  1. the Authority Figure/Experimenter (E);
  2. the Technician/Teacher (T);
  3. the Learner (L);


The experiment was set up as follows:

"E" would show up in a white coat and explain to two individuals that one of them would be playing the part of the teacher and one would be the learner and explain the rules. Then he would hand a slip of paper to each one. One would say "Teacher" and the other would say "Learner." The learner (L) would move to another another room and the teacher (T) would stay with the Experimenter.

Then they would get to work.

The Teacher would, through a microphone, read a question to the Learner. If the Learner got the question wrong, T would administer a shock. Each time the shock was administered, T would increase the voltage a little for the next time and L would scream in pain.

The dial went up to "450 volts." In many cases, this was marked as "DANGER" or "LETHAL."

The thing is, this experiment was a ruse. The "Learner" was part of the experiment, an actor along with the Authority figure. No one was shocked. No one was in pain. L wasn't being tested.

T was.

The idea of the experiment was to discover what our limitations are in terms of what we'd be willing to do to harm another, and how authority can influence those limitations. I'll get to the results soon, but first I have to explain something:

In social psychology, we talk about Diffusion of Responsibility, a problem that often occurs when people don't feel adequately responsible for the circumstances around them. Having an authority figure available to tell us what to do provides an immense amount of diffusion of responsibility.

In Milgram's experiment, E didn't use threats or cajole. If T didn't want to engage in the experiment, the experimenter would first say "please continue." If that failed, the next statement would be that "the experiment requires that you continue." If that didn't do the trick, E would say that "it is absolutely essential that you continue," and finally, "you have no other choice, you must go on."

If T still refused after those four statements, the experiment would end.

If the experiment didn't end through refusal, it would end after three "shocks" at the maximum level of 450.

There were no threats to E. There was no danger. No loss to refusal. It was merely those statements on the part of the experimenter.

It's easy for us to look at this and think, "I wouldn't ever go that far." It's easy for us to say "I'd never do that."

But the fact of the matter is, in Milgram's work and studies that have replicated it have shown a remarkable consistency: more than 60% of the sample has stuck with the study until the very end, even though they believed at the time that they might be doing serious harm to another human being.

So yes, I'd love to be able to say "I'd never do a thing like that." But I know enough about psychology and self-deception to understand fully well that I can't be certain how I'd behave if faced with such a dilemma. On the surface, it seems like a no-brainer and I honestly can't conceive of doing anything but walking out. But I don't know that I'm that much different from so many people who go along with the experimenter. I don't know that I'm better than they are and I don't know that I'm that strong a person.

I hope I am.

But I'm also fine with not knowing that I'm one of that 60+% who would buckle under the dread of the words "it is absolutely essential that you continue."

So.

Now you know about Milgram's work. Some of you knew all this already. Some of you didn't.

But that's not the point of this piece.

The point is to talk about where we go from here.

In 1974, Milgram wrote an article for "Harpers," "The Perils of Obedience:"

The problem of obedience is not wholly psychological. The form and shape of society... have much to do with it. There was a time, perhaps, when people were able to give a fully human response to any situation because they were fully absorbed in it as human beings. But as soon as there was a division of labor things changed... The breaking up of society into people carrying out narrow and very special jobs takes away from the human quality of work and life. A person does not get to see the whole situation but only a small part of it, and is thus unable to act without some kind of overall direction. He yields to authority but in doing so is alienated from his own actions.

Even Eichmann was sickened when he toured the concentration camps, but he had only to sit at a desk and shuffle papers. At the same time the man in the camp who actually dropped Cyclon-b into the gas chambers was able to justify his behavior on the ground that he was only following orders from above. Thus there is a fragmentation of the total human act; no one is confronted with the consequences of his decision to carry out the evil act. The person who assumes responsibility has evaporated. Perhaps this is the most common characteristic of socially organized evil in modern society.


Let me tell you a story. A woman I know has a son who, in September of 2001, was in his early teens. He was at home with his father, when the first tower fell. They were watching TV at the time, glued to the set.

When the tower fell, his first comment was "cool!"

There was an awkward pause and at first he didn't understand what he'd just said.

Then there was a moment of realization on his part. He looked at his father, confused, and said "wait-- that was real, wasn't it?"

This kid-- a perfectly ordinary kid in so many ways-- no delusions, no dissociative disorders, no disconnect from reality-- said "cool" when one of the towers fell. He said this not because he was mean, or cruel or inhuman.

He said it because it happened on television. And when big, dramatic, things happen on television, they happen because of effects, because of writers, because of cameras and tricks and angles and stunt performers.

I'm going to break from this for a moment, because something big is going on:

As I write this diary, there's a hostage situation over at one of the Clinton campaign offices in New Hampshire. I don't know much more than that. No one seems to know much more at the moment. I wonder how many people watching it are feeling separated from it, and how many are taking it like it's something real and profound. Judging from a quick scan of freerepublic.com (I will not link there), there are definitely people who seem to take it as though it's a game, and something worthy of jokes. I don't mean the sort of jokes that people make when nervous or disturbed. I mean the sort of jokes that people make when they are, in fact, completely separated from humanity.

I don't know what to say about this. I sometimes forget how bad the comments over there can be sometimes, and I shouldn't be bothered by them, but I just find it disturbing. I think we need to find a way to bring these people to light without allowing ourselves to be sucked into their twisted world. I have yet to figure out a way of doing that.

Obviously, I'm not going to be posting this diary at the time I expected to. There's no point at all in posting something like this until the current crisis is resolved, so by the time you're reading all this, we'll all know a lot more about what's going on here.


So, anyway: more from Milgram's article:

I will cite one final variation of the experiment that depicts a dilemma that is more common in everyday life. The subject was not ordered to pull the lever that shocked the victim, but merely to perform a subsidiary task... while another person administered the shock. In this situation, thirty-seven of forty adults continued to the highest level of the shock generator. Predictably, they excused their behavior by saying that the responsibility belonged to the man who actually pulled the switch. This may illustrate a dangerously typical arrangement in a complex society: it is easy to ignore responsibility when one is only an intermediate link in a chain of actions.


I'm going to mention another concept that I've talked about before: cognitive dissonance -- the condition that exists when our behavior contradicts our beliefs. When dealing with cognitive dissonance we sometimes change our behavior, but we sometimes also change our beliefs.

We do not want to think of ourselves as a country which supports or promotes torture. It contradicts our beliefs. So when we see that we have, in fact, engaged in torture, we have some choices:

  1. we can change our beliefs to convince ourselves that we think torture is ok;


  2. we can say "this has to stop" and change our behavior;


  3. we can say "this has to stop" and then convince ourselves that we've changed our behavior without actually doing it;


  4. we can say "we oppose torture" and then reclassify everything we do as something that's not torture.


It's not a difficult argument to make that we, as a nation, have adopted a combination of #s 3 & 4. We've not only moved our debate to treat torture as though it is worth a discussion over whether or not it's an acceptable approach, not through an open discussion but through a redefining of torture into something that ignores the reality behind it.

This denial of the reality behind it is so severe that someone who's experienced torture actually got lectured by, of all people, Mitt Romney on how he defines torture.

Here's the reality as I see it:

  1. we, as a matter of policy, torture people;


  2. we, as a matter of sense of self-integrity, don't want to acknowledge that we torture people;


  3. despite all this, some of us openly acknowledge that we torture.


We need a wave of action about this, pushing our media to reflect a truthful and accurate narrative about this. Therefore, every time we see a "news" article which:

  1. uses the word "waterboarding" but not the word "torture;"

  2. describes the act of "waterboarding" as "similuated" drowning;

  3. references without critique the claim that "we do not torture;"

  4. references torture on the part of lower-level military personnel without mentioning any higher ups;

  5. makes any reference to "torture" without acknowledging any history of torture on the part of the US...


we need to write letters. We need to bombard these papers with letters reminding them of the truth. We need to not let them get away with rewriting the narrative to dismiss torture. We need to eliminate diffusion of responsibility by forcing us front and center into the reality of what's gone on.

Research on obedience has shown that we comply easily when we feel removed from the situation. We ignore the reality of things we can not easily control, assuming that someone else will take responsibility. We find it easier to push a button that will kill someone five miles away than to pull a trigger that will kill someone who will look into our eyes. We find it easier to ignore an act of atrocity and pretend it is not our problem than to take responsibility for it.

Torture can only be supported through obfuscation and lies. We will not stop this until every one of us choose to actively challenge these lies and until we push ourselves to not just bemoan the use of torture but fight it, every of the way. Fight it when someone claims we need it to get information. Fight it when someone pretends it isn't real. Fight it when someone refuses to acknowledge it. Fight it when someone obscures its meaning.

Never.

Stop.

Fighting.

Torture.

Friday, February 23, 2007

Bird color, physics and interpretation

There's a great Birdwatcher's Digest piece about why we consider some birds to be blue, despite the fact that there are no feathers anywhere in the world which are actually blue.

Monday, January 22, 2007

Trusting the Random

A common source of confusion for new students studying research methodology is lack of faith in the random. If you're trying to determine whether or not, for example, classical music improves memory retention, the students will want to break down the control and experimental groups based on everyone's -existing- memory skills to make sure it all breaks down evenly.

You can do this, but it has its drawbacks: you run serious risk of thinking your research is more significant than it is by breaking everyone down along arbitrary lines in a fashion which doesn't have a factual basis. You also take on unconscious experimenter bias when you make intentional efforts to break down the groups in an equitable fashion.

If, however, you break them down along totally random lines, you place yourself in a better situation by removing yourself from the equation of choosing who goes into which group. You also run a risk of these random factors influencing your experimental results, but this gets taken care of through replication and reassessment. The random elements involved become statistically insignificant should you have multiple experiments that suggest the same results.

Monday, January 15, 2007

Billingual Benefits

Yahoo news is reports today on a Canadian study which suggests that bilingualism delays onset of dementia.

For years, I've been telling my students about the benefits of a bilingual education: understanding multiple languages not only provides great ben and health benefits, but also helps us understand language to be something more fluid and less rigid.

Saturday, January 13, 2007

Great resource in Phinneas Gage

As you may know, Phinneas Gage was a railroad foreman who suffered a major personality change after an injury damaged his frontal lobe. I was doing some research for a project and found this resource on Gage.

Tuesday, January 09, 2007

Like I've been trying to tell everyone forever...

...Reuters reports Funding affects study results:

Studies funded entirely by industry were four times to eight times more likely to be favorable to the financial interests of the sponsors than those paid for by other groups, the researchers found.

Of the 22 studies clearly identified as funded by companies or industry groups, just three, or 13.6 percent, had findings that were unfavorable to the beverage studied.

More than 38 percent of the independently funded studies were negative, the researchers found


Like I always say, know your sources.