martes, 22 de noviembre de 2011

Doing the Ethical Thing May Be Right, but It Isn’t Automatic By ALINA TUGEND Published: November 18, 2011 New York Times

FOR the last few weeks, the sex abuse scandal at Penn State and the harassment claims against the Republican presidential candidateHerman Cain have been fodder for discussion at my house. The same is true, I assume, around the country.

Putting aside the specifics of each case, one question that has come up is, “What would I do?” That is, if I saw what seemed to be a crime or unethical act committed by a respected colleague, coach, teacher or friend, would I storm in and stop it? Would I call the authorities immediately? Would I disregard the potentially devastating impact on my job or workplace or beloved institution?
Absolutely, most of us would probably reply. I think so, others might respond. And the most honest answer? I don’t know.
As much as we would like to think that, put on the spot, we would do the right — and perhaps even heroic — thing, research has shown that that usually isn’t true.
“People are routinely more willing to be critical of others’ ethics than of their own,” said Francesca Gino, an associate professor at Harvard Business School, and two other authors in the journal article “See No Evil: When We Overlook Other People’s Unethical Behavior.” The article appeared as a chapter in the book “Social Decision Making” (Psychology Press, 2009). “People believe they are more honest and trustworthy than others and they try harder to do good.”
But our faith in ourselves isn’t borne out by history or research, something the Times columnist David Brooks pointed out in his column this week.
The most well-known example of this in academia is the experiment conducted by the Yale University psychologist Stanley Milgram in the early 1960s. In the experiment, participants were “teachers” and, unbeknown to the participants, the “learner” was really an actor. The teacher was to instruct the learner in word pairs. For every wrong answer, the teacher could shock the learner, increasing the intensity of the shock for each wrong answer.
In reality, there were no shocks (the teacher couldn’t see the learner), but the person administering the shocks didn’t know that. In fact, the learner would bang on the wall, supposedly in pain, as the shocks “increased.”
In the end, a majority of the “teachers” administered the strongest shock of 450 volts, and the experiment was replicated elsewhere with similar results. The findings are depressing — that ordinary people can be easily persuaded to do something they believe is wrong.
“People would sit there crying and sweating, but they didn’t want to be rude,” said Carol Tavris, a social psychologist and author of numerous books including, “Mistakes Were Made (But Not by Me)” (Harcourt, 2007). But most people say they believe they would act differently from the participants, despite evidence to the contrary, she said.
For example, Professor Gino said, she and her colleagues asked female job candidates what they would do if inappropriate comments were made in a job interview.
“Most said they would walk away or raise a red flag,” she said. “But in reality, when it happened, they didn’t do that. Across the board, research points to the fact that people want to behave well but give in to temptations.”
Research also shows that it is much easier to step over the boundary from ethical to unethical when there is a gradual erosion of moral values and principles rather than one big leap.
A 2009 article in The Journal of Experimental Social Psychology, also co-written by Professor Gino, used as an example an accounting firm that has an excellent relationship with a client company. The accounting firm, which receives tens of millions of dollars in fees from the client, approves the company’s high-quality and ethical financial statements.
For three years, everything is fine. But suddenly, in the fourth year, the company stretches and even breaks the limits of the law.
Another case? Same accounting firm, same client. This time, after the first good year, the client bit by bit pushes the ethical envelope over the next three years.
The accounting firm would be more likely to approve the financial statements in the second case than in the first, the article says.
One of the reasons, Professor Gino and her colleague write, is that “unethical acts can become an integral part of the day-to-day activities to such an extent that individuals may be unable to see the inappropriateness of their behaviors.”
Here’s another way we deceive ourselves. Most of us say we admire people who stand up for what’s right (or what is eventually shown to be right), especially when they are strong enough to stick to their guns in the face of strenuous opposition.
But again, research shows that’s not necessarily true. In “When Groups are Wrong and Deviants are Right,” published last year in The European Journal of Social Psychology, Australian academics argue that group members are often hostile to people who buck conformity, even if the members later agree with the dissenter.
Even when, say, a whistle-blower may prove to be correct, she is not always admired or accepted back into the fold, the academics found. Rather, the group may still feel angry that the whistle-blower damaged its cohesion.
Philip G. Zimbardo, professor emeritus of psychology at Stanford University and author of numerous books including, “The Lucifer Effect: Understanding How Good People Turn Evil” (Random House, 2007), has spent a lifetime studying moral degradation. In 1971, Professor Zimbardo set up the infamous Stanford Prison Experiment, where the college student “guards” turned sadistic in a very short time, denying food, water and sleep to the student “prisoners,” shooting them with spray from fire extinguishers and stripping them naked.
Professor Zimbardo has classified evil activity in three categories: individual (a few bad apples), situational (a bad barrel of apples) or systemic (bad barrel makers).
“The majority of people can get seduced across the line of good and evil in a very short period of time by a variety of circumstances that they’re usually not aware of — coercion, anonymity, dehumanization,” he said. “We don’t want to accept the notion because it attacks our concept of the dignity of human nature.”
While it may be easy to give up in the face of such discouraging findings, the point, Professor Zimbardo and others say, is to make people conscious of what is known about how and why people are so willing to behave badly — and then use that information to create an environment for good.
Professor Zimbardo, for example, has established the Heroic Imagination Project. Already in some California schools, the project has students watch the Stanford Prison Experiment and similar ones about obedience to authority to teach how individuals can recognize the power of such situations and still act heroically.
He says he hopes to bring his project into the wider world of business and the military.
Although no one thinks it’s an easy task, Professor Zimbardo is not alone in his faith that people can be taught, and even induced, to do the right thing.
“I am a true believer that we can create environments to act ethically,” Professor Gino said. “It just might take a heavier hand.”
A version of this article appeared in print on November 19, 2011, on page B5 of the New York edition with the headline: Doing the Ethical Thing May Be Right, but It Isn’t Automatic.

No hay comentarios:

Publicar un comentario