The Office of the President Emeritus: University of Oregon

Speeches and Writings

"Situational Ethics, Social Deception, and Lessons of Machiavelli"

Judge Learned Hand Award Luncheon
Oregon Chapter of the American Jewish Committee
Tuesday, November 30, 2004
12:00 Noon
Governor Hotel - Portland OR

Dave Frohnmayer, President
University of Oregon

Thank you for the honor of this award and for the occasion that this distinguished organization has given to us for a time of reflection. One of my own great mentors, the late Elliot Richardson, was a law clerk for the great judge, Learned Hand, and embodied his wisdom and moderation in his own life and actions.

Our topic today is daunting because it forces us to ask what underlies our judgments about ethical behaviors. Do we really have choices? Are we the children of nurture or of nature? Are there circumstances that might cause any of us to cross an ethical line? And what is ethics? To the last question first: there is not even total agreement on how to conceptualize the inquiry.

Some people think of ethics as staying out of jail. A second group of people think about ethics as codes of conduct and rule books. A third group may understand ethics more broadly in spiritual commands or principles of a religious belief. A fourth group might think about ethics as helpful aphorisms such as "first do no harm" or "do unto others as you would have them do unto you," or the utilitarian principle, "the greatest good for the greatest number." Yet another group believes that ethics has to do with doing the right things for the right reasons. And a final group might even think of ethics as challenging because so often we are torn not between choosing right or wrong but rather choosing between two rights or two wrongs: we become confused, perplexed, even paralyzed, by the realization that our values actually seem in conflict.

Without ignoring these either narrow or increasingly grand and important classifications of ethics, (I return to some of them later), I thought today I could escape from sanctimonious platitudes by exploring what the academy, and more specifically what social science, modern psychology and even genetic discoveries may tell us about ethics, ethical choices, and doing the right thing.

I will spend the next few minutes recounting some recent learning, which mostly is ignored by our social discourse about ethics. My purpose is to explore six experiments or discoveries and examine them one by one, trying to view some of the enduring questions of ethical choice through the lens of empirical evidence about human existence and behavior.

Let's go back thirty years or so to the findings that are most well known, the most infamous - and the ones that retain a tantalizing level of mystery because we can't redo the experiments. They are, first, Phil Zimbardo's Stanford prison experiment and second, Stanley Milgram's Yale University study of "obedience to authority."

ZIMBARDO - THE STANFORD PRISON EXPERIMENT

In 1971, Stanford Psychologist Philip Zimbardo transformed the basement of an academic building on campus into a mock prison as the setting for a two-week experiment in prison psychology. Zimbardo recruited undergraduate volunteers to participate in what became known as the Stanford prison experiment. The volunteers were randomly assigned to be either guards or prisoners. After only six days Zimbardo ended the experiment because of what the situation did to the volunteers who participated. The students' lives were transformed by the setting in which they found themselves; sadism, brutality, lying, and depression, and extreme stress became the norm.

In reflecting on the experiment decades later, Zimbardo said that people who believe they know how they would act under unfamiliar and pressure-filled situations are deluded; no one - no one knows how he or she would respond. Zimbardo's work has never been replicated; the fact that it cannot be repeated because of the ethical restrictions of human subjects committees, is validation to some extent of the power of the experiment. Social psychologists do, however, continue to produce evidence that situational factors can have powerful influences over the behavior of individuals.

SHOCKING BEHAVIOR

In the early 1960s, Stanley Milgram (Yale) provided a sobering insight into human behavior with his experiments on obedience to authority. Volunteer subjects were asked by the lab-coated, clipboard-carrying researcher to help in a study that explored the role of punishment in learning. The volunteers were asked to be "teachers" and to give the "learners" in the next room a series of questions. They could not see the "learners," only hear their responses through a microphone.

When the learner gave an incorrect answer, the teacher was told to deliver a shock to the learner, and the teacher was told to increase the voltage of the shock to subsequent incorrect answers on a device with markings of various levels from "slight shock" to "danger: severe shock.

The unseen learners in fact were trained actors. They were not actually receiving shocks but their cries of pain, nevertheless, convinced the teachers of the severity of the shocks. Even after teachers expressed concern about the pain they were apparently inflicting, the authority figure instructed them to continue and, indeed, to inflict even more severe shocks. Sixty-five percent of the "teachers" punished the learners to the maximum 450 volts, even as the alleged "learners" screamed or appeared to have been shocked into unconsciousness. No "teacher" stopped before reaching 300 volts. Replications of this study and others by Milgram have produced essentially similar results.

These studies - and now I am quoting Robert Mauro, associate professor of psychology at the University of Oregon - have both a surface message and a deeper message. The surface message is that "normal" people can be made to do really awful things under the "right" conditions. The people - the volunteers - in these two studies are not pathological. The deeper issues are about how these "evil situations" are constructed.

In the Milgram experiment, the "teacher" is not initially asked to shock the "learner." Instead, the experimental situation is constructed so that the teacher is allowed to sink into the role. At first, nothing happens; the learner gets everything right. The situation starts to become routine. Then, the learner makes an error and the teacher must deliver a small shock. Afterwards, the learner continues to respond correctly.

This cycle continues for quite some time, slowly sucking the teacher into the role. For the teacher to break out of this situation, he/she must be willing to say "I have done evil; but it stops here. I will do no more harm." For the teacher to go on, all he/she needs to do is continue. The teacher can keep believing any of the stories that the lab-coated experimenter provides, e.g., "the experiment requires that you go on," "it's my responsibility," etc., and can keep believing that he/she has done nothing wrong. You need only confront your guilt if you DON'T continue.

So, the question is how can you influence people to be resistant to the effects of "evil situations?" According to Zimbardo, the best way to avoid the influence of an evil situation is to avoid the evil situation. But can we do that all the time? Does life allow that?

If you can't avoid the physical situation, you may still be able to change the psychological situation that leads the research subject to inflict senseless and brutal pain, or commit some other evil act.

There is some reason to believe that one can make people somewhat less susceptible by making them aware of the possible effects of situational factors and activating conflicting norms. But there are also studies that demonstrate rather dramatic failures of these sorts of interventions. For example, in one study seminary students (good people?) about to give a sermon on the Good Samaritan were still induced by situational factors not to render aid to a "victim" while on their way to deliver the sermon!

This story is sobering. It is one reason I am told that few social psychologists were surprised by the Abu Ghraib prison scandal.

AND ONE MORE EXAMPLE OF THE "MILGRAM" RESULTS

In an experiment, a researcher called a hospital, posing as a physician, but with a name none of the nurses had ever heard (certainly not the name of a doctor who actually worked at the hospital), and instructed the nurse on duty to give a patient a dose of a medication that was not in common use at that hospital and at twice the normal dosage. Of the 22 nurses called, 21 obeyed the doctor (or tried to- the hospital's pharmacist had been warned of the request and told not to fill the prescription).

This particular conceptual replication is a very strong rebuttal to folks who say that the Milgram results occurred simply because the setting and request were unfamiliar to the subjects. Milgram's classic article noted, interestingly, that physical proximity to the instructor made a difference in the intensity of the subject's response. The closer to the command, the more difficult it was for the "teacher" to resist even a suggestion or explanation as to why he should continue.

Zimbardo and Milgram have shown us what we are capable of-let us now consider three studies that look not at what but at why-specifically the conundrum of nature vs. nurture.

NEURAL NET FOR MORALITY

Joshua Greene, a Princeton neuroscientist, looks at moral paradoxes. An example: We condemn someone who sees a drowning baby in a pond and won't jump in to save it because it would ruin his $400 shoes. But millions of children around the world are dying and a little money for medicine or food would save their lives. Yet we don't consider ourselves monsters for eating lunch here rather than giving the money to Oxfam. Why?

Greene started his academic career as a philosopher and decided that in the end Hume was right - people act not because they rationally determine something to be right but because it makes them feel good.

He then moved to the sciences and, to test his hypothesis, he now hooks up volunteers to an MRI , presents them with moral conundrums and looks at what parts of their brains light up. His research indicates that in facing such paradoxes, we do not use our "powers of reason alone," and that our emotions play a powerful role, triggering instinctive responses that are products of evolution -- our inherited nature.

Subjects who are presented with impersonal (detached, distant - millions of anonymous children far, far away) moral decisions and "non-moral questions" (Is three greater than four?) showed electrochemical activity in a part of the prefrontal cortex, the center for logical thinking.

Subjects presented with personal moral questions (regarding, for instance, a specific baby that one sees drowning) showed activity in the three regions of the brain that are related to emotional and empathetic responses. Greene thinks these three regions may be a "neural net" for morality.

Greene believes many of our great conflicts may be rooted in such neuronal differences. And he also argues that more than genetic, evolutionary factors play a role in this patterning in the brain. Says Greene: "Genes, culture, and personal experience have wired people's moral circuitry in different patterns."

NATURE VS. NURTURE

Steve Pinker (a Harvard professor of psychology) believes that debates over which is more important - culture or genes -- nature or nurture -- evoke more rancor than just about any other issue in the world of ideas.

The nineteenth century viewed biology as destiny; during much of the twentieth century, the opposite--we are born as "blank slates" --was the common descriptor. But from the end of that century through the beginning of this one, cognitive science, evolutionary psychology, developmental psychology, behavioral genetics, and neuroscience have shown that the innate organization of the brain cannot be ignored. These disciplines have helped reframe our very conception of nature and nurture.

Now we have moved to what Pinker calls "holistic interactionism" based on the notion that "the answer is all of the above" - nature and nurture are not mutually exclusive, genes cannot cause behavior directly, the direction of causation can go both ways. It has a veneer of moderation.

The problem is it is neither as reasonable nor as obvious as it appears. For one thing, lots of people are still blank slaters. Notably the late Stephen J. Gould, for example. And so are postmodernists who consider behavior to be a "social construction." More important, this wishy-washy approach gets us off the hook of trying to untangle a very complicated interaction; Pinker argues forcefully that the "all of the above" explanation is essentially too lazy.

A LITTLE INTUITION

Jonathan Haidt (Virginia) and Craig Joseph (Chicago) come at the same issues in a slightly different way : They posit two main schools in the social sciences: (l) the empiricist who argues that moral knowledge and moral action are learned in childhood; that nothing moral is built into the human mind, though there may be some innate learning mechanisms that enable the acquisition of later knowledge. And (2) the nativist who argues that morality is built into the human mind by evolution.

Haidt presents a modified nativist view; human beings have intuitive ethics; we feel flashes of approval or disapproval toward certain patterns of events involving other human beings - especially with regard to four universal, or nearly universal areas: suffering, hierarchy, reciprocity and purity.

He believes these four areas and their intuitions undergird cultural moral systems. He argues that intuitions are the factory-installed equipment that evolution built into us to respond to recurrent social problems. Intuitions are the judgments, solutions, and ideas that pop into consciousness without our being aware of the mental processes that led to them. "Moral" intuition is a subclass that has feelings of approval or disapproval as we see or hear about something someone else did or consider choices for ourselves.

We have a well-developed ability to process information slowly, deliberately, and fully within conscious awareness. But recent research in social psychology suggests that responses to moral dilemmas mostly emerge from the intuitive system - quick gut feelings that happen within a second or two when a particular situation is presented. People then search for supporting arguments and justifications using their reasoning system, which is the rational tail that got wagged by the emotional dog. Someone recently remarked to me "What I think before I act is my rationale; what I think after I act is my rationalization."

"To be pulled in many opposite ways at once results negatively, but it is not the same thing as to feel no impulse at all. An ass between two bales of hay is said to have died of starvation, but not from indifference."

Learned Hand - P. 10, Class-Day Oration, (1893).

ETHICS AND BUSINESS

And for the sixth finding, how about ethics within the business environment? More broadly, how are our ethical expectations and actions influenced by the organizations and institutions of which we are a part? How much should we fear the all-pervasive organizational corruption of which John Grisham wrote a fictional account in The Firm?

As an aside, a poll by the New York Times/CBS News in l995 (that would be before the scandals of Enron and Martha Stewart) 55% of the American public believed that the vast majority of corporation executives were dishonest. And in l996, 41% of all the television murders were committed by businessmen.

So, a quick take from research by Myrna Wulfson, St. John's University Review of Business Fall 1998: more than 90% of Fortune 500 companies have ethical codes of conduct detailing what the companies expect from their employees in terms of responsibilities and behavior. Do these codes result in more ethical behavior by employees?

No. Why not? Because as the codes and the rules proliferate the good folks leave and the bad ones like the challenge of "gaming the system." What results in more ethical behavior by employees? The behavior of top management. People model the behavior of their leaders.

How about goal setting in a corporation? Do clear directions to employees (not about ethical behavior but about what they are expected to accomplish for the benefit of the company and themselves?) have any effect on ethical behavior? Goal setting is part of our mantra of what constitutes good executive practice.

According to Maurice Schweitzer of the Wharton School at the University of Pennsylvania, yes they do. Goal setting is a motivator for unethical behavior - particularly when people fall just short of reaching their goals.

So - given all this - some of you may be asking yourself: what does all this have to do with Machiavelli? (This is a speech about Machiavelli.) Quite a bit, as it turns out, in the arena of leadership and ethics. And, by the way, all of you are leaders, whether you are senior partners, associates, clients, sinners or concerned bystanders.

We have just talked about what unethical choices we might make and whether they really are choices. Machiavelli's teachings take us to a third step which is whether we can ever be justified in consciously choosing a course which we and others have been taught to believe is wrong on some level, but right on another.

Sir Isaiah Berlin, in his pathbreaking thesis on Machiavelli argues that Machiavelli developed and embodied a radical break from the pre-existing western moral tradition. Contrary to other religious and ethical schools of thought, his concerns demonstrate that all worthy values are not cohesive and compatible. The good, true and beautiful are not of one piece.

In this world, worthy values clash. - just as they do in the brains being studied today by neuroscientists. If one believes in political leadership as a means to civic greatness, then doing what others may consider as evil is not evil at all. In fact, one might even read Machiavelli to be saying that in his moral universe, it is evil not to do evil, if the price is loss of power. In Machiavelli's world there are two co-existing and incompatible codes of ethics: those which leaders must follow and those which may apply more generally to other citizens.

In a similar note of fatalism, resignation or blunt realism, Loyal Rue, a senior fellow at Harvard Divinity School and author of By the Grace of Guile: The Role of Deception in Natural History and Human Affairs says this about political deceit:

"Politicians have interests. All human beings have interests. Whenever people have interests, it gives them something to protect. So people, including politicians, lie to protect their interests and those of the people they care about."

But now consider a second conclusion: Machiavelli's theories simply won't work--at least they won't work with the completeness that he once advocated given the size and technological advances in our modern nation-state. A prince might rule a small city state with a trusted cadre of henchmen; but acquiring and exercising power in the age of modern media makes the possibility of extended deception very unlikely, if not impossible.

If you follow Machiavelli literally, you have, at a minimum, two other problems: You must always sleep with one eye open -- if people know you will hurt them, they will try to hurt you first! (Remember the insights from research today in corporate settings - the followers will model the leader's behavior.) And, if you put your faith, as did Machiavelli, in the preemptive strike, you always must observe, calculate and act with complete precision. There is no latitude for your error.

There are deeper problems, of course, with Machiavelli's guide for leaders. Those problems slide us down the famous "slippery slope" very quickly. How do you know when an end is grand enough, or attainable enough to justify deceit and brutality? How does this theory play into every human's infinitely easy capacity to rationalize pleasurable or productive actions? Means and ends are not separate: means (as a wise professor once told me) usually enter into and condition the ends. And if you are concerned about your salvation in the hereafter, which Machiavelli was not, there is something far more eternal about conventional ethics.

On two points, Machiavelli makes enormous sense, however. The first is that the leader's ability to calculate the consequences of his or her actions must itself be assessed as part of the morality of those actions. Max Weber, the early 20th Century German sociologist of the Weimar era, explored this issue brilliantly in his profound essay, "Politics as a Vocation." In this century, millions of lives have been lost because leaders claiming the virtue of "pure intentions" have failed to estimate the human consequences of what they set in motion. Machiavelli's contribution to this moral calculus is something we should honor. Hell is paved with good intentions (and incompetent execution).

And second, never forget the "followership" part of the equation. Machiavelli embraced cruel tactics because he saw his own fellow citizens as ethically fickle, imperfect, and often lacking in virtue. Leadership is a two way street. We should be particularly wary of the proposition that we, the led, are morally superior to "they," our leaders. The "problem of Machiavelli" is not just what he was; it is also what we are.

There is a special point to the "we" of this. The founders of the U.S.

Constitution saw corruption of the civic virtue as likely; feared tyranny profoundly; and worried how long their construction of government would last.

They did not choose to put their faith in princes, and accepted human vice as a given. But they did something very different to counter it. They divided power among three competing institutions of government, separated power, held out for no heroes, and counted on checks and balances to deter tyrants. So far, 215 years later, it seems to be working.

And now to my final difficult question: Can you be a leader without being unethical and immoral in your tactics?

Must we pay Machiavelli's high price for his brand of leadership? We may choose to do something less than ethical, but this is not Machiavelli's world, and it will catch up with us! Do we live in a world of political shades of gray? Of course - they permeate our personal lives and the issues of national and international interest. That is the real issue. The challenge is not only the extreme ethics of murder and deceit; it is more in the dangerous immorality of civic disengagement and indifference. It can be the smug satisfaction that our values and virtues are superior to those of others. Reds and Blues find common discourse on "values" to be elusive. It also lies is in not trying to have ethical standards, and in giving up. Not caring is worse than being wrong. The struggle must be engaged. That, truly, is the Machiavellian challenge we face.

Notwithstanding or in view of all these findings from 20th and 21st century social sciences as well as Renaissance Italy, I would hope that the thoughts of most of you who are still listening return to the most enduring fundamental questions of ethics - free will versus determinism - unseen hands, fate, genetics, or inevitability: How much does human behavior and our judgments about it rest on choice? If it all rests on determinism, then moral judgments may seem superfluous. If it rests preponderantly on choices, then traditional notions of personal responsibility and theological conceptions such as confession, repentance and forgiveness continue to retain their vitality.

Those of us who are intellectually facile have to be especially rigorous in detecting our own ability to deceive ourselves - to rationalize - to remember selectively. Remember the words of an Oregon Supreme Court decision which invalidated a rental payback bonding plan: "It is a scheme that would only fool a lawyer." Martin v Oregon Building Authority (1977)

It may be, as some would doubtless argue, that the social science learning I have cited today needs more testing and replication; and that it is dangerous to generalize as broadly as I suggest or imply. But let us take our lesson from the namesake of today's kind award, Judge Learned Hand. He once remarked: "...life is made up of a series of judgments on insufficient data, and if we waited to run down all our doubts, it would flow past us." P. 137 On Receiving an Honorary Degree (1939)

What "take home" lessons do flow from this modern learning applied to more enduring questions?

1. In talking about ethics, we are not just talking about losing our bar tickets, or even our professional reputations or self-esteem - we are talking about losing our souls. Staying out of jail is not enough; and rigid rule-based behaviors can still create serious injury to others.

2. We will not presently resolve the nature/nurture argument, but neither one frees us from responsibility for individual ethical choice. Even the most deterministic neuroscientist won't let us off the hook for making ethical decisions based on principle and at least some range of free action.

3. If we wonder about the strength of temptation, we should. Our social environment can make us much more susceptible to temptation - and it can be so powerful that we no longer recognize it as temptation. It may be an enduring ethical caution to avoid those environments of temptation, even if we believe ourselves incapable of succumbing to them.

4. Rationalization is an extraordinarily powerful denial mechanism to which we succumb - and we use it again and again to deny the problem or the conduct. We should never assume that we know how we would react under pressure.

5. We are so often blind to our dark side; most evil sneaks up on us because we consider ourselves good...too infallibly good. There is obvious wisdom in the ancient Greek concept of hubris, reflected as well in both the rabbinical tradition and the gospels: There are many reasons why pride "goeth before destruction" -- among the most important of these reasons is that pride is the enemy of insight.

6. A major component of ethical judgment is to recognize the flashing yellow lights that say "don't enter the valley of the shadow." The admonition to avoid the "occasions of sin" may be more important that we have realized. We can easily go too far - authority is seductive; we can reach a personal tipping point after which our hands are inescapably dirty. Some environments blind us to the human consequences of our actions- so we MUST be attuned to the consequences of our behavior and our own weaknesses, our own sins --whatever they may be. This ethical life is hard work - "knowing right from wrong" requires diligence, self-scrutiny and looking into a very well-lit and refractive mirror.

Judge Learned Hand would have understood this well. As Judge Hand said in his work, A Fanfare for Prometheus (1955, p. 279):

"We may win when we lose, if we have done what we can; for by so doing we have made real at least some part of that finished product in whose fabrication we are most concerned: ourselves."

Learned Hand was interested in process and moderation (one of the ways perhaps to avoid self-deception and rationalization). - one of my great mentors, Eliott Richardson,was a law clerk of Learned Hand - tree of descent

Let me add some quotes from Learned Hand to the one mentioned above :

Our dangers, as it seems to me, are not from the outrageous but from the conforming; not from those who rarely and under the lurid glare of obloquy upset our moral complaisance, or shock us with unaccustomed conduct, but from those, the mass of us, who take their virtues and tastes, like their shirts and their furniture, from the limited patterns which the market offers. P. 34, The Preservation of Personality (1927).

... nothing is more commendable, and more fair, than that a man should lay aside all else, and seek truth; not to preach what he might find; and surely not to try to make his views prevail; but, like Lessing, to find his satisfaction in the search itself. P. 138, On Receiving an Honorary Degree (1939).

We may win when we lose, if we have done what we can; for by so doing we have made real at least some part of that finished product in whose fabrication we are most concerned: ourselves. P. 297, A Fanfare for Prometheus (1955).

The condition of our survival in any but the meagerest existence is our willingness to accommodate ourselves to the conflicting interests of others, to learn to live in a social world. P. 87, To Yale Law Graduates (1939).

...life is made up of a series of judgments on insufficient data, and if we waited to run down all our doubts, it would flow past us. P. 137On Receiving an Honorary Degree (1939)