Monday, June 06, 2016

What Morality Needs

There are three things that we must have for a moral system to emerge.

(1) Intentional agents - agents that act on beliefs to fulfill desires.

(2) Some desires are maleable.

(3) One set of tools for molding desires includes rewards and punishments.

It is another feature of our moral system that praise serves as a type of reward, and condemnation serves as a kind of punishment.

In saying this, I am denying that morality requires any type of fellow feeling, altruism, or community spirit. These are helpful, but they are not necessary.

Peter Railton provides an example of the type of belief I am rejecting.

Both social and biological evolution involve selection mechanisms that favor behaviors satisfying criteria of relative optimality that are collective (as in prisoner's dilemma cases) or genotypic (which may also be collective, as in kin selection) as well as individual or phenotypic. Were this not so, it is hardly possible that moral norms could ever have emerged or come to have the hold upon us they do. ("Moral Realism", The Philosophical Review, Vol. 95, No. 2 (Apr., 1986), pp 163-207.)
However, it is easy to describe a foundation of morality that does not involve any type of inherent kindness or altruism.

In my posts, I have illustrated some of my points by imagining a world containing a single person by the name of Alph who has one desire - a desire to gather stones. This is the only desire he has. He has no sense of altruism or concern for anybody or anything else. He does not even care about himself except insofar as it is necessary in order to make true the proposition, "I am gathering stones".

In this example, imagine that there is a limited number of stones to be gathered. After gathering all of the stones, Alph then has to scatter them again so that he can once again make true the proposition, "I am gathering stones."

Into this world, we introduce another person by the name of Betty.

Next, we give Alph the power to select Betty's desire - an ability to influence what Betty wants to do. Specifically, we give Alph the power to give Betty one of two pills. The blue pill will cause Betty to gather stones - just like Alph's. The red pill will give Betty a desire to gather stones.

Using only Alph's desire to gather stones, what should he do?

Alph has no reason to give Betty the desire to gather stones. If he did that, then Betty will be in competition for the stones that Alph is gathering.

However, if Alph gives Betty the desire to scatter stones, then, while Alph is gathering stones, Betty will be scattering them again. Assuming that Betty works as fast as Alph or faster, Alph could potentially spend all of his time gathering stones. That is to say, he could keep the proposition, "I am gathering stones" true indefinitely.

Now, instead of saying that Alph can select Betty's desire by giving her a pill, Alph can do so through a system of praise and condemnation. He takes Betty out into the field, has her take a stone from the stones that Alph has gathered, and scatter it. With each successful action, he praises Betty. If Betty picks up a stone and gathers it into the pile, he condemns her. In this way, let us assume, Betty acquires a preference for scattering stones.

Here, Betty acquires a desire that is compatible with Alph's, allowing the two of them to live in harmony. Each, while acting on their own desires, performs actions that tend to fulfill the desires of others. However, neither of our agents are altruistic in any way. Neither agent cares a bit about the other person except insofar as the other is useful in helping the agent fulfill his or her desire.

We can continue to add complexity to this system. Let's make it a large community filled with thousands of people. Further make it the case that different people are more or less efficient at gathering or scattering stones.

We can imagine a society so complex that it would be nearly impossible to calculate whether stones are being gathered faster than they are being scattered, or more slowly. However, the people in our world need not make that calculation. They only need to observe whether those who gather stones are able to work continually while those who scatter stones are running out of stones to scatter, or if it is those who scatter stones that can work continuously.

Whichever the fact of the matter is, the group that is able to work continuously has no reason to do anything other than continue working as the new person enters the world. However, those who are being frustrated have a reason to teach the new person to acquire that interest that would bring the two groups back into balance. The community brings about this coordination, even though its members still lack any kind of community spirit, kindness, altruism, or fellow feeling.

Each person in our community has a reason to cause each other person to have an aversion to causing physical harm - where "harm" is anything that prevents a person from gathering or scattering stones respectively. Similarly, they each have a reason to cause others to help them if they should fall into dire straights. Instead of being innate, we merely need a population capable of learning or acquiring new interests in ways that others can manipulate.

If those new desires are learned by praising and rewarding those who have them, and condemning and punishing those who lack them, then you have a community that has something so much like a moral system it would take a philosopher to find the difference.

One could argue that this qualifies as "selection mechanisms that favor behaviors satisfying criteria of relative optimality." However, it actually favors nothing. Agents can learn harmful and anti-social behaviors and attitudes as easily as they learned to be helpful and refrain from causing harm. The difference is that (barring conflicting desires) nobody has a reason to cause others to acquire interests that would thwart their own desires. They only have reason to cause others to acquire dispositions that would tend to fulfill their desires.

Here, we have the start of a moral system. And all we needed were intentional agents with malleable desires that could be molded using reward and punishment. All three of these things exist. Consequently, moral systems have emerged.

No comments: