tag:blogger.com,1999:blog-16594468.post4935092976133734397..comments2023-10-24T04:29:23.693-06:00Comments on Atheist Ethicist: Obligations towards Children: Happiness and Desire FulfillmentAlonzo Fyfehttp://www.blogger.com/profile/05687777216426347054noreply@blogger.comBlogger4125tag:blogger.com,1999:blog-16594468.post-28750475155020201872007-01-10T14:17:00.000-07:002007-01-10T14:17:00.000-07:00Aerik, it's not equivocation because I used the te...Aerik, it's not equivocation because I used the terms meaning and value interchangably, and in fact this interchangability has no true bearing on the argument. Replace one with the other, and perhaps ask what is meant before you accuse.<br /><br />You miss the point of my blank slate argument. It is true that the newborn's brain has the potential built in - but then so does a newly fertilized egg. If you call that a human, then we won't be able to agree until we have that argument, and I don't want to have it. At what point does one become human? When we feel bodily pain? When we form abstractions? Who cares? The premise here, right or wrong, was that this baby is a blank slate in terms of desires and coherent thoughts. We could say that the doctors modify the desires in the test-tube, if you'd like. I even considered the alternative where the mind was not treated as a blank slate!<br /><br />I was expecting a response along the lines of our responsibility to children, and how we should help them fulfill their inevitable desire of not wanting change, but I suppose I've made that a non-point by making the hypothetical scenario explicit.<br /><br />Alonzo,<br /><i>Indeed, if I were to discover that I was in such a machine, I would then treat my fellow humans the same way that I would treat the characters in a computer game.</i><br /><br />The experience machine of your example was sophisticated enough to be convincing. Here we can consider solipsism and Turing's test. If agents conjured by this machine can convince one of their humanity, and their desires, then they are no different from the people around me. Knowing that they are simulated doesn't change this.<br /><br />My main point here was that you might be confusing two things: mistakenly believing that a desire is fulfilled, and fulfilling a 'simulated' desire. A person might desire to help a begger, and thus they give the begger change. But this does not help the begger, it makes things worse. We mistakenly believe that we did good, and knowing better we would not have done so. This is how you should approach religion. The other case is of the simulated begger who you help by feeding, warming, etc. You argue that because the substratum of that begger - who is nearly indistinguashable from a human in a lifetime of interaction - is silicon and logical gateways, that doing good to them has no value. It does. There's no difference between him and us because humans exist at the level of patterns and concepts and relationships.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-16594468.post-21016994524257304932007-01-10T05:35:00.000-07:002007-01-10T05:35:00.000-07:00M
The problem being presented here is one in of p...<b>M</b><br /><br />The problem being presented here is one in of putting a person in a state where his or her desires cannot be fulfilled - regardless of the origin of those desires.<br /><br />Even if the person is provided with the happiness of (falsely) believing that those desires are being fulfilled.<br /><br />I have explained how some people would, in fact, choose the experience machine - if they desired only happiness. So, the fact that you would find value in the experience machine raises no objection. The explanation also handles cases of those (like me) who would consider such a life to be a waste.<br /><br />Indeed, if I were to discover that I was in such a machine, I would then treat my fellow humans the same way that I would treat the characters in a computer game. In fact, life would be nothing but a computer game. I may <i>pretend</i> that it is important that certain characters live or die - but, in the end, it does not matter. I get bored of computer games pretty quickly. I tend to think that I should be spending my time with real people than with fictional people.<br /><br />However, different people have different desires, and make different choices based on those desires.<br /><br />Whether changing desires is a moral or immoral act depends on what they are changed to. Changing a person's desires to make them crave the torture and suffering of others is an immoral act. Changing the desires of others to make them want to help others is a moral act.<br /><br />Indeed, the whole point of the moral education of children is to promote desires that tend to fulfill other desires, and inhibit desires that tend to thwart the desires of others.<br /><br />Even though a child's mind is not a blank slate, it is not completely write-protected either. It is malleable within limits.<br /><br />Note: I wrote an answer to your "unreprentant guiltless righteious murderer" issue in the same post that you provided it a couple of hours after writing this article - in <a href="http://atheistethicist.blogspot.com/2007/01/answering-m-on-subjectivism.html">"Answering M On Subjectivism"</a>Alonzo Fyfehttps://www.blogger.com/profile/05687777216426347054noreply@blogger.comtag:blogger.com,1999:blog-16594468.post-9068643106605425922007-01-10T01:49:00.000-07:002007-01-10T01:49:00.000-07:00The central implication here is that a 'false' wor...<i>The central implication here is that a 'false' world is devoid of value. This I don't accept. If we discovered that this world of ours was a simulation, would our lives lose meaning?</i><br /><br />Here you commit a fallacy of equivocation: meaning -=- value. Not the same thing. They cannot be equivocated, especially here, first because ascribing value to a thing entails placing it on a finite, often discontinuous scale - most of the time it is perfectly reasonable to compare values analogically in quantitative terms. "Well if I had to put it on a scale of 1 to 10..." This is not so with meaning.<br /><br /><i>For the little girl, changing her desires when she has none is not an immoral act.</i><br /><br />This is not true. One's desires can be self-chosen based off whatever information is available to them. Even so, your particular premise - that said girl is a blank slate - is in itself false, so your entire argument is invalid. The elasticity of the human brain (the sole cause and manifest of the human mind) is complicated and far-reaching, but by no means is the human brain a blank slate. The way we learn language as infants is entirely based off of figuring out whether expressions are head-first or head-last, and everything else falls into an arbitrary (even if systematic) place if and only if this one determination has occurred. Even so, Japanese is the only language known to be head-last, which shows a profound un-blank-slate-edness, wouldn't you think? And there are many more examples of brain plasticity having limits and 'pre-programmed' settings that make any argument concerning "blank slate" people completely irrelevant.<br /><br />And here we have a conundrum. If n person's brain were a complete blank, a 'blank slate,' they would in fact have no way to grow. At all. How do you give freedom to nothingness, and how does it make choices or even absorb information? Hooey. What defines personage, sapience, is a certain level of awareness of one's surroundings and one's self in a cogent manner at some level. A blank slate is in fact not a person. Hell, you can't even say a blank slate has a brain, really.<br /><br />So you must consider, M, that when you refer to a child or somebody with a child's mind as a "blank slate" you are in fact dehumanizing them.Aerikhttps://www.blogger.com/profile/06757043033204620563noreply@blogger.comtag:blogger.com,1999:blog-16594468.post-19026026587892559692007-01-09T23:19:00.000-07:002007-01-09T23:19:00.000-07:00I think we disagree. The moral problem being prese...I think we disagree. The moral problem being presented here is one of changing a person's desires. If not that, then of fulfilling those desires in a false world.<br /><br />We should not change the desires of a person that does not want them changed. However, where a person is a blank slate, there is no problem. If there were, then it would be immoral to create a person (say, an AI), with certain unnatural desires. And it isn't. I refuse arguments from nature, fate and the like as nonsense. Here the desire of the child is made to be happiness, and it appears that other relevant desires are removed. I see no moral problem.<br /><br />As for the other desires - helping people, and so on. The central implication here is that a 'false' world is devoid of value. This I don't accept. If we discovered that this world of ours was a simulation, would our lives lose meaning? Mine wouldn't. Did anything change? It didn't. We exist at the level of relationships, concepts, and patterns, and these would be the same no matter what the substratum - simulated, physical, who cares?<br /><br />You might assign greater value to the world that is responsible for the simulation of the one you exist in, and may have very valid desires that would be thwarted based on that. I think that this would be a rational error on your part.<br /><br />For the little girl, changing her desires when she has none is not an immoral act. But let's say she desires to do good, have a happy family, or learn the behaviour of her universe as through physics. If the people there are indistinguishable from the people here, it's incorrect to say that doing good to them, or having a family with them has less value there than here. As for gaining knowledge, my initial decision would be that it is immoral to keep a being in the dark about a 'greater' world once they have solved the intellectual problems of theirs. But the 'desire for understanding' aspect is not as relevant to this discussion, and unfortunately few people place great value upon it to begin with.<br /><br /><br />I'd still like you to address the problem of the unrepentant guiltless righteous murderer, and how one could call such a person "evil", where "evil" is something greater than 'thwarted a desire to live', or 'will be punished by society', or 'I highly disapprove of such acts'. I don't think that it can be greater, and so the murderer's morality is no more or less valid than ours, just in a minority. My previous arguments might have better detail.Anonymousnoreply@blogger.com