Navigation
Page Summary
pinkdormouse.livejournal.com - (no subject)
apel.livejournal.com - (no subject)
yonmei.livejournal.com - (no subject)
communicator.livejournal.com - (no subject)
archbishopm.livejournal.com - (no subject)
kittydragon.livejournal.com - (no subject)
cdybedahl.livejournal.com - (no subject)
cdybedahl.livejournal.com - (no subject)
cdybedahl.livejournal.com - (no subject)
cdybedahl.livejournal.com - (no subject)
cdybedahl.livejournal.com - (no subject)
cdybedahl.livejournal.com - (no subject)
cdybedahl.livejournal.com - (no subject)
communicator.livejournal.com - (no subject)
yonmei.livejournal.com - (no subject)
yonmei.livejournal.com - part 2
cdybedahl.livejournal.com - Re: part 2
cdybedahl.livejournal.com - (no subject)
communicator.livejournal.com - (no subject)
cdybedahl.livejournal.com - (no subject)
Style Credit
- Base style: Librarian's Dream by
- Theme: Bit Warmer by
Expand Cut Tags
No cut tags
(no subject)
Date: 2003-07-13 11:35 pm (UTC)Gina
(no subject)
Date: 2003-07-14 02:23 am (UTC)For me it's enough that the intent of the originators is to limit somebody's choice. On the other hand, just because something is morally suspect doesn't mean that it's imperative that it be changed. The results can still be good, whether that was the intention or not.
(no subject)
Date: 2003-07-14 05:07 am (UTC)Suppose that, rather than changing Tara's memory, she had changed the Buffyverse reality so that the quarrel had never happened at all. The effects would be the same, in that Tara would have no memory of fighting with Willow. But there would be no external evidence to make her aware of the tampering. Nobody would remember the quarrel (possibly not even Willow, if the spell was powerful enough), there would be no physical evidence. She would never find out. She would never feel violated. Given those changed circumstances, is the tampering still wrong? If so, why?
It would be wrong on a different level. It wouldn't violate Tara's memory - the quarrel never happened. It would mean that Willow didn't have to take responsibility for her own actions. If she does something and doesn't like the results, she just casts the "make it didn't happen" spell, and suddenly, it didn't. This is Star Trek style time travel, effectively - "we're just going back to change this one thing, it won't hurt a bit" - and the simplicity of this is, well, just wrong.
I haven't read The Domination of the Draka. But from your description, I would say yes, the Final Society is wrong. From the way you describe it, the biotechnological manipulation of Homo Servus designed to take choice away from them - to make it so that they can never choose to rebel.
And if one were to remove the slave mentality from a Servus, would not that be exactly the same sort of crime that Willow commited, the external manipulation of the apparatus of choice?
No. If one were to remove the slave mentality from a Servus, that would be the exact reversal of what Willow did. She removed the possibility of choice from Tara by manipulating her mind, just as those who created Homo Servus did. To return the freedom of choice is the exact reverse: it is no more a crime than Dawn telling Tara that Tara and Willow had had a quarrel the previous evening was a "crime", though that too was "external manipulation of the apparatus of choice", and Tara would have been much happier if Dawn hadn't told her.
Furthermore, given that the Final Society actually is a Utopia -- everyone in it is happy, they are designed to be! -- wouldn't changing it be a crime in itself?
No. See above. If Dawn hadn't told Tara that the quarrel had happened, Tara would have been happy, Willow would have been happy, and Dawn would have been happy. Everything would have worked the way Willow designed it to be. Do you really feel that Dawn committed a crime when she broke up all that happiness? I don't think so - because happiness is not an end in itself. It's nice, but it's icing on the cake.
(no subject)
Date: 2003-07-14 08:32 am (UTC)(no subject)
Date: 2003-07-14 11:09 am (UTC)(no subject)
Date: 2003-07-14 05:58 pm (UTC)(no subject)
Date: 2003-07-15 05:26 am (UTC)(no subject)
Date: 2003-07-15 05:27 am (UTC)(no subject)
Date: 2003-07-15 05:34 am (UTC)Actually, that touches on the transhumanist answer to the Pauli Paradox. Hm. Must think more about that.
(no subject)
Date: 2003-07-15 05:35 am (UTC)(no subject)
Date: 2003-07-15 05:41 am (UTC)Well, yes. The results are what we can experience, so they're pretty much what matters. The intent of others never get any better than hearsay, really.
The results can still be good, whether that was the intention or not.
If I understand you correctly, you mean that the intent is a contributing but not solely determining factor in deciding if an action is morally right or wrong?
If so, there could be borderline cases where the exact same action could be either right or wrong depending on the intent of the actor?
(no subject)
Date: 2003-07-15 06:10 am (UTC)This is Star Trek style time travel, effectively - "we're just going back to change this one thing, it won't hurt a bit" - and the simplicity of this is, well, just wrong.
Sorry, "just wrong" isn't good enough. It's a cricular argument, "we think this is wrong because we think this is wrong". You have a lot of company there, though. A large majority of the moral philosphy I read at University boiled down to "contradicts our moral intuition", which is academese for "it's just wrong".
To return the freedom of choice is the exact reverse: it is no more a crime than Dawn telling Tara that Tara and Willow had had a quarrel the previous evening was a "crime", though that too was "external manipulation of the apparatus of choice", and Tara would have been much happier if Dawn hadn't told her.
Surely you can't mean that any change in the direction of more choice is good? Because that would mean, for example, that forcibly changing a hetero- or homosexual person to bisexual would be good. Which at least I wouldn't agree with.
Giving Tara her lost memory back after she's discovered that it's missing is one thing, because she knows that she has lost something. She experiences loss. But for a Homo Servus, where the loss is several centuries and many generations in the past, the situation is different. Changing the personality there isn't liberation, it's murder followed by birth of a new and very different individual. And to pull that argument a little further, changing it so that all future children to Homo Servus will be baseline Homo Sapiens, with their larger set of available choices, would be genocide of the Homo Servus subspecies. Can that really be right?
(no subject)
Date: 2003-07-15 06:22 am (UTC)(no subject)
Date: 2003-07-15 07:46 am (UTC)Good q. I suppose for me it is interacting with the otherness of the universe. I guess that includes sticks and stones (and stars etc.) which are completely other and can not be enslaved to my own projects, and also interacting with other human beings who have projects of their own, which they pursue just as vigorously as I do mine.
That's the paradox. You want things to go your own way (well, I do anyway) but a system which was set up so that things always went your own way, would render the triumph meaningless. It would also render love and co-operation and transcending the ego meaningless.
I suppose I'm Hegelian in as much as I think that the tension of unreconciliable opposites, is what creates the universe.
(no subject)
Date: 2003-07-15 09:16 am (UTC)Er... are we thinking of the same Brave New World? In the one I read, John Savage is introduced as the man from outside the system, but there are two men within the system who find the world intolerable as is. John doesn't prove that the system doesn't work: he just proves that someone who was not brought up inside the system - tailored to fit with electroshock and hypnopaedia - will never fit into it. Bernard Marx and Helmholtz Watson, however, were brought up inside the system, but are still uncomfortable within it. They are the real proof that the system does not work - it is intolerable except to those who have been successfully conditioned to tolerate it.
We, also being outside that system, also tend to see it as wrong. But is that because it is somehow inherently wrong or just because it's very unlike our own system? What is the moral difference between bringing up children with a BNW reward/punishment system and the way we bring up our children with reward (like cookies) and punishment (like being sent to bed early) systems?
The distinction is in what's left open to choice. I take the moral position that it's wrong to restrict people from making choices: granted that there are some choices we can't allow children to make, because they don't yet have a proper understanding of the consequences (you can't let children make the free choice to go play on the freeway...) but we can choose to bring children up either with free and questioning minds or with closed and locked minds. Closed and locked minds are much more comfortable for parents to deal with: the problem with bringing children up to think about everything and make their own moral choices is that sooner or later, your children will make different moral choices from you.
Sorry, "just wrong" isn't good enough. It's a cricular argument, "we think this is wrong because we think this is wrong". You have a lot of company there, though. A large majority of the moral philosphy I read at University boiled down to "contradicts our moral intuition", which is academese for "it's just wrong".
It wasn't actually intended as a moral judgement, more a condemnation of the simplicity that says "if I cast a spell and make it didn't happen, it doesn't matter that I did it" but if you prefer, I'll expand it. Let's suppose that you have a favourite treasured vase, and X breaks it. X has three choices: (1) to admit zie broke your vase, and resolve that in future zie will take greater care with fragile things. (2) to cast a spell making you think you hated the vase, so that you won't mind that zie broke it. (3) to turn the universe back in time by fifteen minutes so that zie never broke the vase at all.
Option (1) is the taking-responsibility option. X has learned from what happened and will break fewer vases in future. Option (2) is the diversion option: X did break the vase, but has means to make sure that zie doesn't get in trouble for it. Option (3) is the make-it-didn't-happen option: X chooses to decline to learn from zie's experience, and will continue to smash vases and keep making-it-didn't-happen.
[continued on next rock]
part 2
Date: 2003-07-15 09:16 am (UTC)Now that's an astonishing manipulation of what I actually said, and a still more astonishing refusal to meet with what I did say. You're using the classic poor-debater strategy of setting up a straw man, claiming that the straw man represents my opinion, and knocking it down to prove that my opinion is wrong. If you wish further lessons in how to do this, may I point you at this (http://facts4god.faithweb.com/thelist.html) website?
Meanwhile, may I restate what I actually said, which is: "To return the freedom of choice is the exact reverse: it is no more a crime than Dawn telling Tara that Tara and Willow had had a quarrel the previous evening was a "crime", though that too was "external manipulation of the apparatus of choice", and Tara would have been much happier if Dawn hadn't told her." Now if you're done with the cheap rhetorical tricks, feel free to deal with what I really said. Or not, if you find it so unanswerable that the only way you can deal with it is to pretend I never said it.
But for a Homo Servus, where the loss is several centuries and many generations in the past, the situation is different. Changing the personality there isn't liberation, it's murder followed by birth of a new and very different individual.
Well, you've read the book, I haven't. I don't know if you're describing the situation accurately - if it would be possible to convince a Homo Servus that they didn't have to serve their masters except by drastic personality transplants. If you are describing the situation accurately, then I'd agree that it would be wrong to force adult Homo Servus into personality transplants against their will.
And to pull that argument a little further, changing it so that all future children to Homo Servus will be baseline Homo Sapiens, with their larger set of available choices, would be genocide of the Homo Servus subspecies. Can that really be right?
If Homo Servus is an artificial creation that has been so crippled that individuals of the species have had their freedom of choice taken away from them, then I would certainly argue that it's no worse than deliberately discouraging the breeding of artificially-created breeds of dog where purebred puppies are intentionally born crippled.
Re: part 2
Date: 2003-07-16 10:01 am (UTC)Well, what do you know. I thought I used the debate technique of reformulating how I interpreted your statement and asking if that really was what you meant, since it seemed unreasonable. Lucky for us that you knew better, so that neither of us runs the risk of actually understanding what the other party thinks.
(no subject)
Date: 2003-07-16 10:14 am (UTC)I fully agree that constant success would quickly become meaningless (it's pretty much that same as playing computer games in unbeatable "God Mode", which is mostly boring). The Experience Machine (I think that's the proper English term, I read all this stuff years ago and in Swedish...) scenario is a bit more sophisticated than that. The "false" reality is specified as maximally rewarding, which means that it gives the person in it just enough opposition/tension to grow and feel true accomplishment. It is part of the argument (or thought experiment or whatever you want to call it) that the artificial existence be perfect in every sense, since the purpose of it is to try to figure out why we tend to feel that it is inferior to real life in spite of being better in every way.
(no subject)
Date: 2003-07-17 07:29 am (UTC)Yeah, that's the right phrase. Nozick's thought experiment. There's a whole dialogue to be had about experience machines and sexbots, but I'll draw a veil over that...
FWIW I think that if an experience machine was maximally rewarding it would be indistinguishable from real life. In effect it would be real life. That's because it would engage you with an 'other' as challenging as the entire universe. If it engaged you with anything more limited (for example the imagination of an author or a team of games programmers)then it would be inferior.
(no subject)
Date: 2003-07-18 08:22 am (UTC)