soul samurai, you should play Seaman for the Dreamcast. It's not as advanced as you like, but it may bring out you not wanting to let them die in the game.
soul samurai, you should play Seaman for the Dreamcast. It's not as advanced as you like, but it may bring out you not wanting to let them die in the game.
As long as artificial intelligence can be turned off and restarted, there are no moral questions to ask. If the time comes when this intelligence can no longer be restarted to its original state after it is turned off, then it becomes an ethical issue.. because that means that this intelligence is self-aware and self-sustaining already.
Rainforests being burned down is an ethical issue because once it is destroyed it can no longer be returned to its original state.
AI characters being killed is not an issue because every event can be restored.
This issue is not about whether AI can actually "feel". That's beyond our ability to comprehend because "feeling" is subject to whatever is the observer. For example, who knows if a tree doesn't "feel" pain when it is being cut? We only assume it doesn't "feel" because our notion of "feeling" is our own alone and we don't detect any signs based on our expectations.
Rather, this issue is simply the restorability of something impacted by our actions. Melting polar ice caps is an ethical issue because we can't bring it back to exactly the way it is without affecting some other environmental issue. Killing an AI character, at this moment, can always be reversed.
So in conclusion, it is only when actions done to an AI become irreversible that the issue becomes ethical.
Nice read. Congrats on a really deep thread. I've thought stuff like this a lot of times as well and what i always come to is i don't think i could do it. After all what if hypothetically we were all simply programs running around in a computer or video game? Maybe the only reason we feel pain or emotions is because we have coding and programming that tells us that we do. At the end of the day it would still be real to us and i wouldn't want to put them through it. The day A.I. gets that advanced is the day we need to start really thinking about what being alive truly means.
| bugrimmar said So in conclusion, it is only when actions done to an AI become irreversible that the issue becomes ethical. |
I would absolutely buy that game. Toying with my emotions would be awesome. Imagine if you could never erase the data either. It would always be there how you left it. Creepy creepy.
Here on another note people saying killing the AI is an ethnically ok if it's restorble, but you have to look at other people raping it. I mean honestly could you really rape something that actually thinks like that. That's twisted. You're scaring something just the same physical body or not.
CURRENTLY PLAYING: Warframe, Witcher 2

This thread is really freaking me out. I was going to joke about killing Haley Joel Osment from the movie "A.I.", but I really don't know if I could do it.
:P not a christian if thats what your pokin at. Not religious one bit, and i don't beleive in heaven so nah doesn't work. That's also like saying," Don't worry mom when i murder you, you'll go to heaven it all works out."
CURRENTLY PLAYING: Warframe, Witcher 2

No human can accept a utopia, and if this AI thinks just like a human i don't think it would to. After a while we would go crazy, get bored, and want out. Even if i died right now knowing i would get a utopia I still wouldn't want it, we're bent on survival. Not only that i could only think of how betrayed they would be.
As a note on the mother, I meant a real life situation. If we absolutely knew a heaven existed, it would still be wrong to go murder your mother.
CURRENTLY PLAYING: Warframe, Witcher 2

but whos decided whats wrong is wrong? :P hehe sorry just being dificult. We should discus it though seeing as though were talking about it
If at first you don't succeed, you fail
....Basiclly if they go to this utopia you're turning them to retards by structuring them back into a line of codes. I would rather be deleted then have this. Either that or turning them into a Leave it to Beaver style character. You can't take away from them what the already know unless you do turn them into a retard.
"Ghouls are people too! They dream, they feel, they BLEED!....... At least, thats what they programmed me to say. Personally, I think they're a rotting maggot farm, and if I had my choice, I'd blow everyone one of them to hell..... DAMN this combat inhibitor! "
Ok lets say this AI develops relationships with other ones. Then you kill it, that's creating social implications within this small virtual world. Gah this gets into Matrix and Avalon type qaulities. What if the world we're in right is a virtual world and there is someone out there actually playing "the game", and we're just the NPCs.
CURRENTLY PLAYING: Warframe, Witcher 2
