thekitchensink said:
If I kill you, and then clone you with the same memories, feelings, relationships, etc., are you the same person? |
Of course not but virtual A.I. will always lack the definitions of life, as it's... well, virtual. Now if this discussion was about androids that would be a different story.
But according to your definition current A.I. also has some sort of "life" in it as it has certain attitudes programmed into it. This is basically the way simple life-forms "work". So if A.I. is advanced enough to act different according to the situation (and tries to safe itself from virtual death by actual learning) does that mean I should be punished like with cruelty to animals? You know Life doesn't start at human beings, life can be much simpler and your definition of "life" is problematic in the sense that it includes everything that acts different in different situations. (Feelings are just ways to increase our chances to survive: If you're sad the social group is gonna help you, if you are angry it is harder to make you scared, etc.). So if I'd program an A.I. to act agressive if someone beats it in-game and to act sad if someone else dies (which just makes the A.I. clear that it will have problems to survive now, without the other persons help) then it's some form of human being? It would still lack metabolism and a physical body and it still could be replaced. A human being can't, even with cloning. Cloning doesn't mean every memory you have will be tranfered into the next body, it just re-starts the process of cell division with a certain DNA information. Everything else is coincidence. Re-starting an A.I. program is different, though as you can save all "memories", etc. and then upload it again. Nothing is lost. A cloned human would have different memories, a different life, different experiences... things that can't be copied.







