By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - A.I. so smart in a game it really has feeling.

Of course I didn't post the obvious thing I should have posted. Info about Turing Test. Its only a small step towards living & feeling AI, but still enough big step for current level of technology(No AI has passed it yet.).

http://en.wikipedia.org/wiki/Turing_test



Around the Network
thekitchensink said:
Louie said:
The answer is easy: People would still kill the A.I.

It's just a game in the end so the CPU can't "die" no matter what you do. You just have to re-start the game and it'll be "alive" again.

 

If I kill you, and then clone you with the same memories, feelings, relationships, etc., are you the same person?

 

Of course not but virtual A.I. will always lack the definitions of life, as it's... well, virtual. Now if this discussion was about androids that would be a different story.

But according to your definition current A.I. also has some sort of "life" in it as it has certain attitudes programmed into it. This is basically the way simple life-forms "work". So if A.I. is advanced enough to act different according to the situation (and tries to safe itself from virtual death by actual learning) does that mean I should be punished like with cruelty to animals? You know Life doesn't start at human beings, life can be much simpler and your definition of "life" is problematic in the sense that it includes everything that acts different in different situations. (Feelings are just ways to increase our chances to survive: If you're sad the social group is gonna help you, if you are angry it is harder to make you scared, etc.). So if I'd program an A.I. to act agressive if someone beats it in-game and to act sad if someone else dies (which just makes the A.I. clear that it will have problems to survive now, without the other persons help) then it's some form of human being? It would still lack metabolism and a physical body and it still could be replaced. A human being can't, even with cloning. Cloning doesn't mean every memory you have will be tranfered into the next body, it just re-starts the process of cell division with a certain DNA information. Everything else is coincidence. Re-starting an A.I. program is different, though as you can save all "memories", etc. and then upload it again. Nothing is lost. A cloned human would have different memories, a different life, different experiences... things that can't be copied.



thekitchensink said:
Louie said:
The answer is easy: People would still kill the A.I.

It's just a game in the end so the CPU can't "die" no matter what you do. You just have to re-start the game and it'll be "alive" again.

 

If I kill you, and then clone you with the same memories, feelings, relationships, etc., are you the same person?

 

 So you would commit murder everytime you turn your console off, really smart.



 

 

 

 

 

I wouldn't say you're committing murder every time you turn off a console, I would consider it freezing the world almost. Deleting the data, yeah that might be, however simply enjoying killing an actual thinking thing such as by gun point. Disturbing to me.



CURRENTLY PLAYING:  Warframe, Witcher 2

I wouldn't like interracting/killing perfectly simulated A.I , there's some advancements in gaming we shouldn't make . More tactical , intelligent A.I is never a bad thing though.



Around the Network
WessleWoggle said:

I would rape and murder relentlessly, they're not feeling real pain, unless the computer they're running on is organic rather than electronic.

Ha. You have just opened a can of worms, and now you'll have to put up with me.

Imagine someone invented an electronic neuron which can replace organic neurons at 100% accuracy. Then every single day for 100 days, someone undergoes surgery to replace one billion of their ~100 billion organic neurons with the electronic version.

At what point does it become OK to rape and murder this being?

 



My Mario Kart Wii friend code: 2707-1866-0957

WessleWoggle said:
NJ5 said:
WessleWoggle said:

I would rape and murder relentlessly, they're not feeling real pain, unless the computer they're running on is organic rather than electronic.

Ha. You have just opened a can of worms, and now you'll have to put up with me.

Imagine someone invented an electronic neuron which can replace organic neurons at 100% accuracy. Then every single day for 100 days, someone undergoes surgery to replace one billion of their ~100 billion organic neurons with the electronic version.

At what point does it become OK to rape and murder this being?

 

They're real, and not on a screen or in a simulation, like the A.I. were were talking about.

 

 

In what way is a person real that a simulation isn't? I'd say both are very real since they both exist.

Your point seemed to circle around the organic/electronic distinction. Is that not the case any longer?

 



My Mario Kart Wii friend code: 2707-1866-0957

WessleWoggle said:

If it can move freely within the physical world and affect it, then it's 'real' and not just simulated. It can output real actions, rather than just output more code.

Organic/Eletronic is the distinction I make towards acting without inhibition in a game world. I know organic things feel real pain but I don't know for sure if code interacting with other code can create real pain.

 

Well in that example I gave before, once the person's brain is fully electronic at the end of 100 days, you can easily transfer the brain into a computer (it may run slower but fundamentally it's the same thing)... Then if you really want to, you can turn it into a robot which affects the physical world.

The thought processes going on inside the computer would be exactly the same as those going on in an organic brain, just using a different computational substrate (electrical impulses instead of chemical ones). By definition it will be as self-aware as a "normal" person, capable of experiencing the same emotions, etc.

PS: What about completely paralyzed people? They can't affect the real world or move in it, but they're still human, real and capable of experiencing emotion by any definition. My point is that once you start distinguishing between organic/electronic and such, you quickly start getting into contradictions.

 



My Mario Kart Wii friend code: 2707-1866-0957

WessleWoggle said:

I really can't argue about if electronic or chemical impluses would be the same if you simulated a conciousness, since it's unknown at the time.

But about paralyzed people, they can take up space, and use oxygen. They affect the real world in a physical way.

Yes, I would get into contradictions if I started pondering this. That's because I have nothing to go on.

 

My point is since we're not absolutely sure, it's a good idea to take a cautious approach which is not to rape or kill any sentient beings no matter whether they breathe or not.

 



My Mario Kart Wii friend code: 2707-1866-0957

thekitchensink said:
Louie said:
The answer is easy: People would still kill the A.I.

It's just a game in the end so the CPU can't "die" no matter what you do. You just have to re-start the game and it'll be "alive" again.

 

If I kill you, and then clone you with the same memories, feelings, relationships, etc., are you the same person?

Assuming you were cloned exactly as you were before in what way would you not be the same person? The only difference I would have from my pre-cloned self is that I would be pissed that you killed me. Even more so if you forgot to tape my shows for me while I was being cloned. :D

 



CIHYFS?