By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - General Discussion - How much is a human worth?

Tagged games:

 

Are humans invaluable?

Yes 35 47.95%
 
No 21 28.77%
 
For the moment, yes 4 5.48%
 
See results 13 17.81%
 
Total:73
IIIIITHE1IIIII said:


There are already robots who can think for themselves (even though they have no moral or sympathy yet) and there are estimations that in a not too distant future there will be computers smarter than the human brain.

Yes, we can create new species. We came out of nature, like you said, and we can change the nature.

robtos can't think for themselves,you just think they do.

i don't think you understand how computers work.Robots brain is essentially a processor,like the one you find in your computer.till the time you don't give it a command,it won't do something.

 

we came out of nature naturally not cause our anscestors wanted us to become humnas,we just did over time by evolution.

 

computer will never be smarter than humans.

 

cause intelligence is free will which robots just can't have.do a bit of research on how computers work



Around the Network
IIIIITHE1IIIII said:
Troll_Whisperer said:
Obviously a human it's the most valuable thing because we humans are the only ones that can think in terms of 'value'. Things have the value that we as humans give them. These values would change if we as a society change our value model.


It is possible though to create life with greater morals and capabilities to save humans and themselves. Values can easily be taught as well, just like babies (probably) are born without them.

There is no 'greater morals' because morality is 100% subjective. I have a definition of what 'good' morals are and you have another. Now, these definitions are usually close because we live in a society that raises people in a certain way, and it is usually societies with certain set of morals the ones that become successful and survive (social selection).

Therefore many moral choices seem like a given, but the truth is they are still 100% subjective. We just shun  or jail those that don't follow the society's standards.

Now, if we were to create beings that we deem 'greater' than us or whatever, perhaps we would think of them as 'more valuable', but again that would be a value that we, as a people, set as standard. Society's mentality would've changed. As you said, values can be taught, or learnt, and those are the ones that set the value of something, nothing more concrete or objective.

 

I don't know if I'm making any sense. I know I'm being extremely relativist here but there really isn't a straight, objective answer to this.



No troll is too much for me to handle. I rehabilitate trolls, I train people. I am the Troll Whisperer.

snakenobi said:

robtos can't think for themselves,you just think they do.

i don't think you understand how computers work.Robots brain is essentially a processor,like the one you find in your computer.till the time you don't give it a command,it won't do something.

 

we came out of nature naturally not cause our anscestors wanted us to become humnas,we just did over time by evolution.

 

computer will never be smarter than humans.

 

cause intelligence is free will which robots just can't have.do a bit of research on how computers work


Or maybe you should do a little research on how computers will work a thousand years from now?

 

Computers can have cameras - sense of sight: check

Computers can have microphones - sense of hearing: check

Computers can have heat and touch sensitive features - sense of touch: check

Computers can mimic the brains neural cells - sense of taste and smell: check

Computers have loudspeakers: they can talk

Computers have a (potentionally massive) memory: they can learn

 

What is stopping computers from being equal or greater than humans? That's right: Technology. And technology makes progress every day.

I won't even start on the nature argument as exactly everything that happens happens by nature.



Troll_Whisperer said:

There is no 'greater morals' because morality is 100% subjective. I have a definition of what 'good' morals are and you have another. Now, these definitions are usually close because we live in a society that raises people in a certain way, and it is usually societies with certain set of morals the ones that become successful and survive (social selection).

Therefore many moral choices seem like a given, but the truth is they are still 100% subjective. We just shun  or jail those that don't follow the society's standards.

Now, if we were to create beings that we deem 'greater' than us or whatever, perhaps we would think of them as 'more valuable', but again that would be a value that we, as a people, set as standard. Society's mentality would've changed. As you said, values can be taught, or learnt, and those are the ones that set the value of something, nothing more concrete or objective.

 

I don't know if I'm making any sense. I know I'm being extremely relativist here but there really isn't a straight, objective answer to this.


Hehe, amen on that last part. The question was though if you'd consider it to be morally correct [by your own moral standards] to let a vastly superior being survive a human being, knowing that the superior life form would be helping and supporting more humans and possibly others of his kind than what the human would have. If the answer is "yes", then the human has become the dog in the save human or dog scenario.

 

(Damn this is some complex shit x)



IIIIITHE1IIIII said:
Troll_Whisperer said:

There is no 'greater morals' because morality is 100% subjective. I have a definition of what 'good' morals are and you have another. Now, these definitions are usually close because we live in a society that raises people in a certain way, and it is usually societies with certain set of morals the ones that become successful and survive (social selection).

Therefore many moral choices seem like a given, but the truth is they are still 100% subjective. We just shun  or jail those that don't follow the society's standards.

Now, if we were to create beings that we deem 'greater' than us or whatever, perhaps we would think of them as 'more valuable', but again that would be a value that we, as a people, set as standard. Society's mentality would've changed. As you said, values can be taught, or learnt, and those are the ones that set the value of something, nothing more concrete or objective.

 

I don't know if I'm making any sense. I know I'm being extremely relativist here but there really isn't a straight, objective answer to this.


Hehe, amen on that last part. The question was though if you'd consider it to be morally correct [by your own moral standards] to let a vastly superior being survive a human being, knowing that the superior life form would be helping and supporting more humans and possibly others of his kind than what the human would have. If the answer is "yes", then the human has become the dog in the save human or dog scenario.

 

(Damn this is some complex shit x)

Ah, so it's one of those 'would you eat shit to save your mum's life' kind of questions.

I guess I would go for saving the superior being, as long as I don't personaly know the person being sacrificed.

And would mankind accept a secondary role? I don't know about this. Depends on how these beings treat humans. History tells us that people rather have at least some degree of freedom, even if it costs human lives.



No troll is too much for me to handle. I rehabilitate trolls, I train people. I am the Troll Whisperer.

Around the Network
IIIIITHE1IIIII said:


Or maybe you should do a little research on how computers will work a thousand years from now?

 

Computers can have cameras - sense of sight: check

Computers can have microphones - sense of hearing: check

Computers can have heat and touch sensitive features - sense of touch: check

Computers can mimic the brains neural cells - sense of taste and smell: check

Computers have loudspeakers: they can talk

Computers have a (potentionally massive) memory: they can learn

 

What is stopping computers from being equal or greater than humans? That's right: Technology. And technology makes progress every day.

I won't even start on the nature argument as exactly everything that happens happens by nature.

they can't perceive or make sense of things

 

computer stores information but it doesn't know what the information means only you can understand what the information means.



snakenobi said:

they can't perceive or make sense of things

 

computer stores information but it doesn't know what the information means only you can understand what the information means.


All that is true, when it comes to present computers.



IIIIITHE1IIIII said:
snakenobi said:

they can't perceive or make sense of things

 

computer stores information but it doesn't know what the information means only you can understand what the information means.


All that is true, when it comes to present computers.


no that is applicable to all computers

 

mechanical computers

transistor computers

moleculer computer

quantum computers



snakenobi said:
IIIIITHE1IIIII said:
snakenobi said:

they can't perceive or make sense of things

 

computer stores information but it doesn't know what the information means only you can understand what the information means.


All that is true, when it comes to present computers.


no that is applicable to all computers

 

mechanical computers

transistor computers

moleculer computer

quantum computers


This is tupid really, I'm talking about computers hundreds and even thousands of years in the future. Those computers will definitely be several times smarter than the human brain and will be able to run themselves (unless humans stop them/makes them incapable of doing that, which they probably will in most cases).



Depends how much money you make I guess, as sad as that is.