| the2real4mafol said: Aren't robots essentially slaves? Also, they have no morality or conscience at all. So, no they shouldn't have rights. Robots are the same as any other manufactured good as far as i'm concerned |
"Morality" is a human construct and differents from human to human. There are no set "laws" on morality. If Artificial Intelligence radically evolves (called AI+, AI++ - read David Chalmers) then AI will have powers far beyond humans and will be able to self-improve and self-replicate itself at a rapid rate. If this AI is goal driven (and it will be, if anything, to self improve and self replicate) then the human race goes bye bye. A machine will not have to take human condition when pursuing its goals, mainly through acquiring scarce resources and using them to the detriment of humans.
But of course, for now, this is all just fantasy. If we do design an AI that reaches human intelligence, there are no reasons to come to the conclusion that it will understand itself well enough to design a superior successor because there is no verifiable proof that a designed intelligence can improve itself.







