A lot of people believe that a person with consciousness deserves certain rights, no matter the circumstances. If consciousness or intelligence seems to be the decider of individual rights, will sufficient advancement in these aspects give robots rights too? Before continuing, we need a definition of consciousness so we are on the same page. I have provided one in the next paragraph (from wikipedia, of course). If you have a different definition, then I will love to hear it.
"Consciousness is the quality or state of being aware of an external object or something within oneself. It has been defined as: sentience, awareness, subjectivity, the ability to experience or to feel, wakefulness, having a sense of selfhood, and the executive control system of the mind."
Some people might argue that a robot will never be as conscious as a human, therefore doesn't deserve the same rights. But I disagree with this for two reasons:
(1) Consciousness is so immeasurable that it's impossible to prove that humans are more conscious than robots. It's not actually possible to look inside someone's mind and see if they are genuinely experiencing consciousness, or if it just looks like they are. Sure, we can dig in their brain and see physical and chemical processes. But these processes can't tell us if the person is sincerely having experiences in their mind, or if there is just these physical processes that cause behavior which appears to be directed by consciousness.
The only thing we can examine for sure is behavior which seems to resemble the effects of consciousness. We can tell from behavior roughly how aware a person is. No, we cannot examine their minds. But if a person acts as if they are conscious, we can safely assume they are conscious, since actions are all we have to judge them. We can program robots to act with similar behavior. They can act as if they are conscious very similar to humans. If this is the case, then they should likewise have the same rights.
(2) Even if robots did have a lower level of consciousness than humans, that doesn't mean they don't deserve individual rights; since even people with minimum consciousness have individual rights.
Others might say human consciousness is inherently more valuable than robotic consciousness, even if they have the same behavior.
I don't see the logic here at all. What makes biological processes that causes behavior any more valuable than electrical process that causes the exact same behavior? Why does combination of oxygen, nitrogen, and carbon in the brain of a human have more inherent value than the iron, copper, and other chemicals in the circuits of a robot? (I'm no scientist, so these chemicals are probably very wrong).