Was for Fury
Answer the Damn Question! | |||
| Yes | 32 | 41.03% | |
| No | 46 | 58.97% | |
| Total: | 78 | ||
Jay520 said:
|
I see. However I think our race as a whole tends to stick together when like-minded, and are fearful of competition from others. Robots, something not even living, having the same rights that it took over two thousand years to achieve for humanity? Not gonna happen.
In the future, Xenophobia will be the new racism, and will never go out of style. Humanity FTW
Jay520 said:
What have I misrepresented? I asked you, "But how do you know other humans and animals have consciousness? What proof do you have of this that cannot also be applied to a robot?" You answered, "Because every human, when presented with the question of "do you exist" responds in the same affirmative manner." And I responded, "Computers can do that today", which is true. They can answer in an affirmative manner to the question, "do you exist" - Moreover, some humans are actually incapable of responding in the affirmative manner to the question "do you exist". Would you say they don't deserve rights - How does the definition in my post necessiate a biological brain? |
Because you are only addressing single points and pretending that's my entire argument. I've made many statements here and they should be taken as a continuous response.
A computer is a slave to programming. If it is asked about its existence it can see if its memory bank of responses pertains to the question, but it cannot rationalize a response of its own volition.
you also completely ignored my animal example despite being directly applicable to humans.
if you are referring to vegetables then they do not have conscious.
you don't acknowledge the scientific fact that conscious comes from the brain?
If they will have a free will, personal preferences and be at least as intelligent as the least intelligent human being my answer would undoubtedly be yes.
We can't base our rights on what body a mind is contained in.
Robots are not counscious. They are parts doing as they are told. They cant love or think freely like we do. They will always just do what their programming tells them to do. They dont have a will of their own. At best they would have programs that allow them to overwrite their original program or contantly update it, but that rewritting itself is a programming pre-established with certain rules.
Thinking machines are the things of sci-fi fantasy. They arent possible as thinking and feeling entities that are aware of their existance.
| Veracity said: 1. Because you are only addressing single points and pretending that's my entire argument. I've made many statements here and they should be taken as a continuous response. 2. A computer is a slave to programming. If it is asked about its existence it can see if its memory bank of responses pertains to the question, but it cannot rationalize a response of its own volition. 3. you also completely ignored my animal example despite being directly applicable to humans. 4. if you are referring to vegetables then they do not have conscious. 5. you don't acknowledge the scientific fact that conscious comes from the brain? |
1. I only responded to what I felt needed to be responded to. But just for your sake. I will repost your post and respond to all the points
| Veracity said: (a) Because every human, when presented with the question of "do you exist" responds in the same affirmative manner. (b) An extension to animals would be their ability to avoid predators realizing "they" are and need to avoid threats. ( c) It is a byproduct of a brain. (d) Trees are not conscious at the macro level |
(a) I responded to this and this conversation is ongoing.
(b) I can give the same answer to this that I can give as 'a'. Computers can be programmed to do it.
(c ) What is your point? Are you arguing that because something is the byproduct of the brain, it cannot be produced using other means? Moreover, I addressed this in the OP, so I didn't feel it needed to be responded to.
(d) This is a true statement. Not sure why you brought this up.
2. Humans also need to look back on their memory bank of facts, experiences, etc. to rationalize. Sure, they don't have preset answers to questions. But I have no reason to believe that robots won't advance far enough where they too will look bank on their memory bank of facts and experiences to rationalize, without just responding with preset answers.
3. It's basically the same as your human example with a diffferent application. Don't see why both need a response.
4. I'm talking about mentally retarded humans with an inability to speak and have a low awareness. Do you believe they have rights?
5. No, I don't deny that. Just because something comes from the brain, that doesn't mean it can only come from the brain.
| SxyxS said: 1)even (most) animals don't have (almost) no rights but you ask for righs for a fucking piece of silicon and metal?You really care about the important problems. 2)we have to find out what consciouness is .You can't code and simulate something what you don't understand. 3)you must be a 100% pervert bastard when you create an artificial being and coding a programm that makes it suffer. As the famous nexus6 Roy Beatty once said(or was it Prizz?):"All this pain and fear was so irrational" 4)There will be rights for Androids as soon as they start to dream of electronic sheep 4b)Considering the perverts who run the USA,they will give rights to Androids,Not because of empathy,just as a reason to protect their war machine robots,to make them more accepted by the population AND to protect the military.If such a robot runs crazy ,the military has to pay,some guy (eg. general,coder,manufacturer,the guy who rc's the robot)has to go to jail but as soon a robot has rights the robot will be accused and setenced to prison. |
1. Uhhh...sure, I guess you could say that.
2. This is true. Do you think we will never understand consciousness?
3. Pain and fear are effective feelings to help beings avoid danger. I don't see how it's irrational.
4. Okay.
Personally the answer to this question solely depends on whether or not it will ever actually happen.
Rights for robots shouldn't exist if their actions reflect their creators programs and no more than that.
Rights for robots should exist if it is possible that they become aware of their actions and become about as able to "make decisions" as we are (the whole debate on if we are even making decisions or following a pre-set reaction is debatable itself).