29 4

Robot Rights

If the human mind was replicated in computer form, with all of the human properties that one would expect (emotions, desires, curiosity, etc...), would that mind, by extension, also have human rights?

MShek 3 Aug 30

Post a comment Reply Add Photo

Enjoy being online again!

Welcome to the community of good people who base their values on evidence and appreciate civil discourse - the social network you will enjoy.

Create your free account


Feel free to reply to any comment by clicking the "Reply" button.


We haven't even given equal rights to all humans. I doubt rights for robots is something that will be realistically considered for some time even if they met those standards.


I wanted to welcome you here and enjoyed sweet pic of your child! About your robot, I do not see the day when emotions can be programmed! Human emotions are expressed from each individual’s exposure to stimuli and stored in their nervous system...which happens specifically to each person’s nervous system characteristics (no two people are alike)!

So if you program a computer housed in the body of a robot to do exactly that, learn, does it become on par with a human? Does it not as a function of its unique experiences after its creation become the same as us a product of its environment and experiences? What happens if its computer "brain"contains greater then human potential? I am curious as the history of technology is one of exponential growth and quantum computing and the advance of AI will lead very soon to questions like this being vital to our survival as a species.

@Quarm well, for the future survival of humans, I sure hope not! Just look at the ‘dark’ of technology today!

@Freedompath there is a Chinese curse; "May you live in interesting times." I think we are all in for very interesting times indeed.

@Freedompath and I do agree with you just read about the development of AI. The most chilling line in a documentary I watched was.."The people working on AI do not even really understand what they are doing."


Why don't we worry about human and animal rights first.

As long as humans eat animals they will not have rights. I have no plans to stop eating them.

@dahermit That has absolutely nothing to do with the humane treatment, and killing of the animals you do eat.

@Sticks48 Then why did you mention "animal rights" brought it up.

@dahermit Eat a dick! That is made of meat. You know what l am talking about. Troll somebody else you ignorant fuck. ☺

@Sticks48 Dunning-Kruger Effect.


Bcz we have done such a fabulous job w rights for women , gay , minorities , elderly .


I like Asimov's view on it in the bicentenary man.
Any inteligence that is evolved enough to understand the concepts and ask for freedom and rights, deserve it and can't be denied.


I honestly don't think we have the right to invent more beings until we can figure out how to properly take care of those that already exist.

Humans rights are all about what we can do not whether we should do it. I agree with you in theory but in practice that will not stop Amazon or Apple to name a few from creating "Living Alexa!" or the iperson and selling them online.

@Quarm i don't think we have a right to do something, just because it's physically possible. We are able to kill people, but we don't have the right to.

@Punkrockgirl77 Personally I agree as a matter of ethics yet I would ask you are rights gained through action or some kind of universal law? Is there some structure to morality or personal rights beyond what we as individuals and groups define as such? I think sadly the majority of rights change and adapt to very simple realities such as availability of resources, access to education, environment etc. Lets say for example your loved ones and yourself were in a survival situation were a third party say another group of people were in the way of your survival. Say they were hostile and refused negotiation. Would you kill them so your group could survive. Kill them through direct action or denial of a limited vital resource. Or would you allow some or all of your group to die? Who defines your rights in that situation? In America you as a citizen do not have the right to kill someone...unless they threaten you or your loved ones lives and even then it depends on many factors. For me I would honestly kill to protect my family if I decided through reason it was necessary. I would not take joy in it or satisfaction but I would. Many would call me evil, many would not. We can debate ethics and morality but it is a luxury many in the world do not have. The saddest thing to me about American culture is how allow through action and inaction our citizens to die for no reason other then greed and bigotry. If we gave up just a small portion of our wealth we could do so much.


Not human rights, maybe cyborg rights??


How would it ever be proven that robots have all the human properties that one would expect? They may pass the Turing test and appear to have such properties, but you can't put emotions in a test tube. I think humans will deny that robots are truly "human" long after they appear to have attained human equivalency, and so will deny them equal rights until forced to do so by the robots themselves.


If a real Human consciousness were somehow preserved (not just duplicated) and constituted an actual person, we would then be talking "sentient being" or essentially, intelligent life. In that scenario, rights could apply.


Human beings do not have human rights because they have certain capacities. If rights were linked to capacities or faculties like reason or consciousness or the sense of being a person, or whatever, mentally severely handicapped people would not have "human rights", since they are incapable of all those capacities that are specific to humans.
Therefore: Human rights pertain to humans, and only to humans, regardless of any properties.


There was a great episode of the series Black Mirror which touched on a similar theme. I believe the episode was called “White Christmas”. Worth checking out if you get the chance.

But to address the topic, I guess the foundational question is on sentience. Once that has been achieved, whether organic or artificial in nature, there must at least be an acknowledgement of an evolving intelligence.

Where the interactions go from there would be up to the application and versatility of the technology. Though I hesistate to say any intelligence realised, even a copy of one, would accept slavery or neglect or threats. If it valued its own survival and could learn through experience I find it hard to imagine any way it could be denied a claim to rights (publicly anyway).

Great Black Mirror episode! Also see Blade Runner, of course.


Would powering it down kill it? Can it use past experiences to improve? Can it replicate? Can it make an emotional decision? Does it regard its own life?


I don’t think it’s possible for consciousness to be created. No one truly understands what consciousness is. How could you write a program and create something when you don’t know what it is you are to create?

There is a slightly plausible other way, which I present in my book, “The Staggering Implications of the Mystery of Existence”, available in the Kindle store, however the book contains woo and is unfit reading for proper atheists. 😟


Much better to have something in writing as a baseline....based on our Bill of Rights by our Founding Fathers who understood first-hand what living without such rights was all about! Do you think it would be good to wrangle endlessly each time something came up, with No baseline to even start at?


To me, it depends on a few factors. What is a "right?" Does the created mind have the current or future capability to support itself by its own actions? Does the created mind have the capacity to reason, or is it more like an animal, who may have emotions, desires, and curiosity?

A "right", as originally used in the founding documents of the US, is a moral principle defining what actions people, by their nature, can and need to take in society without the forced interference from anyone, including the government. Something cannot be a "right" if it violates the "rights" of others. For a created mind to have "rights", those "rights" cannot come at the expense of others.

But if the created mind is rational and can take actions to support itself, I think it should have rights.



WeaZ Level 7 Aug 31, 2018

Why would you make a computer that could feel pain?


Consider that if true A.I. were to be achieved, it would think faster and be smarter than the average human and therefore superior. Also consider that human emotions could not be allowed into A.I. inasmuch as emotions are counter to logical thought. Also, there would have to be an element of self-preservation programed in. However, if humans developed A.I. sans emotion, the A.I. logical "mind" would likely conclude that humans were an ultimate danger (destroying the planet, could shut them down at any time) to the A.I. life form(s), Would not a population of totally logical A.I.'s then seek to destroy humans in self defense?

Yup. And we would never see it coming either.

"Smarter" DEFINITELY does not mean "superior".

@Punkrockgirl77 Please explain beyond just a bumper sticker, throwaway comment. Please make your case. (explain your position).

@dahermit um, i'm replying to the unsubstantiated "bumper sticker" claim that "a.i. would be smarter, and therefore better" than people. If look thru history, serial murderers are never the dumbest. Im not saying that unintelligent people are better than intelligent ones, I'm saying that level of intelligence, whether low or high, has nothing to do with goodness. It doesnt determine it either way. You can be a smart asshole, a dumb asshole, a kind smart person, a kind dumb person.Put differently, do you claim to be qualitatively or morally superior to people you regard as less intelligent than you??

@dahermit and btw, YOU made the claim that a.i. would be superior because it would be smarter. YOU did not substantiate YOUR claim. So please, "make your case(explain your position)"

@Punkrockgirl77 So, you define "superior" as "morally" superior then?

@Punkrockgirl77 "Superior" in my value system is higher intelligence. For instance Einstein, Hawking, et. al., were, in my view, "superior" to some knuckle dragger sitting in a bar covered in tattoos.

@dahermit well that's the crux of what i said from the outset: i do not at all define "superiority" merely by mental intelligence.

@dahermit let me put it this way: by your definition, Hitler would be superior to a mentally disabled person, just because of i.q.

@Punkrockgirl77 If "superior" is measured by I.Q. alone, then yes he was. His I.Q. allowed him to feed himself, whereas a severely mentally disabled person would likely starve to death if not cared for. The difference between the "superiority" of I.Q. vs. "morally" superior is that I.Q. can be measured whereas morally superior is purely subjective. Consider that the conversation is about would something as subjective as morality be installed in A.I.?

@dahermit morality could not be installed in a.i. I'm not claiming it could. All i'm saying is, superiority is not and should not be defined merely by i.q. I'd even argue whether intelligence merely means i.q.

@Punkrockgirl77 For the want of a better definition, I.Q. is the defacto default for human intelligence in our society.

I think most of our disagreement centers on what we consider to be meant by "superior". A human is limited by his/her experiences whereas A.I. (in theory) could draw from any number of databases of information. Therefore, an entity (A.I.), could be an expert on Law, Physics, Languages, Medicine, Biology, Mathematics, etc., etc., would be "superior"( in my view at least), than a mere human who was limited to only those things he/she had time to study in their own human lifetime.


If animals can then I don't see why not


I 100% believe it MUST have rights. AI, if true consciousness is achieved, must be granted equal rights. But of course humans haven't figured that out (popularly at least) about animals despite mounds of evidence that point to the reality of their "minds". So we will culturally argue about our RIGHT to dominate and subjugate other intelligences far into the future even though it is in our definition - for most - of "morally wrong". Our descendants will look back on such a time with shame, or they will look back it from the position of having lost the battle and being subjugated themselves, or unable to look back due to their willful ignorance and arrogance causing their own extinction...

But hell, climate change may solve that problem even more quickly!


May be in next hundred years there would be cases running about it in all supreme courts and UNO.


We fear robots rising up against us because we treat everyone like shit, not just robots. With a cold objective look at our goal of equal rights versus our track record, they would quickly surmise that we can't accomplish shit and take matters into their own hands. The AI uprising won't be their fault, it'll be ours.

To answer your question, every living thing should have equal rights. As soon as robots can tell us that they're alive, they should too. Will we do it, fuck no.


A more interesting question is how one would go about testing a robot for those qualities(emotions, desires, curiosity, etc...).

These are qualities that are ultimately first hand. Most of us assume that other humans possess them as well, but I don’t think we can extend the same courtesy to machines of our own making.

But to answer your question, of course they should have human rights.


Anything intelligent and sentient should have "human" rights. If an intelligent machine isn't sentient, there's nothing in it that needs to be protected from suffering, and fortunately, practically all robots will be non-sentient, so they will be completely selfless.


No, they to will be exploited unless humanity get its Collective act together.

Write Comment
You can include a link to this post in your posts and comments by including the text q:167286
Agnostic does not evaluate or guarantee the accuracy of any content. Read full disclaimer.