Agnostic.com

8 0

If a machine is programmed to have thoughts, emotion and critical thinking, plus self awareness, would it be alive? Given legal autonomy?

StevenMichael 5 Aug 13
Share

Enjoy being online again!

Welcome to the community of good people who base their values on evidence and appreciate civil discourse - the social network you will enjoy.

Create your free account

8 comments

Feel free to reply to any comment by clicking the "Reply" button.

0

It shouldn't because it could still be programmed with a great deal of bias, triggered by circumstance. Like genes switching on or off.

0

Perhaps they’d come up with a standard for “non biological life” which had a looser definition. If it could think I would imagine it might end up asking for ‘robot rights’ and legal autonomy, which would be amusing.

0

If a machine achieves sentient self-awareness (i.e. can think for itself, make autonomous decisions, can imagine new ideas, and can understand and have emotions), it absolutely is alive and should therefore be given the same rights as any other sentient species. We do not get to decide what life is for anyone but ourselves. Of course, like all sentient species, it would have to find is own source of "food", shelter, etc. without infringing on anyone else, otherwise it would be a criminal and would/should be treated as such. But since you can't kill something so rare, morally speaking, prison time would most likely be the sentence, depending on the severity of the crime. Barring it turning to a life of crime, it should have every right to life, liberty, and the pursuit of happiness that everyone else has.

0

Didn’t Stephen Hawking warn us about artificial intelligence becoming smarter than us. He said it would be the end of us....so it is not a road we should travel too far down in my opinion,

0

it would be a very alien creature so much of what we learn is tied to other senses touch taste etc that a purely logical creature with no other senses might struggle in our world

0

Saudi Arabia granted citizenship to a robot (Sofia) last year. Soon, AI will be designing its own neural networks to meet requirements it has established. The singularity is coming whether we want it to or not. There will come a time, probably in our lifetimes, when AI, having already surpassed human beings in processing capacity, becomes self aware. Try getting the nations of the world to consider agreeing to hold back. It's a global competition, and if it's not the US, it's China or Russia or Singapore or Japan or Korea or Germany or Finland, etc.

0

Why does anyone consider this a good idea? We can barely take care of each other. So along with the steadily increasing population of meat bags, we're adding silicon based sentience to the list. We don't deserve them. And why add emotions? Because they've worked out so well for us? Put that crap on the backburner until we sort ourselves out.

0

Aside from sci-fi no one knows how to program a machine to think, have emotions, or do critical thinking let alone be self-aware. We really don't know what any of those things actually are yet. Computers can be made to look smart or even self-aware, but they are still nothing more than player pianos with the ability to execute this piece of code or that piece of code depending on data conditions. The reasoning is all in the mind of the software designers.

Write Comment
You can include a link to this post in your posts and comments by including the text q:154247
Agnostic does not evaluate or guarantee the accuracy of any content. Read full disclaimer.