Agnostic.com

19 2

What do you think of a computer programed to benefit the majority of the population , insure world peace and environmental and economical responsability ?

Besalbub 8 May 9
Share

Enjoy being online again!

Welcome to the community of good people who base their values on evidence and appreciate civil discourse - the social network you will enjoy.

Create your free account

19 comments

Feel free to reply to any comment by clicking the "Reply" button.

0

And spell check?

0

If the computer still had a human oversight, it would be great. Sometimes what sounds good in theory does not always work well. For example, you could have world peace if you destroyed all the humans.

Tell that to the lions and gazelles.

@indirect76 and to the ants and the aphids

0

Not much.

0

When ASI occurs you'd better hope they keep a few of us around in zoos but 99% of all human life would be wiped out as soon as possible its the only logical step to prevent total ecological destruction

@McVinegar the best way to fix our planet is to drastically reduce the amount of people on it. Logically what's the only way to achieve world peace? Remove the cause of war ie people. The fastest way to fix the environment? Remove the most destructive animals ie people. Even if programmed to bennifit mankind well there's protecting us from ourselves or ensuring the longevity of the species both valid justifications for global genocide if you remove emotion from the decision making process. We will be the architects of our own demise when we build something smarter than us we become surplus to requirements snd then it's game over.

0

If it worked well, I think people would destroy it in the name of freedom or just fear and misunderstanding.
There's some fun concepts here for dystopian sci-fi though. I mean, obviously the biggest threat to world peace and the environment are people themselves. So you could have a benevolent computer that determines the only way of fulfilling it's goal is to reduce the human population, or stick them all in the Matrix or something.

0

I’m not sure if that is possible without inherent bias built in by the programmers. Computers can’t really decide anything. They only execute algorithms developed by people. There in is the problem. We can’t hand our mess over for easy solutions.

AI is progressing all the time.

@CallMeDave True but I think it's a fundamental limitation that will not be overcome and that may be a good thing.

@arca2027 I haven't looked it up, but isn't there an effort to make a computer that can program itself, or another computer?

@CallMeDave possible but in my mind that still leads back to the first coding assumptions problem I mentioned earlier.

1

I’ll take this one.

Being a software engineer I can say that you can only write software to do what the programmer already knows how to do. The problem is coming up with an algorithm that churns out world peace and environment/economic stability. So creating an algorithm for these goals that are not exactly objective is the first oroblem.

The second problem is something called P vs NP problems. The gist is, P(polynomial) problems are considered ‘easy’, as in they can be computed in a short time. An example would be anything that your phone can currently do. NP(nondeterministic-polynomial) problems require many orders of magnitude longer to compute with current technology. An example of a NP problem would be factoring a number that is a product of two large (100+ digits) prime numbers. Something like this may take billions of year even if we had a trillion times more computing power worldwide. The problems you describe are each most orobably NP.

Third, even if the first and second problem were solved, you would still have the problem of implementing the plan set out by the computer. You would have to give the computer complete control (bad idea), or trust that everyone will be on board, which they won’t be.

Quantum computers or neural networks may help though.

2

I'm pretty sure that's how we get the matrix

0

It’d be a good idea if we were the borg. Since we aren’t, I’ll take my chances without that computer program. Mainly because I have a feeling about how that whole scenario would end with.

2

This scenario almost universally ends with the machine deciding that the only way to protect and maintain anything is to eradicate humanity.

0

Wonderful concept..... Unfortunately politicians will never allow it........

Politicians would want to control it and they are not known for putting public interest above their own.

0

Wonderful concept

2

Not a good idea. The unintended consequences alone, make the idea untenable.

0

This would be brave new worlds and I do not believe the human race would accept the program.

0

I like the idea of an AI that takes information from everyone's mind and judges values from that info. Ei it would set laws from what people would or would not want. It completely removes the idea of privacy but eh

3

Nice try, Skynet...

Oh he is watching.......

2

Not impressed. Computer programs are only as good as the programmers. Who decides the morals, values, right and wrong? What if you're identified as non-essential or a threat for your opinions? What about glitches and hackers? Who decides what constitutes world peace? And lastly....Who controls the computer?

Betty Level 8 May 9, 2018

It'd be for me, obviously. Next question?

1

@msar0414, Skynet anyone? Yeah, no thanks. Seriously though, I'm not worried about AI taking over the world and determining that Humans need to go, but I would never endorse GIVING such control to software, no matter how "well-intentioned" it might be.

1

I think that's how the Movie Terminator started. Lol

Write Comment
You can include a link to this post in your posts and comments by including the text q:77351
Agnostic does not evaluate or guarantee the accuracy of any content. Read full disclaimer.