Agnostic.com

12 2

Tainted Data Can Teach Algorithms The Wrong Lessons

[wired.com]

My initial reaction this this article was "No kidding". Having done a bit of dabbling within this realm, pretty much all of the issues I have run into surrounding Artificial Intelligence can be traced to the data. When machines are subjected to the same biases of humans, is it really surprising that they would come to similar conclusions?
To draw a parallel that most on this site will comprehend, would you have adopted religion at a young age (if applicable) if someone trusted by you didn't share similar beliefs?

An unexpected consequence of looking into this was the realization that one could not help but to self reflect on their own biases if they wanted to approach this in any credible fashion. Of course, it isn't a current requirement for participation. But it becomes blatantly apparent when the algorithms start spiting out biased results (whatever the stench).

Which brings me to a couple questions for readers that dare tackle this issue. If someone is in the field, even better (feel free to enlighten me on how full of shit I am).

1.) Should the maturation of Artificial Intelligence algorithms be regulated?

No, were not going to be able to police the whole world. But as the common saying goes, are speeders justification for scrapping speed limits?

2.) Should private entities be allowed to create and curate proprietary black box algorithms?

We are already here.

3.) If one were to attempt to rein in the beast, how should/would one go about this?

Mb_Man 7 Nov 26
Share

Enjoy being online again!

Welcome to the community of good people who base their values on evidence and appreciate civil discourse - the social network you will enjoy.

Create your free account

12 comments

Feel free to reply to any comment by clicking the "Reply" button.

0

Well, we are all evolved algorithms, complete with bias from those very evolutions. It is why we default to belief so oft, and why we see ready threats, those in our faces, but are less likely to see long range threats.
OUR algorithm evolved that way and we have to work around it.

So to some extent are we not the pot calling the kettle black?

0

Maybe education on the benefits of ethics? Come to think of it, this might even work for humans...

0
  1. in such a new field it would be difficult to come up with a body of people that could regulate something that is not well understood. The ones breaking the boundaries would be the same as those needed to regulate it.
  2. sure but when it comes to public safety issues some way to regulate it other than our current habbit of trying to sue people after the fact to try to compensate for damages
  3. no idea

i work with a lot of data and i see a lot of circumstances where people ask for data to aggregated, joined and massaged and when they get the results they proclaim "well this can't be right" and the response is that is EXACTLY what the raw data you provided created. Often they have no idea what all the data they have is and if (not likely) it was actually recorded in a usable manner. People have a hard time grasping the idea that technology does what its told, no grey area.

0

Yes, like the use of derivatives formed from algorithms which are use to hide, confuse, and take advantage of others!!!

If you can not reign in derivatives, as a mode to control others monies!!!

Then How would anyone or any organization be able to control someone else's algorithms packed together in some sort of proprietary black box AI or IA as some form of non transparent derivative(s)???

Seems the genie is already out of the bottle!

0

Have a really good off switch.

0

This may seem trivial but it isn't.

1

AI may change our ethics and certainly should change our politics.

0

I do not think anyone owns things like this. Your worries seem premature to me. Also, like most inventions/progress, it will prove to be both good & bad, so chill..........

Google, facebook, Huwaii, Amazon, the NSA, CIA, FBI, and so many independent developers.

0

As we are so many countries, religions, cultures, e,tc, etc it is difficult to regulate anything. Far too many egos around.

0
  1. Hard to do. But if it does happen, it will be one form of AI regulating another.
  2. Yes, because trying to enforce this is fruitless. I can develop a black-box algorithm on my home computer, and who is going to stop me from keeping it private? The only thing in this respect that should be enforced is how an algorithm is used and how it effects society . . . . and even that is not an easy thing to deal with or enforce.
  3. The REAL answer is that the only way man is going to survive this is by becoming more like computers . . . . transhumanism to the max . . . . we must evolve and develop a symbiotic relationship with computers.
    Computers can benefit from their creators, the biological deep-thinking humans, and humans can benefit from computers . . . . a human-computer combination can do things that we could only dream of . . . . as in the matrix, one can become an expert in many fields at nearly the blink of an eye, not only that, when it is discovered how experiences can be shared, a whole field of study and exploration of experiences will blossom . . . . leading to better understanding between all walks of life, and all cultures . . . imagine that suddenly you can speak the language of any culture, and not only that, understand the ins-and-outs of the culture itself . . . . the whole potential of that is mind-blowing. Our only real hope is this kind of thing happening, because right now, without any kind of huge paradigm shift, the human race is well on its way to self-destruction.
0

I'd mandate that the data set used be made public if the outcome is also made public. See what other algorithms conclude, or even check the dataset for clear biases.

1
  1. Yes, 100%, we need to replace/add to congress a technicate like a decade ago.

  2. This is only acceptable if their are tons of entities constantly in competition with each other.

  3. With other beasts who's only job is to prevent any one entity from becoming too powerful, thus maintaining balance.

Write Comment
You can include a link to this post in your posts and comments by including the text q:431572
Agnostic does not evaluate or guarantee the accuracy of any content. Read full disclaimer.