http://www.bbc.co.uk/news/technology-19354994
Just read this article and it seems we're getting closer and closer to the possibility of AI's.
As a human race, I think we're incapable of developing these in a controlled environment as a well designed AI is expected to have the capability to self improve itself. Being technology at it's finest themselves, the risk of them surpassing all the constraints of human intelligence is more than possible.
Then what happens when an AI actually becomes smarter than a human? It will be impossible for any human to fully understand as they will then just keep improving itself as much as it possibly can, which could make it stupidly hard to predict, maybe understanding the laws of logic far better than any human can.
What does everyone else think about this? I hope I'm not the only one who gives a shit!
Just read this article and it seems we're getting closer and closer to the possibility of AI's.
As a human race, I think we're incapable of developing these in a controlled environment as a well designed AI is expected to have the capability to self improve itself. Being technology at it's finest themselves, the risk of them surpassing all the constraints of human intelligence is more than possible.
Then what happens when an AI actually becomes smarter than a human? It will be impossible for any human to fully understand as they will then just keep improving itself as much as it possibly can, which could make it stupidly hard to predict, maybe understanding the laws of logic far better than any human can.
What does everyone else think about this? I hope I'm not the only one who gives a shit!