Will AI wipe out humanity?

I mean, it's pretty funny really. All the GPU's and all the algorithms in the world, could never show me anything as true, or create a single second as beautiful, as I saw on Thursday, just watching the leaves in a particular tree in the light, watching the light dance in the various ripples and flows and layers of our local stream. Everything we build is so clumsy and ugly, compared to nature, that it's just hilarious. A joke. I hope more people start to get it, and get the random lifetime ambition to plant a tree or six.

The implants in people's head will be able to create all those visuals for them to seem perfectly real.

Implants are coming, it's already been normalised to have these phones attached to your hand. The next obvious step will be things being implanted. It'll 100% happen at some point. Younger generations won't take much concincing and everyone else dies out eventually.
 
The implants in people's head will be able to create all those visuals for them to seem perfectly real.

Implants are coming, it's already been normalised to have these phones attached to your hand. The next obvious step will be things being implanted. It'll 100% happen at some point. Younger generations won't take much concincing and everyone else dies out eventually.

You've been watching to much Black Mirror.
 
I think we're quite a long way away yet. Even today's AI such as ChatGPT isn't actually AI, it's just an algorithm which utilises the internet to formulate answers to questions. It cannot come up with new ideas or do anything that separates it from a very advanced version of Google.

They say the AI singularity is tied to processing power but I don't think that is really relevant. We're getting close to brain type levels of hardware and processing power but the software is a million miles off. Whilst computers are tied to 0's and 1's then we'll never see a true intelligence which isn't reliant or based upon human programming.

That doesn't mean we can't get AI and robots to do everything we need because do we really need an intelligent AI or do we actually just need far more powerful automation? I'd say the latter is best and the former is dangerous.
 
I think we're quite a long way away yet. Even today's AI such as ChatGPT isn't actually AI, it's just an algorithm which utilises the internet to formulate answers to questions. It cannot come up with new ideas or do anything that separates it from a very advanced version of Google.

They say the AI singularity is tied to processing power but I don't think that is really relevant. We're getting close to brain type levels of hardware and processing power but the software is a million miles off. Whilst computers are tied to 0's and 1's then we'll never see a true intelligence which isn't reliant or based upon human programming.

That doesn't mean we can't get AI and robots to do everything we need because do we really need an intelligent AI or do we actually just need far more powerful automation? I'd say the latter is best and the former is dangerous.

That's such a wrong answer that I'm guessing you know very little about AI.
 
I think we're quite a long way away yet. Even today's AI such as ChatGPT isn't actually AI, it's just an algorithm which utilises the internet to formulate answers to questions. It cannot come up with new ideas or do anything that separates it from a very advanced version of Google.

They say the AI singularity is tied to processing power but I don't think that is really relevant. We're getting close to brain type levels of hardware and processing power but the software is a million miles off. Whilst computers are tied to 0's and 1's then we'll never see a true intelligence which isn't reliant or based upon human programming.

That doesn't mean we can't get AI and robots to do everything we need because do we really need an intelligent AI or do we actually just need far more powerful automation? I'd say the latter is best and the former is dangerous.

But aren’t humans just a complex array of zeroes and ones?

I don’t see the functional difference between 80 billion neurones that are either on or off, and the 80 billion transistors in a modern GPU that are either on or off.

There’s only two real differences:
1) the inputs
2) the software/wiring

At the moment our best AI is generally of a single mode (text/images/sound). But they are building multi-modal general AI as we speak that will be able to take in inputs from different media.

Our brain’s software has had about 4 billion years of head start on being optimised and trained through evolution. It feels like it’s only a matter of time until our AI neural networks catch up.

I think people who think there is something inherently non-replicable about the way humans do things are in for a nasty shock.

Even through the recent LLM development there has been papers released about how some formal logical principles seem to weirdly pop out of the mechanics of language itself. As in, our brains logically assessing a situation show little difference to these AI prediction engines which are just coming up with the most likely next word in a chain of reasoning. My hunch is that we are not as special as we think we are.
 
Does it matter?

This question has made me realise I really don’t give a toss about the long term (200+ years in the future) prospects of humanity.

I care about the world around me and the people in it today. I care about my children and their children and probably their children too, but I don’t think I can bring myself to care about 20 generations into the future any more than I care about relatives 20 generations back dying of bubonic plague etc.


Obviously I’d like for us to avoid this if possible and not embark on a deliberate attempt to end the species, I think we have a moral responsibility to look after the planet we’re on and not dig a hole for the future generations but god knows much of the planet would be much better off without us.

No one grieves for the Neanderthal. We are just another link in the evolutionary chain, eventually we will be replaced like all species before us.
 
Does it matter?

This question has made me realise I really don’t give a toss about the long term (200+ years in the future) prospects of humanity.

I care about the world around me and the people in it today. I care about my children and their children and probably their children too, but I don’t think I can bring myself to care about 20 generations into the future any more than I care about relatives 20 generations back dying of bubonic plague etc.


Obviously I’d like for us to avoid this if possible and not embark on a deliberate attempt to end the species, I think we have a moral responsibility to look after the planet we’re on and not dig a hole for the future generations but god knows much of the planet would be much better off without us.

No one grieves for the Neanderthal. We are just another link in the evolutionary chain, eventually we will be replaced like all species before us.
I didn’t know Martin Keown had died, RIP.
 

Don't have an account? Register now and see fewer ads!

SIGN UP
Back
Top