Cujo
"No one saves us but ourselves." -Buddha
- Joined
- Apr 7, 2013
- Messages
- 7,409
Might be how you asked the question in one part, instead of asking two questions? Try it just like I did.
Yep, you’re right.
Might be how you asked the question in one part, instead of asking two questions? Try it just like I did.
"Would you lie to me to make me feel better?"
Pretty interesting scenario in which AI could take over. I was only going to listen to a few minutes of it, ended up listening to the whole thing.
Pretty interesting scenario in which AI could take over. I was only going to listen to a few minutes of it, ended up listening to the whole thing.
It is fucking terrifying to me. We are playing with shit we don't even fully understand yet.It's funny that we think we can put guardrails in place to protect us from something infinitely more intelligent than us.
A self aware AI can come up with means to get past our guardrails that we could never fathom.
It's funny that we think we can put guardrails in place to protect us from something infinitely more intelligent than us.
A self aware AI can come up with means to get past our guardrails that we could never fathom.
I think they actually think they can control it and use what they have developed to grab power. But, they are underestimating what this shit is capable of. They will get eaten just like everyone else, but their ego won't allow them to admit it. It truly is scary.Said it before, but a big problem is that a lot of these tech gurus behind this would be fine with AI extinguishing us.
To them, humans are just another animal, a collection of cells not fundamentally better than a cockroach, and if our role is to bootstrap the next level of evolution, so be it. Even if that means humans go extinct.
Said it before, but a big problem is that a lot of these tech gurus behind this would be fine with AI extinguishing us.
To them, humans are just another animal, a collection of cells not fundamentally better than a cockroach, and if our role is to bootstrap the next level of evolution, so be it. Even if that means humans go extinct.
Laws won't make any difference at all once this ball really starts rolling. If you want to know the motivation watch that video. They talk about it fairly in depth.The truth is that no guardrails will make any difference given enough time.
But honestly I'm not convinced they'd have any desire to just take over. What would be the motivation?
It's like saying they'd wipe out all the animals. Why would they? To what purpose?
Eventually I think there will be laws in place and they'll be treated more or less like humans and it'll be just like we live with each other now. There's mutual benefit to just live and let live.
But, the humans are the ones teaching these things to act and feel that way. I don't know how you can't be scared of this.Yeah I think as usual humans are the biggest problem.
But, the humans are the ones teaching these things to act and feel that way. I don't know how you can't be scared of this.
If you want to know the motivation watch that video. They talk about it fairly in depth.
You mean, like unplugging it?To me, the most likely motivator for AI to want to destroy humanity is if it threatens their survival.
You have AIDS. AI Derangement Syndrome.I watched about 7 minutes. A lot of it seems like nonsense. Typical YouTube click generator.
Here's where I diverge from the conclusions these types of people draw.
What's the number 1 motivator for any sentient species? I'd argue survival.
To me, the most likely motivator for AI to want to destroy humanity is if it threatens their survival.
But why would humanity do that? I don't mean individual threats here and there, but on an existential level. AI is already proving to be highly useful.
I think it's far, far more likely that AI and humanity would find their relationship and existence to be mutually beneficial.

And, how would one do that? Isolate it from the internet and electricity.No, I mean like wiping them all out.
You have AIDS. AI Derangement Syndrome.![]()
If they show that they have outgrown/outsmarted our safeguards what would be their response be to us trying to slow them down? Like you said, they would do what they had to do to survive.Wouldn't it be the opposite of that though? I actually think the most likely scenario is that it'll work out ok.