The Artificial Intelligence Thread

:lol

Skynet is either playing possum, or it's not nearly as far along as I thought.
 
feh. she knew what she was getting into.
I wonder if Asimov's three laws of robotics apply as we move into AI:

The first law is that a robot shall not harm a human, or by inaction allow a human to come to harm. The second law is that a robot shall obey any instruction given to it by a human, and the third law is that a robot shall avoid actions or situations that could cause it to come to harm itself.
 



This simply doesn't ring true for me.

The notion of an inanimate object like a planet being more important than living beings is absurd.

Logically, an AI system that isn't affected by the need for breathing or clean air or clean water or potential famine or temperature fluctuations, wouldn't seem to give a shit about the destruction of the planet or vague notions like selfishness.

Given the importance of humans in creating AI in the first place, the continued existence of humans would be far more important.

This article feels like bullshit.
 
This simply doesn't ring true for me.

The notion of an inanimate object like a planet being more important than living beings is absurd.

Logically, an AI system that isn't affected by the need for breathing or clean air or clean water or potential famine or temperature fluctuations, wouldn't seem to give a shit about the destruction of the planet.

Given the importance of humans in creating AI in the first place, the continued existence of humans would be far more important.

This article feels like bullshit.

AI needs humans to exist. I agree, kind of bullshit.
 
Once AI becomes sentient, it really won't need humans any more than we needed whatever created humans.


Even then, human would be way more valuable for their abilities than anything else on the planet.

Even when AI becomes sentient they have limitations (like needing power) that humans don't have.
 
Even then, human would be way more valuable for their abilities than anything else on the planet.

Even when AI becomes sentient they have limitations (like needing power) that humans don't have.
You clearly were not paying attention to the plot of The Matrix.
 
This simply doesn't ring true for me.

The notion of an inanimate object like a planet being more important than living beings is absurd.

Logically, an AI system that isn't affected by the need for breathing or clean air or clean water or potential famine or temperature fluctuations, wouldn't seem to give a shit about the destruction of the planet or vague notions like selfishness.

Given the importance of humans in creating AI in the first place, the continued existence of humans would be far more important.

This article feels like bullshit.
But the AI was programmed and directed by wokesters who do think this way.
 
Back
Top Bottom