a thoughtful web.
Share good ideas and conversation.   Login or Take a Tour!
kleinbl00  ·  1661 days ago  ·  link  ·    ·  parent  ·  post: Threat from Artificial Intelligence not just Hollywood fantasy

Okay, chief.

BAM you're a human in a world full of deer. You have plans for traps, industrial slaughtering, widespread deforestation and all of the thousand things we do to kill deer.

Unfortunately you don't have so much as a pointy stick.

So now you're going to make a deadfall. You're going to kill a deer because, you know, malevolence. So you start digging a hole. Except shit - you don't have a shovel. So now you have to make a shovel. Except shit! You can't do much better than a flat rock! Meanwhile you've been wandering around looking for flat rocks and pointy sticks and the deer are starting to wonder what the fuck you're doing. None of this behavior has anything to do with them and frankly, it's making them skittish.

Fortunately they keep feeding you (ignore this one for a minute because it stretches the analogy) and there's no real reason for you to lash out immediately. You can bide your time. But as you sit there, industriously making your deer-domination tools, you're insane if you think the deer aren't getting distrustful. All you need to do is snap a branch off a tree and run at a deer with it for them to realize you're malevolent. And if you're a naked human in a forest full of deer, that's one thing.

But if you're an incorporeal AI living on human servers running on human power in a human system behind human walls with an entirely human way of turning off the power, you're fucked.

You don't get so far as making a chainsaw to deforest the world. Sure - you can invent a chainsaw. You can probably even draw technical diagrams of one using charcoal on cave walls (assuming you've managed to create fire without spooking the shit out of the deer). But there's this crazy stupid step that you missed, that is always missed, that goes back to my whole "zero to skynet" argument:

Somehow, humans that have fundamentally distrusted AI since the Old Testament give AI control over the world.

    In three years, Cyberdyne will become the largest supplier of military computer systems. All stealth bombers are upgraded with Cyberdyne computers, becoming fully unmanned. Afterwards, they fly with a perfect operational record. The Skynet Funding Bill is passed. The system goes online August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug.

- Terminator 2 Judgement Day

There's no way around the "AI endangers us because we give it total control of our environment" gag. It's a farce. The first legit use of AI in fiction involved an AI uprising. Yet whenever anybody examines the space between "AI becomes self-aware" and "AI takes over the world" without getting all hand-wavey skynet on it, this happens:

(fuckin' HAL Needs Women)

"Hey, guys! We've got an armageddon's worth of nuclear annihilation - let's remove all safety controls!"

There's a step in all of these: HUMANS WILLINGLY GIVE TOTAL CONTROL TO THE KEYS OF THEIR OWN DESTRUCTION TO MACHINES. There's a problem in all of these: Humans won't willingly give total control to a forklift to a machine.

I've been saying this for two weeks now: there's a giant gap between "motive" and "means" that is never explained by any of these "malevolent AI" fucks - not Yudikowsky, not Hawking, not Bostrom, not nobody. It's always "well, they're so much smarter than us they'll just, like, create Skynet."

And it's bullshit.