a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by usualgerman
usualgerman  ·  322 days ago  ·  link  ·    ·  parent  ·  post: You Are Not a Parrot

I think a huge problem for the debate on whether robots and AI are conscious in any meaningful sense is less about the capacity of the machine in question and much more about the fact that consciousness is a hard problem in itself. We simply lack a good enough definition of consciousness to make any meaningful tests for consciousness that are based on real theory.

The general philosophical definition is that a conscious being has a subjective experience of the world. Or to quote the common question “is it like something to be an X?” Does it have an internal thought process, will, wants, and desires? Does it experience things subjectively? Does the robot experience something like pain when it falls off a platform? But how do you define pain? An amoeba will react negatively to a stimulus, and it will be attracted to others. But if it encounters water too hot for it and moves away is this a biological equivalent of machine learning, or is it pain? Keep in mind that amoebas have no CNS or brains, just a single cell. To my mind the amoeba could be doing either of these. It could be doing exactly like the robot falling off a platform. “This event is negative, avoid.” Or it could experience pain.





kleinbl00  ·  322 days ago  ·  link  ·  

    To my mind

THIS is the huge problem: laypeople presuming that their fuzzy understanding of consciousness is appropriate to the discussion of large language models.

I am not an expert on consciousness. I've read enough experts on consciousness and AI to know that the people who do this shit for a living? See no controversy at all.

usualgerman  ·  321 days ago  ·  link  ·  

It’s called the “hard problem” for a reason. Consciousness and free will are extremely hard to provide good definitions for, and in fact there are good philosophical arguments on how we — Humans — may not really have either one. Now if we can’t be sure that WE are conscious, that WE have the ability to exercise free will, is really not possible to make coherent arguments about whether anyone or anything else does. It ends up something like arguing about souls — and it’s amazing how groups of living things we didn’t historically see as “equal to us” were seriously considered to maybe not even have a soul. There are historical arguments during the pre-civil war era arguing about whether Black people had souls. We argue in much the same way about animals — are animals “conscious” which TBH is a stand in for the discussion people don’t want to have about rights. If you can deny souls or the modern equivalent of consciousness to a being whether it’s an amoeba, a cow, a robot, an alien or a human, then you don’t have to give them rights.

My interaction on the topic is fairly shallow. Mostly reading about it, although I’ll admit that science fiction has shaped my thinking as well.

kleinbl00  ·  319 days ago  ·  link  ·  

    To my mind

Philosopher walks into a bar. Says to the bartender "I have a proposal for you. I will stand in the middle of the bar with a full glass of beer. Whenever someone says 'drink' I will take a sip. I will then walk halfway to the wall. I will pay you for the first glass of beer if you pay to keep my glass full."

Bartender says "oh no you don't, pesky philosopher! For I am an educated man and I have learned that if you walk halfway towards something with every step you will only approach it asymptotically and never truly reach it and I am far too clever to provide you with endless beer!"

An engineer, deep in his cups, says "Charge him for three beers." The bartender and philosopher both turn to him, annoyed, because engineers exist to annoy everyone.

"This room is what, forty feet? He's gonna go about twenty the first time, about ten the second time, about five the third time, about two and a half the fourth time, about a foot and change the fifth time, about six inches the sixth time and by the seventh time he won't be able to raise his glass without touching the wall."

The engineer then raises his hand to quell the objections of the educated. "Yeah yeah yeah, shut up. He's a drunk, he's already wobbly on his feet, he can't stand up straight to within half an inch, he's got eight sips between the middle of the room and the wall unless his sips are truly heroic he's going to have a hell of a time drinking more than a pint before his nose brushes."

The philosopher changes the subject to the qualia of color and the engineer grumbles 'fucking green" into his beer.

__________________________________________________________________________

This is not a philosophical problem. It is an engineering problem. It is an engineering problem by virtue of it being proposed, investigated and executed by engineers. There are metrics. There are boundary conditions. There is data. The way to answer "do machines have souls" is to define "soul", not take your existing data and using it as an argument to neutralize the humanity of humans.

Am I alive? yes.

Are you alive? yes.

Are bugs alive? yes.

Is software alive? No. Not by any definition we have ever used. Ever. In the history of life on earth.

If your interaction on this topic is not shallow, there's no controversy at all. None. QED, if your conclusions are different than the experts, your conclusions are wrong.

Devac  ·  319 days ago  ·  link  ·