a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by kleinbl00
kleinbl00  ·  434 days ago  ·  link  ·    ·  parent  ·  post: AutoGPT

punters used to like to point to Asimov's three laws of robotics and go "look what an excellent set of commands to give future artificial intelligences" without recognizing that Asimov's staggering ouvre is basically nothing but a smorgasbord of paradoxes prompted by the inherent ambiguities of his three laws.

I'm a really shitty coder. Abstractions are my Achilles heel. I'm really good with mechanical shit though - when you can't abstract it, I can build it in my head no problem. So the "how does it do what it does" with LLMs is abstractly opaque to me but concretely crystal-clear: it's playing Family Feud.

The answers on Family Feud aren't correct, they're popular. It's a game of consensus, not accuracy. So, also, are the answers out of ChatGPT: determining whether an answer is correct or incorrect is not a part of its core programming. You can bend it that way, but only within limits: For example, GPT detectors are more likely to flag non-native speakers as bots than native speakers. That means its training data is looking for the unspoken rules of English. It doesn't need to codify them, it just needs them on its LUT. Can you also code it with Strunk & White? Indubitably. At which point ChatGPT becomes a handy damn engine for turning your learned-in-India pseudo-Queen's English into California slang. That will help talented people get the work they deserve. I'm a fan. But the coding that allows you to go "Siri, how many songs are there on Leonard Cohen's 5th album?" is the same one that allows you to go "Siri, how do I prevent fan death??"

Asimov first formalized the Three Laws in 1940. They were already inconsistent with his earlier writings. Turing presented the Imitation Game in 1950 thusly:

    I propose to consider the question, "Can machines think?" This should begin with

    definitions of the meaning of the terms "machine" and "think." The definitions might be

    framed so as to reflect so far as possible the normal use of the words, but this attitude is

    dangerous, If the meaning of the words "machine" and "think" are to be found by

    examining how they are commonly used it is difficult to escape the conclusion that the

    meaning and the answer to the question, "Can machines think?" is to be sought in a

    statistical survey such as a Gallup poll. But this is absurd. Instead of attempting such a

    definition I shall replace the question by another, which is closely related to it and is

    expressed in relatively unambiguous words.

    The new form of the problem can be described in terms of a game which we call the

    'imitation game." It is played with three people, a man (A), a woman (B), and an

    interrogator (C) who may be of either sex. The interrogator stays in a room apart front the

    other two. The object of the game for the interrogator is to determine which of the other

    two is the man and which is the woman. He knows them by labels X and Y, and at the

    end of the game he says either "X is A and Y is B" or "X is B and Y is A." The

    interrogator is allowed to put questions to A and B thus:

    C: Will X please tell me the length of his or her hair?

Notably: Turing was then nine years out of a broken engagement to a woman he told was gay, and two years away from dating a man. Which ended badly for him, as we all know.

    Now suppose X is actually A, then A must answer. It is A's object in the game to try and

    cause C to make the wrong identification. His answer might therefore be:

    "My hair is shingled, and the longest strands are about nine inches long."

    In order that tones of voice may not help the interrogator the answers should be written,

    or better still, typewritten. The ideal arrangement is to have a teleprinter communicating

    between the two rooms. Alternatively the question and answers can be repeated by an

    intermediary. The object of the game for the third player (B) is to help the interrogator.

    The best strategy for her is probably to give truthful answers. She can add such things as

    "I am the woman, don't listen to him!" to her answers, but it will avail nothing as the man

    can make similar remarks.

    We now ask the question, "What will happen when a machine takes the part of A in this

    game?" Will the interrogator decide wrongly as often when the game is played like this as

    he does when the game is played between a man and a woman? These questions replace

    our original, "Can machines think?"

The basic drive of The Imitation Game is "can an observer determine objective truth without objective observation." It wasn't that "machines will be able to think" it was "we'll never be able to answer whether machines think because fuckin' hell we'll never be able to determine who's really a man or a woman."

By the way, Turing saw ChatGPT coming from miles and miles away:

    We also wish to allow the possibility than an engineer or team of engineers may construct a machine which works, but whose manner of operation cannot be satisfactorily described by its constructors because they have applied a method which is largely experimental.

Turing's larger point wasn't "It's a thinking machine when it can fool the observer" it was "if it walks and quacks like a duck, call it a duck." The stakes for ducks, of course, are lower.