This is an older nautilus article, but man they write well...


    Conversely, people whose facial muscles are immobilized by Botox injections can’t mirror other people’s expressions, and have less empathy. No mechanics, no emotion, it seems.

"Well, that's vaguely terrifying," said the guy who lives on the west side of LA.

    Why, for instance, should we ascribe sadness to a particular piece of music? “There’s nothing intrinsically sad about this music, so how do we extract sadness from that?”

Ooo, ooo, ooo, I got this one. For a project in multivariate calculus back in college I modeled sine chords in 3D space (using paper mache - we were old school back then). The more minor a key, the longer the cycle of repetition. Simple waveforms are harmonic and major. Complex, beating waveforms are disharmonic and minor. We associate pure tones with happiness. Don't ask me why.

    By other definitions, the trait exists in all sorts of animals, with most people willing to ascribe feelings to creatures that closely resemble humans. So does he believe a computer could be emotional? “As emotional as a crocodile, sure. As emotional as a fish, yes. As emotional as a dog, I can see that.”

Sherry Turkle, based on 30+ years of research, has argued that the way machines model emotion will always be so different, and the mechanisms by which machines use emotion will always be so alien, that it's purest fallacy to pretend that machines will ever be "alive" the way we are. She also points out that because of the way humans are hard-wired to respond to each other, we won't give a shit. If she presents about 20% of the cues of human interaction from a robot, humans will go the other 80% of the way with nary a regret, even though they know they're interacting with a machine. We're so used to providing context that when ELIZA sits across from us providing semantic rules, we imagine the human on the other end.

Asimo with fur right there. Everybody knows it. But Paro is revolutionizing healthcare in Japan.

posted 1755 days ago