a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by kleinbl00
kleinbl00  ·  3358 days ago  ·  link  ·    ·  parent  ·  post: How long until a robot cries?

    Conversely, people whose facial muscles are immobilized by Botox injections can’t mirror other people’s expressions, and have less empathy. No mechanics, no emotion, it seems.

"Well, that's vaguely terrifying," said the guy who lives on the west side of LA.

    Why, for instance, should we ascribe sadness to a particular piece of music? “There’s nothing intrinsically sad about this music, so how do we extract sadness from that?”

Ooo, ooo, ooo, I got this one. For a project in multivariate calculus back in college I modeled sine chords in 3D space (using paper mache - we were old school back then). The more minor a key, the longer the cycle of repetition. Simple waveforms are harmonic and major. Complex, beating waveforms are disharmonic and minor. We associate pure tones with happiness. Don't ask me why.

    By other definitions, the trait exists in all sorts of animals, with most people willing to ascribe feelings to creatures that closely resemble humans. So does he believe a computer could be emotional? “As emotional as a crocodile, sure. As emotional as a fish, yes. As emotional as a dog, I can see that.”

Sherry Turkle, based on 30+ years of research, has argued that the way machines model emotion will always be so different, and the mechanisms by which machines use emotion will always be so alien, that it's purest fallacy to pretend that machines will ever be "alive" the way we are. She also points out that because of the way humans are hard-wired to respond to each other, we won't give a shit. If she presents about 20% of the cues of human interaction from a robot, humans will go the other 80% of the way with nary a regret, even though they know they're interacting with a machine. We're so used to providing context that when ELIZA sits across from us providing semantic rules, we imagine the human on the other end.

Asimo with fur right there. Everybody knows it. But Paro is revolutionizing healthcare in Japan.





user-inactivated  ·  3358 days ago  ·  link  ·  

Rosiland Picard's group at the Media Lab have been working on figuring out what people are feeling for 20 years, and they've gotten very good at it. You don't need to know what boredom is to recognize bored faces. Kismet proved you don't need to do much to make the right expression in response. Being able to fake it well enough is as good as it gets in AI. I go back and forth on whether faking it well enough is really very different from the real thing. I fake the right facial expression all the time, after all. I'm pretty sure everyone does.

rezzeJ  ·  3357 days ago  ·  link  ·  

Could it be argued that, in regards to human interaction, faking the right facial expression could still be seen as somewhat genuine? To fake the right expression, one has to have at least partly empathised with the other party. In turn, even though the expression isn't a knee-jerk response, conscious processing of the situation and consideration of the correct response has still taken place.

To provide an analogy. Imagine you were asked a question. To answer it you searched google, deduced the correct response and gave it to them. Even though you didn't intrinsically have that knowledge, you still displayed knowledge and tact in relaying the correct response. That answer isn't any less genuine than if you'd known off the top of you head.

kleinbl00  ·  3357 days ago  ·  link  ·  

This is kind of the crux of the issue. David Levy says yes. Sherry Turkle says no. Levy's argument is "good enough is good enough" while Turkle's argument is "a human response does not imply a human motivation yet that's what we're banking on and it's gonna fuck us, here's a giant ream of data proving so."

To your analogy: you ask me a question because you want my answer, not because you want a Google search. If I have no idea and tell you I have no idea, you have my answer ("I have no idea") and that is a datapoint in our relationship. If I have no idea and tell you I'll look it up, you have a different answer ("I don't know, but I like you enough to put in some research"). If I have no idea, look it up surreptitiously and then tell you, you have still a different answer ("I know everything"). It doesn't take too much extrapolation to see that these three responses cover a whole bunch of relational nuance and provide a wide fan of outcomes... yet from a machine intelligence standpoint, you only get the 2nd... and you're mistaking "IF A THEN B" for "I like you enough to put in some research".

rezzeJ  ·  3357 days ago  ·  link  ·  

Thanks for the books, I really must motivate myself to set about sometime to read instead of mindless trawling the web as much as I do. I've succeeded in reading a good few articles I never would have bothered with other the last few days, so hopefully that's on the up. I've been reading 'Our Mathematical Universe' for too long, I need to finish that.

I agree with your deconstruction of my analogy. I have no counter for now.

user-inactivated  ·  3357 days ago  ·  link  ·  

    You don't need to know what boredom is to recognize bored faces.

I thought the article's example of a robot noticing when someone's throat tightens was very good.

rezzeJ  ·  3357 days ago  ·  link  ·  

    We associate pure tones with happiness. Don't ask me why.

I saw this as a perfect cue to google the problem and provide an answer . But it turns out no-one really knows for sure! I'm currently looking for a topic for my masters thesis, this is definitely going down on the list of potential selections. Thanks for the inadvertent suggestion.

kleinbl00  ·  3357 days ago  ·  link  ·  

Dude, throw down. Modeling waves is so cool looking. You could even get into the emotional response to sine vs. square vs. saw... I have a gut feeling, utterly unbacked by any scientific knowledge, that transient peaks and low repeatability are "scary" to us because they sound like roaring while sinusoidal harmonics are "soothing" to us because they sound like lullabies. If I hadn't gone into mechanical engineering I might have done the research myself.

rezzeJ  ·  3357 days ago  ·  link  ·  

It is now on the list along with a discussion on the usefulness of the pursuit of originality and some sort of study into inspiration. The masters I'm doing in more practice based, so the dissertation is only 10,000 words, but that still obviously allows a lot of room for an interesting investigation. Your suggestions are great, thanks. I shall add them to the document!

coffeesp00ns  ·  3357 days ago  ·  link  ·  

To throw another wrench into the "Major-Happy/Minor-Sad" there's things like the use of Minor keys for other emotions, most famously the solemnity of Christmas in a lot of baroque works.

You can also look at the modes, and their relations to happy/sad. Some of them sound more major, but sadder, and others more minor, but happier.

Then you could look at how some keys used to have certain emotions/scenarios implied (some still do), and then you could look at how older tuning systems might have influenced that...

Sorry. I get excited about this kind of thing.

rezzeJ  ·  3356 days ago  ·  link  ·  

No need for the apology, I dig the enthusiasm. Noted all these ideas down!

coffeesp00ns  ·  3356 days ago  ·  link  ·  

If you want some cool shit, look into how harpsichords are/used to be tuned. It really emphasizes why people felt certain things about certain keys. because they were tuned to a "key", and weren't well-tempered, certain keys (like Ab in F) got REAL squirrelly. But even once they got into "well-tempered" tunings, they still use some interesting and weird tunings.

kleinbl00  ·  3357 days ago  ·  link  ·  

And that's why I follow you.

demure  ·  3357 days ago  ·  link  ·  

    the way machines model emotion will always be so different

Sure! But if the end goal is to provide an interaction with a robot that is comparable to that of a human, is that at all a problem?

We are past the level of Kismet and ELIZA - maybe someday a robot will cry..