Analysis Google no longer understands how its "deep learning" decision-making computer systems have made themselves so good at recognizing things in photos.

    This means the internet giant may need fewer experts in future as it can instead rely on its semi-autonomous, semi-smart machines to solve problems all on their own.

    The claims were made at the Machine Learning Conference in San Francisco on Friday by Google software engineer Quoc V. Le in a talk in which he outlined some of the ways the content-slurper is putting "deep learning" systems to work. (You find out more about machine learning, a computer science research topic, here [PDF].)

(not my title)

kleinbl00:

Naaah. They're not understanding machine intelligence.

Take a look at this abstract real quick. What it says, basically, is that machine intelligence has determined that certain colors predominate in child pornography. This means that the machine can use color skew in identifying child porn.

That's like the "shredder" discussion - show a computer a million pictures of paper shredders and it will come up with some statistical truisms about pictures of shredders. They tend to be gray and have buttons on the right, for example. Or "often located next to waste paper bins."

Machine intelligence is awesome for statistical discoveries like the above - "if the background is red, it's 50% more likely to be child porn than if the background is blue." It sucks for things like "generate me an image of child porn" or "show me child porn images with yellow backgrounds."

Apply this to the shredder. If you want machine intelligence to not find your shredder, put a Mr. Yuck sticker on it. Attach a feather. Put it under your desk backwards. Machine intelligence excels at projecting trends and sucks at classifying outliers as anything other than "outliers." All the machine can do is go "I think this isn't a shredder because it has a feather on it human please check." And, since most people don't put feathers on their shredders, that's plenty good enough.

There's no cognition in machine intelligence: "someone is trying to hide their shredder." The Skynet reference in the article is actually already in use by the CIA: IF "walking around in the dark" AND "skulking" AND "in Pakistan" AND "opening car trunks" THEN "send a Reaper by to take a look-see. " This beastie surveils an area roughly the size of Ohio every time they put it up - and if you think they aren't running its every visual through machine intelligence you're delusional.

The downside is in order to foil stuff like this, you have to not act like an insurgent. High-level targets are never taken out by routine sweeps, they're always cooked off by interdisciplinary ops that start with HUMINT. Machine intelligence is really good for going "I think you should look at this" and terrible at "I've learned something new." Guaranteed - as soon as the "child porn is blue" paper came out, everyone you want busted started shooting in red rooms.


posted 3762 days ago