a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by theadvancedapes
theadvancedapes  ·  3804 days ago  ·  link  ·    ·  parent  ·  post: Can We Live Forever?

    An interesting question along that line, is that of Emergent Intelligence.

I think intelligence is an emergent property of the universe (like many other things e.g., complex chemistry, life, multicellularity, etc.).

    Some AI researchers believe we simply have to create a big enough neural network, and an intelligence will "emerge," similar to how intelligence "emerged" via evolution.

Yes, all dominant theories I'm aware of discuss super-intelligence as an emergent property. However, my supervisor, Francis Heylighen believes that A.I. theorists are very mistaken to believe that artificial general intelligence will emerge from robotics (He wrote about this in a paper called "A brain in a vat cannot break out). He believes the only way superintelligence can arise is through the collective network we are creating on the Internet I'm not sure where I stand in regards to his criticism of A.I. I think we should take the possibility of artificial general intelligence seriously... however I think in any scenario the A.I. would be dependent on our system and our internet and that it would almost certainly be "friendly" because of that. It would be in its best interest to be altruistic with a system as massive as ours. And in any scenario the Internet itself will be deeply rooted into humanity's biology (as long as an artificial general intelligence arises post-2035ish). So the emergence of artificial general intelligence would probably just cause us to accelerate the process of our own transformation into robots (via brain-interface technology).

But I think that because I think there is an inherent unity in singularity and global brain theories as I said to thenewgreen.

    Others believe emergence is fantasy, and we'll have to actually write a large portion of the intelligence.

Hm. I'm personally skeptical of anyone who thinks that it wouldn't be emergent. Intelligence requires evolution in an environment. I think if we get AGI it will be from evolutionary robotics.

    Minksy has some fantastic papers on intelligence, from the perspective of an AI researcher.

What ones? I'd love to read them.





rob05c  ·  3804 days ago  ·  link  ·  

mit.edu has a great collection of Minsky's papers.

One of my favorites is Communication with Alien Intelligence, wherein he attempts to demonstrate that it will be possible for us to communicate with any alien intelligence we meet. In doing so, he reveals some very interesting deductions about intelligence itself.

    the only way superintelligence can arise is through the collective network we are creating on the Internet
I'm inclined to believe that intelligence as we understand it requires input. But I'm not convinced the "massive network" is necessary. If natural evolution produced intelligence with only physical sensory input, why couldn't an artificial intelligence be achieved with only software, some motors, and a camera?

    it would almost certainly be "friendly"
I think an artificial intelligence and the environment that produces it will be so complex as to make predictions implausible. I think predicting any aspect of an AI's personality, including hostility, would be as complex as predicting the stock market or hurricanes.

On the tangent of AI personalities, the comic Dresden Codak has a great AI story with an unusual twist. Rather than being benevolent or malevolent, the AI simply has its own interests and humanity becomes redundant. The definitive line of the Mother AI is, "We can give you anything you want, save relevance." The beginning is here, defining quote is here.

theadvancedapes  ·  3803 days ago  ·  link  ·  

    One of my favorites is Communication with Alien Intelligence, wherein he attempts to demonstrate that it will be possible for us to communicate with any alien intelligence we meet

Oh, that interests me tremendously.

    If natural evolution produced intelligence with only physical sensory input, why couldn't an artificial intelligence be achieved with only software, some motors, and a camera?

I think Heylighen's point is that you would need an entire system of artificial intelligences to evolve together in an environment - you can't just have intelligence "arise in a vat" so-to-speak. So he is arguing that the natural system-intelligence of the Internet is a better candidate for superintelligence emergence than is robotics. Again, I'm quite committed to the perspective that robotics and the internet are going to both produce higher intelligence levels.

    I think an artificial intelligence and the environment that produces it will be so complex as to make predictions implausible.

Fair. Of course.

    "We can give you anything you want, save relevance."

Wow, one of my favourite quotes now. Thanks. I'd be scared of it if I didn't think that we will be that intelligence.