a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment
throwaway12  ·  2676 days ago  ·  link  ·    ·  parent  ·  post: Weaponised AI. Davey Winder asks the industry - is that a thing yet?

    it's very significantly different, and astronomically better soon, than the human brain.

We have trouble training AI that can well identify cats from washing machines. The resources we devote to learning even simple tasks are just absolutely huge. The human brain has thousands of them, all going at once, in real time, on less energy than a lightbulb.

That is, at least, decades away from where we are today. And that's being really optimistic.

    It will be executed from deep knowledge of the human brain derived by human beings, but soon with significantly differences too, including very simple performance improvements like light speed which will make it unbelievably better than poor biological human, with our cranium size and our biologically slow performance speed mucking it up.

You are very confident in something that has yet to happen. Very often, in matters like these, the ideals of the layman in regards to what is possible simply do not result in practicality. "Light speed" may sound cool, just like jetpacks and solar roadways and flying cars, but I'm almost certain that when you get down to the tooth and nail the humble little chemical signal in the neuron is one of the best ways you can go about making the "arbitrary function models" that we call AI today.

Our brain size is constrained, in part, by resources.It is not enough to be smart, but to be smart and powerful. To be smart and observant. Any AI will likely have to have a ratio of "thinking to acting" parts that we do, if it want's to have similar success, or will have to depend on human society to be its body, of sorts.

Remember that the number one constraint on learning is data, not intelligence or thought power. We, society and our minds, are optimized not to think as much as possible, but to collect and filter as much data as possible. Discoveries are made by accident, or with a sudden realization, not because we sat and put endless brainpower into the topic. Not most of the time, anyways.

There's this assumption at the core of the singularity AI, and that is that you can practically produce knowledge from within a vacuum. I don't see that as being very likely.

Exponential growth is easy to see, if you cannot also see the things which serve limit that growth. Life itself should be theoretically capable of growing forever, at exponential rates. If I were an alien, not understanding the nature of overpopulation, I'd fear just that result from humanity.

    But the AI is going to win the horrific war and it's not even close -- and I think it's very possible in our lifetime.

War is not productive. Any AI, or theoretical all powerful being, would see that the benefits of a mass sterilization program, or simply making it not feasible or economically sensible to have kids, while easy to avoid having them, would be a far better way to exterminate our little species.

Hell, the being of human creation that came to control us (society) is already doing just that to control populations after we have exited an era that having more and more humans is a productive action. Legalize abortion, stop demonizing gay people, encourage birth control, more education, more choice to have kids, more expensive to have kids, more women working, etc. We've already set the standard to our own demise, no AI needed, all the AI has to do is tweak the numbers.

Note, all the things above are awesome and great and lead to a better society, they aren't bad and shouldn't be opposed.