a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by wasoxygen
wasoxygen  ·  2652 days ago  ·  link  ·    ·  parent  ·  post: Superintelligence: The Idea That Eats Smart People

It's fairly certain that malaria will continue to kill people. It is very likely that your contributions to AMF will reduce this bad outcome, buying some time until a better solution is found. (Previous improvements in our response to polio and smallpox give reasonable hope for such progress.)

Meanwhile, if the risk of AI catastrophe is 1%, then it is 99% certain that resources dedicated to averting that problem will be wasted (disregarding side benefits of the research, which could occur with malaria research as well).

There is also some concern that a project like OpenAI could increase risk of a disaster.

Asteroid impact could render all these problems trivial; it's hard to prioritize giant problems that have tiny probabilities.

I agree that a lot of the essay is not very rigorous, but I think it makes some salient points:

· It is not clear what "hyperintelligence" means, and not obvious that it's possible for anything to be exceedingly more intelligent than people.

· We are not good at "baking in" robust reliability to complex systems; we make gradual improvements through trial and error. Such improvements are easily defeated, often unintentionally.

· The cats and emus demonstrate that superior intelligence does not guarantee the ability to dominate inferiors.





enginerd  ·  2651 days ago  ·  link  ·  

How many existential risks are competing for our attention, brainpower, and funding? Let's brainstorm.

* Asteroid impact

* Solar flare

* Epidemic of an infectious pathogen

* Climate change

* Artificial intelligence

* Nuclear war

That's all I got, and yes I think we should prepare for all of them.

wasoxygen  ·  2648 days ago  ·  link  ·  

The hard part is deciding what "prepare" means. Any money and time devoted to the asteroid threat is denied to pathogens, including malaria.

If the only criteria for spending resources on a problem are 1) it could cause humans to go extinct and 2) we cannot prove that it is impossible, then the list will grow endlessly, with no guarantee that we have thought of everything:

· supervolcano

· grey goo

· nearby supernova/hypernova

· anoxic event

· particle accelerator mishap

· hostile alien invasion

· wrath of a supreme being

It's not easy, but I think we must do some kind of cost-benefit analysis before dedicating significant resources to improbable doomsday scenarios.

Devac  ·  2652 days ago  ·  link  ·  
This comment has been deleted.