A working knowledge of human cognition and its predictive frailties seems an eminently useful thing. I spend a fair amount of time reading Slate Star Codex and Less Wrong, two blogs dedicated to, in part, rationality. They both make reference to Eliezer Yudkowsky's book Rationality: From AI to Zombies and I'm curious if anyone here has ever read it.

https://intelligence.org/rationality-ai-zombies/

What thoughts do you have on the book, their blogs, or the project of self-improvement via study of human cognition?


Odder:

I don't plan to. Yudkowsky isn't remotely qualified to write a book on rationality, and he knows nothing about philosophy, computer science or cognition. He's just a sci-fi nerd with delusions of grandeur and no formal training in anything, and I know that no serious philosopher, computer scientist, or psychologist takes his work very seriously.

I have a very negative view on both LessWrong and Slate Star Codex. LessWrong is Yudkowsky's blog, of course, and he would have done better if he had bothered to read some philosophy before trying to teach others philosophy, instead of just deciding that Bayes Theorem was the answer to everything. Slate Star Codex strikes me as more irrational and reactionary than rationalist, mistaking fear, paranoia, and lack of empathy for "cold, hard logic." I'd be concerned for anyone that took anything they read there too seriously, as it seems like a precursor to nasty places like theredpill subreddit.


posted by blackbootz: 709 days ago