A working knowledge of human cognition and its predictive frailties seems an eminently useful thing. I spend a fair amount of time reading Slate Star Codex and Less Wrong, two blogs dedicated to, in part, rationality. They both make reference to Eliezer Yudkowsky's book Rationality: From AI to Zombies and I'm curious if anyone here has ever read it.
https://intelligence.org/rationality-ai-zombies/
What thoughts do you have on the book, their blogs, or the project of self-improvement via study of human cognition?
https://wiki.lesswrong.com/wiki/Sequences
you can pick through these without reading the whole thing, taken as a whole it's pretty long. you're probably already familiar with most of it, or maybe not
there are a lot of very good points in there, and some neat ideas.
rationality as a concept isn't really anything, though. just read books and learn things. probably spend less time on hubski