A working knowledge of human cognition and its predictive frailties seems an eminently useful thing. I spend a fair amount of time reading Slate Star Codex and Less Wrong, two blogs dedicated to, in part, rationality. They both make reference to Eliezer Yudkowsky's book Rationality: From AI to Zombies and I'm curious if anyone here has ever read it.
https://intelligence.org/rationality-ai-zombies/
What thoughts do you have on the book, their blogs, or the project of self-improvement via study of human cognition?
Uh, have you ever actually read Slate Star Codex? This strikes me as a singularly bizarre interpretation of his writing. What, exactly, led you to this conclusion?I'd be concerned for anyone that took anything they read there too seriously, as it seems like a precursor to nasty places like theredpill subreddit.
Maybe, and while I think that's something to look at, it's far from definitive. I mean, Odder refers to SSC as "reactionary," despite one of SSC's longest posts being an anti-reactionary FAQ.
Not remotely? Then what constitutes qualification in your eyes? To be frank, I'm not picking up anything more than your mere contempt for both of these authors. I'm unconvinced that Yudkowsky's problem is that he hasn't bothered to read some philosophy. That just seems too unspecific to be helpful. And granted I've read a fraction of the total content of SSC and Less Wrong, but if what Scott Alexander writes typifies "irrationality" then show me your paragon of rationality.Yudkowsky isn't remotely qualified to write a book on rationality
My beef with Yudikowsky is he's only got a hammer, so all the world's a nail. It's not that he doesn't know other tools exist. It's not that he hasn't seen screwdrivers on the shelves of others. It's that he's a zealous proselytizer of the Church of Hammers and all other tools are sacrilegious at best and witchcraft at worst. More than that, he uses rhetorical sleight-of-hand in order to make the casual observer think the screwdriver in his hand is actually a hammer, therefore all tasks can be accomplished with hammers. And should you call him on it, you're a heathen. And take it from a studied and practiced vomiter of word salad: SSC is a vomit of word salad. If you have a bright enough blizzard of bullshit you can hide false equivalency, groundless assertions and appeal-to-authority fallacies well enough to confuse the casual reader into believing they're ingesting facts and well-reasoned arguments. This is the basic problem of the whole LessWrong microcosm: "It's not my opinion that women are irrational, it's a scientific fact. Here, let me snow you." "The world is really simple, it's just that everyone else is doing it completely fucking wrong." It all boils down to variations of "things will be different when I'm in charge" and the fact that uncredentialed sci fi nerds are bloviating about human nature and sociology is A-OK with other uncredentialed sci fid nerds but FOR FUCK'S SAKE this shit? This is NEVER going to help you: - win an argument - convince anyone of anything - figure out what your girlfriend is thinking - help you understand why poor white men voted Trump - get you laid - help you buy a car - teach you how to poach an egg ...or anything else useful. The phrase "deep research by Wikipedians far and wide" should scare the fucking socks off of you, not convince you you're imbibing from the font of wisdom.No, I will not tell you how I did it. Learn to respect the unknown unknowns.
I have a sense that being versed in the predictable biases of human cognition is useful only as a soft or secondary skill. By itself, the knowledge is pretty useless and can even be paralyzing. There's no primary benefit, except to those who have managed to market and bill themselves as some sort of biases spotter. But it's not useless. I think my exposure to some of this literature has imparted humility in how I conduct myself, both in text and in person. No one wins an argument by pointing and shouting, "You're committing the base rate fallacy, a typical form of extension neglect." But just like I've grown to trust those who are aware of their own limitations -- something I recall you value, i.e. the acoustical engineer lecturer who admitted his ignorance during a conference you attended -- I think humility is security, and can be dead sexy. I can't point to it and say that's why I get laid, but it I think it plays a role. I'm not saying that anyone who's opposed to Yudkowsky et al is bashing rationality as an ideal good. But I'm at a loss as to who then you all read or aspire to emulate (if anyone is even still in that part of their lives). Furthermore, it's a bit unsettling. Apparently Yudkowsky and SSC are quite literally something close to evil, and I'm here unawares and asking if anyone else likes their bible. I'm in denial that I can be this far off the trail.
bootzie: I'm with you. I haven't read Yudkowsky, but I have occasionally read interesting stuff on SSC. You say this: Point 1: Being versed in biases of our own cognition is very very useful. Our biases lead us to all kinds of dangerous, relationship-destroying assumptions. I don't want to get into a spitting match with kb, but the link that he called "this shit" - has some important and useful biases to be aware of. Knowing our biases might not help us win an argument, but it will help us avoid a shitload of arguments that we don't need to get into in the first place. Point 2: So awareness of cognitive biases is good. It's probably reasonable to assume that everything we think and do is a result of biases learned from our culture and society. Everything needs to be questioned. "Why do we believe what we believe?" What biases lead us to believe it? I've been quietly wondering what Kool-Aid I am drinking? My definition of Kool-Aid: a sweet sugary drink that will kill you.I have a sense that being versed in the predictable biases of human cognition is useful only as a soft or secondary skill.
I think humility is security, and can be dead sexy. I can't point to it and say that's why I get laid, but it I think it plays a role.
Wow, you are so right about this. Why? because if you don't assume that you know everything, you have a chance to learn stuff. You can ask your lover what she or he likes - you need humility to even ask.
Thanks lil. I can't take credit for the insight of the sexiness inherent to humility. We all intuit it I suspect, but a friend distilled it as: a measure of revealed vulnerability is one of the best things that can happen to a relationship. My fear in starting this discussion was that I was running the risk of so tame and tepid a starting point as "rationality is good" that no one would bite. Instead I'm feeling like I need to get my head checked because some of my sources of rationality and self- & outer-inquiry are so far from some of hubski's ideal that I run the danger of literally falling in with bad people for it.
You missed the basic point about "the acoustical engineer lecturer" which is really problematic as the last time I quoted it, it was legit in a takedown of The Last Psychiatrist: It's this simple: SSC, Yudikowsky and the whole LessWrong constellation aren't saying "I don't know. Does anybody know?" They're saying "I know beyond any doubt and anybody who says I don't is the enemy." If you put forth a hypothesis you are inviting scrutiny, criticism and exploration of that hypothesis by others. In this way the hypothesis is tested and regardless of the outcome, someone's getting an education. The knowledgebase of the local ecosystem will increase. If you instead put forth a maxim you are silencing dissent, discouraging investigation and drawing battle lines between schools of thought. Leo Baranek? Here's this thing I know. Let me explain it. Things I don't know I'm eager to learn. Eliezer Yudikowsky? If I'm talking about it, I know it and if you disagree, you're wrong. We've done this before: I know you love the guy so I'll hold back but this is weapons-grade bullshit. I coulda sworn I've posted SSC before. Apparently I've only even commented on it once. It's not that I think they have nothing to say - the Yudikowsky box experiment is primal Faustian-level intrigue and I love the idea of it. but they're just so fucking full of themselves and so far beyond the ability to see the limits of their insight that reading their diatribes in a non-critical frame of mind makes you think you've learned a fact when you've learned an opinion. And it's not that they don't defend their opinions. It's that they view their opinions as beyond the need of defense. That graph you linked? That's taxonomy. That's a naming-of-things, not a knowledge-of-things. Right - no one has ever won an argument by pointing and shouting "base rate fallacy" except on the Internet. The Internet is a place that confounds "argument from authority" and "appeal to authority fallacy" in pretty much every Reddit thread ever. Taxonomy without insight allows choads to think that someone is unqualified to speak about something if they quote their own expertise. Boys can name a hundred different dinosaurs. Doesn't mean they'd fare any better on the plains of the Cretaceous.The basic point of it is this: trust the people who tell you they don't know.
This is so offensively wrong it's infuriating me.
SSC often makes highlight posts of comments pulled from the discussion, often containing critical commentary. That, and the regular open threads, strike me as the behavior of someone open to doubt, forming new opinions, revising old ones. He often verbalizes his reticence about stating a point or the validity of the data he's using. I'm practically unfamiliar with Yudkowsky, but it seems like he's the main perpetrator of this kind of weaponized certainty. Actually, scratch that. That title definitely goes to TLP. But the thrust of your comment applies if I had indeed made you think that I value certainty over self-conscious doubt. I strive not to, as comforting as certainty can be sometimes.
It's interesting you mention this because as I was wrapping up to go populate a rack and terminate 18 runs of CAT5 I had an insight into myself, in particular - Why I don't blog - Why I don't trust bloggers Let's unpack: Right - the author gets to pick and choose from among those who interact with him, while deleting and censoring those he chooses not to. He has the mic, he chooses who he hands it to, and he turns it on and off at will. That's not free-ranging discussion - that's shaped discussion and it's a very different thing. Wade into /r/The_Donald and take a look at the "commentary" there. do you see much dissent? The header of your linked post: [Content note: Gender, relationships, feminism, manosphere. Quotes, without endorsing and with quite a bit of mocking, mean arguments by terrible people. Some analogical discussion of fatphobia, poorphobia, Islamophobia. This topic is personally enraging to me and I don’t promise I can treat it fairly.] I comment. I often comment where I'm not welcome, and I often say negative things. Put it this way: I'm just as interested in capital-T Truth as anybody else, I just don't think I have it. And in 40 years of searching, I don't think anybody else does, either. Those who are willing to share their journey are interesting and trustworthy. Those who share their arrival are suspect. And for me, I'm not interested in publishing "this is what I think about something" despite the fact that people have been asking me to do so for a decade or more. Because really? "this is what I think RIGHT NOW" is closer to the truth. "this is what I think IN RESPONSE TO THAT" is another caveat. And the blogging platform strikes me as fundamentally dishonest: I mean, yeah - I can pick a bunch of comments that disagree with my nuance and answer them as an illustration of my open-mindedness. But nobody wants to read me putting forth a firmly-held notion and then getting into a pissing match with someone else who disagrees. Who learns from that? And who can trust it? I think bloggers are open to having their nuances discussed so that they can better shape their message for their admirers. I think the pseudo-intellectual "rationalist" sphere is particularly guilty of this - if you disagree, you're irrational and not worthy of debate. However, if you disagree on nuance, on taxonomy, on particulars... well, you're increasing their pagerank so all aboard.SSC often makes highlight posts of comments pulled from the discussion, often containing critical commentary. That, and the regular open threads, strike me as the behavior of someone open to doubt, forming new opinions, revising old ones.
SSC often makes highlight posts of comments pulled from the discussion
[COMMENT THREAD CLOSED GO AWAY]
I bought a squarespace last year in order to blog with it and after an intro post I never posted again. Blogging has evolved. I think bloggers of 2004 would be astonished to hear that blog's are now the more permanent features of the internet, and the place for ephemera and free-flowing discussion actually happens in forums or social media, where the stakes and standards are lesser. A blog feels like it needs to be more polished and formal, and that friction was enough for me to not use it. In the meantime, I've been all over hubski. Go figure. It seems like we completely disagree about SSC (which I also want to distinguish from LessWrong). So far I know of only two posts that the comment threads are closed on, and it's for reasons of personal comfort. There are hundreds of more posts where the comments are completely open. I'm not saying that no editorializing happens as a result of how Scott Alexander decides to interact or showcase the discussion. But for the platform, he goes above and beyond what I would ever expect or demand of someone else. Furthermore, I don't see him viewing his own opinions as beyond the need of defense. I get the sense repeatedly that he went out, amassed data, and only formed opinions after the fact. I certainly disagree with some of his conclusions and arguments, but I don't see what apparently you, Odder, and a handful of others are convinced of, which is his and Yudkowsky's rank irrationality, reactionaryism, and downright wickedness.
kb also touches upon the subject of blogging, which I'd like to share my thoughts about. Because this is blogging: sharing thoughts. Communication is sharing thoughts; some of them just happened to be based on education that the other person doesn't have, academical or personal. Nobody has The Truth, and some people even recognize it - but people still want to share what they know. Some - in hopes that it will help another human being who hasn't arrived to the conclusion they've met and who might benefit from it or the process of its birth in the blogger's mind (see Raptitude by David Cain for an example of such sharing). Nobody can give you The Ladder, but sharing a stair or two is nice, especially when those don't run out. I won't comment on how Eliezer and his followers treat different topics, in comments or otherwise, because I don't have enough experience with them for such an analysis. I just want to point out that the old man kb isn't holding the cup of truth, either, generalizing about bloggers as if they're a homogenous bunch. Just like audioengineers and writers, one would have to presume. Meanwhile: Apparently, this doesn't apply to rationalists or, the more I see it, anyone else whose Internet handle isn't "kleinbl00". Keep in mind: this is from his personal subreddit, where he shares links about himself.Redditors tend to overestimate my knowledge about things because I only comment on things I know and understand.
I don’t think it’s possible or preferable to eliminate people sometimes acting irrationality in favor of “total rationality.” Not to redraw our rationalist’s boundaries, but I don’t think being “unemotional” is even the goal. I think a large goal of rationality is awareness. Awareness of the factors leading to and contributing to decision making. The outcomes of different judgment calls have real-world consequences, like a doctor evaluating a CT scan for cancer, that can be influenced to an astonishing degree by factors that ought to be irrelevant. But not every judgment or decision need be evaluated in such a utility-maximizing way, if only because this evaluative process is tiring and incredibly time-consuming. But that seems to be what you’re reading in Yudkowsky et al. And I don’t, so there seems to be an irreconcilable difference. How do you know what they find difficult to understand, and then choose to ignore? I appreciate you bringing this all up. It’s what I asked for, and your thoughts are valued.This is, in essence, the problem with auto-didacts like Yudkowsky and Scott Alexander. They highlight what they believe to be important, and they ignore that which is difficult for them to understand.
I value your detailed response, but I feel obliged, despite possibly being on the wrong footing, to reply with a few corrections and, at times, counter-points of my own. I'd like to make them not from the position of supporting LessWrong, SSC and/or any associated media or persons, but from the position of actual rationality which, I'm sure, you'll adhere to as much as myself. Ad hominem is only fallacious when it avoids tackling the argument; if the argument is about you and your abilities to do X, then attacking the argument maker is the only possible way to resolve the argument. Appeal to authority is only applicable when the person in question is not a true expert in the given field (say, asking medical advice of a lawyer); trusting experts is not a fallacy, provided the experts can provide enough credentials. Perhaps, it isn't a misunderstanding on your part. Perhaps, it is, indeed, the overly zealous followers of rationality that make you think this is how things are with informal fallacies. I don't know. I would simply like to point out that bending definitions of logical fallacies to bend one's interests is, in itself, an error in rational thinking, thus not belonging anywhere near a rational argument. I think there's a stark misrepresentation of what values system are, in themselves, and what's supposed to arise from adhering to them. One can achieve no values and no reason for living from physics, economics or art alone. To damn rationality as a useless subject of study is to prove yourself blind to the complexity of life that we, human beings with higher thinking, came to lead. I never understood it when people criticized one philosophical movement for not being full, perfect or even for being lacking in one side or another (say, value of free will, personal responsibility or following of a higher force). It is as if people truly expect one set of rules to solve their problems, which has never been the case. Most creations, whether physical or ideal, have been a product of influence from previous creations; it's the combination of ideas that led to them being a complete set of traits for a particular purpose. Therefore, one can never rely on one set of rules, philosophical or otherwise. It's unwise to dedicate yourself to a single cause blindly, without experiencing many. Let alone the fact that under a new stratagem you are acting without the previous borders, meaning that you're not vulnerable to a different set of dangers. To say that rationality alone will save us is dangerous, because no philosophical movement can embrace every aspect of living (yet), and rationality is certainly not one of the most embracing. However, I'd be equally foolish to argue that it has no place in our lives. There's a misunderstanding of where rationality's best applied to. Purging emotions will make us more productive in pragmatic matters, sure, but it will also strip us of our nature as complex animals, and I don't know how about you, Yudkowsky or any of the LessWrong commenters, but I wouldn't want to lose it, if only because I have no guarantee that the other system will make me feel better, which is, ultimately, what we all strive for. Away from this overly zealous kind of thinking, lies true rationality - the kind that improves our thinking without stripping us of our nature. It does so by optimizing our decision-making process by removing emotions from the equation. Think of the time you said something you'd come to regret later, and you'll understand what I mean. Now imagine one's emotions affecting bigger things (i.e., things that took a lot of time, effort and/or resources to establish). Applying rational thinking here is a method that allows us to improve our state of being and our feeling about/of ourselves and not letting volatile emotions compromise the process itself. Getting angry is rational if you're an actor playing an angry character. In most other situations, intentionally getting angry will only cause trouble - therefore, irrational. I think I've just made one, as long as we agree that emotions are not something to escape completely. I think I've just answered that, as well: to feel better about ourselves, in one way or another (having curiosity satisfied, proud for achieving something, relieved at having another's misfortune pass etc.). Feel free to argue that, of course: I'd be happy to further this discourse. I completely agree with you on this one. Expanding on what I've said above, it's dangerous to blindly adhere to one vision of things or another because it leaves you blind to other possibilities, other views and, subsequently, potential for better things to be made (say, by finding a satisfying compromise that benefits all sides). I don't think rationality is a panacaea for all the problems human beings have, but I strongly believe that, taken in moderation, it's a great tool of forging our minds into more capable mechanisms of thinking and decision-making.For all the supposed fallacies of "ad hominem" and "appeal to authority" <..> directing attacks at the argument maker and trusting experts are both pretty damned reliable.
These hyper-rational science fanboys think that the greatest problem with the human race is that we don't sufficiently resemble the Vulcans from Star Trek. Never mind that rationality is at most a minor footnote in philosophy, and that there can be no values, metaphysics, or reason for living with just syllogisms and Bayes' Theorem alone.
there isn't even a "rational" justification for rationality itself
Why is a state of affairs where everyone acts according to rationality inherently more desirable than one where people sometimes act irrationally?
Without any guidance from mentors or criticism from peers, they imagine that everyone else is just hung up on unimportant problems that don't matter, and that they, by avoiding outside influence, have manged to discern the truth.
https://wiki.lesswrong.com/wiki/Sequences you can pick through these without reading the whole thing, taken as a whole it's pretty long. you're probably already familiar with most of it, or maybe not there are a lot of very good points in there, and some neat ideas. rationality as a concept isn't really anything, though. just read books and learn things. probably spend less time on hubski
I started it, but didn't find it especially interesting or persuasive. I mean, there's something to the idea of cognitive bias, but I'm wary of anyone who takes a "this one thing is the source of all our ills" approach to ... well, anything. Plus, his book reads like it was written by a 14-year-old who just discovered the Wikipedia article on epistemology and Nizkor's summary of logical fallacies on the same day, and now he has all the answers. Take for example when he lectures about how the "selling hope" interpretation of lotteries is still bad because it's a "waste of emotional energy." He argues that "maybe" (cue weasel words note a la Wikipedia) people would instead dream about going back to school or improving themselves in some way. He then adds: Life, according to Yudkowski, is just a matter of maximizing specific bars a la The Sims. If you choose to maximize a different bar, that's a "stupid" or "biased" decision. But he never really makes a compelling case for why his bars are actually better. And at some point, this so-called "rationalist" approach just removes the human portion of life. As the saying goes, In the alternative, XKCD nicely sums up some of my objections. But I'm not really sure why Yudkowsky thinks that what he's arguing is in some way different or new, especially on things like ethics. He says at one point that This idea (said less pretentiously) has been part of pretty much every human ethical system since the dawn of time. Side note: I couldn't help but laugh when he says, in the same post, "When I'm deciding where to steer the future, I take into account not only the subjective states that people end up in, but also whether they got there as a result of their own efforts" (emphasis mine). More generally, what I can gleam of his ethical thought seems to go around in circles a lot (but of course, this is the reader's fault), and I'm still not 100% sure what his ethical system actually is. And, of course, his writings on religion (e.g. here) are so laughably wrong-headed that they border on Poe's Law territory. I'm reminded of the opening line from a review of The God Delusion: To summarize, my impression is of someone who's trying desperately to systematize the whole of human thought so that (a) he can be correct at will, and (b) can ignore emotions he doesn't want to deal with.Their dreaming brains might, in the 20th visualization of the pleasant fantasy, notice a way to really do it. Isn't that what dreams and brains are for?
Life is not a problem to solve, but a reality to experience.
So my values are not strictly reducible to happiness: There are properties I value about the future that aren't reducible to activation levels in anyone's pleasure center; properties that are not strictly reducible to subjective states even in principle.
Imagine someone holding forth on biology whose only knowledge of the subject is the Book of British Birds, and you have a rough idea of what it feels like to read Richard Dawkins on theology.
Those are some great points, thanks for illustrating. I hate to admit it, but my reading comprehension skills are so affected by presentation. If someone I trust prefaces a link with "this stuff is brilliant and changed how I think" I can be reliably taken in by it. And just now, you pointing out to me to look for a bit of sanctimony and assumed-but-unexplained-premises primed me well to find them. These predilections might be so obvious as to be mundane. I mean, we all judge books by their covers even though we know better. I'm at the very beginning of Yudkowsky's book, where the author is on best behavior to woo, so I'm not bored yet. But it's over 1000 pages, and I have a backlog a mile long, so we'll see.
I think that's pretty much everyone :) I take that approach on anything trying to teach me something. But some of that is my own priming; my background is in history and then law, both of which train you to be a very skeptical and analytical reader. For fiction I don't care, so don't really worry about it as much. As you allude to, it's ultimately a question of where to spend your limited hours. If you're getting something out of reading it (even if it's just refining your own thoughts via your disagreements with Yudkowsky), then there's nothing wrong with doing so. And who knows, even a blind squirrel can find nuts, so you may find or two ideas there that are useful. If someone's right they're right, even if they come across as self-important and shallow.And just now, you pointing out to me to look for a bit of sanctimony and assumed-but-unexplained-premises primed me well to find them.