- Rationalist groups tend to be functional to the extent that their activities involve learning to program, writing papers for general publication, building a giant dome on the playa, or otherwise interacting with the real world. Rationalist groups tend to be dysfunctional to the extent that their activities involve very long conversations about human psychology and social dynamics, especially dynamics within the group itself. Relatedly, the clearest-cut cases of rationalists being right have involved external events in the world and not the nature of human beings.
Fucking lol how did I miss this So let's start here: Eliezer Yudikowsky is a homeschooled kid who wrote 660,000 words to prove just how much worse Harry Potter could be so he could surround himself with other homeschooled wierdos who think the rest of the world is doing it wrong. who could predict what happened next The Coen brothers couldn't write that. It's too incisive for Monty Python. Neal Stephenson would consider it too nerdy to bother with and Nick Harkaway wouldn't bother with antagonists that were such pussies. The rationalists defy parody. A logical outcome when you think humanity is, on balance, wrong and humanity, on balance, thinks you're assholes Minus the rapey-stabby bits. For the record, Aliester Crowley's Thelema cult turned out to involve zero murder sprees. Plenty of people had a great time living under Rajneesh. And I am unaware of any other philosophical movements that directly led to the embezzlement of $50b. This is, not to put too fine a point on it, not true. It also applies squarely to Scientology which is absolutely positively 100% a cult. That's because his entire life is online, so that's where his cult lives. Which is why he banned discussion or mention of Roko's Basilisk for five years Similar things have been said about Spahn Ranch Not AT ALL cult-like What makes a not-cult is a mission to improve the lives of the vulnerable. What makes a cult is a practice of exploiting the vulnerable through in-group jargon and gnostic practices. "religion" would like a word The friends of Bill W would like a word citation needed Fuckin' NOT AT ALL A CULT GUYZ queso this shit all started on lesswrong, from which the Zizians who were about the future outweighing the present and the effective altruists who were all about the future outweighing the present both sprung, but EA HAS NOTHING TO DO WITH ZIZ Stories like this abound around Jim Jones So if you suck at it you aren't a cult? NXIUM called they want their guru back The Apocalypse cometh and that right soon has NEVER figured into cults before It's enough to make you move to a commune in Guyana reader, it does not "I can quit whenever I want" Fuckin' motive doesn't matter? AYFKM 10/10 no notes out of a cannon, into the sunThe rationalist community was drawn together by AI researcher Eliezer Yudkowsky’s blog post series The Sequences, a set of essays about how to think more rationally.
One is Black Lotus, a Burning Man camp led by alleged rapist Brent Dill, which developed a metaphysical system based on the tabletop roleplaying game Mage the Ascension.
I myself am a rationalist, and the rationalist community is closely knit
The rationalist community as a whole is remarkably functional.
The Sequences make certain implicit promises. There is an art of thinking better, and we’ve figured it out. If you learn it, you can solve all your problems, become brilliant and hardworking and successful and happy, and be one of the small elite shaping not only society but the entire future of humanity.
Multiple interviewees remarked that the Sequences create the raw material for a cult. To his credit, their author, Eliezer Yudkowsky, shows little interest in running one.
He surrounds himself with people who disagree with him,
However, if Brent wasn’t there, Black Lotus would have been fine. One interviewee said that, when Brent wasn’t there, Black Lotus led to beautiful peak experiences that he still cherishes: “Brent surrounded himself with people who built the thing he yearned for, missed, and couldn’t have.”
Worse, the promise of the Sequences is more appealing to people who have very serious life problems they need desperately to solve
People in vulnerable positions are both more likely to wind up mistreated and less likely to be able to leave. Elizabeth Van Nostrand, who knows many members of dysfunctional groups both rationalist and non-rationalist, said, “I know people who've had very good experiences in organizations where other people had very bad ones. Sometimes different people come out of the same group with very different experiences, and one of the major differences is whether they feel secure enough to push back or leave if they need to. There isn't a substitute for a good BATNA.”1
“I was totally coming out of a super depressive and dysfunctional phase in my life, and this was a big upswing in my mood and ability to do things. We were doing something really important. In retrospect, I feel like this is the sort of thing you can't do forever. You burn out on it eventually.
One interviewee observed that the early rationalist community had been more supportive of less functional rationalists, perhaps because it was smaller. While it wasn’t capable of transforming them into a superhumanly rational elite (no one can do that), it helped them learn useful skills, and become independent. This interviewee said that, once the early rationalists became functional, they pulled the ladder up behind them. They (understandably) only wanted to hang out with people who already have their shit together.
When discussing dysfunctional or abusive groups, many academics treat their beliefs as secondary.
It’s difficult to understand the internal dynamics of the Zizians. They don’t have former members, and members tend to isolate themselves from their former friends. So anything I say about them is inherently speculative.
essica Taylor, an AI researcher who knew both Zizians and participants in Leverage Research, put it bluntly. “There’s this belief [among rationalists],” she said, “that society has these really bad behaviors, like developing self-improving AI, or that mainstream epistemology is really bad–not just religion, but also normal ‘trust-the-experts’ science. That can lead to the idea that we should figure it out ourselves. And what can show up is that some people aren't actually smart enough to form very good conclusions once they start thinking for themselves.”

Or, as Jessica Taylor said, “They do outsource their thinking to others, but not to the typical authorities.”
In and of itself, that dynamic is bad but not necessarily seriously so. Many effective altruists–members of a community closely linked to the rationality community–similarly defer to more experienced effective altruists. While effective altruists have widely critiqued this habit, it results only in poorly thought out viewpoints about charity evaluation, not in violent crime.
One Black Lotus member wanted to emphasize to me that Black Lotus wasn’t all bad.
One interviewee said, “One kind of cult you can have is when you and ten of your closest friends all live in a house together and you have the blackout curtains drawn and a lot of MDMA, and you sit around and talk about the implications of the whatever.” The rationalist community keeps spawning groups like this. Most are nothing but a (possibly fun) waste of time. But when the conversations become fraught and obsessively inward-facing, it can spawn Leverage Research, or Black Lotus, or the Zizians.
Brent Dill convinced some people that he was an extraordinary genius who would be capable of fantastic achievements, just as soon as he stopped being depressed.
Several interviewees noted particular risks from overriding harms. “The issue is that ‘something, something dead babies’ justifies an awful lot,” said one interviewee. The long-term benefit that rationalists tend to be most worried about is AI. Many rationalists believe that an artificial general intelligence (AGI) will be developed very soon: for example, a mostly-rationalist team wrote the forecast AI 2027. Many of them also expect that, without heroic effort, AGI development will lead to human extinction.
The overwhelming stakes of AGI can lead to a dangerous sense of grandiosity. “It’s a story in which they matter and in which it is justified for them to do weird stuff and stand up for themselves,” said an interviewee familiar with the Zizians. “Every action has great meaning, and that hooks into people in two ways. One of which is that it's empowering, and the other of which is that it's a great trigger for becoming obsessed with whether you're a bad person.”
Early rationalist writing, such as the Sequences and the Harry Potter fanfiction Harry Potter and the Methods of Rationality, emphasized the lone hero, standing defiantly against an uncaring world. But the actual process of saving the world is not very glamorous. It involves filling out paperwork, making small tweaks to code, running A/B tests on Twitter posts.
When asked how people could tell that their project wasn’t a cult, one interviewee said, “You leave the house regularly. Or if there's an office, you leave both the office and the house regularly.”
One of my interviewees speculated that rationalists aren’t actually any more dysfunctional than anywhere else; we’re just more interestingly dysfunctional. Dysfunctional workplaces, rape, abuse, and even murder aren’t unusual. People are more interested in rape or murder when it has a complicated and unusual philosophical justification, but we shouldn’t confuse being more fun to talk about with being more common.
Ozy Brennan is an animal welfare researcher and science fiction author. They blog at Thing of Things.
Glad to see you commenting on this, because pretty much everything I know about these guys started with you posting about the Zizians a while back. I thought it was a great read, as is your response. Where I should go to find out more about rationalist groups summoning demons (and how much were they influenced by the Laundry Files)?
So there was a time when someone at the Daily Dot wanted me to explain how and why, exactly, Violentacrez got doxed to Gawker. I gave them a fifteen thousand word hyperlinked charlie-at-the-board.jpg oral history of the tedious little ingroups of Reddit when it basically came down to "one of my friends, who is facebook friends with one of my enemies, doxed him for the lulz." It was hyper important to me and I'd edited that document four times to get it shorter but I could not boil it down to something interesting to external observers because fundamentally? It wasn't. Book four of The Story of Civilization is entitled "The Age of Faith." It covers the era of European history we generally regard as "the Dark Ages" and, in doing, gives chapter-and-verse why we generally don't study this period in history - it's not that nothing happened, it's that it was such a pointless, meaningless, return-to-zero mess that you really can walk into the Renaissance as a tabula rasa with nothing that came before included as context. The Age of Faith is 1086 pages, with a 60-page bibliography and a 70-page index. My copy used to belong to Gore Vidal and I doubt I will ever read it again. I have never been a member of the TESCREAL community in any way, shape or form. They always struck me as pompous homeschooled assholes whose interest largely lies in rubbing their boogers on you. I have, however, been subjected to their tedium since the drop - because I am a bloviating asshole on the Internet, which means fans of bloviating assholes think we're alike. Blackbootz was one of the worst offenders in that I was constantly being asked for opinions about this, that or the other on Slate Star Codex, The Last Psychiatrist or Lesswrong. So the first thing I will say to "where should I go to find out more" is "don't." The dramas of online communities (even those with meatspace shenanigans) are self-referential, ephemeral and tedious. The second thing I will say is go here. Cults can only erupt in a specific social environment under specific conditions and I thank my lucky stars that I'd already read Helter Skelter and done some deep dives on Jim Jones when I was in those conditions. The first thing you need are spiritual people who have rejected other forms of spirituality. They need to be broken. Their life has to be missing something and it needs to be something big - let's call them "seekers." The next thing you need is a seeker with charisma but without humility. One person looking for truth is no problem - I knew a guy who determined Christianity was the One True Faith because he has the most documented miracles on Youtube. A bunch of people looking for truth isn't necessarily a problem either... so long as they don't start turning to that one guy. Even then it isn't necessarily a problem if he has the self-awareness to go "why are they looking at me? What's going on here? How did I end up being the person everyone turns to? How can I help them?" Every cult you've ever seen or heard of - including several that are now considered major religions - hit that turning point and the charismatic dude went "how can I use this to my advantage?" That Wired article used to be an article, not a podcast. Wired likely culled it for audio-only because my god, what a read. That article had links to Jack LaSota's blog which Wired likely culled because my god, what a rabbit-hole. Look: of true agency, are zombies. They can be directed by whoever controls the Matrix. The more they zone out and find a thing they can think is contentment, the more they approach the final state: corpses. Those who have seen horror and built a vessel of hope to keep their soul alive and safe from harm are liches. Christianity’s Heaven seems intended to be this, but it only works if you fully believe and alieve. Or else the phylactery fails and you become a zombie instead. For some this is The Glorious Transhumanist Future. In Furiosa from Fury Road’s case, “The Green Place”. If you’ve seen that, I think the way it warps her epistemology about likely outcomes is realistic. how far have you sailed from the shore for that to seem like a rational, reasonable argument? He's basically saying 'seek enlightenment' but in this weird-ass Agent Smith Shadowrun Harry Potter dipshit patois that he's managed to hide the point from himself. The medium is the message. You can't observe it without swimming in it, and you shouldn't swim in it. I'll say this: Emile Torrez cares a lot about the TESCREAL posse; he coined the phrase. Like most zealots, he's a heretic fallen from the fold. Ozy Brennan? He's got beef with individuals but he still buys the belief ("we are better than you, the math proves it"). The fact of the matter is, you are your symbology and when you start talking about demons, liches, monsters and gods, you're going to end up with a pentagram on the floor, take it from me. It's the fundamental pareidolia of human nature, the need to make sense out of noise. The original version of this had a Star of David, not a unicorn. So. Charlie Stross worship? I've seen stupider. I wouldn't bother digging deeper because all you're going to see is a poorly-read opportunist attempting to keep one step ahead of his flock. I'll go one further and hypothesize that the TESCREAL posse is subject to so much schism because they all think they're smarter than everyone else, which makes them harder to keep on top of than, say, Squeaky Fromme. The reasons these fuckers matter is they have fucktons of techbro money. Jim Jones mattered because he had the support of the liberal political machine of San Francisco so he could do a lot more damage than, say, Heaven's Gate. A whole lotta dipshits in Silicon Valley and adjacencies are firmly of the belief that they know better than you, the math proves it and frankly, gettin' stabby is often the only thing that separates cults from power.Those who have felt the Shade and let it break their minds into small pieces each snuggling in with death, that cannot organize into a forbidden whole

Hahaha fuck me that Wired article is a ride. Where to begin? Not to, I suppose. Thank you (x10) for taking the time to provide such a detailed response. Am I correct in inferring these people come from backgrounds that are privileged as fuck? I can't imagine how they stayed out of prison for so long, otherwise.
Well the local paper brought it up by mentioning that two of the ringleaders are Lakeside grads, so. MIRI was founded with money from these guys before Peter Thiel came onboard; Thiel's grievance parade hides the fact that his family is rich AF going back like four generations. Sam Bankman Fried and Caroline Ellison are the kids of tenured Ivy League professors and no, there aren't a lot of TESCREAL dipshits of color. FWIW, the best cult document by far is Wild Wild Country. On the face of it, it has nothing to do with TESCREAL dipshits but in a way, it really really does.Am I correct in inferring these people come from backgrounds that are privileged as fuck?