Contra EA and Rationality
The least charitable argument against these movements I'm comfortable with mustering.
The following constitutes a lower bound on my assessment of EA and Rationality as movements. In other words, it is about the worst assessment of them I can be persuaded to believe. As such, it is mostly a narrative for how one could come to this negative assessment. Because it is a lower bound, I am very welcoming and open to counter-arguments as long as they are willing to entertain how someone could place some probability mass on this less charitable interpretation. You are permitted to yell at me in the comments or on X about this, although I cannot guarantee that would update me most efficiently.
Eliezer Yudkowsky is a moral anti-realist (via value fragility, and orthogonality).
EA cannot hope to carry out its agenda on this philosophical foundation.1
Moral anti-realism is probably wrong, anyway. By this step we have enough to have an overall negative outlook.
CFAR threw in the towel by declaring the type of failure in the sense of “we failed, but our core philosophy is right, therefore no one could have succeeded.”
LessWrong and the EA forum have a relatively bad consensus-making mechanism, even by today’s standards.2
“Doomers” are arguably closed-minded, in the sense that their perspective and political orientation does not follow even what their attempts to find a consensus have uncovered.
Points 5-8 imply that some degree of “cultishness” is in fact present, which I define as the tendency to favor confirmation-bias in order to preserve one’s sense of status via group identification.
As for the “narrative” which attempts to tie all this together: Moral anti-realism is more of a prerequisite for rather than a conclusion of Sequences-based rationality. It helps to justify the basis for which a movement is actually necessary: In short, faulty human “values” might be more responsible for biases and other mistakes in human cognition, as well as the tendency for humans to unjustifiably commit to ideologies and religions ostensibly on “moral realist” bases. In shorter: Moral anti-realism acts as a ‘high-status’ signal (we’re better than you because you falsely think you’re doing the right thing).
One should actually do a tad more than just notice the skulls.
This is either left as an exercise for the reader, or to me if I should eventually decide to cover this topic. But I think it’s possible to immediately think of reasons why this would be.
I’m leaving this as an exercise for me, so stay tuned.