🚨

Critiquing rationalism

or: “so what are you doing about the skulls?”

I love rationalists. Literally — I married one!

So why don’t I call myself a rationalist? Approximately:

  • I don’t think they’re very successful;

What do I think rationalists should do differently?

  • Lightcone
    • Incorporate and run as a for profit. Migrate your impact equity to regular shares of stock. Charge for services (you’re already doing this with Lighthaven, which is pretty good)
    • Rescue the Lesswrong offshoots (EA Forum, Alignment Forum, Progress Forum)
    • Get off of Slack; build good realtime chat directly into Lesswrong. (I always found it embarrassing that so much discussion about Manifold happened on our Discord instead of on our native platform)
  • More broadly
    • Learn to communicate and form coalitions with people not like yourselves.
      • I don’t just mean EA — though definitely repairing some of the EA/rat split stuff would be good, like whatever is up with OpenPhil/Dustin and rationality
      • Really, people who are “low decouplers”, people with different values, people who are less intelligent-in-the-way-rationalists-prize
        • If you want to take AI safety into the mainstream, then it becomes really important to understand and work with all kinds of folks
      • Rationalists have been able to develop their own “stack” — their own news sources, funding sources, coordination mechanisms, physical communities, Lighthaven as a campus. It’s one of the closest working examples of a network state.
        • But then you can easily stop getting feedback from the outside world
    • Think about what barriers to entry an extremely literate codex imposes on potential contributors; or what kind of skillsets your movement lacks as a result
      • “Getting things done”
      • Interpersonal charisma
      • Communication in other forms (video; animation; memes; tweets)
      • Art? Music? (I actually like a lot of the rationalist stuff tho)
  • Lessons from EA
    • For all that you shun EA-style college recruiting — it sure does do something with regards to getting smart ambitious upcoming folks excited by the mission
  • Lessons from startups
    • Move fast
    • Make things
  • Lessons from religions
    • More regularity (Sunday mass)
    • Love everyone, help everyone
    • (rationalism by default is selfish and individualistic)

Excesses of truthseeking

  • Just because you like a particular way of communicating, does not mean that it is a better expression of truth
    • See: Ask vs Guess vs Tell culture, and the virtues of Guess Culture
    • To take a tiny example, the rationalist norm of trying to be precise in everyday conversation in response to “how are you?” or “what have you been doing lately?”
      • I imagine a straw rationalist as trying to respond with exacting precision, like (TOTALLY MADE UP EXAMPLE) “oh, in the last week my day to day happiness ratings have averaged 5.5/10, which is about 1 point lower than my set point” instead of “fine” but:
        • This comes at the cost of succinctness
        • This ignores built in societal norms which most of the rest of the population expects you to abide by
      • Ignoring the norm in favor of what you see as correct or more “true” is defecting, and perhaps even less truth-conveying, because it is interpreted differently than you intended

Places where I love rationalism

  • When it comes to my defense eg on Manifest 2024
  • I like that somebody out there cares about truth!
    • Maybe rationalism is best relegated to “the truthseeking arm of Society”, with the understanding that it is but one of many philosophies in a pluralistic congregation

See also:

  • Rationalism is systematized winning
  • Yes, we have noticed all the skulls
  • Where are all the successful rationalists?
  • Richard Ngo on the importance of truthseeking