The whole article is interesting, but I thought that this was a good summary of the various rationalist subcultures:
-
The whole article is interesting, but I thought that this was a good summary of the various rationalist subcultures:
But people who are drawn to the rationalist community by the Sequences often want to be in a cult. To be sure, no one wants to be exploited or traumatized. But they want some trustworthy authority to change the way they think until they become perfect, and then to assign them to their role in the grand plan to save humanity. They’re disappointed to discover a community made of mere mortals, with no brain tricks you can’t get from Statistics 101 and a good CBT workbook, whose approach to world problems involves a lot fewer grand plans and a lot more muddling through.
-
The whole article is interesting, but I thought that this was a good summary of the various rationalist subcultures:
But people who are drawn to the rationalist community by the Sequences often want to be in a cult. To be sure, no one wants to be exploited or traumatized. But they want some trustworthy authority to change the way they think until they become perfect, and then to assign them to their role in the grand plan to save humanity. They’re disappointed to discover a community made of mere mortals, with no brain tricks you can’t get from Statistics 101 and a good CBT workbook, whose approach to world problems involves a lot fewer grand plans and a lot more muddling through.
Does this mean people attracted to rationalist cults are looking for a way to avoid critical thinking of their own?
-
Does this mean people attracted to rationalist cults are looking for a way to avoid critical thinking of their own?
IMO approximately nobody wants to avoid critical thinking, it's just easy to fall into the trap of not examining yourself for errors in thinking. It's kind of like this XKCD:
The only way to improve over time is to have a genuine interest in becoming less wrong (à la the lesswrong.com people). Human nature is to become lazy after a time and becoming less wrong initially will give you better tools for defending errors in thinking.
-
IMO approximately nobody wants to avoid critical thinking, it's just easy to fall into the trap of not examining yourself for errors in thinking. It's kind of like this XKCD:
The only way to improve over time is to have a genuine interest in becoming less wrong (à la the lesswrong.com people). Human nature is to become lazy after a time and becoming less wrong initially will give you better tools for defending errors in thinking.
We do love a good heuristic