Would you care to elaborate on this? I have my own criticisms of LW and without mentioning what they are, I want to see if other people's match up with mine.
Edi/Disclaimer: I do feel like I benefited intellectually by reading the original Sequences on LessWrong. The general idea that strong AI might represent a risk to humanity seems plausible, although I'm not sure how credible the specifics are.
Well, I don't feel intelligent or well educated enough to critique most of what they (EY and supporters) say point by point. However, one thing I recall seems indicative of some of the systemic problems with how he/they think about the world:
EY once wrote that he tried "exercise" (unspecified but presumably steady cardio) and found that it "didn't work". His conclusion was that he....inherently was unable to improve his physical fitness due to some genetic trait that a minority of the population was cursed by.
That is so breathtakingly arrogant and foolish that I was taken aback and it was one of the many small things that led me to question his 'rationality'.
Presumably, EY was a baby, and like all other babies, gradually developed increased muscular strength and coordination during the process of learning to walk. Thus, his muscles are capable of responding to stress and adapt to that stress by getting stronger. If EY had the grit to actually try rigorous training such as progressively loaded barbell squats I'm quite sure he would experience at least a modest, but measurable, increase in physical fitness. Instead, he rationalized his physical weakness and chose the easy road. Plenty of people do this, but they're not so 'rational' as to try and intellectually justify it on their own website publicly!
* I don't feel like searching LW to try and find a citation for this - but does anyone really doubt it? Just look at a picture of the guy.
IIRC his attempted exercise was walking around the Bay Area which left him winded and sore. Agree it sounded woefully ignorant to the science of fitness. He would have done a lot better to hire a physical trainer who could kick his ass a little bit, boot-camp style.
I see LW as part of the cult of reason - under risk of strawmanning, it's the idea that pretty much everything subject to perfect logical deduction from first principles.
It's important to study and understand human biases and it can be helpful in overcoming many struggles since most of the time, you're your own worst enemy, but the philosophy that you're inherently flawed and you should put up a constant effort to be "less wrong" is a recipe for disaster IMO.
Absolutely, take time every now and then to reflect on life and whether any biases and assumptions about the world is impairing your wellbeing and happiness and whether it might be worth changing that -- but in everyday life, listen to your impulses, intuition and feelings. Don't be blind, don't be stupid, but also don't constantly second-guess yourself.
I should totally write a self help book. Or at least make some inspirational Facebook cover photos.
> but in everyday life, listen to your impulses, intuition and feelings
The idea, if I understand it correctly, is that those are the things that are supposed to end up "less wrong." You're not supposed to be consciously thinking all the time about how your thinking is broken; you're supposed to practice a few tricks for a while, internalize them, and then your impulses/intuition/feelings will be (less) broken.
I think that's right, and it's probably even pretty compatible with what I suggest.
What I think is dangerous is adopting the underlying philosophy that your intuition is inherently wrong and in need of salvation from reason.
In this, as with every other area of human improvement, there's a balance to be struck between recognising your current state as "good enough" (even that has a derogatory ring to it) while not isolating you from the fact that there's almost always almost infinite room for improvement. And I think the cult of reason and LW in particular is bad at recognising and respecting a "good enough" state.
Or put another way, imagine if the most popular software engineering website was "YoureNotAsGoodAsJohnCarmack.com".
I think they have taken this idea of being logical and internalized it as a status symbol.
The use of the word "rational" apparently only applies to them so by criticizing LW, I assume that makes you irrational.
Is it useful for every human to overcome the biases on their list? Or are these just criticisms that one group can use to distinguish themselves from other people? [1] I don't even believe all of these biases exist but instead could be attributed to a discrepancy in definitions and usage of language in a formal and colloquial sense. Further, although possibly mentioned elsewhere, I think it is possible to suffer from a "cognitive bias" cognitive bias where belief about overcoming cognitive bias causes a new cognitive bias.
Check out this link [2], I think Harry Potter is associated with Eliezer himself.
In short, I don't think a bunch of people who claim to be rational are completely ego free.
edit: I don't mean this to sound as if I am arguing there is no such thing as cognitive bias or that nobody can really be rational but that ego can and has gotten in the way of discussion about it.