Naturally, this is at odds with my tendency to view acquiring true beliefs and knowledge as a good thing.
Basilisks are a strict subset of knowledge that has unintended consequences. Sometimes you learn something that ends up making you better. Sometimes it has an effect that's hard to judge on the good or bad scale, but is nevertheless there. The whole field of psychological priming is an example of that.
Sometimes I read things and I wonder "Did I just look a basilisk in the eye?" A warning, if you're genuinely afraid of these things you should probably stop reading and go back to whatever it was you were doing. Anyway, as an example, there have been some interesting studies about willpower. One of them is about the issue of whether willpower is a finite or infinite resource. The study showed that people who think willpower is infinite, act like it's infinite! They're generally more motivated than average, highly productive, get stuff done. And people who think willpower is finite, that it's limited, act that way as well. With those people, you can monitor a daily depletion of willpower through choices. Every decision we have to make depletes some of our willpower, if you believe it's a resource to be depleted. If you set yourself up to make less decisions in the day, you'll likely be more productive since you have more willpower to spare.
The basilisk part of that is: if you previously believed willpower was an infinite thing, and you read that study, what if you started to believe willpower was a finite thing and your productivity suddenly shoots down? That's no good, you should have avoided that study! Rational ignorance!
Another potential basilisk I ran into is the idea (not a study since it's pretty vague) of "awesome points", which is fairly similar to the willpower issue at least. Everyone has a certain amount of "awesome points" over their lifetime, points they can spend to do awesome things like building a company or doing a cool school assignment well. If you accept that and start thinking in terms of "do I want to spend awesome points doing this?", then you'll start doing only the important/really awesome things instead of wasting your early years and burning out before you're 30. I thought it was a potential basilisk, since what if people believed they had an infinite amount of awesome points, and did more awesome things that way? The poster anecdotally said he didn't notice a decrease in the number of awesome things he did when he realized they were finite, so at least there's some reason to believe this fuzzy pseudo-fiction of awesome points really is finite regardless of our beliefs about it. One can only do so much before burning out.
I also have an idea of partial-basilisks. Pieces of knowledge that are fine by themselves, but when coupled with other pieces of knowledge become deadly. Or pieces of knowledge that are deadly by themselves, but coupled with other pieces of knowledge become fine. For example, consider a recently converted atheist who suddenly believes life has no meaning and isn't worth living, and kills himself. Knowledge of atheism must be bad! But he lacked other knowledge that could have saved him. Another, consider your best friend told you he just had awesome sex. Okay, fine. Later you learn it was with your spouse. Uh oh. (I'm not saying you'll go on a murderous rampage and hurt others, but just by learning that piece of information your emotional state can drastically change for the worse, which is why it's somewhat a basilisk.) If you're already irrational, learning new things can really hurt if you're trying to become more rational. Suppose you learn a list of common cognitive biases--but instead of applying them to yourself, trying to make yourself better and more aware of your flaws so you can fix them, you use them as fully general counterarguments against anything you don't like! This is also true for people who memorize lists of fallacies. "Appeal to authority!" "Ad hominem!" and more replace actual discussion, and the truth of the matter is buried by the new argument over who is committing which fallacies.
I'm genuinely afraid of the really bad knowledge basilisks, the lesser ones I'm less afraid about but they still bother me. Apart from the major one that would have had really serious consequences alluded to at the top (as opposed to minor basilisks like the willpower one or even the spouse-cheating one) I don't think I've gotten any better at detecting them before looking at them. It's only after looking at something do I think "was that a basilisk I just saw?" When I can make a slight argument for "yes", it worries me that I didn't get any warning. How can I improve my basilisk-sensing skills? The "DO NOT TOUCH" buttons are usually pretty obvious, and the ones to most be afraid of, but I want to sense the lesser ones as well since they can be almost as damaging long-term.
Posted on 2011-12-02 by Jach
Trackback URL: https://www.thejach.com/view/2011/12/knowledge_basilisks