# Knowledge Basilisks

I've been somewhat curious about knowledge basilisks for the past year or so. They're downright scary. The idea is that there are some pieces of knowledge, true or false, that hurt or even kill you and/or others around you just by learning them. I almost discovered one a year ago and fortunately had the sense to turn back before it was too late. I didn't look the basilisk in the eye but I could feel it in the room.

Naturally, this is at odds with my tendency to view acquiring true beliefs and knowledge as a good thing.

Basilisks are a strict subset of knowledge that has unintended consequences. Sometimes you learn something that ends up making you better. Sometimes it has an effect that's hard to judge on the good or bad scale, but is nevertheless there. The whole field of psychological priming is an example of that.

Sometimes I read things and I wonder "Did I just look a basilisk in the eye?" A warning, if you're genuinely afraid of these things you should probably stop reading and go back to whatever it was you were doing. Anyway, as an example, there have been some interesting studies about willpower. One of them is about the issue of whether willpower is a finite or infinite resource. The study showed that people who think willpower is infinite, act like it's infinite! They're generally more motivated than average, highly productive, get stuff done. And people who think willpower is finite, that it's limited, act that way as well. With those people, you can monitor a daily depletion of willpower through choices. Every decision we have to make depletes some of our willpower, if you believe it's a resource to be depleted. If you set yourself up to make less decisions in the day, you'll likely be more productive since you have more willpower to spare.

The basilisk part of that is: if you previously believed willpower was an infinite thing, and you read that study, what if you started to believe willpower was a finite thing and your productivity suddenly shoots down? That's no good, you should have avoided that study! Rational ignorance!

Another potential basilisk I ran into is the idea (not a study since it's pretty vague) of "awesome points", which is fairly similar to the willpower issue at least. Everyone has a certain amount of "awesome points" over their lifetime, points they can spend to do awesome things like building a company or doing a cool school assignment well. If you accept that and start thinking in terms of "do I want to spend awesome points doing this?", then you'll start doing only the important/really awesome things instead of wasting your early years and burning out before you're 30. I thought it was a potential basilisk, since what if people believed they had an infinite amount of awesome points, and did more awesome things that way? The poster anecdotally said he didn't notice a decrease in the number of awesome things he did when he realized they were finite, so at least there's some reason to believe this fuzzy pseudo-fiction of awesome points really is finite regardless of our beliefs about it. One can only do so much before burning out.

I'm genuinely afraid of the really bad knowledge basilisks, the lesser ones I'm less afraid about but they still bother me. Apart from the major one that would have had really serious consequences alluded to at the top (as opposed to minor basilisks like the willpower one or even the spouse-cheating one) I don't think I've gotten any better at detecting them before looking at them. It's only after looking at something do I think "was that a basilisk I just saw?" When I can make a slight argument for "yes", it worries me that I didn't get any warning. How can I improve my basilisk-sensing skills? The "DO NOT TOUCH" buttons are usually pretty obvious, and the ones to most be afraid of, but I want to sense the lesser ones as well since they can be almost as damaging long-term.

#### Posted on 2011-12-02 by Jach

Tags: philosophy

LaTeX allowed in comments, use $\\...\\$\$ to wrap inline and $$...$$ to wrap blocks.