DRMacIver's Notebook
Install cut out switches in your mind
Install cut out switches in your mind
Let me tell you my favourite joke. It comes from Scott Aaronson’s “Quantum Computing Since Democritus”:
There’s a joke about a planet full of people who believe in antiinduction: if the sun has risen every day in the past, then today, we should expect that it won’t. As a result, these people are all starving and living in poverty. Someone visits the planet and tells them, “Hey, why are you still using this anti-induction philosophy? You’re living in horrible poverty!” “Well, it never worked before…”
A different Scott recently wrote about Confirmation bias. His view is very Bayesian, which I’m not that into, but the basic premise is sound: If you come across a piece of evidence that contradicts your worldview, it’s probably right to discount it.
In a recent discussion of one of my favourite papers a friend pointed out that the arguments depended in quite a critical way on certain values (obvious sensible ones like “racism bad” which he agreed with, but even so) and that this presented difficulties with how to use it properly for breaking out of your frame of reference.
The common thread here is that in all of these cases there is an underlying concern: “But what if my fundamental worldview was wrong in some critical way? How would I break out of that?” - the arguments that were being presented were fine, but they intrinsically rested on unjustified, and possibly unjustifiable, assumptions.
I recently reread “Zen and the Art of Motorcycle Maintenance”, which is a deeply flawed book that is much more interesting in its flaws than in the things people normally get out of it. One of the things I found fascinating was observing the author’s thought process (the author is schizophrenic, and this was very obvious to me when reading it). This provoked the following thought process.
- Yes, this is a very familiar line of reasoning.
- Now comes the bit where you stop.
- …why aren’t you stopping?
He was getting involved in thought loops that were obviously deeply unproductive, and he wasn’t stopping. This was quite distressing to me to watch because of how carefully I have cultivated a series of cutoff switches in my thought processes to avoid this from happening.
The most basic one is this: If you are concerned with your worldview being fundamentally wrong in some way… Eh, don’t worry about it. It’s probably fine. Put that to one side and come back to it later if you really feel like it.
Fundamentally everything you do grounds out in basic premises about how the world works and what the appropriate values are, and those things are arbitrary and contingent and not even necessarily that consistent. If you allow yourself to reevaluate your worldview as part of every basic thought process, what will happen is that you will be constantly spiralling into unproductive areas of thought.
You should, of course, occasionally consider that your fundamental worldview might be wrong, but mostly the way to do that is by talking to people who are different from you and learning from them and how they see the world. Getting trapped in thought spirals will not do this, and you need to install cut off switches to block off those spirals when you run into them.