(More posts of marginal importance.)
The virtue of tsuyoku naritai, “I want to become stronger”, is to always keep improving—to do better than your previous failures, not just humbly confess them.
Yet there is a level higher than tsuyoku naritai. This is the virtue of isshokenmei, “make a desperate effort”. All-out, as if your own life were at stake. “In important matters, a ‘strong’ effort usually only results in mediocre results.”
And there is a level higher than isshokenmei. This is the virtue I called “make an extraordinary effort”. To try in ways other than what you have been trained to do, even if it means doing something different from what others are doing, and leaving your comfort zone. Even taking on the very real risk that attends going outside the System.
But what if even an extraordinary effort will not be enough, because the problem is impossible?
I have already written somewhat on this subject, in On Doing the Impossible. My younger self used to whine about this a lot: “You can’t develop a precise theory of intelligence the way that there are precise theories of physics. It’s impossible! You can’t prove an AI correct. It’s impossible! No human being can comprehend the nature of morality—it’s impossible! No human being can comprehend the mystery of subjective experience! It’s impossible!”
And I know exactly what message I wish I could send back in time to my younger self:
Shut up and do the impossible!
…People ask me how likely it is that humankind will survive, or how likely it is that anyone can build a Friendly AI, or how likely it is that I can build one. I really don’t know how to answer. I’m not being evasive; I don’t know how to put a probability estimate on my, or someone else, successfully shutting up and doing the impossible. Is it probability zero because it’s impossible? Obviously not. But how likely is it that this problem, like previous ones, will give up its unyielding blankness when I understand it better? It’s not truly impossible, I can see that much. But humanly impossible? Impossible to me in particular? I don’t know how to guess. I can’t even translate my intuitive feeling into a number, because the only intuitive feeling I have is that the “chance” depends heavily on my choices and unknown unknowns: a wildly unstable probability estimate.
Short aside on Yudkowsky’s famous AI-Box experiment.
509. Crisis of Faith
When should you stage a Crisis of Faith?
Again, think of the advice you would give to a theist: If you find yourself feeling a little unstable inwardly, but trying to rationalize reasons the belief is still solid, then you should probably stage a Crisis of Faith. If the belief is as solidly supported as gravity, you needn’t bother—but think of all the theists who would desperately want to conclude that God is as solid as gravity. So try to imagine what the skeptics out there would say to your “solid as gravity” argument. Certainly, one reason you might fail at a crisis of faith is that you never really sit down and question in the first place—that you never say, “Here is something I need to put effort into doubting properly.”
If your thoughts get that complicated, you should go ahead and stage a Crisis of Faith. Don’t try to do it haphazardly, don’t try it in an ad-hoc spare moment. Don’t rush to get it done with quickly, so that you can say “I have doubted as I was obliged to do.” That wouldn’t work for a theist and it won’t work for you either. Rest up the previous day, so you’re in good mental condition. Allocate some uninterrupted hours. Find somewhere quiet to sit down. Clear your mind of all standard arguments, try to see from scratch. And make a desperate effort to put forth a true doubt that would destroy a false, and only a false, deeply held belief.
Elements of the Crisis of Faith technique have been scattered over many posts:
- Avoiding Your Belief’s Real Weak Points—One of the first temptations in a crisis of faith is to doubt the strongest points of your belief, so that you can rehearse your good answers. You need to seek out the most painful spots, not the arguments that are most reassuring to consider.
- The Meditation on Curiosity—Roger Zelazny once distinguished between “wanting to be an author” versus “wanting to write”, and there is likewise a distinction between wanting to have investigated and wanting to investigate. It is not enough to say “It is my duty to criticize my own beliefs”; you must be curious, and only uncertainty can create curiosity. Keeping in mind Conservation of Expected Evidence may help you Update Yourself Incrementally: For every single point that you consider, and each element of new argument and new evidence, you should not expect your beliefs to shift more (on average) in one direction than another—thus you can be truly curious each time about how it will go.
- Cached Thoughts and Pirsig’s Original Seeing, to prevent standard thoughts from rushing in and completing the pattern.
- The Litany of Gendlin and the Litany of Tarski: People can stand what is true, for they are already enduring it. If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it. You would advise a religious person to try to visualize fully and deeply the world in which there is no God, and to, without excuses, come to the full understanding that if there is no God then they will be better off believing there is no God. If one cannot come to accept this on a deep emotional level, they will not be able to have a crisis of faith. So you should put in a sincere effort to visualize the alternative to your belief, the way that the best and highest skeptic would want you to visualize it. Think of the effort a religionist would have to put forth to imagine, without corrupting it for their own comfort, an atheist’s view of the universe.
- Make an Extraordinary Effort, for the concept of isshokenmei, the desperate convulsive effort to be rational that it would take to surpass the level of Robert Aumann and all the great scientists throughout history who never let go of their religions.
- The Genetic Heuristic: You should be extremely suspicious if you have many ideas suggested by a source that you now know to be untrustworthy, but by golly, it seems that all the ideas still ended up being right. (E.g., the one concedes that the Bible was written by human hands, but still clings to the idea that it contains indispensable ethical wisdom.)
- The Importance of Saying “Oops”—it really is less painful to swallow the entire bitter pill in one terrible gulp.
- Singlethink, the opposite of doublethink. See the thoughts you flinch away from, that appear in the corner of your mind for just a moment before you refuse to think them. If you become aware of what you are not thinking, you can think it.
- Affective Death Spirals and Resist the Happy Death Spiral. Affective death spirals are prime generators of false beliefs that it will take a Crisis of Faith to shake loose. But since affective death spirals can also get started around real things that are genuinely nice, you don’t have to admit that your belief is a lie, to try and resist the halo effect at every point—refuse false praise even of genuinely nice things. Policy debates should not appear one-sided.
- Hold Off On Proposing Solutions until the problem has been discussed as thoroughly as possible without proposing any; make your mind hold off from knowing what its answer will be; and try for five minutes before giving up, both generally, and especially when pursuing the devil’s point of view.
And these standard techniques are particularly relevant:
- The sequence on The Bottom Line and Rationalization, which explains why it is always wrong to selectively argue one side of a debate.
- Positive Bias and motivated skepticism and motivated stopping, lest you selectively look for support, selectively look for counter-counterarguments, and selectively stop the argument before it gets dangerous. Missing alternatives are a special case of stopping. A special case of motivated skepticism is fake humility where you bashfully confess that no one can know something you would rather not know. Don’t selectively demand too much authority of counterarguments.
- Beware of Semantic Stopsigns, Applause Lights, and the choice to Explain/Worship/Ignore.
- Feel the weight of Burdensome Details; each detail a separate burden, a point of crisis.
But really there’s rather a lot of relevant material, here and there on Overcoming Bias. The Crisis of Faith is only the critical point and sudden clash of the longer isshoukenmei—the lifelong uncompromising effort to be so incredibly rational that you rise above the level of stupid damn mistakes. It’s when you get a chance to use your skills that you’ve been practicing for so long, all-out against yourself.
I wish you the best of luck against your opponent. Have a wonderful crisis!
Very insightful comment by Roko:
“I suspect that there are many people in this world who are, by their own standards, better off remaining deluded. I am not one if them; but I think you should qualify statements like “if a belief is false, you are better off knowing that it is false”.
It is even possible that some overoptimistic transhumanists/singularitarians are better off, by their own standards, remaining deluded about the potential dangers of technology. You have the luxury of being intelligent enough to be able to utilize your correct belief about how precarious our continued existence is becoming. For many people, such a belief is of no practical benefit yet is psychologically detrimental.
This creates a “tradgedy of the commons” type problem in global catastrophic risks: each individual is better off living in a fool’s paradise, but we’d all be much better off if everyone faced up to the dangers of future technology.”