501. Rationality quotes
This is a pretty gloomy post.
…Looking back on the magnitude of my own folly, I realized that at the root of it had been a disbelief in the Future’s vulnerability—a reluctance to accept that things could really turn out wrong. Not as the result of any explicit propositional verbal belief. More like something inside that persisted in believing, even in the face of adversity, that everything would be all right in the end.
…But we don’t live in that world. We live in the world beyond the reach of God.
In this world everything can happen, be it ever so cruel and evil.
In the world where God exists, does God intervene to optimize everything? Regardless of what rabbis assert about the fundamental nature of reality, the take-it-seriously operational answer to this question is obviously “No”. You can’t ask God to bring you a lemonade from the refrigerator instead of getting one yourself. When I believed in God after the serious fashion of a child, so very long ago, I didn’t believe that.
Postulating that particular divine inaction doesn’t provoke a full-blown theological crisis. If you said to me, “I have constructed a benevolent superintelligent nanotech-user”, and I said “Give me a banana,” and no banana appeared, this would not yet disprove your statement. Human parents don’t always do everything their children ask. There are some decent fun-theoretic arguments—I even believe them myself—against the idea that the best kind of help you can offer someone, is to always immediately give them everything they want. I don’t think that eudaimonia is formulating goals and having them instantly fulfilled; I don’t want to become a simple wanting-thing that never has to plan or act or think.
…But clearly, there exists some threshold of horror awful enough that God will intervene.
…Belief in a fair universe often manifests in more subtle ways than thinking that horrors should be outright prohibited: Would the twentieth century have gone differently, if Klara Pölzl and Alois Hitler had made love one hour earlier, and a different sperm fertilized the egg, on the night that Adolf Hitler was conceived?
For so many lives and so much loss to turn on a single event, seems disproportionate. The Divine Plan ought to make more sense than that. You can believe in a Divine Plan without believing in God—Karl Marx surely did. You shouldn’t have millions of lives depending on a casual choice, an hour’s timing, the speed of a microscopic flagellum. It ought not to be allowed. It’s too disproportionate. Therefore, if Adolf Hitler had been able to go to high school and become an architect, there would have been someone else to take his role, and World War II would have happened the same as before.
But in the world beyond the reach of God, there isn’t any clause in the physical axioms which says “things have to make sense” or “big effects need big causes” or “history runs on reasons too important to be so fragile”. There is no God to impose that order, which is so severely violated by having the lives and deaths of millions depend on one small molecular event.
…Once upon a time, I believed that the extinction of humanity was not allowed. And others who call themselves rationalists, may yet have things they trust. They might be called “positive-sum games”, or “democracy”, or “technology”, but they are sacred. The mark of this sacredness is that the trustworthy thing can’t lead to anything really bad; or they can’t be permanently defaced, at least not without a compensatory silver lining. In that sense they can be trusted, even if a few bad things happen here and there.
Yudkowsky recounts the story of his “Bayesian Enlightenment”:
So I pointed this out, and worked the answer using Bayes’s Rule, arriving at a probability of 1/2 that the children were both boys. I’m not sure whether or not I knew, at this point, that Bayes’s rule was called that, but it’s what I used.
And lo, someone said to me, “Well, what you just gave is the Bayesian answer, but in orthodox statistics the answer is 1/3. We just exclude the possibilities that are ruled out, and count the ones that are left, without trying to guess the probability that the mathematician will say this or that, since we have no way of really knowing that probability—it’s too subjective.”
I responded—note that this was completely spontaneous—”What on Earth do you mean? You can’t avoid assigning a probability to the mathematician making one statement or another. You’re just assuming the probability is 1, and that’s unjustified.”
To which the one replied, “Yes, that’s what the Bayesians say. But frequentists don’t believe that.”
And I said, astounded: “How can there possibly be such a thing as non-Bayesian statistics?”
That was when I discovered that I was of the type called ‘Bayesian’. As far as I can tell, I was born that way. My mathematical intuitions were such that everything Bayesians said seemed perfectly straightforward and simple, the obvious way I would do it myself; whereas the things frequentists said sounded like the elaborate, warped, mad blasphemy of dreaming Cthulhu. I didn’t choose to become a Bayesian any more than fishes choose to breathe water.
But this is not what I refer to as my “Bayesian enlightenment”.
The somewhat depressing ending:
That scream of horror and embarrassment is the sound that rationalists make when they level up. Sometimes I worry that I’m not leveling up as fast as I used to, and I don’t know if it’s because I’m finally getting the hang of things, or because the neurons in my brain are slowly dying.