Yudkowsky admits he made a mistake in his last post:
…after publishing yesterday’s post I realized a similar flaw on my own—this one concerning Occam’s Razor and psychic powers:
If beliefs and desires are irreducible and ontologically basic entities, or have an ontologically basic component not covered by existing science, that would make it far more likely that there was an ontological rule governing the interaction of different minds—an interaction which bypassed ordinary “material” means of communication like sound waves, known to existing science.
… telepathy, and the ability to influence events just by wishing at them, and precognition, would all, if discovered, be strong Bayesian evidence in favor of the hypothesis that beliefs are ontologically fundamental. Not logical proof, but strong Bayesian evidence.
If reductionism is correct, then any science-fiction story containing psychic powers, can be output by a system of simple elements (i.e., the story’s author’s brain); but if we in fact discover psychic powers, that would make it much more probable that events were occurring which could not in fact be described by reductionist models.
Which just goes to say: The existence of psychic powers is a privileged probabilistic assertion of non-reductionist worldviews—they own that advance prediction; they devised it and put it forth, in defiance of reductionist expectations. So by the laws of science, if psychic powers are discovered, non-reductionism wins.
Offers a nice definition of the power of an optimization process, a term which is often used by Yudkowsky:
Put a measure on the state space – if it’s discrete, you can just count. Then collect all the states which are equal to or greater than the observed outcome, in that optimization process’s implicit or explicit preference ordering. Sum or integrate over the total size of all such states. Divide by the total volume of the state space. This gives you the power of the optimization process measured in terms of the improbabilities that it can produce – that is, improbability of a random selection producing an equally good result, relative to a measure and a preference ordering.
Good to know:
When I think you’re a powerful intelligence, and I think I know something about your preferences, then I’ll predict that you’ll steer reality into regions that are higher in your preference ordering. The more intelligent I believe you are, the more probability I’ll concentrate into outcomes that I believe are higher in your preference ordering.
Yay. Nice post about Yudkowsky’s childhood.
My parents always used to downplay the value of intelligence. And play up the value of—effort, as recommended by the latest research? No, not effort. Experience. A nicely unattainable hammer with which to smack down a bright young child, to be sure. That was what my parents told me when I questioned the Jewish religion, for example. I tried laying out an argument, and I was told something along the lines of: “Logic has limits, you’ll understand when you’re older that experience is the important thing, and then you’ll see the truth of Judaism.” I didn’t try again. I made one attempt to question Judaism in school, got slapped down, didn’t try again. I’ve never been a slow learner.
Whenever my parents were doing something ill-advised, it was always, “We know better because we have more experience. You’ll understand when you’re older: maturity and wisdom is more important than intelligence.”
If this was an attempt to focus the young Eliezer on intelligence uber alles, it was the most wildly successful example of reverse psychology I’ve ever heard of.
….The moral I derived when I was young, was that anyone who downplayed the value of intelligence didn’t understand intelligence at all. My own intelligence had affected every aspect of my life and mind and personality; that was massively obvious, seen at a backward glance.
Please memorize the next two paragraphs:
…“Intelligence has nothing to do with wisdom or being a good person”—oh, and does self-awareness have nothing to do with wisdom, or being a good person? Modeling yourself takes intelligence. For one thing, it takes enough intelligence to learn evolutionary psychology.
…We are the cards we are dealt, and intelligence is the unfairest of all those cards. More unfair than wealth or health or home country, unfairer than your happiness set-point. People have difficulty accepting that life can be that unfair, it’s not a happy thought. “Intelligence isn’t as important as X” is one way of turning away from the unfairness, refusing to deal with it, thinking a happier thought instead. It’s a temptation, both to those dealt poor cards, and to those dealt good ones. Just as downplaying the importance of money is a temptation both to the poor and to the rich.
Aaw, this is distilled awesomeness! God, I hate all those modest, enlightened and oh so progressive people, in other words, those fucking hypocritical liars who profess that IQ/intelligence isn’t important or meaningful or that intelligence doesn’t vary at all and has nothing to do with morality. Guys, when was the last time you saw a lizard reading Kant? What do you think is the mean IQ of a bunch of Nazis?
I really despise those people. Srsly. There’s a question on Okcupid (Yeah, I know, I’m sorry..) that reads “Would the world be a better place if people with low IQs were not allowed to reproduce?” And of course almost everyone answers “no”. Well, just so you know, if you’re one of them, I hope you burn in hell. So much for my morality.