There's a certain grand irony to the whole thing. Eliezer writes this sprawling intellectual epic called The Sequences that puts a moral spin on General Semantics, Popperian Science, Behavioral Economics, Evolutionary Psychology, Bayesian Statistics, putting it all together into one cohesive worldview concluding that the world is about to be destroyed by peoples inability to absorb the astounding facts of creation. He tells people that you have to embrace the real world you live in and they should be more confused by fiction than reality. Then almost as a joke he decides to write this Harry Potter fanfiction and it attracts a deluge of people who are in love with the aesthetics, tropes, and devices of fiction. They love cute stories about how stuff works, novelty, cleverness taken to its limits, and they turn the edifice of this 'rationality' into a social club for a particular sort of brilliant slacker.
The grand master of this club is of course Eliezer himself.
“Rational!Harry has already taken the Unbreakable Vow. Rational!Voldemort, especially if he doesn't have his horcrux, and knowing that there's no negotiated way to escape the confrontation, will set up a deadman switch that destroys the world in the event of his own death, and tell Rational!Harry so in Parseltongue. I won't call it checkmate, but Rational!Harry cannot do things past this point that run the *risk* of destroying the world, which is a pretty severe condition. He has to be certain he's disabled Voldemort's kill-switch. Voldemort has also already thought of that, and will tell Harry in Parseltongue that he has set up more than one kill-switch, but not say how many, and that he knows he's obliviated at least one of them from his own memory, but he doesn't know how many.
If you want to continue the story past this point, it's plausible Harry could walk into the Hall of Prophecies and find a list of all Voldemort's kill-switches, or a further prophecy that the world will definitely end if Harry dies. Which, of course, Harry could also tell Voldemort in Parseltongue. I think at that point Voldemort literally screams in frustration, but he still refuses to take down his own kill-switch. Past *that* point I suspect both of them have primarily switched to mentally thinking of it as a fight against the Voice of Time.
In anything more closely resembling a straight-up fight with no prophecies or blackmail, the older Tom Riddle wins. The younger Tom Riddle knows this at this point, and his first priority is to run, not fight. If Harry can figure out the Mirror inside a month, he has a pretty solid refuge and one where Time can be made to run faster. The older Tom Riddle may or may not respect the younger Tom Riddle enough to anticipate any strategy like that; if he's been vanished away before Harry slew all his Death Eaters, he doesn't quite know what he's dealing with yet.”
(Source: Canon!Harry and Rational!Harry vs. Canon!Voldemort and Rational!Voldemort)
“I have already remarked that nothing is inherently mysterious—nothing that actually exists, that is. If I am ignorant about a phenomenon, that is a fact about my state of mind, not a fact about the phenomenon; to worship a phenomenon because it seems so wonderfully mysterious, is to worship your own ignorance; a blank map does not correspond to a blank territory, it is just somewhere we haven’t visited yet, etc. etc...
Which is to say that everything—everything that actually exists—is liable to end up in “the dull catalogue of common things”, sooner or later.
Your choice is either:
Decide that things are allowed to be unmagical, knowable, scientifically explicable, in a word, real, and yet still worth caring about;
Or go about the rest of your life suffering from existential ennui that is unresolvable.
(Self-deception might be an option for others, but not for you.)
This puts quite a different complexion on the bizarre habit indulged by those strange folk called scientists, wherein they suddenly become fascinated by pocket lint or bird droppings or rainbows, or some other ordinary thing which world-weary and sophisticated folk would never give a second glance.
You might say that scientists—at least some scientists—are those folk who are in principle capable of enjoying life in the real universe.”
(Source: Joy in the Merely Real)
If you look at what Eliezer spends most of his time discussing on Reddit as of this articles publish date it's a rarely interrupted mix of:
- Lengthy Harry Potter theorizing
- Cryptocurrency speculation
- Defending his authorship of HPMOR
It's frankly bizarre that the guy pounds out these elaborate word of god amendments to his giant Harry Potter fiction while insisting that only people who can get over magic have a shot at living in the real world. For the sake of fairness, his twitter isn't quite so dorky but the Reddit history really makes me wonder what he does all day.
“I'm not sure if the following generalization extends to all genetic backgrounds and childhood nutritional backgrounds. There are various ongoing arguments about estrogenlike chemicals in the environment, and those may not be present in every country...
Still, for people roughly similar to the Bay Area / European mix, I think I'm over 50% probability at this point that at least 20% of the ones with penises are actually women.
A lot of them don't know it or wouldn't care, because they're female-minds-in-male-bodies but also cis-by-default (lots of women wouldn't be particularly disturbed if they had a male body; the ones we know as 'trans' are just the ones with unusually strong female gender identities). Or they don't know it because they haven't heard in detail what it feels like to be gender dysphoric, and haven't realized 'oh hey that's me'. See, e.g., http://sinesalvatorem.tumblr.com/…/15-regarding-the-4chan-t… and http://slatestarcodex.com/…/typical-mind-and-gender-identi…/
But I'm kinda getting the impression that when you do normalize transgender generally and MtF particularly, like not "I support that in theory!" normalize but "Oh hey a few of my friends are transitioning and nothing bad happened to them", there's a *hell* of a lot of people who come out as trans.”
“So instead, by dint of mighty straining, I forced my model of reality to explain an anomaly that never actually happened. And I knew how embarrassing this was. I knew that the usefulness of a model is not what it can explain, but what it can’t. A hypothesis that forbids nothing, permits everything, and thereby fails to constrain anticipation.
Your strength as a rationalist is your ability to be more confused by fiction than by reality. If you are equally good at explaining any outcome, you have zero knowledge.
We are all weak, from time to time; the sad part is that I could have been stronger. I had all the information I needed to arrive at the correct answer, I even noticed the problem, and then I ignored it. My feeling of confusion was a Clue, and I threw my Clue away.
I should have paid more attention to that sensation of still feels a little forced. It’s one of the most important feelings a truthseeker can have, a part of your strength as a rationalist. It is a design flaw in human cognition that this sensation manifests as a quiet strain in the back of your mind, instead of a wailing alarm siren and a glowing neon sign reading “EITHER YOUR MODEL IS FALSE OR THIS STORY IS WRONG.””
(Source: Your Strength as a Rationalist)
The conclusion on the left is strange, and more than a little frightening. It's premised on the idea that trans is an intrinsic trait people have, which is itself partially justified by the small number of trans people in existence. If you suddenly witness a large increase in trans people while believing that it's intrinsic, you should notice you're confused and seek alternative explanations. Instead Eliezer fits an improbable model that privileges his existing ideas.
To give some idea of just how improbable, he proposes trans is 4x more common than the entire LGBT demographic of California (4.8%) at the time of writing.
“Q: An omniscient source offers to provide a truthful answer to a single question. What would be the most beneficial question to ask?
A: Will you either answer this question in the negative, or become my good-genie servant for eternity?”
(Source: An omniscient source offers…)
“The jester reasoned thusly: “Suppose the first inscription is true. Then the second inscription must also be true. Now suppose the first inscription is false. Then again the second inscription must be true. So the second box must contain the key, if the first inscription is true, and also if the first inscription is false. Therefore, the second box must logically contain the key.”
The jester opened the second box, and found a dagger.
“How?!” cried the jester in horror, as he was dragged away. “It’s logically impossible!”
“It is entirely possible,” replied the king. “I merely wrote those inscriptions on two boxes, and then I put the dagger in the second one.””
(Source: The Parable of the Dagger)
The question does not specify that:
- The entity must answer your question no matter how ludicrous
- The entity is somehow bound in any way by its actions by answering
- The answer must be 'logically consistent' in a rigid way
If I were the hypothetical entity discussed, I would slay someone who asked me this on the spot.
““Because the circumstances under which you’re invoking meta-honesty have something to do with how I answer,” says Harry (who has suddenly acquired a view on this subject that some might consider implausibly detailed). “In particular, I think I react differently depending on whether this is basically about you trying to construct a new mutually beneficial arrangement with the person you think I am, or if you’re in an adversarial situation with respect to some of my counterfactual selves (where the term ‘counterfactual’ is standardly taken to include the actual world as one that is counterfactually conditioned on being like itself). Also I think it might be a good idea generally that the first time you try to have an important meta-honest conversation with someone, you first spend some time having a meta-meta-honest conversation to make sure you’re on the same page about meta-honesty.”
“I am not sure I understood all that,” said Dumbledore. “Do you mean that if you think we have become enemies, you might meta-lie to me about when you would lie?””
(Source: Meta-Honesty: Firming Up Honesty Around Its Edge-Cases)
(Source: Why Truth? And…)
The excerpt on the left is almost an OCD scrupulosity level of obsession with 'not lying'. Some of the reason not to lie is game theoretical, but most of it is moral. Further, it is difficult for me to imagine a more 'spock' sounding dialogue than the one presented in the post linked on the left.
What's the slacker club interest in this stuff anyway? Marketing professor David Gal recently asked why behavioral economics is so darn popular in an op-ed for the New York Times. He ends up blaming an addictive mix of pop psychology and borrowed prestige from economics (both of the two ounces it has). Hacker News, itself mostly populated by the sort of middle class technician that enjoys this stuff has a long and interesting thread analyzing the question. One comment in particular stuck out to me.
Hacker News user 'kolbe' writes:
“It's easy to understand and relate to. To even comprehend contemporary research in most scientific disciplines, you need a seriously strong understanding of math or chemistry--a level so high that it cannot be 'popular'. Behavioral Economics only require remedial algebra, statistics and literacy, and the topics they address are usually familiar to everyday people's lives.”
And in this regard I think he's hit the nail on its head. Rationality is a subject that has weak enough prerequisites to attract every useless crank that bounced off anything requiring more actual effort. Especially the LessWrong flavor, which contains enough extropian science fiction content to fill out the greatest metal album yet to be recorded.