(This was a daily beemail from 2018 and I keep finding occasion to refer to it – most recently in the context of a new forum thread on satisficing vs maximizing – so I’m finally putting it in the forum!)
In case anyone else is into this, I was debating the concept of rationality today with one of you. There’s a Beeminder tie-in, of course, if a bit tangential. The thesis is that irrationality is really bad. I might be preaching to the choir here so let me know if any of this sounds remotely persuasive or misguided or just platitudinous!
(I’ve fictionalized the debate a bit, btw.)
“It feels wrong to prioritize money over other values.”
Leaving aside all the philosophy of what money is, it’s possible to hate some kinds of work enough for it to be rational to choose to not do it no matter how lucrative. But for example when one puts off certain simple adulting tasks and costs oneself thousands of dollars, there’s no way to rationalize that. We seem to both be guilty of that kind of irrationality.
“I don’t have a requirement that my actions be rational.”
Opinion: It’s a moral imperative to always reject irrationality. You can fall short of rationality 6 ways from Tuesday but you can never endorse or embrace irrationality. You can acknowledge your shortcomings and work around them, find ways to cope with them, etc. But throwing up your hands and saying that irrationality is ok is shameful.
Because irrationality is by its very definition wrongness, not-ok-ness.
When you’re arguing against rationality what you mean is that you disagree about what actions are most rational. You can’t argue against rationality itself. That’s like arguing against logic. No, worse. It can be rational to argue for intuition and emotion over logic, in some senses. Arguing against rationality is like arguing against Rightness.
To be clear, I’m also extremely irrational. I’m so deeply irrational that I created Beeminder. The use of Beeminder is incontrovertible proof of one’s irrationality. No rational person would ever purposefully limit their future options and set up a penalty to pay for no reason other than changing their mind about their priorities. It’s insane. But it’s also the epitome of not accepting irrationality – of doing whatever it takes to fix it.
“I do not have to prove my rationality to you, only to myself!”
Yes, you can have absolutely any utility function under the sun. No preference is verboten. (In the mathematical sense this even allows for preferences that really are bad, like pedophilia or thinking that the earth would be better off without humans.) But if you find an inconsistency in your utility function (hyperbolic discounting is Beeminder’s focus; others are sunk cost fallacy, uncalibrated predictions, scope insensitivity, there’s an enormous list that all humans are susceptible to) then, well, there’s still such a thing as bounded rationality where the cognitive cost of being more rational exceeds the benefit. But it’s just super icky to me to be like “oh well, I’m irrational but I like it that way”. Maybe it’s mainly a shameful lack of intellectual curiosity. Like don’t you want to understand what the different factors are that make your choices rational despite seeming irrational (and they must ultimately be rational for you to endorse them – as you say, you’re at least proving your rationality to yourself)?
In conclusion, always be fighting the good fight against irrationality, I guess. Like by beeminding!
Bringing it back to the debate that started this debate, about money, I view that as a question of quantifying preferences, another pet topic of Bee’s and mine. So I’ll end with a cute and potentially relevant vignette from Eliezer Yudkowsky. I’m so deep in this way of thinking that I honestly can’t tell if any of this sounds idiotic, heretical, immoral, or obvious to normal people. I would not be surprised by any of those reactions so I’m curious what yours is!
Let me try to clear up the notion that economically rational agents
must be cold, heartless creatures who put a money price on
There doesn’t have to be a financial price you’d accept to kill every
sentient being on Earth except you. There doesn’t even have to be a
price you’d accept to kill your spouse. It’s allowed to be the case that
there are limits to the total utility you know how to generate by
spending currency, and for anything more valuable to you than that,
you won’t exchange it for a trillion dollars.
Now, it does have to be the case for a von Neumann-Morgenstern
rational agent that if a sum of money has any value to you at all, you
will exchange anything else you have – or any possible event you
can bring about – at some probability for that sum of money. So it
is true that as a rational agent, there is some probability of killing
your spouse, yourself, or the entire human species that you will
cheerfully exchange for $50.
I hope that clears up exactly what sort of heartless creatures
economically rational agents are.