Hey Dan, nice article. It reminds me of a couple contracts I’ve made that
you might find interesting.
First, my failure payoff recipient on Stickk is always a friend. Friend
choice is obviously important. I tried making my Mom the recipient once but
knew the entire time that losing $100 would just mean her getting me gifts
totaling $100 over the course of the following months. Not too motivating
to stay on track. (Now that I think about it, that sounds like a good way
to make purchases you can’t justify to your significant other – just
purchase them indirectly through your “I can’t accept your money” friend
after you intentionally lose a bet to them. Although it also sounds like a
good recipe for one or more unhealthy relationships.)
The friend that I settled on agreed to do something irresponsible with any
money he earns from our contracts. Last December I failed a contract which
ended up costing me $300 (ouch). We were at a conference with a casino at
the time, so his plan was to go up to a roulette wheel and put all the
money on black. If he lost, so be it. If he won, it meant free dinner and
drinks all night for everyone in the party except for me (the idea being
that I shouldn’t benefit from my own failure). The result: he won, spent
the winnings frivolously all night, and still had a quarter of it left by
the end. Which he proceeded to put on black again. And won again.
Effectively I funded two nights of reckless entertainment for my friends.
While I admire his follow-through on his commitment to irresponsible
behavior, I can’t say it actually gave me any more motivation to meet my
contract; as far as I was concerned, it was his money to lose at that
point. If anything, it lessened the pain of me losing they money: at least
in the moment, I received more value out of watching his high-stakes antics
than I would have had he just gone home and bought groceries. Maybe I
wouldn’t feel the same way had he lost.
On another note, I’ve recently found the “public humiliation” contracts to
be very effective. Two of my friends and I entered into a race to run 100
total miles, where the losers have to erect 9 square foot shrines of the
winner and change their facebook profiles for a month to themselves
worshiping said shrine. To avoid the lazy outcome where nobody finishes the
race, anyone not finishing in 50 days has to post a video of themselves
dunking a basketball to Youtube (or rather, trying to dunk and failing
miserably).
There ended up being some pretty interesting strategizing going on (e.g.,
When should I report my miles? How will my reporting affect others’
behavior?), but in the end this contract taught me most about the
importance of getting the rules right to induce the desired behavior. For
example, we set no upper bound on the amount that could be run each day,
which gave the bet an unwelcome Cold War feel, with miles-per-day counts
escalating over time to avoid the pain of social media embarrassment. At
the same time, in order to run 10 miles per day without injuring ourselves,
our “running” pace slowed to a barely-acceptable crawl, further taking time
away from more important matters. All this demonstrates what I’m sure is
already known: if you want to maximize social efficiency, choosing the
contract failure consequences is only part of the battle.
Finally, regarding the article specifics, I have just one comment:
“In fact, if you set up a commitment device where you destroyed your
computer if you didn’t stay on your yellow brick road then I actually would
call that kind of immoral, even though it was your own computer. Because
you could’ve instead given it away and not wasted anything. Burning money,
in contrast, doesn’t waste anything (except some paper).”
I don’t think I consider destroying money any less immoral than destroying
a physical good. Whether you destroy a $500 computer or $500 cash, in both
cases, you’re giving up the opportunity to give away a $500 computer (it’s
just that in the second case you’d have to buy it first). Maybe destroying
the money could even be considered more immoral, since the person that
would hypothetically get the computer might prefer to have a cash donation.
BTW, I would definitely use that money-shredding alarm clock.
Best,
Eric
On Wed, Mar 6, 2013 at 1:31 AM, Daniel Reeves dreeves@beeminder.com wrote:
“prevent things that don’t make the world worse” approximately equals
“prevent things that make the world better” or “encourage things that
make
the world worse.” I fear you accidentally too many negatives here.
Oh, I actually meant it that way but it was a terrible choice for a
pullquote because you’re right that it sounded backwards out of
context.
But in a way it summed up the whole thesis of the post: it’s possible
to be very motivated by penalties that are to you very bad but that
don’t harm the world in way. Namely the threat of having to pay
Beeminder money!
In other words, by definition your self-imposed penalty has to reduce
your own utility. Just make sure to transfer that utility elsewhere
instead of letting it be destroyed.
PS: I like “Commitment Devices that Don’t Make the World Worse” –
can’t decide if it’s too late to change the title.
–
http://dreev.es – search://“Daniel Reeves”
Goal tracking + Commitment contracts == http://beeminder.com
–
You received this message because you are subscribed to the Google Groups
“Akratics Anonymous” group.
To unsubscribe from this group and stop receiving emails from it, send an
email to akratics+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.