# Post-mortem: A silly Whittle Down alternative for goals with difficulty curves

Hey everyone. I’ve been skimming the forums a bit, catching up with all the awesome goals and ideas over the last few years.

I just wrapped up a goal that was a little crazy/silly but figured some of you might find it interesting.

I had a goal that any sane person would set up as a Whittle Down goal (to do pretty significant amount of data entry). I knew that the first half of it would go quickly, the last third would take quite a bit more time, and the last few percent would be all the complicated bits and would take a while.

I was inspired by a little game I sometimes play on a treadmill to keep me going - I count down how much I have left based on the fraction I have left - “1/3rd left!” “1/4th left!” “1/5!” “1/6!” “1/7!”… etc. As I get towards the end, I get the nice zeno-like rush as the fractions get quicker and quicker (and makes the last part of the run feel like it’s going by quicker than it actually is). And, well, if you just focus on the denominator part of those goals (the 3 in 1/3 and 4 in 1/4), you could reframe it as an odometer “Do More” goal where you’re increasing the denominator of the fraction representing how much left you have to do.

So, sounded silly, so I went ahead and tried it. Some thoughts:

• This worked for me well because it was easy for me to automatically compute (total/(total - howMuchCompletedSoFar)) so I didn’t need to do math when entering in progress.
• The numbers increased faster than I expected. It meant that a rate of 0.5 a day could be appropriate today but crazy slow a week later. I retro-ratcheted up a few times, but that didn’t help as much as I had hoped.
• I found a much more effective/fun way of retro-ratcheteding my goal was to compute how fast I’d expect to go in a week, and then set that as the new rate. So for example if I was at 0.5 a day and had a week buffer, I might just set a goal for 10 a day. That would result in some pressure to keep at it through the week so that by the time the new rate took effect, I was far enough along that the rate was actually achievable.
• The graph in the end wasn’t super meaningful (although it was meaningful for most of the goal).

I don’t think I’d recommend this kind of experiment in general but it was a fun way for me to dive into what was a daunting project.

5 Likes

Pretty clever!

I might try something like this the next time I have an appropriate goal, one where I expect the difficulty/aversiveness of the task to rise as I progress.

In a sense, this is a manual implementation of the (long since removed) exponential road feature that Beeminder used to have. As a feature, exponential roads just weren’t pulling their weight—but a lightweight trick like this one seems like it could be very worthwhile way to accomplish the same ends.

2 Likes

Yeah; it’s very much a manual exponential goal.

I haven’t done the exact math on how close it is to something like e^x. The thing that I kept using to gauge how fast the curve grew was by calculating the distance between numbers

1/3 - 1/4 = 1/12
1/4 - 1/5 = 1/20
1/(n-1) - 1/n = 1/((n - 1)n)
etc.

2 Likes

The Euler–Mascheroni constant is the difference, in the limit, between the partial sums of the harmonic series and the natural logarithm.

The amount of work you need to do on day n is 1 \over n, thus the total amount of work you need to do in the first n days is \sum_{k=1}^{n} {1 \over k} — the partial sum of the harmonic series, which, as mentioned, approaches in the limit \ln n plus a constant (of approximately 0.5772.)

4 Likes