Subtle change to self-destructing datapoints

Thanks to @alys’s brilliant Beeminder Advent Calendar post about using Pessimistic Presumptive Reports on do-more goals, we were inspired to make this feature a bit less ugly but also we think it won’t hurt to do it in a fully backward-compatible way.

Starting today [UPDATE: UVIs #4328 and #4329] you can use any of the following magic strings in a datapoint’s comment to mark it as self-destructing, meaning that a new datapoint on the same day will replace the self-destructing one:

  1. PESSIMISTIC PRESUMPT (which must appear at the very start of the comment)
  2. #SELFDESTRUCT
  3. #THISWILLSELFDESTRUCT

All of those are case-sensitive – it only works if they’re all caps like that.

(Ideas for something more pithy but still reasonably self-documenting welcome.)

While changing this we spotted a possible Chesterton’s fence that we wanted to mention to you all just in case someone has some spacebar heating reason to prefer the status quo, but otherwise we’re going to rip it out. Namely, in the current implementation, if a datapoint is marked as self-destructing but the replacement datapoint is also self-destructing then we have a special case to override the self-destruction and keep both datapoints. That makes no sense, right? Also, death to if-statements! Speak now or forever hold your peace?

4 Likes

If there are multiple SELFDESTRUCT data points, do they all go away when a new data point is added?

1 Like

Affirmative! (That’s what the code seems to be saying; would be good to confirm. The latest code is now officially deployed.)

Well, this is funny. I think I now understand the purpose of the Chesterton’s fence. If you try to add multiple self-destructing PPRs at once, they confusingly annihilate each other. Now I’m not sure whether to bite the bullet and call that expected behavior!

But for now it means the answer to @byorgey’s astute question is that it’s moot. There’s no way to ever have multiple SELFDESTRUCT datapoints because each new one destroys the previous one and if you try to add them simultaneously then they mutually self-destruct and you get none at all!

Double-oops, I was all wrong, it’s just that we introduced a bug and for an hour or so there, SELFDESTRUCT datapoints were just always self-destructing. It was impossible to add them at all. :person_facepalming:

UPDATE: Ok, the bug is fixed but it’s so embarrassing. The Chesterton’s fence should not in fact have been ripped out! New status quo is that everything works as advertised except we currently still have the special case that SELFDESTRUCT datapoints do not cause other SELFDESTRUCT datapoints to self-destruct.

5 Likes

Out of curiosity, why does that end up being necessary? It’s not immediately obvious (or even a few-minutes-of-thinking-obvious) why it’s critical to the self-destruct functionality :thinking:

1 Like

Since you asked, here’s the note from the internal gissue:

See the forum thread for the embarrassing way this played out. Hint: we should not have ripped out the Chesterton’s fence! Who would’ve thought?? I still think the special case is dumb, where self-destructing datapoints fail to self-destruct because the replacement is also self-destructing. That still makes no sense. But we do need a special case, sort of. Namely, when a new datapoint comes in and we’re iterating through the existing datapoints looking for any that should self-destruct, DO NOT INCLUDE THE NEW REPLACEMENT DATAPOINT IN THAT LIST. We only want to delete any already existing datapoints marked as self-destructing.

When we naively removed that Chesterton-fencey special case, we started doing this:

  1. Add a new self-destructing datapoint, D
  2. For each datapoint X with the same date as D, delete X if X is self-destructing
  3. Oops, D itself was one of those X’s so we just deleted D no matter what

In conclusion, let’s nix the dumb special case but still exclude the replacement datapoint itself when doing the self-destructing.

So that is now done, thanks to @bee, who scoped the query properly so we don’t include the newly added datapoint in the candidates for self-destruction.

The only possible weirdness now is that if for some reason you try to add multiple self-destructing datapoints at once, only the last one in the list actually gets added. All the others self-destruct. Which makes sense when you think about it so I’m inclined to call that as-designed. It also makes for an easy answer to @byorgey’s question. Namely, no such thing as multiple self-destructing datapoints, ever.

2 Likes

Ahh, that totally makes sense! Super neat, thanks for sharing! :smiley:

1 Like

Loving this! Like @alys, I also liked to use the self-destructing “pessimistic presumptive” datapoints for none presumptive reasons. (e.g. I had an IFTTT recipe that added one every day, to force me to add regular datapoints on non-cumulative goals) I like being able to add the #selfdestruct since it’s more descriptive for the purpose and, more importantly, will put a hashtag that I can visibly see on the graph to remind me.

2 Likes

Challenge accepted.

3 Likes