math ftw, equation describing universal pessimistic presumptive reports (over-the-top beeminder nerdery)


#1

Repeating an anecdote from our internal mailing list, since it’s relevant to the weight loss question Jake Hofman brought up. (Oh, ha, and I hadn’t even noticed till now that Jake is a prime example of the claim below about physicists!)

This is why physicists are better programmers than computer scientists are. Math FTW!

The amount of grief and future debugging we just saved ourselves by spending 20 minutes drawing graphs and writing equations on paper: kinda staggering.

Here’s the answer to pessimistic presumptive reports: r - yaw*lnw
[lnw = lane width, yaw = +1 if good side of road is up, -1 if down, r = daily rate of the YBR]

That works for ALL graph types, including Do More where PPR doesn’t even make sense (it comes out to zero as the PPR, which we’ll special-case to say if PPR==0 then suppress it altogether).

In the case of PPRs for Do Less, r - yaw*lnw = r-(-1)r = 2r, which is how we implemented it originally.

Squee! I sure like math!

I was doing Project Euler problems the other day and thinking how preposterously far removed it was (writing little programs to solve those contrived math problems) from day-to-day getting actual shit done on real web apps. But this reminds me that it’s surprisingly relevant.

Aaron Parecki chimes in: “in very specific cases.”

Me again: But it’s more often than you’d think, is the point. Like we could easily have just plowed forward writing code to make the right things happen in all the various cases without it ever occurring to us that there was an elegant mathy generalization that could replace all that code with an equation.

PS: Here’s the 4 pages of pictures and equations that would’ve all been reams of code we’d have to maintain if we hadn’t stepped back and did the math and turned it into “r - yaw*lnw”:

PPS: And of course once you stare at “r - yaw*lnw” for a minute it all becomes perfectly obvious: the PPR follows the YBR from one day to the next (the “r” part) but then pulls back by one lane width in the bad (as in bad side of the road) direction.


#2

On Mon, Dec 2, 2013 at 9:53 PM, Daniel Reeves dreeves@beeminder.com wrote:

The amount of grief and future debugging we just saved ourselves by
spending 20 minutes drawing graphs and writing equations on paper:
kinda staggering.

nice—a great example of the value of thinking before coding!

my advisor used to refer to this as “code at the last possible
minute”, which he recently reminded himself of publicly:

https://twitter.com/chrishwiggins/status/398105787807313921

-j


#3

Yes, yes, yes!

I have come to believe that (“abstract”) math is relevant to actually
getting stuff done — more elegantly, simply, maintainably, and concisely
— more often than you’d think, even when taking this statement into
account
. =) The trick—as it seems you have found from experience—is
that it often actually requires stepping back, thinking carefully, and
asking the right questions to see it. In my humble opinion, if anyone
thinks that math is only applicable “in very specific cases”, it’s only
because they either (a) don’t know enough math or (b) don’t spend enough
time asking the right questions / thinking about how it might be
applicable. If you had only ever seen duct tape used for taping ducts, and
hadn’t ever given much thought to other possible uses of duct tape, you
could be forgiven for thinking that duct tape is only applicable “in very
specific cases”.

In a sense, this is what a lot of my research is all about. See this
paperhttp://www.cis.upenn.edu/~byorgey/pub/monoid-pearl.pdffor an
example of how you can take a simple mathematical idea and get a
whole lot of mileage out of it for actually Getting Stuff Done (in this
case, producing vector graphics http://projects.haskell.org/diagrams).
Some computer scientists actually do like math—though not nearly enough.
I think physics, as a discipline, is mature enough that the essential
nature of mathematics has become obvious to everyone in the field; computer
science is just a baby discipline and has some catching up to do. 50 years
or so from now, I think you will be laughed out of an interview for a
coding job if you admit that you don’t really know that much math (whereas
now I imagine you can have a good laugh with the interviewer about how
useless all that silly abstract math nonsense is).

-Brent