Making a Beeminder GPTs

I’m looking into this, to do things like talk you through the goals, alternate visualizations, even “gate” goals (as in, you have to paste in a status report to the GPT to be able to increment a goal).
However, I’m having a hard time getting past the (previously discussed) bad decision to only support api auth over a url parameter.
Can we get the ability to pass it in via the Authorization header? Please?


Well, so the basics of doing this are pretty simple, but because we issue two different types of auth tokens (and look up & verify the owner in different ways based on the type of token), doing the header based authorization becomes more of a thinker. Currently we do it based on the key name passed with it (e.g. auth_token vs access_token).


  1. try to look up user by both types of token {if (lookup A) else (lookup b)}
  2. only allow one type of token in header auth
  3. invent some new syntax where you have to include “Bearer access:” or “Bearer auth:” or whatever
  4. shunt personal token auth into “basic auth”, where you use username:token concated together as if token was a password? and then the bearer token type is only for access tokens?
  5. differentiate token-type based on characteristics of the token (brittle and we probably don’t want to commit to tokens staying the same indefinitely).

This might be silly, but maybe a different base URL?


1 Like

My favorite solution would be:

  1. Deprecate auth tokens
  2. Give the user the ability to create an arbitrary number of named access tokens (which secretly create single-user oauth apps behind the scenes)
  3. Only support access token in the authorization header.

I accept this is an unreasonable amount of scope creep, though.


What I’m leaning toward is to update v1 auth to accept only access_tokens in the header, with the intention of later deprecating auth_tokens and doing the arbitrary named access token thing. That makes it possible (though slightly more complicated) for @patimen to move on with making the Beeminder GPT, without doing something too egregiously gross/brittle/complex/ or scope-creepy right now.


Okay, I updated things so that you can send access_tokens (but not your personal auth_token) in the Authorization header.


Do you have an example of how this should look?

I tried this:

> curl -H 'Authorization: Bearer my-access-token' -s ''
{"errors":{"message":"Token missing or incorrect token.","token":"no_token"}}

Hmm, that looks correct, and the error is unexpected. I am looking through git history to see if/when we might have broken it.


Welp, that was a nice little lesson plan past me set up for present me.

Did we break something after the fact with other changes we’ve made lately? (like changing things with how we handle missing “Accepts” headers, and how we handle glomarization.)

\me digs through git history

Nope! turned out the whole thing was broken from the start due to a combo of two dumb errors. So sloppy, right!?

Well, only sorta. It turns out I actually wrote quals for this when I made the change… so what’s going on?

A few minutes later, I see that the quals were not testing what I thought they were testing, and so they happily returned “success”-es that were meaningless. :woman_facepalming: So anyway, fixed the easy-peasy-sloppy-queasy errors, fixed the quals to actually test the Authorization header, and there’s a deploy on it’s way out in just a little bit.

I relate all this in such detail because I thought it was an interesting lesson. Writing quals is good and valuable, but also not sufficient on its own to prevent bugs or verify your code. Which I knew, but anyway, it’s a good allegory.


Very true. I generally try to write my tests before I make them pass so I can see them fail, for this very reason.


Thanks for the thorough investigation and write up! I can confirm this is now working.


so you’re using GPTs for your beeminders now ?

Mind sharing your implementation ? sounds interesting…


No, I just meant the Authorization header part. But that hopefully unblocks other people looking into GPT integration if they want to.