Greedy Sadistic Pirates

Similar to Greedy Pirates but a bit more difficult — perhaps. I’m actually not sure yet what the correct answer is.

A bunch of pirates have stolen a treasure chest full of gold and they decide to divvy it up as follows:

The shortest pirate proposes an allocation and they all vote on it (including the proposer). If the proposed allocation passes the vote they go with it. If not – or even if there’s a tie – they heave the shortest pirate overboard and start over with the new shortest pirate.

Assume that when they kill the short person they feel a tinge of glee, but they’ll gladly let them live if they stand to gain so much as a doubloon by it.

So if you’re the shortest pirate what allocation should you propose?

1 Like

Ha, I was considering my version an error and just edited it to exactly what you’ve posed here, but I like this better! I’m undoing my edit to the original now!

Actually, I realized there’s something else we were taking for granted but which seems reasonable: there is one thing pirates care about even more than money, and that is staying alive.

1 Like

A very small hint:

Even in this version of the problem, with three pirates the third one should not appease the second pirate with a single coin.

1 Like

With two pirates, even if the second pirate offers the first pirate all the money, the second pirate’s gonna die.

So with three pirates, the second pirate will vote for anything where she stays alive. So the third pirate can propose to keep all the money and the second pirate will vote for it.

With four pirates, the fourth pirate needs three votes, and no matter what, the third pirate won’t be the deciding vote, because if the fourth pirate’s allocation is voted down, he gets all the money plus one death, which is one death better than the fourth pirate can offer.

But if the fourth pirate dies, the first and second get nothing, so all she needs to offer them is one gold piece each. I’ll write this (1, 1, 0, x-2).

With five pirates, the fifth pirate needs three votes, so he needs to pay off the third pirate with one gold piece and either the first or second with two: (0, 2, 1, 0, x-3) or (2, 0, 1, 0, x-3).

Here it gets tricky, because the first and second don’t know for sure whether they’ll get 0 or 2 if the sixth pirate dies.

With six pirates, the sixth pirate needs four votes, so she has to pay off three other pirates. The fourth pirate can be paid off with 1 for sure. The third pirate can be bought off with 2 for sure. But the first and second may be optimistic sadistic pirates who think they’ll be chosen for a payoff, or realistic sadistic pirates who think the choices will be randomly made.

If they’re realistic, they both expect 1 gold piece on average, plus one death, if the sixth pirate dies. So the sixth pirate will need to offer one of them 2.

So it’s either (2, 0, 2, 1, 0, x-5) or (0, 2, 2, 1, 0, x-5).

With 7 pirates, the 7th needs three other votes, and can get them with (2, 0, 0, 2, 1, 0, x-5) or (0, 2, 0, 2, 1, 0, x-5). He just needs to offer the fourth pirate 2, the fifth pirate 1, and either the first or second 2.

With 8 pirates, the 8th needs 4 other votes. The 6th costs 1, the 3rd costs 1, and the 1st, 2nd, and 5th all cost 2, so she needs to pay out two of those three.

@zedmango That’s about as far as I had gotten too. But I’m not sure I buy the argument about how the sixth pirate can pay off the first and second pirates. It seems like it depends on how risk-averse the pirates are. Also it makes me wonder whether there are weird scenarios in which the pirates make coalitions and promise votes in exchange for favorable allocations, or whether that could possibly make a difference. Maybe they don’t trust each other though.

1 Like

I don’t really think of pirates as being risk-averse :wink:

But yeah, to be on the safe side, the sixth pirate may need to offer 3 gold to the first or second pirate, which is definitely more than they’d get from the fifth pirate no matter what.

I’m not seeing how coalitions or promises could make a difference - can you elaborate?

1 Like

Well, for example, suppose there are six pirates and the sixth offers two coins to the first pirate and none to the second. Now suppose that before the vote, the fifth pirate secretly goes to the first pirate and promises to give them two coins if they just vote against the sixth pirate’s proposal. If the first pirate trusts this, it is clearly in their interest to vote against the sixth’s proposal, since they will get the same amount of money but one extra death. In this scenario, the sixth pirate has to offer three coins to one of the first two, regardless of how risk-averse they are.

Of course, by the same token, the fifth pirate could secretly promise to allocate three or more coins to the first pirate if they vote against the sixth pirate. It is less clear that the first pirate should trust this promise, since once the sixth pirate is dead the fifth pirate doesn’t necessarily have an incentive to keep their promise.

1 Like

Ooh I like this!

I think we should assume that the pirate’s code of honor requires them to keep all agreements on penalty of death.

1 Like

Oh dear. I think this changes things quite drastically. :laughing:

1 Like

To clarify, do you mean “give them two coins no matter what” or “propose an allocation in which they get two coins if the allocation is accepted”?

1 Like

I meant the latter. But I suppose pirates who aren’t in a position to make a proposal could still promise to give coins to other pirates out of the coins they get when another pirate’s proposal is accepted. (Let’s assume that pirates don’t have other, pre-existing coin stashes out of which they could pay off other pirates.) Do you have an example in mind where this makes a difference?

1 Like

The problem requires a reasonable assumption about how pirates feel about risk and violence.
Basically, would they take 2 coins instead of a murder and 2/3 chance of having 2 coins?

If we assume that they’d take 2 coins in this situation, then the rest is simple.

If there’s one pirate, he gets everything.
If there are 2 pirates, the tallest still gets everything and the second tallest dies.
If there are 3 pirates, the third tallest gets everything. The second tallest would support him purely out of self-preservation.
If there are 4 pirates, the shortest can give the tallest and second-tallest one coin each, and take the rest.
Now, if there are 5 or more pirates, the following pattern ensues: we have the shortest take most of the money, second-shortest get nothing, third-shortest get one coin. This gives us 2 votes, uses 3 pirates and leaves a pool of N-3 pirates. We need half of the votes in the pool if it’s even-sized and majority if it’s odd-sized. We will do that by offering appropriate number of pirates 2 coins.
They have to agree, because if they decline there’s around 50% chance that they’ll get nothing at all, and around 50% chance they’d get the same 2 coins. We’ve assumed that the loss is not worth the murder.

R
R, 0 (no winning play here unfortunately)
0, 0, R
1, 1, 0, R
[2x1/2] 1, 0, R
[2x2/3] 1, 0, R
[2x2/4] 1, 0, R
[2x3/5] 1, 0, R
[2x3/6] 1, 0, R

with [2xa/b] meaning a pirates out of b get 2 coins and R meaning the remainder of the treasure.

However, if the pirates are especially bloodthirsty and risk-seeking - so that they would take half the chance of 2 coins with a side dish of murder above plain 2 coins any day, we’d have to appease them with 3 coins instead. Then the solution wil be:

R
R, 0 (no winning play here unfortunately)
0, 0, R
1, 1, 0, R
[2x1/2] 1, 0, R
[3x1/2] 2, 1, 0, R
[3x1/3] 2, 1, 0, R
[3x2/4] 2, 1, 0, R
[3x2/5] 2, 1, 0, R
[3x3/6] 2, 1, 0, R
[3x3/7] 2, 1, 0, R

Here’s my solution, adapted from my solution to “Greedy Remorseful Pirates”. I’m assuming no enforceable contracts. That makes this a noncooperative game theory problem. (I don’t yet know how to solve the cooperative case – what @byorgey and @zedmango were discussing – where you do have enforceable contracts but I agree it could be very interesting!)

Also, contra @zedmango’s and @daerdemandt’s solutions, I’m assuming no uncertainty. To achieve that I’m using one more assumption about how pirates break ties: they always favor pirates closest to their own height. I don’t think that assumption is strictly needed but it avoids a big mess of multiple Nash equilibria.

Also I’m ordering the pirates shortest to tallest. So pirate 1 is the shrimpy proposer.

In the 2-pirate case the short pirate (the proposer) is utterly doomed. There’s no way to appease the tall pirate. Even if you give them all the gold, they don’t lose any gold by killing you so they do.

In the 3-pirate case it’s the total opposite: the shrimpy proposer can have everything. Why? Because if the proposer is killed, the 2nd pirate becomes the shortest pirate in the 2-pirate case. Not an enviable position, as we just saw. Since that person will die if you die, they’ll vote yes and you have your majority. No need to give the tallest pirate anything.

In the 4-pirate case, you as shortest pirate have no way to win the 2nd pirate’s vote. They get everything if you die, by becoming the proposer in the 3-pirate case. So don’t give that pirate anything; they’re a lost cause. Instead, you need to win over the 2 tallest pirates (3 and 4) to get a majority. Giving them 1 coin each does the trick. That’s 1 coin better than they each get by killing you, so they’re on board.

In the 5-pirate case, the 2nd pirate is still a lost cause and you still need to appease 2 other pirates. Pirate 3 gets nothing if you die (they become pirate 2 in the 4-pirate case) so 1 coin suffices for them. But pirates 4 and 5 each get 1 coin if you die (by becoming pirates 3 and 4 in the 4-pirate case). So pick one of them – breaking the tie in favor of the shorter one, 4 – and give them 2 coins. That’s your 3rd of 5 votes so no need to waste any money on pirate 5.

Now we can see the pattern emerge. The shortest pirate allocates nothing to pirate 2, 1 coin to pirate 3, 2 coins to pirate 4, 0 coins to pirate 5, 1 coin to pirate 6, and continues alternating 0 and 1 coin for the remaining pirates.

Here’s a table showing the cases up to 9. Each column shows the allocation for a given number of pirates. A * means the remainder of the treasure, ! means certain death, and the tick marks are the yes votes. You can look up and to the left from each cell to see what that pirate gets if the proposer dies (if it’s worse they votes yes).

   1  2  3  4  5  6  7  8  9
1  *' !' *' *' *' *' *' *' *'
2     *  0' 0  0  0  0  0  0
3        0  1' 1' 1' 1' 1' 1'
4           1' 2' 2' 2' 2' 2'
5              0  0  0  0  0
6                 1' 1' 1' 1'
7                    0  0  0
8                       1' 1'
9                          0

For a proper proof we just need to consider the allocation for the n-pirate case – {*, 0, 1, 2, 0, 1, 0, 1, 0, 1, …} – and show that the majority vote yes. We get 1 vote from the proposer (pirate 1), 1 vote from pirate 3, and 1 from pirate 4. That’s 3 out of 4 votes so far. For pirates 5 through n we get votes from all the 0’s in the allocation, which is half of them if n is even, so we’re set. If n is odd then the nth pirate is getting 0 and voting no. In that case we have 3 of 4 yeses, then half yeses for pirates 5 through n-1, then a no. Grouping the last no with the first 4, that’s still 3 out of 5 and then half of the rest, so still, barely, a majority. QED.

PS: Other places around the internet where a version of this puzzle has been discussed:

  1. https://en.wikipedia.org/wiki/Pirate_game
  2. https://fivethirtyeight.com/features/can-you-solve-the-puzzle-of-the-pirate-booty/
  3. https://waitbutwhy.com/2016/06/pirate-booty-puzzle.html
  4. https://www.thebigquestions.com/2015/09/10/are-you-smarter-than-google-part-three/ (points out the multiple equilibria problem, where everyone can vote against their own interest because everyone else is and so no one’s vote matters; the concept of trembling-hand perfect equilibrium fixes that)