A Shapley example

I've been wondering lately why we don't think about Shapley values more. The argument is, in brief, that whatever its flaws, it's the best algorithm we have for computing something of great interest: how to attribute the value generated by a whole to the parts of that whole.

Consider the "glove game": there is one left-handed glove and two right-handed gloves. A matched pair is worth one unit of value. How much does each glove contribute to that unit of value?

It turns out that the left-handed glove is assigned 2/3 of the unit and each right-handed glove 1/6. This is already pretty surprising to me: I might have guessed 1/2-1/4-1/4 instead.

Now let's generalize the situation: more right-handed gloves join the team, so that there is one left-handed glove and N right-handed gloves. Figuring out the Shapley values here is a nice elementary exercise. It turns out that the left-handed glove is assigned N/N+1 and each right-handed glove is assigned 1/(N * (N + 1)).

I would not have guessed this (at least not before I'd lived with Shapley values for a while). More concretely: if two more right-handed gloves show up, the single left-handed glove gets much more valuable: its Shapley value goes from 2/3 to 4/5. And each of the right-handed gloves gets much less valuable: they each go from 1/6 to 1/20.

One more time: this isn't a claim that Shapley values are perfect. But the uniqueness claim is so strong. Interpreting the requirements of that uniqueness claim in this application: The null player assumption requires us to assign the other (non-glove) objects in the room should get assigned values of 0. The efficiency assumption requires us to make the values add up to 1. The symmetry assumption requires us to give the same value to each right-handed glove (and similarly for left-handed gloves, if we generalize the problem to add those). For the linearity assumption, suppose we make a matching pair of gloves today and tomorrow, assigning one unit of value to each pair. The assumption requires us to give the same result if we consider these as two different value-creation events or one event that is spread out over two days. (We can't restribute value by playing metaphysical tricks in individuating those events.)

Those are very mild assumptions. And if you accept them, the Shapley assignment of values is unique.

A few lessons jump out at me here:

  1. The left-handed glove's Shapley value gets bigger when more teammates show up.
  2. Team size matters a lot.
  3. Especially if you're doing highly interchangeable labor with those teammates.

Finally, a meta note: This reminds me so much of poker, where highly simplified "toy games" are used as intuition builders. There's a real skill to using these toy games correctly. Most players never take them seriously. A few spend much too much time thinking about them. Finding the right balance here is an underrated cognitive skill. (My former co-host Andrew is one of the best at this.)

Home page