We discussed "Defense Independent Goalie Rating" (DIGR) for a little while last week, and I wanted to re-visit it briefly. DIGR falls in the general realm of "shot quality", which means there's a tendency for people to accept that it's meaningful without going through the process of determining how much talent is involved.
People calculate shot quality differently, but the simplest way to assess its existence is to look at shot distance. (I'm going to restrict myself to even-strength shots to put everyone on a level playing field.) Yes, shot angle, type, odd-man rush, rebounds, etc…all impact the likelihood that a shot goes in, but fundamentally, wouldn't a poor defending team that allowed a higher quality scoring chance on average let opposing shooters get closer to the net?
So…I took the average distance that every team allowed shots from on the road shots and divided them into 1, 3 and 6 year periods. This table shows the standard deviation of shot distance and the coefficient for the line of best fit between the average shot distance for even- and odd-numbered shots:
Stdev | M | Talent | |
1 year | 1.141 | 0.748 | 0.854 |
3 years | 0.875 | 0.838 | 0.733 |
6 years | 0.802 | 0.864 | 0.693 |
Multiply those together and we get the talent portion of the standard deviation of distance for shots allowed – which appears to be converging to approximately 0.7 feet. Converting that to save percentage, we get a standard deviation of 0.2%. The observed standard deviation of save percentage is 0.86%. Long-time readers know what that means: shot distance – and by extension, long-run shot quality allowed – accounts for 5% of save percentage. Worst-case, I suppose it could be 10%, but it's very small.
The other thing you always want to do with a talent is identify who's good at it. What teams do you think gave up shots closest to and furthest from the net on average over the last six seasons? We can't use home arena data because it's swamped by recording bias (e.g. MSG always has the closest shots on goal), and we can't even use raw road data because teams that play at MSG have their shot distances distorted. So I took every shot taken in every arena and normalized it so that the average shot distance in each rink was equal to the league-wide shot distance. Here are the best and worst:
Team | Year | Dist | Team | Year | Dist |
nj | 2007 | 31.56 | nyr | 2009 | 38.09 |
nyi | 2007 | 32.64 | ott | 2009 | 37.82 |
tor | 2009 | 32.67 | fla | 2011 | 37.58 |
nyi | 2009 | 32.73 | ott | 2006 | 37.45 |
pho | 2009 | 32.76 | ana | 2006 | 37.31 |
min | 2011 | 32.79 | nyr | 2006 | 37.17 |
pho | 2010 | 32.97 | fla | 2006 | 37.08 |
det | 2009 | 33.00 | tor | 2011 | 37.00 |
det | 2011 | 33.04 | buf | 2008 | 37.00 |
edm | 2007 | 33.21 | tb | 2011 | 36.91 |
If there's a pattern here, I don't see it. And of course, if you use a different normalization method, you get different results. So a big chunk of our 5-10% of save percentage due to shot distance is scorer bias – the true talent portion is not nothing, but it doesn't appear to be a major driver of save percentage overall. I'd like to see someone run the numbers and figure out whether DIGR-adjusted save percentage is a better predictor of future save percentage than even-strength save percentage on its own.