FanPost

A Look at Shot Quality

A Look at Shot Quality

I wanted to take a crack at shot quality. So I looked at shots recorded by ESPN's GameCast system for the 2009-10 and 2010-11 regular seasons. Only shots and goals were considered (so missed shots were excluded). Penalty shots and empty net shots were also excluded. There were 74300 shots analyzed from 2009-10 and 73774 shots analyzed from 2010-11.

 

In these data, for each shot we know the team taking the shot, the team and the goalie defending the shot, the strength of the team taking the shots (the values here are Even, Powerplay, Shorthanded), the type of shot (slap, snap, wrist, wraparound, deflection, backhand, tip-in), the recorded x-coordinate, and the recorded y-coordinate. As Ken Krzywicki has done link and link, I mapped all shots to a single offensive zone.

 

 

 

 

Adjustments were made for shots taken at Madison Square Garden. I did this by mapping the cumulative shot distribution of the originally recorded x-coordinates at MSG to the cumulative shot distribution of all shots not taken at MSG. (Formally it a probability integral transform but nobody knows what I say when I write that.) I did the same for the y-coordinates.

 

I then followed the same methodology for getting at the average shot quality as I did for the Defense Independent Goalie Rating (DIGR) that I developed previously. Here is the link to that paper:

DIGR Paper

and to a media article with some critiques of that paper':

Article

 

The basic idea is that I fit a nonparametric smooth surface to the ice with the response being a probability of a goal like the graph below.  (This particular graph is for Martin Brodeur for slap shots taken at even strength during the 2009-10 regular season.)  Red indicates high probability of a goal, while blue indicates lower probability of a goal.    

Shotprobmap_medium
 I fit such a model for each type of shot and for each shooting team strength, so that there are 21 total maps/models. In the DIGR paper I made a map for each goalie and then predicted from those maps the expected save percentage if each goalie faced every shot in the league for a given season. Here I created a map for the league (based on all of the shots in a given year) and then predicted the probability of a goal for a given teams set of shots. There is a nice mathematical framework in the DIGR paper that justifies how to do this. The nonparametric part is that our probability model does not assume a particular form (i.e. linear, log linear or quadratic) for the relationship between probability of a goal and the x and y-coordinates.  The data determines the form of the relationship.  Having fit each teams shots to our model, I average these predicted values to get an average predicted shot probability or shot quality against. Call this SQA.

 

I calculated the SQA for both the 2009-10 and 2010-11 seasons with a league average mapping that was different for each year. The following graph shows these values for all 30 teams for shots taken at even strength. It is clear from this graph that there is a relationship from year to year between these two SQA at the team level. The correlation here is 0.75.

 

Shotqual09vshotqual10-even_medium .

The correlation, 0.75, is pretty high.  Though it might be hard to read, each teams location is plotted with an abbreviation for that team.  Minnesota(min) sits at the upper right of this graph and Chicago(chi) actually improved over last year. Tampa Bay(tam) was also a big mover here. Some of these teams are not surprising: New Jersey (njd) and Boston (bos) on the high end, the Islanders (nyi) and St. Louis (stl) on the lower end.  Others are surprising: Dallas (dal) and Calgary (cgy).  This correlation does seem high to me.  So there are two possible problem areas: a problem with the analysis or a problem with the data.

 

Let's start with the analysis.  I went back and looked at Ken Krzywicki's model from his analysis of SQA for the 2009-10 season. I was able to reconstruct some of the additional variables that Ken used by creating angle values and distance values from the x- and y- coordinates. I don't have whether or not a shot was a rebound in my data. This is a weakness but I was able to fit the regression without the rebound and without the indicator of whether or not the shot came after a giveaway by the opposing team. So my logistic model was not identical to Ken's but it was of the same basic form and it had 4 of the 6 predictors that Ken's model had. My logistic model had distance, absolute value of angle, strength of team performing shot and shot type as predictors. Having fit that model, I did the same thing as I had done for the nonparametric one: I predicted the probability of a goal for each teams shots based upon a league average model of shot probability.  Below is a graph that shows the SQA(logit) for each team for the two seasons in question. There are some changes in how the teams perform between the SQA and the SQA(logit). That is to be expected as we have different models. (Personally, I think that there is a strong case for the SQA be a superior model to the SQA (logit) and when I get a chance this evaluation can be done empirically, e.g. by looking at residual deviance. )

 

Shotqual09v10logit_medium

 

The correlation between SQA(logit) for the 2009-10 and the 2010-11 regular seasons was 0.82.  For many of the teams they are in roughly the same region of our graphs as they were for the SQA.  

 

So it is not the model.  It could be the data.

 

Well, ESPN GameCast totals and NHL Play-by-Play totals are different and the NHL ones are to be taken as the ground truth, no doubt. So some of the shots could be fishy. Previously I had looked at distributions of x- and y-coordinates for the 2009-10 season and only found that the MSG games were anamolous, hence the MSG correction mentioned above. That was not it. I looked at counts of shots faced by team. For 2009-10, the teams facing the fewest shots per game were the Blackhawks, Devils and Kings. In the GameCast data it was the Blackhawks, Devils and Kings. Staying with that same season the teams facing the most shots were Florida, Edmonton and Anaheim in both the GameCast data and the NHL data. Similar results held for the 2010-11 season. The number of shots faced per team was reasonable.  I've previously done other quality control checks on both of these years worth of data when I was getting them into the proper format for analysis in the DIGR paper. The data are accurate in the sense that they represent the values that the ESPN GameCast presented.  

 

So what are some of the other possibilities.

 

One. This was a fluke. The relationship between the shot quality measures for 2009-10 and 2010-11 are anomalous. An outlier.

 

Two. The metrics that we are using for shot quality are flawed. No doubt they are imperfect.  (Since they are expected values essentially, they are collapsing distributions into single values.  At some juncture looking at the shot intensity maps would be a good idea.)  There are possible predictors that are missing such as speed of a shot or whether or not the goalie was screened or the game score differential. This analysis looks at only Even, Powerplay and Shorthanded but not at the specifics of 5v4 instead of 5v3. That is a limitation.

 

Three. The data is correct in the files but not accurate. We know that there are issues with the humans recording the shot. The NHL produces highly imperfect data. The x- and y- coordinates are no doubt approximate at best.

 

These are three pretty important concerns, but yet the results suggest a moderate to strong correlation, r=0.75, for SQA from one-year to the next.  That does not suggest a fluke.  But this is testable.  (Some time in the future I'll get the 08-09 data into the proper format and add it to this analysis.)  Certainly both SQA metrics are not without there flaws.  But I think that are generally good.  AND they come to roughly the same conclusion.  Likewise with the data.  It is incomplete.  There is measurement error.  But it is the best we currently have.

 

In the end, based upon this analysis, I think that shot quality can be impacted by a team.  I'll leave its prediction for another day.  But is that impact meaningful?  Well we can say that about 56% (r^2) of the variability in even strength 2010-11 team SQA can be explained by knowing the 2009-10 even strength SQA.  The difference in even strength SQA between the top team and the bottom team for both years is about 1.5%.  Over about 2000 shots, that's about the number of even strength shots that a team faces, the difference from top to bottom is about 30 goals over a season.  That's 10 points in the standings.  That would seem meaningful.            

 

Postscript:

 

Some other correlations that I found:

Between SQA Even Strength and Raw Save Pct; 2009-10(r=0.11), 2010-11(r=0.12).

 

Between SQA Even Strength and SQA Powerplay: 2009-10(r=0.46), 2010-11(r=0.49).

 

Between SQA Even Strength and Total Shots Faced: 2009-10 (r= -0.07), 2010-11 (r= -0.24).

 

Between Total Shots Faced: 2009-10 and Total Shots Faced 2010-11 (r=0.35)

 

 

If this FanPost is written by someone other than one of the blog's editors, the opinions expressed in it do not necessarily reflect those of this blog or SB Nation.

X
Log In Sign Up

forgot?
Log In Sign Up

Forgot password?

We'll email you a reset link.

If you signed up using a 3rd party account like Facebook or Twitter, please login with it instead.

Forgot password?

Try another email?

Almost done,

Join Arctic Ice Hockey

You must be a member of Arctic Ice Hockey to participate.

We have our own Community Guidelines at Arctic Ice Hockey. You should read them.

Join Arctic Ice Hockey

You must be a member of Arctic Ice Hockey to participate.

We have our own Community Guidelines at Arctic Ice Hockey. You should read them.

Spinner

Authenticating

Great!

Choose an available username to complete sign up.

In order to provide our users with a better overall experience, we ask for more information from Facebook when using it to login so that we can learn more about our audience and provide you with the best possible experience. We do not store specific user data and the sharing of it is not required to login with Facebook.

tracking_pixel_9355_tracker