Nothing drives pageviews like portraits
of Victorian-era statisticians
The 2013 NFL Season
As promised, here is the 2013 season update. In order to shed more light on my methodology, I will breakdown the rankings and win-loss records at a team level below, so we can see exactly how right (and wrong) each ranking system was.
As a reminder, I am looking at the rankings after week 4 of the regular season, and seeing how those correlate to each team's win-loss record for weeks 5-16. The ranking systems I'm comparing are: the ESPN power rankings, Football Outsiders' DVOA Rankings, the Advanced NFL Stats team efficiency model, my betting market rankings, and the Simple Ranking System.
|Weeks 5-16||Week 4 Rankings|
The team with the best win-loss record from weeks 5-16 was the Carolina Panthers. Who saw that coming? DVOA had the Panthers at a respectable #6, while the Simple Ranking System had them as the #3 team in the league (the Panthers had only played three games at that point: 2 losses by a combined 6 points, and one win by a margin of 38; SRS loves it when you do that). The rest of the rankings had the Panthers in the middle of the pack.
Each ranking seemed to have its share of hits and misses. The market had the 49ers pegged right, despite their slow start. The Advanced NFL Stats model was the only one to predict the Eagles' strong finish. And nobody anticipated just how awful the Texans were going to be this year.
Combined 2007-2013 Results
But our eyeballs can only get us so far in evaluating which ranking system was the best, which is why we have Spearman's coefficient. Here are the results, alongside the 2007-2012 results from my previous post. The higher the percentage, the more accurate the ranking.
|Week 4 Ranking Correlation to Future Wins|
What surprised me was how low the percentages were this year, in comparison to prior seasons. It has already been pointed out that this year's playoffs were not very random, with the consensus top 4 teams in the pre-season all making it to the conference championships. But the regular season appears to have been just the opposite, with all 5 rankings having their worst, or second worst year in 2013. I'm at a bit of a loss to explain why that is, as this season didn't appear to be particularly "shocking" to me. I think the particularly poor performance of the usually playoff-bound Falcons and Texans had something to do with it.
The market rankings continue to be the most accurate of the bunch, although a 37% correlation is nothing to brag about. SRS had a rather poor season this year, although it is still number two, by a slim margin, in the 2007-2013 average.
Mid-season Ranking Accuracy
I can also apply this same methodology to the week 8 rankings to see if any models improved (or regressed) with an additional four weeks of data. The challenge here is that the more weeks I give the rankings to figure things out, the fewer weeks I have to test their accuracy. The later into the season I get, the more noisy my measurements of future accuracy become, both due to small sample size volatility as well as strength of schedule imbalances.
Here are the 2013 results, alongside the previously published 2007-2012 data.
|Week 8 Ranking Correlation to Future Wins|
You'll notice that ESPN's correlation actually got worse with more data, while the rest of the rankings improved. This is roughly consistent with prior seasons, and a phenomenon I pointed out in the previous post on this topic. ESPN's mid-season regression illustrates the folly of ranking teams by win-loss record, which is pretty much all the ESPN power ranking is, with a deviation from win-loss here and there just to make it appear as if they're trying (not to single out ESPN though, pretty much all the major sports sites that do power rankings employ the same approach). Bill Parcells may disagree, but a team is more than their win-loss record.