Sunday, January 31, 2010

Who's Better, Who's Best - Manning, Favre and Brees

Last Sunday we were all treated to some good football.  The AFC Championship featured the surprising NY Jets, against the Peyton Manning-led Indianapolis Colts, and the NFC Championship showcased the Drew Brees-led New Orleans Saints who played host to Brett Favre and his Minnesota Vikings.


Not only are these outstanding football teams, but three of the four teams featured three of the game's best quarterbacks in Manning, Favre and Brees.  I thought it would be illustrative to take a look at each of the three quarterbacks individually, and then compare the three side by side to get a sense who maybe better, and perhaps even, who may be best.  In evaluating their relative merits, I will only consider their ability to throw the football, not win games, or how well they run, or whether they are clutch performers, or how many Super Bowl victories they have.


Here are the career NFL Passer Ratings for these quarterbacks:


Manning is clearly the best.  Well, at least that's what looking at the career NFL Passer Rating will tell you.  These figures differ from what you would find at the Pro Football Hall of Fame (as of January 23rd, 2010).  The reason being that the figures listed on the Pro Football Hall of Fame site were as of the beginning of the 2009 season, and mine are as of the end of the 2009 season just completed.















For reasons that I have chronicled in the past (see here, here, and here), the NFL Passer Rating formula is in and of itself is of little use in determining who among the three might be best.  The three quarterbacks very well might end up being ranked this way when it is all said and done, but that doesn't justify the rating system.

In order to complete our analysis, we have to make several adjustments to the NFL's passer rating system.  

The first, is that we need to take each player's passer rating, and relate it to some baseline, so that we may account for players playing in different eras (yeah, I know, it's a little bit odd to be talking about players in different eras when comparing three current quarterbacks, but Brett Favre has been throwing since 1992, and Drew Brees only since 2002.  And since the average NFL passer rating has been increasing over time (see here), we need to make an adjustment for that.  The method that I have chosen is to relate a given quarterback's performance relative to the mean for that year, and use the standard deviation in that year as the measuring stick.  In other words, the difference between the quarterback's actual performance compared to the mean, as a function of the standard deviation.  As I described here, we then convert each of these resulting measures, also called z-values, into percentile ranks, using the standard normal distribution.



The second, is that we need to account for the fact that they've each played a different number of seasons.  While I used a rather simplistic approach last year, I will present a more sophisticated approach starting with this blog.  This method will need further refinement over time, since I am early in my thought process regarding this particular topic.  However, it is an improvement over last year's simplistic approach.  Actually, the method used last year (average deviation) really doesn't take into account the number of years at all.  When I evaluated Brett Favre's career, I simply compared him to a group of quarterbacks who had played a like number of years, and simply ignored any comparisons to players who played far fewer seasons.  That method won't work in this case, as we need to compare three quarterbacks with three vastly different career lengths.  All else being equal, and they rarely, if ever, are, a player who has a 10 year career versus a player who has a 7 year career should get a higher rating (even if, and especially if, they both had the exact same ratings in every year they played).  But the question becomes, "how much higher"?

The third, is that we need to somehow account for the fact that each of the three quarterbacks has played not only a different number of seasons, but that those seasons represent different parts of their careers.  In some respects, since the last 8 seasons have been common to all three, we could simply look at the last 8 seasons.  But, that would mean that we would be comparing Brett Favre in his 11th through 18th seasons, Peyton Manning through his 5th through 12th seasons, and Drew Brees his first 8 seasons.  As I have shown in the past, the early years for a quarterback tend to be the "learning years", and hence this comparison would be inappropriate without the proper adjustments.  Brett Favre is also 40 years old, and we have seen declining performances as quarterbacks age (actually, their performances increase at first as they gain experience, and then subsequently decline as the physical toll takes effect).  One method is to look at each player's first 8 years, since in that case the comparisons would be over the identical portions of their careers.  And perhaps we could compare Favre and Manning for the first 12 years.


Last, we need not assume that the NFL Passer Rating is the measure to be used to evaluate who might be the better passer.  The NFL Passer Rating system has a lot of flaws.  I will use two measurement methods.  The first is the concept of Adjusted Passer Rating.  As I have previously mentioned, the average NFL Passer Rating has been steadily increasing every year.  The Adjusted Passer Rating normalizes the passer rating so that the average for each year is 66.7.  This is then very helpful in evaluating players who played in different years (such as the case here).  Last year, I introduced a simple measure of passing efficiency - CMI.  I now call this CMTI (or, simplistically, Completions Minus Thrice Interceptions).  It is not only simple, but it has a 86% correlation with the NFL Passer Rating.  And, because it is so powerful in explaining any variances in the NFL Passer Rating, and because it is so easy to compute, use and explain, I will use it here.


So, to recap, we will make the following adjustments:


  1. Relate performance in given year to the mean, and then in relation to standard deviation for that year - then convert to percentile rank
  2. Use a 'better' measure than NFL Passer Rating.  We will show that Adjusted NFL Passer Rating is 'better', and CMTI is simpler
  3. Take into account differing number of seasons played by each quarterback
  4. Take into account quarterbacks who are at different stages of their career


Here are the respective #s for each quarterback by season:


Brett Favre




Drew Brees




Peyton Manning




What do these performance measures say?  Well, there's a lot of information here, so let's take these one at a time.


The first is that the career averages shown (actually all of the averages shown) are simple arithmetic averages of each of the years, not a true weighted average.  You can see that these simple averages are close enough to the true average by looking at the career average NFL passer ratings shown in these three exhibits and compare them to the career average for each of the quarterbacks shown earlier.


The second is that regardless of what measure we use, whether it is the NFL Passer Rating, Adjusted NFL Passer Rating, CMTI, first 8 years, last 12 years or what not, it should be very clear that Peyton Manning has performed at a truly superior level compared to the other 2 players (or anyone else for that matter).  Just look at CMTI over the past 8 years, for example.  He has averaged being in the 90th percentile over the past 8 years.  Basically, that suggests that he is, on average, in the Top 3 every year.  Truly remarkable.  In Peyton Manning, we are watching one of the all-time best.  A once-in-a-generation quarterback.  In case you didn't notice, the rows highlighted in green indicate a performance that is in the 85th percentile or higher.  The areas in blue are 95th percentile or higher.


Ok,  Manning is best.  That was easy.  Now comes the difficult part.  Who's better, Favre or Brees?


The case for Favre:


On an Adjusted Passer Rating basis, Brett Favre's first 8 years are comparable to, if not better than Drew Brees' first 8 years (70th percentile versus 68th percentile).  We can't directly compare their two careers since one has had an eighteen year career while the other only eight.  It is very evident that, but for the last year, Favre's performance has been quite pedestrian over the past 10 years.  Well, we can't necessarily penalize him for that, because we don't know how Brees might do in his next 10 years.  We don't even know if he is likely to play another 10 years.  Given history, however, it is fairly safe to assume that he won't play another 10 years.  In my database, going back to 1932, and including both the AAFC in the 1940's as well as the AFL in the 1960's, I have a total of 416 quarterbacks who qualified in at least 1 year.  206 of these quarterbacks qualified in only 1 or 2 years.  The average is 4.125 years.  There are only 6 quarterbacks that qualified in 15 or more years.  Y.A. Tittle qualified in fifteen different seasons.  John Elway, Dan Marino and Johnny Unitas each qualified in sixteen different years.  And then there are the ultimate marathon men, Fran Tarkenton and Brett Favre, who both qualified on 18 different occasions.


Michael J. Schell, in his book Baseball's All-Time Best Hitters, makes an adjustment to player batting averages, called a late-career adjustment.  In essence, he recognizes that a player's performance declines as he ages, and so he makes an adjustment for it.  Schell basically caps the at-bats of a player at 8,000.  All at-bats beyond 8,000 are ignored.  This way, he can compare each player through their first 8,000 at-bats.  Doing that type of adjustment would be extremely difficult in the case of football.  We know that the average number of passes attempted by a player per game has increased over time (see here and here), so limiting to a fixed number of attempts would be difficult and meaningless at best.


We could limit the player performances to a given number of seasons.  Look at the three graphs below.




The first is the average percentile rank of CMTI for all 416 quarterbacks by age.


And the second is the same as the first, except with the x-axis denoting the number of seasons in which they qualified instead of their age.


It appears that there is a learning curve at first (and an elimination process, as presumably the worst players end up having shorter careers), followed by a decline after about 12 years.  Keep in mind that this first graph shows the career percentile rank.  What we're really interested in is does performance decline as a player ages?  So, what we really want to see is the average percentile rank for a given year of a player's career no matter how long they have played.


This is what the third graph shows.  It's actually a bit surprising.  There doesn't appear to be a significant drop-off.  As a matter of fact, out in the 15th-18th years, their performance seems to have got better!  Ok, so in the outer years, we're not talking about a lot of players (for years 17 and 18 its just 2 players - Tarkenton and Favre), so anomalies should be expected.  So what do we do?


I mentioned earlier that the average number of years in which a quarterback qualifies is 4.125 for all 416 quarterbacks.  The table below shows the data in more detail.




There are a total of 77 quarterbacks who have played (qualified) in 8 or more years.  That would be 77/416 = 18.5% of the eligible list.  The average number of years for this group is 10.8, and the median number of years for this group is 11.  It would seem reasonable then to look at a minimum of 8 years as a cut-off.  But how do we recognize players who have played a lot longer?  The method that I have chosen is to look at a given player's best 8 years.  This means several things.  First, the obvious - it eliminates from consideration any player who has qualified in fewer than 8 years.  Second, it accounts for the fact that players have played longer, and, that fact doesn't hurt them.  As a matter of fact, it's the opposite.  The less a player has played, the more the "penalty".  Third, given the small sample of players playing beyond 14 years, it is quite possible that one or more of them had good years in those later years, so why exclude those years?


So, going back to the tables where I showed each of the three quarterbacks and their individual seasons, you can see that Favre's best 8 years are better than Brees' 8 years.  This gives credit to Favre's lengthy career.  Remember, less than 20% of the qualified quarterbacks even get to 8 years total, let alone be fortunate enough to have 8 good years.  Perhaps over time, Brees might very well exceed Favre.  And, if and when he does, we'll give him credit for it at that time.


The case for Brees:


Although comparable to Favre in his first 8 years based on Adjusted Passer Rating, using CMTI gives Brees the nod.  Also notice that Brees has had four years where he was better than 85th percentile, while Favre only had one.  Brees' last four years in New Orleans have been especially strong.  In each of years 5-8, Brees outperforms Favre, while in the first four, Favre outdoes Brees 3 out of four years.  Favre, it appears, built his legend largely on the success of his first year, and perhaps even starting with his first game.  Check out the following chart.




It shows the average percentile rank for all quarterbacks who played longer than 8 years, for each year they played.  Actually, it's not quite the number of years they played in as much as it is the number of years in which they threw enough passes to qualify to have a rating.  It's the blue line, and it includes data from the previously mentioned group of 77 quarterbacks.  The blue line in this graph is different than the line shown on the third graph earlier, in that this only includes data for those quarterbacks qualifying in 8 or more years.  The red line is the data for Drew Brees.  The green line is the data for Favre.  Clearly, outside of his first year, and then his 3 MVP years (years 3-5 for him), Favre hugged the average for this group of quarterbacks.  Some years he did better, and some he did worse.


We're almost done.  One more graph.




This one puts a slight twist on the notion of the best years.  Instead of just looking at the best 8 years, why don't we look at the best n years, for each year n that the quarterback has played.  So this way, we can look at the best year for each quarterback, the best two years, the best three years, and so on.  For good measure, I also threw in Manning's #s.


So what do we conclude from all of this?


Again, if there, was any doubt before, this should put it to rest.  Manning is clearly the best.


Between Favre and Brees, an argument could be made for either.  In terms of a total career, Favre gets the nod, simply because of the number of years he's played.  Brees has put up some impressive numbers so far.  If he continues this for a few more years, he will most likely finish ahead of Favre, but that remains to be seen.  Based on the evidence so far however, he is doing at least as well as, if not better than Favre did in his first 8 years.  Perhaps we'll check-in in a few years when Brees retires.  And, speaking of retirement, in my next post, I'll take a look at Kurt Warner's career.



Sunday, January 17, 2010

The Adjusted NFL Passer Rating Revisited

This posting is an update and an enhancement to a previous posting that discussed the concept of the Adjusted NFL Passer Rating.  Back in 1971, when Don Smith, an executive with the Pro Football Hall of Fame, and his team developed the current NFL Passer Rating formula, he suggested that the intent of the formula was such that an 'average' Passer Rating would be equal to 66.7.  In essence, he decided that an average performance for each of the four components that make up the passer rating system should be equal to 1.0, with an exceptional performance defined as a score greater than 2.0, and a poor performance getting a 0.  Since the formula adds the scores for each of the four components and divides by 6, an average performance would be equal to 66.7.

Since we have seen that the average NFL passer rating has increased over time (see my previous post on the topic here), I was curious as to what the average in 2009 looked like, and, more importantly, what would the actual NFL passer ratings have looked like if we adjusted each of the components such that the averages for 2009 were in fact 1.0.

Recall that in the formula , based on data from the 1960's, adjustments are made to each of the four components such that the average for each component worked out to be 1.0.  The adjustments, call them k1, k2, k3 and k4, are:

  • k1 = 30
  • k2 = 3
  • k3 = 1
  • k4 = 1

Again, these factors were arrived at by looking at data from the 1960's.  And, these same factors are applied to the formula regardless of when a quarterback actually played.  This has the benefit of course of creating a formula that is the same for all quarterbacks against a fixed standard.  The problem of course is when the overall standard changes over time, and this is what has happened over the past 80 years or so.

Well, we don't have to use data from the 1960's.  We will use each year as a stand-alone year.  So, for 2009, the factors that make each component equal to 1.0 for an average performance turn out to be:

  • 2009k1 = 40.90
  • 2009k2 = 2.98
  • 2009k3 = 1.20
  • 2009k4 = 1.78
Likewise, the factors for 2008 were:

  • 2008k1 = 41.00
  • 2008k2 = 2.94
  • 2008k3 = 1.28
  • 2008k4 = 1.95
And, for good measure, the actual factors for the NFL for 1969 were:
  • 1969k1 = 32.64
  • 1969k2 = 2.99
  • 1969k3 = 0.99
  • 1969k4 = 1.12
These 1969 factors are very close to the k1, k2, k3 and k4 values used in the formula.

The table below shows the 2009 Adjusted Passer Ratings for the 32 quarterbacks that threw enough passes this past season to qualify.  The table also shows their actual 2009 NFL Passer Ratings.  You can see that the Adjusted Passer Ratings are clearly lower (as a group) than the actual NFL Passer Ratings.  That's because the actual NFL average passer rating in 2009 was 83.0, and, obviously, the average Adjusted Passer Rating for the NFL for 2009 was 66.7 (because we forced it to be so by adjusting the formula such that the average value for each component is equal to 1).



Although the Adjusted Passer Ratings themselves are a lot lower as a group than the actual NFL Passer Ratings, you'll notice that the rankings for each are essentially the same.  Why is that?  Very simply, since both calculations use the same essential formula with the values changed, all this really does is 'normalize' the actual ratings around a value of 66.7 as opposed to 83.0.  And yes, since the relative value of each of the four components now change, there should be some movement in the rankings.  Interestingly, I also included my CMTI calculations (and their respective ranks), and it also highly correlates with both the Adjusted Passer Rating rankings and the NFL Passer Rating Rankings.

Just so that you can see for yourself, I've included the table (shown below) that shows the average NFL passer rating by year from 1932 to 2009 using the current NFL passer rating formula (where k1 = 30, k2 = 3 and k3=k4=1 for all years), as well as the adjusted k1-k4 factors for each year.  There is no need to show the Adjusted Passer Rating for the NFL in each year, since it is the same in every year - 66.7.  If you look at the bottom of the table, for years 1960-1969, you'll see where Don Smith derived his k values from.



Looking at this table you should be able to understand why using the current NFL passer rating system is of little to no use when you need to compare quarterbacks from one era to another.  It can truly only be used to compare one passer versus another in any given year, but not across years.  It stands to reason then, that the current NFL passer rating formula isn't much use when evaluating a quarterback's career, especially ones who've had a long career (you can see the career NFL passer rating leaders here).  In future posts, I will discuss this in more detail.

Sunday, January 10, 2010

2009 NFL Season - Passer Ratings

We now have data for the entire 2009 season, and I can calculate the best and worst performers for the season, using my standard measure, CMI - or more appropriately, Standard Deviations from mean CMI.  CMI is easily calculated, as it is simply [pass completion percentage] - 3 * [interception percentage].  I can calculate that at any point during the season.  In order to calculate the standard deviations from the mean CMI, I like to wait until the season is over (actually, even this can be calculated at any point during the season, but it's a lot simpler for me to just do it at the end of the season - perhaps next year, I can do this after every week during the regular NFL season).   For those of you familiar with statistics, the measure of standard deviation from the mean is also called a z-score (or z-value).  And, furthermore, z-score can be very easily converted to a percentile rank (assuming that the population is normally distributed.  In a later post, I will show that z scores for CMTI or NFL passer rating over the years is distributed normally).  The beautiful thing about a percentile rank is that it is always a number between 0 and 100.  What's not to like about that!

A quick couple of notes here regarding CMI, and many of the calculations that I will be using from here on out. While I will go through the exercise of creating arduous, complex and sometimes arbitrary (and even unnecessary) calculations, I will strive for simplicity whenever possible when I present my ideas, as I believe they are more likely to be accepted.   The use of percentile ranks is an example of this.

Last year, when I first developed CMI, and especially as I looked back in time, I had to figure out how many players to use each year to determine the number of qualifiers.  I went through some elaborate mechanisms to calculate the number of passes attempted during the season (per team per game), and some qualifying standard.  As you are aware, the NFL uses 14 passes per game as the standard.  And yes, while it is simple to use, I am not a big fan of that standard since the average # of passes attempted per game has changed (increased) over time, as you can see from the table below.



The standard today should be closer to 16.

Last year, I arbitrarily assigned a factor close to 50% (with adjustments for the earlier years) to the total # of passes attempted by each team per game to determine the # of qualifiers.  I realized this year, that if I simply used the # of teams in the league as the cutoff for determining the # of qualifiers, I get fairly close to my original method.  And because it is simple, easy to explain, does a better job (in my opinion, anyway) than the current NFL standard, I will use that as the cutoff.  The table below shows the # of qualifiers by season using the NFL standard, my original standard, and my current standard.




As you can see, the # of qualifiers, regardless of which method you select is fairly close in each year, with the exceptions being the earliest years, where the NFL method simply excludes too many passers (in my opinion anyway).  Also note that the the AAFC is excluded by the NFL altogether, while the AFL and NFL have been combined in the years 1960-1969 by the NFL.  This doesn't make sense to me, since each league played a different # of games (and had a different # of teams).  Regardless, if you combine the AFL and NFL results from my method, you get close to the #s used by the NFL.  While I think that my method works, I also recognize that this standard may not be the most appropriate for future NFL seasons.

OK, back to the passer ratings.

Because I will discuss other passer rating systems/calculations that are related to CMI in later posts, I will take this moment to re-classify CMI as CMTI (acronym stands for Completions Minus Three times Interceptions).  The definition itself hasn't changed - just the acronym.

The table below shows each of the qualifying quarterbacks with their 2009 season's statistics, their NFL passer rating, the percentile rank of their 2009 NFL passer rating (in other words, the converted z-scores), their 2009 rank in terms of NFL passer rating, their 2008 NFL passer rating, the percentile rank of their 2008 NFL passer rating, their 2008 NFL passer rating rank, their 2009 CMTI, 2009 CMTI percentile rank, their 2008 CMTI and 2008 CMTI percentile rank, and their 2008 CMTI ranking, and finally, their average 2-year CMTI percentile rank (again a # between 0 and 100), together with that ranking.

I don't think there's any question as to who the elite quarterbacks are.  Also notice the struggles of first-year quarterbacks in each year.


Happy New Year!

Hi Everyone.

Happy New Year to you all.

It's been awhile since I've posted.  The reasons are many, so I shall not go into them.  Regardless, I am back posting.

I have been able to do some additional research in the meantime.  I have added data from the 1932 - 1939 NFL seasons to my database.  I have also added the 1946-1949 AAFC seasons to the database as well as the 1960-1969 AFL seasons.  And of course, updated for the just concluded 2009 NFL season.

I think that adding the data from the AAFC in the late 1940's as well as the AFL during the 1960's makes for a much richer database.  There are those however, who do not believe that including that data makes sense.  I disagree.

Having spent some time thinking about my pursuit here (creating a new passer rating system for the NFL), I've created new statistical measures in addition to those I created last year that I will discuss in upcoming posts.  I've also improved upon my original concept of CMI.

Anyhow, I'm glad to be back posting and look forward to a terrific 2010.

Cheers,
Kiran