I looked at this question earlier this week. The stat I used to measure pitcher performance was RSAA, which tells us how many runs a pitcher saves above average (it is also park adjusted). But it is affected by the quality of the fielders. Now I will look at the issue by trying to create a defense independent stat to rate the pitchers, following the idea of Voros McCracken who found that pitchers have little influence over what happens on balls in play. The measure I will use won't be as sophisticated as the one he uses, but it will give us some idea of which pitchers did better than we might have expected based on their age while removing the influence of the fielders.

First, I found all pitcher seasons with 150 or more IP from 1920-2005. Then I ran a regression with each pitcher's ERA relative to the league average being the dependent variable and strikeouts, HRs and walks (all relative to the league average) being the independent variables (data is from the Lee Sinins Complete Baseball Encyclopedia."). Here is the regression equation:

(1) Rel ERA = .658 + .279*BB + .242*HR - .201*SO

Again, these are all relative to the league average. Each pitcher's stats can be plugged into this formula to get their defense independent relative ERA (the correct term might be fielding independent, but I'm not sure). In any case, this is their expected relative ERA based only on stats that they are responsible. Once I had this for all pitchers in the study, I cut down the total to only pitchers who had at least 10 seasons with 150+ IP. Then I found the relationship betwee this predicted relative ERA (or defense independent ERA or DIPS ERA) and age. The graph below shows the relationship.

This graph covers the ages 22-44 for pitchers who had at least 10 seasons with 150+ IP. There were a few ages younger and older than this, but not many. The relationship was actually better when only these ages was covered. The numbers in the graph are the average DIPS ERA for each age in this sub-group of pitchers.

I could have just used the average DIPS ERA for each age for all the pitchers. But we can't simply find the average DIPS ERA for each age because its possible that a pitcher must be pretty good to be used at very young and/or very old ages. Sometimes the average for very old ages is pretty high because only good pitchers are still around. By only looking at pitchers who had at least 10 seasons with 150+ IP, we get a more realistic aging pattern since this group is likely to be pretty good and we therefore don't have to worry about the old guys being good since all of these guys are good since they pitched so long.

The equation which shows the relationship between DIPS ERA and AGE

(2) DIPS ERA = .00052*AGE^{2} - .0322*AGE + 1.3755

But this was for the pitchers who had at least 10 seasons with 150+ IP. For all the pitchers, I assumed they had the same aging pattern, but I moved the intercept up to take into account the inferior quality as compared to the smaller group. The shift in the intercept was equal to the the difference in average DIPS ERA between the whole group of pitchers (.9466) and the smaller group (.9011). That was .0455. That got added to 1.3755, making the intercept 1.421. So the equation to predict DIPS ERA becomes

(3) DIPS ERA = .00052*AGE^{2} - .0322*AGE + 1.421

Now each pitcher's age was plugged into equation (3) to get a predicted DIPS ERA. Then that got compared to the value from equation (1). Let's take Dazzy Vance from 1925, example (age 34). His predicted DIPS ERA would be be .9042 based on that age (plugging an AGE of 34 into equation (3)). But his DIPS ERA from equation (1), based on his relative strikeouts, HRs and walks was .4546. So he was .4496 better than his age predicted.

The next step was to see how many runs he saved as a result of this. He pitched 265.33 innings. That makes 29.5 complete games. The league ERA that year was 4.27, so his age predicted ERA would be 3.86 (.9042*4.27). But his predicted relative ERA (or DIPS ERA) was 1.94 (4.24*.4546). That difference is 1.92 (3.86 - 1.94). That gets multiplied by 29.5 to get 56.6 runs saved. This was the most runs saved once AGE was taken into account and I considered only the pitcher's stats. Here are the top 25 age adjusted seasons in runs saved based solely on the pitcher's stats.

These stats are not park adjusted. That is why Pet Donohue is up there. He pitched in a low-run park. Clemens of 2005 is not here. He would only be 215th.

I used one other method last week to find the best age adjusted seasons. I subtracted the normal peak age from each guy's age and took the absolute value. That got multiplied by the number of runs saved (which was not age adjusted as in the above explanation-it was simply (IP/9)*(the difference between league ERA and the ERA predicted by equation (1)). The peak age was 29.89. I found that by finding the average age for the top 250 seasons in predicted relative ERA (using equation (1)). For example, Dazzy Vance in 1930 was aged 39. That minus 29.89 is 9.11. He saved 53.75 runs. His relative ERA predicted by equation (1) to be .6237. That times the league ERA of 4.97 is 3.10. The difference is 1.87. He pitched 258.667 innings. That divided by 9 is 28.75. That times the 1.87 difference is the 53.75 runs saved. That gets multiplied by 9.11. That gave him 489.67 "age points." The top 25 in "age points" are listed below.

Clemens makes this list. Notice that there are some relatively young pitchers here. By using absolute value, the farther a pitcher is from the "peak age," the more points he would get. So this puts a guy 10 years over peak on the same footing as a guy who is 10 years under.