For much of baseball history, if a starter couldn't cut it in the rotation, he found himself out of baseball. With the decline in complete games and the rise of the relief pitcher throughout the last few decades, many failed starters have found a home in the bullpen. Pitchers like Dennis Eckersley, Eric Gagne, Jason Isringhausen and John Smoltz, among many others, have thrived as relievers when injuries or ineffectiveness forced them from starting.
In general, relievers have better numbers than starters. They strike out more batters, give up fewer hits, and have a better ERA (somewhat surprisingly, they also walk more batters). In The Book, MGL, Tango and Andy Dolphin calculated that the average pitcher will have an ERA .8 runs better as a reliever than as a starter. Conventional wisdom suggests pitchers perform better in relief than when starting because they're expected to pitch fewer innings and can throw harder. In this post, we set out to test that conventional wisdom using the Pitch F/X data.
Let me pause a moment (as I always do when writing these things) to offer a disclaimer. I know they're boring, but these things need to be said. If you want to, you can read this paragraph in the really fast voice that comes on at the end of bank commercials (you know, the "protected by FDIC" guy). That might make it a little more interesting for you.
I'm relying on the Pitch F/X internal pitch classification system to identify what type of pitch is thrown. Mike Fast took a look at this during the first week of the season and found it less than impressive for classifying pitches. Since then, there have been some changes to the algorithm, and it appears the classification system is more accurate now. However, there is some uncertainty in the pitch types used in this study. Also, I'm relying on a very small subset of information, so sample size will be an issue. We'll get into that more when I start talking about the methodology.
Now that that's out of the way, let's get to the actual study. I identified the 70 pitchers who had made at least one start and one relief appearance this season (as of June 25). From that list, I picked those pitchers who had at least 200 pitches in starts (roughly 2-3 starts) and 100 pitches in relief (2+ appearances). That brought the list down to 26 pitchers. Finally, I split that sample into fastball and junkball pitchers. Fastball pitchers were those pitchers who threw some variety of fastballs in at least 60% of their pitches as starters. Junkball pitchers were those who through less than 60% fastballs. In our sample of 26, there were 17 fastballers and 9 junkballers.
There are two main ways pitchers can throw harder in relief. They can either throw a higher percentage of fastballs, or they could throw each pitch harder. If I had to guess, I'd bet both reasons would contribute to the improved ERA for relievers.
First let's explore whether pitchers changed their approach in relief by throwing a higher percentage of fastballs. The average pitcher in the sample increased his fastball percentage 2% when relieving. That breaks down to 1% for fastballers and 4% for junkballers While there were some major swings in pitch selection (Josh Banks increased his fastball percentage 24% in relief), there's no real indication that pitchers throw a substantially different mix of pitches when starting versus relieving.
If it's not more fastballs being thrown, then it's got to be the pitchers putting more behind each pitch, right? The logic makes sense, if pitchers know they're not going to need to throw as many pitches, they can devote more energy to each pitch. But is that really the case?
Before we dive into the data, we need to discuss how we'd identify a pitch being thrown harder. The easiest way is to compare pitch speeds - higher should equal harder. Beyond pitch speed, we can look at movement. Here's where things get difficult. I'll admit, I'm not sure I've got the physics correct, but let's give it a try anyway. The Pitch F/X data gives us information about both the horizontal (pfx_x) and vertical break (pfx_z). For a much better description than I could possibly give, please refer to Dr. Alan Nathan's explanation of what they mean. I compared each pitcher's horizontal break and vertical break as a starter and as a reliever to determine when they had more break. My thought is that, if we hold speed constant, a greater break means the pitch is thrown harder (because it moves more with the same initial velocity).
I think it's time to get into some results. First, pitch speed. There wasn't much deviation in pitch speed across our sample. Most pitchers were within 2 MPH in either direction, which doesn't seem like it should make a major difference in outcomes (fastball speed of 87-88 while starting going to 88-89 while relieving). The pattern holds when looking at the pitcher's primary non-fastball pitch as well.
Because we saw little to no change in pitch speed from starting to relieving, I'm going to make a simplifying assumption that's almost certainly the absolute wrong thing to do mathematically. I'm going to assume speed is held constant when looking at movement. This allows me to apply my rule of thumb from before - where more movement = harder thrower.
Using that rule of thumb, we don't see much of a pattern when we look at fastball movement.
Category | Pitchers |
---|---|
+H, -V | 7 |
-H, +V | 6 |
+H, +V | 7 |
-H, -V | 6 |
In this table, H represents horizontal movement and V represent vertical movement. A plus indicates the pitcher increased the movement in that direction, while a minus says the pitcher reduced the movement. I'm going to go out on a limb and call this table inconclusive.
But what about breaking balls? We already saw there wasn't much of a speed difference, but what about a difference in movement?
Category | Pitchers |
---|---|
+H, -V | 7 |
-H, +V | 7 |
+H, +V | 9 |
-H, -V | 3 |
So we do tend to see more movement on breaking pitches (and R.A. Dickey's knuckleball) in relief than when starting. It's by no means definitive, but in this small sample, and according to my assumptions, pitchers do seem to be throwing slightly harder in relief than when starting.
But, if in fact pitchers are throwing harder as relievers, the effect shows up in pitch movement rather than pitch speed - which is somewhat contrary to conventional wisdom. And is this effect enough to explain the .8 run difference in ERA? It doesn't seem plausible to me. But if how hard a pitcher throws is not the cause, what can we attribute it to?
Other possible causes include only facing any given batter once per game instead of 2-3 times (the familiarity effect), not being credited with runs if you allow inherited runs to score (the not-my-fault effect) or possibly even facing more pinch hitters (because of the pinch hitter penalty outlined in The Book).
We started this article attempting to confirm the conventional wisdom that pitchers perform better in relief than as starters because they throw harder in relief. Surprisingly, that doesn't seem to be the case. Looking at the sample of 26 pitchers who have been both starters and relievers in 2008, the evidence they throw harder when relieving is scanty at best. There's a very minor (less than 1 MPH on average) difference in pitch speed, mixed results on pitch movement that may hint at a pitcher throwing a little bit harder on breaking balls, and little-to-no-change in the percentage of fastballs thrown. While these may all contribute to the .8 run ERA difference uncovered in The Book, it seems unlikely they explain the whole effect. Followup work should be done to examine a larger sample of pitchers and to attempt to quantify the run effect of a difference in pitch speed. And someone who has taken a physics class in the past decade should probably either confirm or reject my assumptions.
The data I used for this study can be found here.