On Sunday night Jake Arrieta no-hit the Los Angeles Dodgers in a truly dominant performance; by GameScore it was the second strongest start of the season behind only Max Scherzer's 16 strikeout shutout of the Brewers in June.
Arrieta's no-hitter was the second time the Dodgers had been no-hit in a ten day span, and the sixth no-hitter of the 2015 season. The sixth! Having a no-hitter happen roughly once each month of the season is a much higher frequency than one would expect. Regardless, six no-hitters places the 2015 season into a tie for third with the 1969 season in terms of no-hitters produced. One more no-hitter this season and 2015 will jump into a tie for second with two others, and with three more no-hitters this season, which is probably asking a bit much at this point, 2015 will match the 1990 season at the top of the charts:
|... 20 tied with 3|
As you can see in the table, there are a few recent seasons at the top of the charts. Analyzing on a season-by-season basis provides some information, grouping seasons into half-decade bins is interesting for examining larger trends. The plot below shows this information, using standard half-decade endpoints. I should note that these data were obtained using the Baseball Reference Play Index running a query for team games since 1961 in which no hits were allowed. The date cutoff is arbitrarily selected but begins at the expansion era of the game, and I included team games to ensure that any game in which a staff combined for a no-hitter was captured. Perfect games are also included in the dataset.
As you can see, the last five seasons have produced the most no-hitters (cumulatively) as compared to any half-decade block over the last 55 years. For those interested, or annoyed with the arbitrary common half-decade endpoints, using a rolling sum in five year trailing bins over the entire period does not significantly alter the pattern, although the most recent time period is no longer the high mark:
Due to team expansion that MLB enacted over the time frame being considered here, looking at the raw totals for the number of no-hitters is not ideal. Rather, the rate (no-hitters in the season/total games played in the season) expressed as a percentage is a better measure. While this is true, doing so (and determining 5-year rolling rates) again does not really change the pattern:
If nothing else, what these figures show is that there have been three 'peak' periods for no-hitters over this time frame in which 0.10 - 0.12 percent of games ended as a no-hitter. That may not sound like a lot, but represents around 5 no-hitters in a season, almost one per month. To be clear, the times with high rates are the late '60s / early '70s, the early '90s, and the last five seasons. What could be driving this trend?
The most obvious framework that should affect the proportion of games ending in a no-hitter is the run environment, which is affected by factors such as on-base percentage, strikeout rates, and defense. The expansion in the '60s, height of the mound and lack of a designated hitter in the American League contributed to low run totals. The early '90s saw an increase in reliever specialization, and a hangover from the contact and speed game of the '80s.
Currently, strikeout rates are at an all time high, a result of an expanding strike zone and even more specialized relief pitching. We also know the implementation of out-of-the-ordinary defensive alignments is more frequent and impacting batters, although using a crude measure like batting average on balls in play shows that hitters are more successful now than they were many years ago. However, this simplifies the issue and is tangential to the hypothesis being explored here, which is that we should expect throwing a no-hitter to be easier in low run environment seasons.
It turns out that this is mostly the case. Using the runs per game totals listed on Baseball Reference's batting seasons page, I determined the run environment (R/G) for each of the rolling 5-year bins shown in the figures above. This involves weighting by the number of games played each season before determining the average. Correlating run environment with no-hitter rate reveals a negative relationship, and that 54.7 percent of the variance in no-hitter rate can be explained by run environment. This is shown below:
This helps explain the high number of no-hitters that we have seen over the last few seasons. Given that we are in a relatively low run environment (~4.2 R/G) it should be easier for pitchers to throw no-nos; it is certainly not an easy feat, but easier than it would be in a high run environment ----- and it is not just no-hitters. For example, there have already been 19 one-hitters this season, more than we saw in total in 2012 and 2013 and just three shy of the 2014 mark. The 1988 season holds top spot with 24 one-hitters. I ran a similar analysis for 1-hitters and the results are much the same. I will spare you another set of figures (if you really want to see them you can click here and here, but I will note that the peaks are at similar time points, and the relationship between run environment and one-hitter rate is even stronger (R2 = 0.8) than it was with no-hitter rate.
Perhaps this was much ado about something fairly obvious, but the analysis provides some context for why we are seeing so many no-hitters lately. It is getting to the point that due to the frequency of the accomplishment no-hitters may be losing some of their appeal, which is an odd thing to say about something that still happens quite rarely.
All together, we are four-fifths of the way through the season and have seen a nearly historic number of no-hitters and one-hitters. There is still a month to go in the season, so there is a good chance we will see 2015 climb up the ranks in each category. If the present rates hold we should see one more no-hitter and three or four more one-hitters. The low run environment may take away some aspects of excitement that come with baseball, but provides others that we should appreciate.
. . .