Growing up, my favorite player was Cal Ripken Jr. Why did that happen? Well, it probably had to do with the fact that he played for the Baltimore Orioles, which were the closest major league team to me at that point. Also, it didn't hurt that I began realizing that I could watch baseball on television around September of 1995.
Once I started getting a bit more into baseball numbers, I of course started delving into the numbers of my favorite player. Naturally, the first number that would come up about Ripken would be 2,632, of course referring to his consecutive games streak. If Woody Allen was right about showing up and success, Ripken had that in spades.
Unfortunately, by the time I started watching baseball, Ripken was on the backside of his career. While there would still be some shining moments, his 6-6 game against Atlanta in 1999 comes to mind, his best days were behind him at that point.
It brings to mind an interesting question: Did all the games Ripken played get to him later in his career. Granted, Ripken was 35 in 1995, but still. Does high number of games played cause players to crash to earth earlier?
So, to look at this, I decided to take a look at data going back to 1980. I classified player-seasons into three categories: Players who played in at least 99% of their team's games (160+ games), played in at least 95% of games (154+ games), and other players who at least qualified for a batting title.
Now, in order to take a look at the change players experience, I wanted to look at a rate statistic that isn't directly affected by the number of plate appearances. So, I decided to look at wOBA, specifically a player's percent change in wOBA. Now, in order for these seasons to be compared, both seasons had involved in calculating the percent change in wOBA had to be qualified. This lead to 3,059 total data points.
So there are a few ways to look at whether a high number of games played affects performance later in career. The first way could be to look at the immediate, next year performance of a player. So let's start there. What do seasons look like a year after a player plays in at least 95% (154+ games) of his team's games?
As a side note, the data is being fitted using nonparametric regression which will fit the average percent change in wOBA using age. The fits are given below, with the fit for the 95%+ group being in red.
So, it looks the average percent change for the 154+ group dips below the 153 less group as early as age 26. As players age, eventually the 153- catches up to the 154+ group. But between the ages of 28 and 36, playing in 154 or more games in a season leads, on average, leads to a greater decrease in the subsequent season's wOBA.
What if we go to an even higher rate of games played. What happens when a player plays in at least 99% of the team's games (160+ games). We see a similar pattern to the 154+ group.
Here, we see similar average wOBA percent changes for both groups until age 31, After that point, the 160+ group bottoms out quickly. If we look at all 3 groups (
So, this confirms what seems to be clear: As a player gets older, it gets tougher for them to recover from long seasons. We see this reasonably clearly in all 3 plots.
However, this really doesn't answer the initial question. That question was, does playing a large number of games (In full season chunks) early in a career lead to earlier decreases in performance? To look at that, I looked for the players who had at least 4 seasons of 154+ games played before the age of 30. Then, I once again looked at their change in wOBA over their career.
Here, the pattern isn't as clear as before. However, it seems that the players who played 154+ games in 4 or more seasons before age 30 start to decline faster at age 31. However, the difference does not appear large enough to say that a large number of games early in career severely affects performance later in career.
Now granted, there is the problem of missing data here. It's entirely possible that there are large groups of players who didn't last far into their 30s, thus adding a little bit of extra variability to the problem. Regardless, there does appear to be a slight association between the two variables.
Of course, there's the cloud that hangs over all of this. The steroid era did allow players to stave off decline much longer than in years past. The difference in groups here may not be as great as it seems, especially if the fewer than 153 games group is the group that is on steroids.
However, when parts of the steroid era has, hopefully, been washed away in 10 years, we may find that games played has much less long-term effect on player performance.