clock menu more-arrow no yes mobile

Filed under:

Does the Home Run Derby "Curse" Really Exist?

Earlier in July I remember watching an episode of Sportscenter in which former major league manager Dusty Baker was debating whether or not Alex Rodriguez should participate in the 2007 Home Run Derby.

I specifically remember the phrase "home run derby curse" being muttered by the Sportscenter anchor I cannot remember as a reason why A-Rod should be careful in deciding whether or not to partake in the derby.

A myth that seems to surface around the time of each seasons All-Star Game is the theory that players who participate in the home run derby (predominately the finalists) will see significant decline in their second half production; specifically in the power department.

Rodriguez of course did not participate and Vladimir Guerrero ended up winning the derby defeating Alex Rios in the final round.

So, was the decision of not taking part in the derby a wise one for Rodriguez?  Did he avoid the home run derby "curse"?

Let's take a look.  Here is a list of the last twenty-two home run derby finalists in derbies over the last eleven seasons along with their production in the first and second half of their respective seasons:

(Note:  OPS+ is relative to Major League OPS for the split)

The sample size of subjects, that being only twenty-two is quite small, but the event hasn't been around forever (it became part of the All-Star festivities starting in 1985) and the total number of home runs hit in the derby skyrocketed in the 1990's, so we can still get an idea just if this "supposed" curse does exist.

Here's what we find from the chart:

*Eight Players saw improvement in second half OPS+, thirteen saw decline in second half OPS+, one player saw no improvement or decline in second half OPS+ (Giambi in 2002).

*Nine Players saw improvement in second half AB/HR ratio, thirteen saw decline in AB/HR ratio.

*As a whole, HR/AB ratio declined slightly in the second half.

When looking at things from an OPS+ perspective, we need to consider the concept of regression.

Larry Walker saw a significant decrease in his second half OPS+ back in 1997, but was it even reasonable to expect that he could repeat his first half production (.398/.496/.741) in the second half?  Probably not.  

Walker still had an outstanding second half that season (.328/.397/.695), but he goes down as a guy that saw a decrease in production following his participation in the derby.

The same type of thing can be said for Mark McGwire in 1996 (.332/.496/.796 first half vs. .292/.437/.665 second half) and Luis Gonzalez in 2001 (.355/.443/.745 first half vs. .290/.412/.620 second half).

For some of these players it's not so much that the home run derby altered their swing and caused them second half decline, it's more so that some of them were playing well over their heads and just falling back to earth a little bit.

When I combined all the finalists pre and post All-Star at-bats and home runs, I came to find out the average player will see their AB/HR rate increase, but oh so slightly.

The difference isn't real significant, but we have seen a few standouts in derby history.

Players like David Wright in 2006, Jeromy Burnitz in 1999, Garrett Anderson in 2003 and Bobby Abreu in 2005 all saw massive declines in their home run rates following the derby whereas Ivan Rodriguez in 2005, Miguel Tejada in 2004 and Sammy Sosa in both 2001 and 2000 each saw significant improvement in their home run frequencies.

Once again, players have tended to see declines in their home run rates following participation in the derby, but the combined difference is quite small.

So, was A-Rod wise in passing up the offer to participate in the derby?

It probably wouldn't have made a difference.  Some have performed even better in the second half following taking a few swings in the derby while some just have not.  There just isn't an extreme pattern here.