I ran across something odd when looking at GB/FB relationship for pitchers. First, here's a look at overall affect on runs per 9 when plotted against GB/FB ratio (GB/(GB+FB), to be exact):
As you can see, there appears to be an ever so slight decrease in runs as a pitcher's GB/FB rate goes up. It's when you split it into relievers and starters that it gets interesting:
As you can see, relievers show a slight increase in number of runs as GB/FB rate goes up, while starters show the opposite trend, decreasing runs as GB/FB rate goes up. Neither trend is very strong, and the low R values suggest that for individual pitchers, it has little predictive value for R/9, but the sheer number of pitchers in each group (1800+ for relievers, 1100+ for starters) should mean that we're really seeing something real here. I don't quite know what to make of this.
Just for disclosure, this has been taken from Fangraphs for the period of 2002 through 2010, all pitcher-seasons of 20 IP or more. As stated, I'm using GB/(GB+FB), and these two groups were made up of those who were strictly relievers and those who were strictly starters. FWIW, the mixed group, those who both started and relieved in the same season, has a trendline almost exactly the same as the first chart.
I thought I'd post this here because there are a lot of smart baseball people at BtB, and someone might be able to shed some light on this. What's going on here? Why would batted balls create a differing effect for relievers and starters? Is there some kind of bias here that's creating these divergent results?