So what have we learned about the simulations' general accuracy?
- In 2010, the Yankees finished one game and the Rays two games shy of the average projection. The Red Sox finished off by six games, but no projection can — and no projection should — project a season like that.
- In 2009, the Red Sox finished one game better than projected, but the Yankees overperformed by seven games.
- In 2008, the projections generally punted. The Rays were 12 games better than projected, the Yankees were six games worse, and the Red Sox were three games better. Only the Blue Jays were projected more or less accurately.
- In 2007, no projection was closer than four games off of the Red Sox' ultimate record, but they pretty much nailed the Yankees'.
- In 2006, the Sox underperformed by three games, the Blue Jays overperformed by four, and the Yankees were seven games better than the average projection.
- In 2005, the projections nailed it, placing the Sox and Yankees one game off, one either side, of the clubs' ultimate first-place tie.
Of the 16 AL East teams projected over the course of the last six years, I count the number of successes (two games or fewer) at seven — a 43.8 percent success rate. As for significant misses (four games or more), I also count seven. And twice, the projections were three games off, which is right on the border between being fairly close and pretty obviously not close.
Then there's the other, arguably more important facet, which is whether they got the standings right, even if one team ended up significantly better or worse than projected. Being seven games off the Yankees' record doesn't matter nearly as much if the Yanks were expected to finish in first anyway, for example.
On that score, the projections got it right in 2005 and 2009. That's it.
They get partial credit for 2006 in correctly projecting the division winner but swapping second and third.
They missed entirely in 2007, 2008 (though correctly projecting the Sox to finish second, I don't count it as a success if you miss on the first- and third-place teams) and 2010, when all three teams finished differently than projected.
So successful, or at least correctly projecting the division winner, three times, missing the boat three times. Taking that with the roughly 50/50 success rate in terms of raw record projection, it seems we can be about 50 percent certain of the projections being correct in any given season.
As Hudson noted on one of the previous threads, it seems like we could randomly pick two numbers between 92 and 98, then assign the higher number to the Yankees and the other to the Red Sox, and we would have just as likely a chance at correctly projecting the AL East as 1,000 simulations of whatever top-flight projection systems are out there today.
If we did that, we would get it right in 2005 and 2009, get the division winner correct in 2006, and miss it entirely in 2007, 2008 and 2010. Just like the projections.
But that doesn't make them any less fun, which of course is the entire point.