davidmarver wrote:Is it me, or do bullpens and pinch hitters make a giant difference in the outcome of 1-run games?

I think some of it can be attributed to luck, but a lot of 1-run victories are such because of other factors.

This is true, but a team with a higher run differential is usually better.

I posed this question on another board about sports in general. This is a more football-oriented explanation, but just change all the terms to baseball terms and it makes sense (more, in fact, since baseball is governed by strict probability on a greater level):

"Wow. To continue with baseball: Every batted ball has a set of probabilities governing it. Aside from very minimal control of direction due to swing speed/timing, the batter has no ability to control where the ball goes. As such, and considering the defense, it may or may not fall in for a hit. Because of this, every batted ball of Velocity A has roughly the same chance of falling in for a hit, given that the defensive team remains the same (no stamina loss, etc.). What makes a good hitter is hitting more balls, yes.

However, it's still always a probability issue. If you're a .333 hitter, every at-bat gives you a one in three chance that you'll end up on base (or hitting a home run). You can't control that any more than I can control a coin flip -- it's going to come out relatively even, given enough chances, but that doesn't mean that it's not going to land heads fifty times then tails fifty times. Every time you swing, it's the same probability.

Though there's not the same element of ball-goes-anywhere-at-even-probability in football, if you assume static conditions, a 60% completion rate is always 60%. You have a three-in-five chance of completing a pass. Again, the coin analogy.

Now, let's take the coin analogy further, and say that heads is scoring, tails is not scoring. Each team has a given chance of scoring in any given situation.

Say a team gets four flips per quarter/inning/whatever. In football, that's 16 per game. Let's say Team A has a 75% score rate, and Team B has a 50%. Game 1, A goes 8-8, B goes 7-9. Game 2, A goes 16-0, B goes 9-7. It continues this way throughout the entire season. By the end, we now have two .500 teams. But which is better? The one who scored 50% of the time or the one who scored 75% of the time?

Okay, same thing, same assumptions: Team B has a 9/16 chance of scoring, Team C has a 50% chance. Every single game, B wins 9-8 over C. Is B now a superior team, even though their scoring rate is 56.25% compared to A's 75%?

Remember, we're assuming absolute statical conditions, so probabilities are all that matter. A is going to be considered the better team.

Now realize, we've already had fungability to the environment. In that first example, A had good days and bad days. Sure, it's never going to happen in quite that manner -- over the course of multiple seasons, they're more and more likely to go 12-4. But, sometimes they'll go 8-8, sometimes they'll go 16-0. Remember, same exact team. Same exact players, same exact skill level. The only difference between years is that players have different peaks and valleys, and if all of those players have a bad day at the same time, it's going to cause the team to lose."

Your wisemen don't know how it feels to be thick as a brick...