Ok – so this isn’t a sports blog, but I think this crosses over sports/nerd, so here goes. Big news in NFL this week is the Patriots being accused of cheating for having balls inflated approximately 2 PSI below the league minimum requirement, which many people feel would be an advantage to the offense in poor weather conditions. A common response from Pats fans is that the temperature difference between the locker room and the field would explain the measured difference. But, since this is ‘merica, and no one can do the maths, nowhere have I seen any actual estimate of the size this effect. So, I busted out my old friend PV=nRT. OK, so here’s the setup:

P1 = 11.5 PSI (chosen as the minimum – most favorable Pats assumption)

T1 = 70 Deg F

T2 = 30 Deg F

P2 = This shit ain’t known. The whole purpose of this exercise is to find this. Get with the program.

That’s right, I’m using imperial units. Deal with it, bitches.

Locker room:

P1V=nRT1

Field:

P2V=nRT2

A little algebra gives us:

P2=P1(T2/T1) since everything else is constant.

But.. we have to work in metric, because all this imperial shit doesn’t make sense, and we need to use absolute temperature, so Deg K.

Which means:

P1 = .0793 MPa

T1 = 294.1 K

T2 = 272.04 K

So P2 = .0734 MPa = 10.65 PSI

So, I’ll give the Pats 1PSI difference, but 2? That would require a temperature swing of 92F. So unless these balls were stored in a 120 Deg sauna, I’m calling bullshit on this one.