The Tension Index, first launched 4 years ago on this site, has now been updated with a more solid mathematical underpinning. The purpose of the Tension Index was to identify games of "maximum uncertainty" in which the outcome remains in doubt for as long as possible. It was inspired by Game 7 of the 2016 NBA Finals, in which the score differential never exceeded 7 points and neither team's win probability exceeded 73% prior to Kyrie Irving's step back three with 53 seconds left.

The original Tension Index formula was a kludge-y attempt at quantifying how far a game's win probability strayed from 0.50; the idea being that games are more "tense" the more uncertain the outcome is. Here is the original spit and bubblegum formula:

Tension Index = 1 - 2 x [average of the absolute value of the difference between the game's win probability and 0.50]

As it turns out there is a much better way to quantify uncertainty, and it has its roots in information theory, a discipline Claude Shannon practically invented with his 1946 paper: A Mathematical Theory of Communication.

Shannon quantified the concept of information by connecting it to probability. The information content of a message is inversely proportional to the probability of the outcome that message is conveying. In other words, surprising messages convey more information.

If you turned off a basketball game with two minutes left and the home team up by 12 (those were the days), being told later that the home team won the game conveys very little actual information because the probability of that outcome was 0.999. Conversely, if you were told the home team somehow lost that game, that would be a high information content message.

More formally, information is defined as:

-log2(probability of event that occurred)

For binary outcomes (like basketball games), the math works best if you choose a log base of two.

Taking this one step further, we can calculate what is know as information entropy. This is the expected information content of a message. Knowing the outcome of a game in which the home team was up by 12 with 2 minutes left has an information entropy of:

-(0.999) x log2(0.999) - (0.001) x log2(0.001) = 0.011

But what if the home team was up by just two points with two minutes left? Knowing the outcome of that game would have a much higher expected information content:

-(0.678) x log2(0.678) - (0.322) x log2(0.322) = 0.907

Here is what information entropy looks like as a function of probability:

Maximum expected information content comes from a binary event with 50/50 probability (e.g. the proverbial coin flip).

This is exactly what the Tension Index was attempting to quantify, but more clumsily. Replace the smooth curve above with a pointy hat that peaks at 0.5, and you have the original Tension Index formula.

The Tension Index is now calculated by taking the win probability at each play of the game, calculating the information entropy at that point, and then taking the overall average of those numbers.

With the new calculation, the overall ranking of high tension games didn't change much. You can use this site's Top Games Finder to find the highest tension games of the play by play era.

The game with the highest Tension Index of all time was a January 2007 contest between the Timberwolves and the Clippers:

If we restrict to just playoff games, the highest tension playoff game of the past 23 seasons was Game 1 of the 1997 NBA Finals between the Jazz and the Bulls. This was the infamous "Mailman doesn't deliver on Sundays" game in which Karl Malone missed two free throws with 11 seconds left with the score tied. This set up Michael Jordan's game winning jumper from the left wing over Bryon Russell. No, not that one, the other one.