Betting on Logic: Understanding Expected Value in Everyday Life
From lotteries to flight plans, how simple math can help you make smarter, risk-savvy decisions.
*This is Part 3 in a series on How Not To Be Wrong by Jordan Ellenberg*
Read Part 1 here:
Read Part 2 here:
"Expected value is a great way to figure out the right price of an object whose true value isn't certain."
The name expected value doesn’t quite capture the full meaning of the mathematical concept. It’s really about describing the long-term average outcome you’d expect given enough samples. The concept really clicks when viewed through the lens of gambling or lottery tickets. The formula for expected value is simple: the probability of an outcome multiplied by its value.
Let’s walk through a quick example. Say each turn in a game has these odds:
20% chance of winning $10
50% chance of winning $5
30% chance of losing $3
The total expected value (EV) is the sum of all those possible outcomes:
(0.2 × $10) + (0.5 × $5) + (0.3 × -$3) = $3.6
Notice that $3.60 isn’t one of the actual outcomes—it's the average amount you'd expect to win per play over time. So if each play costs $3, would you play?
Of course you would. On average, you'd make more than $3 per play, so you'd likely come out ahead—as long as you stick with it for enough plays. If this sounds too good to be true in the real world, think again. That exact scenario once happened with a lottery game called Cash WinFall, run by the state lotteries of Massachusetts and Michigan (sadly now retired).
When the jackpot reached a certain threshold, the prize money would "roll down" to smaller matches—like matching 3 or 4 numbers instead of all 6—making those tickets worth more than their price. In some cases, the expected value of a $2 ticket would rise dramatically. Savvy groups caught on, pooling resources to buy hundreds of thousands of tickets. By understanding expected value, they turned the game into an investment—earning up to a 15% return on their money.
"The doctrine of expected utility is appealing, straightforward, and simple: presented with a set of choices, pick the one with the highest expected utility."
It may be the closest thing we have to a mathematical theory of individual decision-making.
When humans are being rational (a big if), we should base our decisions on utility. In this context, utility means value—often measured in time or money. For example, you might value an hour of your life at 1 util (a unit of measure for utility). The best decision, then, is the one that wastes the fewest utils—or gains the most.
Let’s say you’re trying to decide how early to arrive at the airport. The question becomes: how much time should you trade for a lower risk of missing your flight?
We can use expected value here too, with a bit of estimation:
Arrive 2 hours early, miss 2% of flights
Arrive 1.5 hours early, miss 5% of flights
Arrive 1 hour early, miss 15% of flights
Missing a flight costs you an extra 6 hours, plus the hassle of rebooking. Let's calculate the expected cost (in time lost) for each option:
Option 1: -2 + 0.02 × (-6) = -2.12 utils
Option 2: -1.5 + 0.05 × (-6) = -1.8 utils
Option 3: -1 + 0.15 × (-6) = -1.9 utils
Based on expected value, arriving 1.5 hours early minimizes lost time. But keep in mind—context matters. If you fly once a year, you may never miss a flight. But if you’re trying to catch a critical connection or international flight, even a 5% risk might be unacceptable.
This is where confidence intervals come into play. Expected value gives us a single number, but confidence intervals help us understand the range of possible outcomes. For example, while the average wait time might be optimal at 1.5 hours, the range of possible outcomes for showing up 1 hour early might include extremely costly misses. The expected value is just one piece of the puzzle—understanding variance and probability spread (i.e., confidence intervals) helps you assess the full risk picture. It's not just about the average—it's about how sure you are that reality will fall near that average.
So while math helps guide decisions, the real world requires context. If you're wealthy, a missed flight is an annoyance. If you're stretched thin financially, it could be a disaster. Risk tolerance isn't just math—it's personal.
Let’s return to lotteries to explore risk and variance more deeply. Variance refers to the range of different outcomes you could experience—even if the average (expected value) is the same. Two lottery groups that made millions off Cash WinFall used different ticket strategies. One picked numbers randomly using a computer, while the other carefully selected tickets by hand.
It turns out that choosing numbers randomly isn't optimal. Why? Because of overlap.
When selecting tickets at random, you increase the chance that multiple tickets have the same or similar number combinations—creating redundant risk. But if you handpick numbers to ensure no two tickets share 5 out of 6 digits, you reduce the risk of multiple tickets failing together. That lowers variance and increases your odds of at least some success.
So it’s not just about buying hundreds of thousands of tickets—it’s about buying the right ones.
"In order to play the lottery without risk, it's not enough to play hundreds of thousands of tickets; you have to play the right hundreds of thousands of tickets."
Lotteries are often called "sucker bets"—and yes, they usually are. From a pure expected value perspective, the average payout doesn’t justify the ticket cost.
But that misses the point.
People aren’t irrational for playing the lottery. In fact, they're just being rational in a different way. A $6 weekly loss over 30 years probably won’t bankrupt anyone. But a jackpot win? That would change your life. And that slim chance—however unlikely—is worth the price of admission for many. Add in the fact that lottery money often goes to public goods like libraries and parks, and it’s not a terrible trade.
You’re buying hope, not just a chance. And sometimes, that’s worth every penny.