Generating Intuitions for Exponential Growth

You’ve probably heard of the Rule of 70:

To estimate the doubling time of an exponential function, just divide 70 by the growth rate.

For some rates, this works really well. At 2% annual growth, the rule gives 35 years, and the actual value is 35.003 years. Other times it fails horribly. At 70% growth, the rule predicts doubling in one time step, but it actually takes 1.3.

How does the heuristic perform in general? Not that well. It’s accurate at 2% growth, but then quickly converges to being off by 0.3 timesteps.

An alternative, the Rule of 72, performs a bit better, converging to being off by 0.28, which is still not great:

So why was the rule ever popular? An early version is attributed to 15th century Italian accountant Luca Pacioli. Coincidentally the same guy who failed to teach Leonardo DaVinci math. In Summa de arithmetica, he writes:

In wanting to know of any capital, at a given yearly percentage, in how many years it will double adding the interest to the capital, keep as a rule [the number] 72 in mind, which you will always divide by the interest, and what results, in that many years it will be doubled. Example: When the interest is 6 percent per year, I say that one divides 72 by 6; 12 results, and in 12 years the capital will be doubled.

For 6 percent, the error is only 0.1, which is not yet too bad. In general, it’s helpful to think of the Rule of 72 as a heuristic that works decently for a certain range of values.

That might sound like a crippling limitation, but for typical investments it’s actually a decent range. Hedge funds average around 7.5%, and the S&P has returned an annualized 9.81% since 1994. Unless you’ve invested your money with Byrne Hobart, your returns are likely in this modest range.

Luca Pacioli, a good mathematician, bad teacher, and responsible investor who failed to foresee the rise of meme stocks.

I’m giving heuristics a bad name. Truth be told, they’re far far better than your intuition.

From Stango and Zinman’s publication on Exponential Growth Bias and Household Finance:

Exponential growth bias is the pervasive tendency to linearize exponential functions when assessing them intuitively.

This has real implications:

exponential growth bias can explain two stylized facts in household finance: the tendency to underestimate an interest rate given other loan terms, and the tendency to underestimate a future value given other investment terms. Bias matters empirically: More-biased households borrow more, save less, favor shorter maturities, and use and benefit more from financial advice, conditional on a rich set of household characteristics.

It’s not that being off by a small timestep will ruin you. It’s that the errors compound, such that the longer your time horizon, the more horrendously skewed your linearized intuition gets:

If you work in startups, finance or anything adjacent, this might be your time to feel smug. Middle America makes poor financial decisions, but surely your experiences have improved your savvy? A paper on Misperception of exponential growth would tend to disagree:

This group of professional decision makers did not show less underestimation than naive subjects… Underestimation appears to be a general effect which is not reduced by daily experience with growing processes.

Equivalent papers are likely being written as we speak on the inability for both lay and expert analysts to properly evaluate and intuit viral growth during the COVID-19 crisis. [1] Already, we have the London School of Economics on the public’s inability to understand log scales. [2]

Being able to wrap our minds around growth matters a lot, and will matter increasingly in the wacky world of pandemics, tech startups and bitcoin. Exponential growth rules everything around me, and none of us can make sense of it. Sam Altman was right: “Everyone’s intuition for exponential growth sucks, so do the math.”

Still, I can’t break out a spreadsheet every time I want to think about a decision. It’s useful to have tools for thought we can actually fit in our heads. The charts above suggest a slight correction to the Rule of 72:

To estimate the doubling time of growth greater than 12%, divide 70 by the growth rate. Then add 0.3.

This gives us a heuristic that works decently well, never exceeding an error of 0.1:

But how does the error compound? It might not sound like a big deal to be off by a tenth of a timestep. But again, in the world of exponentials, that can be a big deal. Off by one timestep could mean off by an entire doubling, easily the difference between riches and ruin.

This chart also makes it clear that the heuristic is not quite as consistent as previously suspected. Remember when I said it converges? That was a lie. Here’s a more cosmopolitan view of error for really big growth rates:

It does actually start to level off at the end, so if you have to deal with growth rates over 1000%, there’s another correction you could make.

This might all sound silly, but in a post-Covid world, it shouldn’t. Daily cases in the UK grew 400% this last month. Depending on your timescale, you might think of that as 38% weekly growth, or as 24,000,000,000% annual growth. [3] [4]

So far we’ve been looking solely from the perspective “given a growth rate, what’s the doubling time?” But these problems can come from a few different angles:

  1. Given trends with different initial values and growth rates, how long until one overtakes the other?
  2. If a growth rate increases, how much does doubling time shorten?
  3. Given a doubling time, what’s the 1000x time?

#3 is trivial (just multiply by 10), #2 just requires using a heuristic twice and comparing, #1 is really hard to do in your head. I shared some weird examples of this in The Unreasonable Effectiveness of Starting Over. Honestly, the best “heuristic” is to have Wolfram Alpha at the ready, and input α0x - α1x = Δ, where Δ is the ratio of initial values, α0 and α1 are growth rates, and x is the number of timesteps until they cross over.

There are also cases where you’re trying to estimate growth, in conjunction with other factors. To write Golden Handcuffs, I had to model a case where:

  • A software engineer makes $X/year
  • Generates savings after taxes and cost of living
  • Invests that capital at some growth rate
  • Does so continuously over several years

In this case, you’re not just considering the growth rate of some lump sum. You’re considering the growth rate of a growing pool of capital, modulated by a dynamic tax rate. If you want to play with this yourself, I have a model for State + Federal + FICA taxes here.

This is a key consideration for the Financial Independence Early Retirement community. Over at Scattered Thoughts, Jamie Brandon has a great visualization of the non-linearities in saving:

The point being, when you grow your savings from 0.2M to 0.4M, your runway doesn’t just double, it skyrockets! At 30k in annual spending, a 0.2M nest egg will only get you 8 years, but 0.4M extends your runway to 57 years, and at 0.41M you’re financially self-sustaining indefinitely!

One upshot of this discussion is that while intuitions are okay, heuristics better, and math unreasonably effective, what we really need is tools to rapidly model weirder and more complex scenarios.

Or maybe one day we’ll just develop better notation. As Alexey Guzey quotes Bret Victor:

…back in the days of Roman numerals, basic multiplication was considered this incredibly technical concept that only official mathematicians could handle … But then once Arabic numerals came around, you could actually do arithmetic on paper, and we found that 7-year-olds can understand multiplication. It’s not that multiplication itself was difficult. It was just that the representation of numbers — the interface — was wrong.

Can we run this process in any kind of systematic fashion? Michael Nielsen has an extraordinary example, alongside a longer exploration with Andy Matuschak.

But that’s a story for a different post.

See Also
Neil Hacker – Compounding


[1] The Stango/Zinman paper was written in 2009 just after the financial collapse. Similarly, the Misconceptions came out in 1975, catching the tail end of the 1970s recession and oil crisis when unemployment soared to 9%, then considered a historic level.

[2] Full paper available here, and a response from Andrew Gelman: Let them log scale.

[3] I wasn’t expecting a number that big either, but I guess that’s the whole point of the post. Sanity check: 5x M/M growth = 512 Y/Y growth = 244,140,625x = 24,414,062,499%.

[4] Of course, eventually Covid runs out of people to infect.

If you want to play around with heuristics and growth rates, you can clone the model here.