Have Algorithms Improved?

Software gets better (and worse) over time, and computers get faster every year. But underlying these changes are algorithms, which are improved or replaced over time, too.  Generally speaking, we don’t bother rewriting code unless the new algorithm is better than the old.  But new or improved algorithms are pretty rare.

This year Yash Sherry and Neil C. Thompson investigated the interesting question, “How Fast Do Algorithms Improve?[1].

They examined research papers and textbooks to assemble a history of “families” of algorithms that solve the same problem, over the past 80 years or so.  The textbooks were used to decide which algorithms were “consequential” enough to consider.  The 57 textbooks point to some 113 families of algorithms, which is quite a few.  (Many algorithms have very specific applications, so most of us do not actually work with anywhere near that many, certainly not at one time.)

Already this historical timeline is kind of interesting.  After all, this is the bedrock, the basic value, the essence of Computer Science.  In our history we have created 113 big ideas.  (OK, we’ve created millions of little ideas, along the way.  : – ))

In this dataset, half of all the algorithm families date from the first 20 years of computing, and almost 90% were invented / discovered by the 1970s.  This confirms my own intuition that very little really new was invented during my own career.

During this period, algorithms were improved (e.g., adding new, better members to a family). The study found that about 20% or a little more were improved each decade through the 1990s, dropping substantially in the twenty first century.

But how much was the “improvement”?

The researchers identify a handful of algorithms that experienced massive improvement, more than 1000 times, and 40% experienced speed up “faster than Moore’s Law”.  Notably, there were more large improvements for larger input sizes.  This corresponds to my own intuition that small problem sizes have improved via Moore’s Law and larger ones via algorithm improvements.

Overall, the “median” algorithm family improved substantially, but less than Moore’s Law for moderate size problems, and more than Moore’s Law for large problems. (Note, though, that Moore’s Law has been known to completely conquer smaller problems.  When you have a big enough memory, you can precompute all possible answers, store them in a table, and just read out the answer.)

“We find enormous heterogeneity in algorithmic progress, with nearly half of algorithm families experiencing virtually no progress, while 14% experienced improvements orders of magnitude larger than hardware improvement”

([1], p. 8)

This is a really neat study. 

Of course, the algorithm families are only a subset of the algorithms and algorithm-like processes implemented in software.  This study limited itself to “exact
algorithms with exact solutions
” (p. 8).  There are slews of algorithms that generate approximate or imprecise, as well as slews of important methods that are not generally considered as formal algorithms (e.g., the organization of large bodies of code and data).

I’ll also note that contemporary architectures are vastly more complicated than a theoretical Turing machine, with many bottlenecks that have nothing to do with the number of steps.  E.g., a large system may contain thousands or millions of functional units, with a variety of capabilities, connected by a complex hierarchy of networks.  The kind of improvements discussed in this paper may be utterly irrelevant to the performance on a real world machine.  Many performance improvements have come about by cleverly mapping algorithms—possibly more than one algorithm—to specific hardware configurations.

What I am driving at here is that this history of improvement is more of a conservative baseline estimate than a complete characterization of how important algorithms are for performance.  In fact, overall we probably do better than this study indicates, if only because most of what we do is not “exact”.


  1. Yash Sherry and Neil C. Thompson, How Fast Do Algorithms Improve? Proceedings of the IEEE:1-10,  2021. https://ieeexplore.ieee.org/document/9540991

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.