Wednesday, August 30, 2023

Premature Optimization: Universally Misunderstood

Milen Dzhumerov (Mastodon):

It has been commonly interpreted as “don’t think about performance in the beginning, you can fix any performance problem later”. This interpretation is completely and categorically wrong.

[…]

With the additional context, the quote takes on a significantly different meaning: it’s making a statement only about micro-optimisations ("small efficiencies"), not about performance in general.

Hoare was simply advising against micro-optimisations without finding the hotspots first: the “premature” part refers to lacking measurements.

[…]

As tech stack fundamentals, access patterns, data dependencies and data structures are baked into a design, it’s not possible to hotspot your way out of a slow architecture.

Previously:

3 Comments RSS · Twitter · Mastodon

Given the level of performance and efficiency we see in most modern software and websites, it's obvious that nearly everyone should be thinking *a whole lot more* about efficiency and optimizations than they are.

The "premature optimization" is a quote from Knuth's 1974 paper "Structured Programming with go to Statements", where Knuth talks about efficiency, e.g. by rewriting machine code to use one operation less, before giving this warning:

"Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil."

1974. Machine code. Low level. No frameworks or software stacks. No architecture or software design, just manual optimization by rewriting single lines of code. This is the level at which this quote should be understood.

Old Unix Geek

Having programmed machine code (the assembler was your brain, you typed in hex), on a 1Kb machine, optimization and thinking hard was essential to having a useful product. At least I had a keyboard, and didn't have to use punchcards...

"Just manual optimization" was a true art, poorly understood these days. Some of us still can do it, but on modern CPUs it requires understanding the full CPU internal architecture and writing a simulator to ensure that one's code can flow as fast as possible.

It's rare to need that, even when programming new algorithms dealing with vast amounts of data. Algorithmic complexity matters more, and reducing the amount of RAM traffic. And most youngsters just throw more threads at things rather than using SIMD.

Few people program in a way that requires much thought these days. They mostly glue library calls together. The speed difference between early computers and modern ones hides most sins. Even in ML, few people actually work with CUDA or at inventing new algorithms.

Leave a Comment