|News||KISS Principle||Recommended Books||Recommended Links||Back to basics||Parkinson Law||Unix Component Model|
|The Second System Effect||Brooks law||Conway Law||Featuritis||Premature Optimization is the root of all evil||The Second System Effect||Greenspun rule|
|Program Understanding||Real Insights into Architecture Come Only From Actual Programming||Conceptual Integrity||Cargo cult programming|
|Pipes||Scripting||Open Source and Bloatware||History||Humor||Random Findings||Etc|
The quote is credited to Donald Knuth. Here is the full quote from his book The Art of Computer Programming: “The real problem is that programmers have spent far too much time worrying about efficiency in the wrong places and at the wrong times; premature optimization is the root of all evil (or at least most of it) in programming.”
Premature optimization is the act of trying to make things more efficient at a stage when it is too early to do so. For example, premature optimization could involve someone spending a lot of time and money picking out the best possible gear for a certain hobby, despite the fact that they haven’t actually tried out that hobby to make sure they enjoy it.
Premature optimization can often end up backfiring, and cause you to waste a lot of resources, such as time, money, and effort, while also increasing the likelihood that you will create future problems. Accordingly, understanding what premature optimization is and how to avoid it can be beneficial in many areas of life.
As such, in the following article you will learn more about this concept, and see some beneficial guidelines that will help you figure out when a certain optimization is needed, and when it is premature.
People make the mistake of trying to optimize things prematurely in many areas of life. This includes, for example:
As noted above, there are some situations where optimizing things early on might be necessary, and in such situations this sort of optimizations are considered appropriate, rather than premature. However, in most cases, the optimizations described in these examples are premature, and it would be preferable to postpone them until a later stage.
The concept of premature optimization was first made prominent in the field of software engineering. It is attributed to Sir Tony Hoare, though it was popularized by Donald E. Knuth, who said that:
“There is no doubt that the holy grail of efficiency leads to abuse. Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.
Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified.”
— Structured Programming with go to Statements (1974)
This presents the argument against trying to make premature optimizations, while at the same time acknowledging that it’s nevertheless important to identify areas where optimizations can be necessary, and to then implement those optimizations.
There are several reasons why premature optimization is problematic:
There are various reasons why people optimize things prematurely:
Note: a related concept which has similar roots is called bikeshedding; this represents a phenomenon where people spend a disproportionate amount of resources dealing with relatively minor issues.
So far, we saw what premature optimization is, why it’s an issue, and why people are prone to it. Next, you will see what you can do in order to avoid optimizing things prematurely.
Essentially, when figuring out whether or not you should optimize something, there are several factors you should consider, and several important questions that you should ask yourself:
Based on these criteria, you can prioritize the different tasks that you have to complete, and figure out which ones you should work on at which stage, in order to ensure that you avoid making any premature optimizations.
However, note that you don’t have to ask yourself all of these questions each time you evaluate a potential task. This is especially true if a certain ask is relatively minor, since it might take you less time and effort to simply get a trivial 2-minute task done than it is to evaluate it using all of these criteria.
Rather, the important thing is to be aware of these considerations, and use them, at least to some degree, to evaluate tasks when necessary. The larger a task appears to be, based on the resources that it will require or the effects that it will have, the warier you should be, and the more you should use these criteria to evaluate it.
It’s important to remember that avoiding premature optimization doesn’t mean that you should avoid optimization entirely. Rather, it simply means that you should think carefully before you decide to spend your resources optimizing something.
This is crucial, since people often repeat the idea that “premature optimization is the root of all evil”, without acknowledging the full quote, which states that “we should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%”.
This means that it can be entirely reasonable to assess a situation and decide that you should optimize something, even if it’s at a relatively early stage. This might happen for a variety of reasons, such as because you believe that a small modification could give you a significant benefit, or because the optimization will allow you to deal with a bottleneck in your work, or because avoiding the optimization might lead to significant technical debt later on.
In the original quote on the topic, this concept was said to apply to roughly 3% of cases, but your cutoff for what a valid optimization is can be higher or lower than that. For example, a common guiding principle is the 80/20 rule (also known as the Pareto Principle), which in this case suggests that 80% of the positive outcomes that you experience will come from 20% of the work that you do.
Overall, to make sure that you avoid optimizing things prematurely, you should always assess the situation first, and determine whether or not the intended optimization is necessary at that point in time. However, this approach shouldn’t become an excuse to avoid optimization entirely, but should rather serve as a way to prioritize tasks as effectively as possible.
One of the hardest parts of software development is knowing what to work on. Good developers are also expensive and in short supply. One of the biggest challenges is making sure we are making good use of our time. The last thing we want is to ship code that our users don’t like or that doesn’t work. How much time we should dedicate to performance tuning and optimization is always a balancing act.
If we don’t do any performance tuning or optimization, our new product launch could be a complete disaster.
Sep 14, 2016
Let's face it, the quote from the title has been used to advocate bad programming far too often. Sometimes it's complemented by the numbers from the full quote like they are the permanent truth or for the very least a well-measured observation:
We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%.
Sometimes it's just backed by the authority of Donald Knuth, the original author of the quote. However Knuth himself attributes the "evil quote" to Hoare and Hoare attributes it back to the Knuth, so it's more of a nobody's child.
The thing is, it was first printed 45 years ago in 1974 in an article devoted to justifying the use of "goto" statement in structured programming. In terms of black and white, it was pro optimization and not contra. And if you read TAOCP, you know that Donald Knuth himself puts algorithms and the machine far before paradigm is it structural, or functional, or any other type of programming. He publishes code examples in MMIX (formerly MIX) machine code instead of high-level languages because this is what really matters. Algorithm and its machine implementation were the core of computer programming 45 years ago, and there were never a good reason for this to change.
However, nowadays it is common knowledge that you should build things to work and only then optimize them to work fast. For me it sounds like: "let's build a house of straw and beeswax and call the fire brigade when it catches fire". It's absurd for every other engineering discipline apart from software engineering. We are here to anticipate and prevent the problems, and not to spawn them ignorantly because of 45 years of self-deception.
In 1974 optimization indeed meant sacrificing code clarity for mere percents of performance improvement. But sometimes you manage to sacrifice less and gain more, which is good. In the very same article from which the "evil quote" is taken, Knuth also published actual results for the case of such optimizations:
The improvement in speed from Example 2 to Example 2a is only about 12%, and many people would pronounce that insignificant. The conventional wisdom shared by many of today's software engineers calls for ignoring efficiency in the small; but I believe this is simply an overreaction to the abuses they see being practiced by penny-wise- and-pound-foolish programmers, who can't debug or maintain their "optimized" programs. In established engineering disciplines a 12% improvement, easily obtained, is never considered marginal; and I believe the same viewpoint should prevail in software engineering. Of course I wouldn't bother making such optimizations on a one-shot job, but when it's a question of preparing quality programs, I don't want to restrict myself to tools that deny me such efficiencies.
And this was in 1974. Today, exactly because of the decades of performance negligence, there are so many low hanging fruits, you don't really have to go deep into scrambling your code, and still gain the most significant performance improvement. Here are some anecdotes from my practice for evidence.
Once we got 10% speedup in one routine by changing
size_tin a loop counter. It happened to be the inner cycle of matrix transformation and apparently, the compiler we used at the time didn't unroll it solely because of the counter type. That's rare, but compilers still fail to optimize things every now and then, that's why it is a good idea to check their output with disassembly tool at least for the very intense parts of the system. It's not even that hard, in fact, everyone can read disassembly.
The other time we got 50% speedup by simply caching an operation that was considered cheap and hence uncacheable. Perhaps it was true some 10 or 20 years ago, but nowadays the difference in cache and RAM reading latency is too drastic to ignore. Machines change, we can only account for these changes by reassuring the facts we know with testing for things that were considered improbable in the past.
We also got 100% of speedup by eliminating an extra call of a rather heavy procedure. It should have been called once, but because of the architectural quirk, it was called twice in the same frame. We spotted this accidentally while profiling a completely different routine. The moral is, you don't necessarily know what's going under the hood unless you peek there periodically. Profiler helps not only in looking for bottlenecks when the whole thing's on fire but as an investigation tool in its own right.
And that one time we got 400% speedup by rearranging the image processing pipeline. Images were stored on disk in one of the several integer formats, and when they were loaded, they were being converted to double precision floating points and then back to the integers, only to do some minor transformations on them. The whole thing was losing most of the performance by simply creating and filling intermediate structures. Alas, as processing pipeline was rather heavy and versatile, it wouldn't be very wise to apply it to every image value separately because of the cost of dispatch. We thought the way to statically create all the possible combinations and then dispatch it once per image hence enabling cheap per-value processing without any intermediate structures. And the resulting code happened to be just about the size of the original. Well, sometimes it pays to know your way around meta-programming.
But the largest performance boost I ever acquired came from very simple thing. I used the default .NET facility for matrix multiplication
Matrix.TransformPointsmethod and it was very slow, so I decided to re-implement it on site. After all, it's just a vector-matrix multiplication, a simple operation. This re-implementation gave me easily 20000% of improvement! When I peeked under the hood to see how the original operation works, I saw this:- System.Drawing.Drawing2D.Matrix.TransformPoints: - System.Drawing.SafeNativeMethods.Gdip.ConvertPointToMemory, - System.Drawing.SafeNativeMethods.Gdip.ConvertGPPOINTFArrayF: - System.Drawing.UnsafeNativeMethods.PtrToStructure: - System.Drawing.Internal.GPPOINTF..ctor (which is empty), - System.RuntimeType.CreateInstanceSlow: - System.Runtime.InteropServices.Marshal.PtrToStructure.
This, ladies and gentlemen, is exactly how the fruit of performance unawareness looks like. The perfect specimen! It has conversions and marshaling, constructors and creators, and only sometimes deep on the bottom it has the matrix multiplication I was looking for, but it doesn't even show on the profiler.
My point is, it's all bad. It is the fear of evil that took us to the point the very basic desktop software is sluggish as hell. At work, I use Visual Studio and Word, and Excel, and Outlook, and it's all tremendously bad. Sometimes, when Visual Studio goes "busy", I have to load my trusty Vim just to continue to work. And I'm not even a Vim fan, it's only as good for me as it works while everything else hangs.
And on the Internet, it's even worse. The guy wrote a message board engine in assembly and everybody is amazed how fast it is. Well, it's not. There's no magic in assembly per se, it's just all the other implementations are horribly slow. Assembler only brings performance awareness up front.
So, it is bad. It's all bad. But it's also good. For me. Because I make money and reputation by making things better. I fear no evil walking through the darkest valley, and it pays well in the end. So for all those who cover own negligence with Donald Knuth's "evil quote" I have one thing to say.
If you enjoyed this post, you might also enjoy the Programming section on my new site:
If you correct this bias in your logic, you can say that the opposite of your title is true based on your observations. The fact that there is little to none profitable/valuable software optimized right from the start on the market, might mean that "(premature)? optimization" might indeed be 'the…
When people hide behind this saying, it's probably because they are misquoting it, the same way people misquote the saying "The love of money is at the root of all kinds of evil". (The quote this article is about is an adaptation of this biblical saying). You see, it's not money, it's the love of money. Big difference.Love it. I wish to put your article in the face of every people that every time when I ask a question about optimization on a forum the first responses I got are like: " Why you need that? Are you sure you need that much speed? Today computers are…...
You're making a logical error there, 'survival bias'. You're earning money by optimizing already built solutions. Amount of money you earn is indicative of how 'valuable' this slow and stupid piece of code is to your employer. This actually shows, that you can neglect not only "premature", but also "timely" optimization in your software, and still…
Google matched content
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|You can use PayPal to to buy a cup of coffee for authors of this site|
Last modified: June 02, 2021