TITLE: Software Has Its Own Gresham's Law AUTHOR: Eugene Wallingford DATE: June 14, 2015 9:17 AM DESC: ----- BODY: Let's call it Sustrik's Law, as its creator does:
Well-designed components are easy to replace. Eventually, they will be replaced by ones that are not so easy to replace.
This is a dandy observation of how software tends to get worse over time, in a natural process of bad components replacing good ones. It made me think of Gresham's Law, which I first encountered in my freshman macroeconomics course:
When a government overvalues one type of money and undervalues another, the undervalued money will leave the country or disappear from circulation into hoards, while the overvalued money will flood into circulation.
A more compact form of this law is, "Bad money drives good money out of circulation." My memory of Gresham's Law focuses more on human behavior than government behavior. If people value gold more than a paper currency, even though the currency denominates a specific amount of gold, then they will use the paper money in transactions and hoard the gold. The government can redenominate the paper currency at any time, but the gold will always be gold. Bad money drives out the good. In software, bad components drive good components out of a system for different reasons. Programmers don't hoard good components; they are of no particular value when not being using, even in the future. It's simply pragmatic. If a component is hard to replace, then we are less likely to replace it. It will remain a part of the system over time precisely because it's hard to take out. Conversely, a component that is easy to replace is one that we may replace. We can also think of this in evolutionary terms, as Brian Foote and Joe Yoder did in The Selfish Class: A hard-to-replace component is better adapted for survival than one that is easy to replace. Designing components to be better for programmers may make them less likely to survive in the long term. How is that for the bad driving out the good? When we look at this from the perspective of the software system itself, Sustrik's Law reminds us that software is subject to a particular kind of entropy, in which well-designed systems with clean interfaces devolve towards big balls of mud (another term coined by Foote and Yoder). Programmers do not yet have a simple formula to predict this entropy, such as Gibbs entropy law for thermodynamic systems, and may never. But then, computer science is still young. There is a lot we don't know. Ideas about software have so many connections to other disciplines. I rely on many connections to help me think about them, too. Hat tips to Brian Rice for retweeting this tweet about Sustrik's Law, to Jeff Miller for reminding me about "The Selfish Class", and to Henrik Johansson for suggesting the connection to Gibb's formula. -----