"A while back," Steven Victor wrote, "I was asked to look at an issue where numerical data 'kept getting more and more inaccurate' in newer versions of a software product."
"After some searching, I came across some code that converted an integer into a string representation of the value. It used the common "itoa" function, and since it was pretty run-of-the mill data meant to be interpreted by humans, the string was supposed to a base-ten representation.
No big deal, right?
Steven continued, "Whoever wrote this code must have been studying up on design guidelines that favored “avoiding magic numbers,” as the code I came across looked like this:"
char sTypelib_version[3]; ... itoa(typelib_version, sTypelib_version, VERSION_CODE);
After looking for the value of VERSION_CODE, I finally found it in a global constants header file:
//#define VERSION_CODE 10 //2005 //#define VERSION_CODE 11 //2006 //#define VERSION_CODE 12 //2007 #define VERSION_CODE 13 //2008
"Apparently," Steven said, "this code worked extremely well in 2005 when the tenth version of the product was released."