NOAA Central Library Card Catalog 1

The year was 2015. Erik was working for LibCo, a company that offered management software for public libraries. The software managed inventory, customer tracking, fine calculations, and everything else the library needed to keep track of their books. This included, of course, a huge database with all book titles known to the entire library system.

Having been around since the early 90s, the company had originally not implemented Internet connectivity. Instead, updates would be mailed out as physical media (originally floppies, then CDs). The librarian would plug the media into the only computer the library had, and it would update the catalog. Because the libraries could choose how often to update, these disks didn't just contain a differential; they contained the entire catalog over again, which would replace the whole database's contents on update. That way, the database would always be updated to this month's data, even if it hadn't changed in a year.

Time marched on. The book market grew exponentially, especially with the advent of self-publishing, and the Internet really caught on. Now the libraries would have dozens of computers, and all of them would be connected to the Internet. There was the possibility for weekly, maybe even daily updates, all through the magic of the World Wide Web.

For a while, everything Just Worked. Erik was with the company for a good two years without any problems. But when things went off the rails, they went fast. The download and update times grew longer and longer, creeping ever closer to that magic 24-hour mark where the device would never finish updating because a new update would be out before the last one was complete. So Erik was assigned to find some way, any way, to speed up the process.

And he quickly found such a way.

Remember that whole drop the database and replace the data thing? That was still happening. Over the years, faster hardware had been concealing the issue. But the exponential catalogue growth had finally outstripped Moore's Law, meaning even the newest library computers couldn't keep up with downloading the whole thing every day. Not on library Internet plans.

Erik took it upon himself to fix this issue once and for all. It only took two days for him to come up with a software update, which was in libraries across the country after 24 hours. The total update time afterward? Only a few minutes. All he had to do was rewrite the importer/updater to accept lists of changed database entries, which numbered in the dozens, as opposed to full data sets, which numbered in the millions. No longer were libraries skipping updates, after all.

Erik's reward for his hard work? A coupon for a free personal pizza, which he suspected his manager clipped from the newspaper. But at least it was something.

[Advertisement] Continuously monitor your servers for configuration changes, and report when there's configuration drift. Get started with Otter today!