*ding* Customer Process Manager Error - Unable to Close Order Line 255731

Rick ignored the error. The Customer Process Manager was a rickety ASP web floating on an Oracle DB designed by a drunken sailor. It was more likely to produce errors than the desired result. The users, who had been using the application for a decade before Rick was tasked with keeping it hobbling along, were used to its odd behavior and knew when to expect errors.

Rick's inbox dinged again, and this time released a flood of similar error messages. Within a few minutes, there was a deluge of unclosed order lines. And that's when the phone rang. "I think we've got a problem," the user said, "and I think it was caused by this weekend's upgrade."

"I doubt that," Rick said, "but I'm looking into it."

The upgrade in question was a RAM and CPU bump on the DB server. The incompetently designed database schema performed badly no matter what you threw at it. The goal of the upgrade was to increase it from "morbidly-obese sweat-pant wearer in Wal*mart" speeds to "morbidly-obese sweat-pant wearer in a buffet line" speeds. Their testing had shown that the beefy CPU and half-ton or RAM had improved their benchmarks.

Confident that the upgrade couldn't be the cause, Rick dug into the errors. The underlying cause was a primary-key violation on the INVENTORY_TRANSACTION table. He looked at what fields the primary-key included and grimaced. Aside from having six fields in the key, it didn't start too badly: "N_CUSTOMER, N_ITEM, N_TRANS_TYPE". But dangling from the end of the key like an angry Klingon hung "D_TIMESTAMP".

An INSERT trigger ensured that D_TIMESTAMP, a date field, always started as SYSDATE. Prior to the upgrade, at Wal*Mart waddling speed, the application trickled through the database table, and that meant very little happened in any given second. But after the upgrade, a number of order lines processed quickly, and suddenly the fact that some orders had the same item on two lines meant that the transaction exploded. Roughly 50% of the time that an order had duplicate lines, it now failed.

"Well," Rick sighed, "the user was right." The upgrade was the culprit. He considered adding a dbms_lock.sleep call to slow the process back down to waddling-through-the-aisles speed, but in the long run, that would only make the application more embarassing to support. Besides, the users really liked the snappier response time. He didn't want to add a new, unique column to the table, only because there were so many undocumented processes doing undocumentable things to the table that he feared breaking one of the hundreds of scheduled PL/SQL procedures that depended on the schema looking exactly how it did today. For the same reason, he didn't want to change the timestamp to an actual timestamp datatype, which supported fractional seconds. Instead, he scanned the list of columns, looking for something to add to the already overburdened key.

It turned out to be easier than he expected. At the end of the table, undocumented and unreferenced in any process he could find, was an unfamiliar column: N_TRXN_ID. The name was promising, and a quick survey of the data showed that it was a unique, auto-numbered column, populated by a trigger and guaranteed to never be null. For over a decade, it sat there, unused and unloved, while D_TIMESTAMP valiantly tried to do a job it was unqualified for.

[Advertisement] BuildMaster allows you to create a self-service release management platform that allows different teams to manage their applications. Explore how!