• (nodebb)

    Sounds like a (for this site) happy ending all around. Charles' solution seemed complex but not total WTF (other than using Excel of course) so he gets to parade around with those laurels, Liam and the submitter both got what sounded like praise for making things faster. The company got better and faster reports.

    Seems like everyone "won", at least as much as you win in the usual Daily WTF story.

  • ray10k (unregistered)

    Liam and the submitter should have put in a delay. Having that long runtime suddenly disappear just makes the end users suspicious of the numbers even if they check out.

  • Decius (unregistered)

    Replicating a process without understanding the slightest bit of why it runs just makes it run faster. To make it run better, rather than ask : "Where does the frist cell in column A come from?" you need to ask "What SHOULD the frist chunk of information represent?"

  • (nodebb)

    By my calculations, 40 minutes is merely two orders of magnitude faster than 30 seconds (1.9 if you want to be really pedantic).

  • Vault_Dweller (unregistered)

    Another version of "Security by Obscurity", where security in this context means job security

  • MiserableOldGit (unregistered)

    Been there done that more times than I care to remember. Including the (very) odd contract where "Charles" was still there sabotaging my efforts and undermining me and even physically threatening me, because we both knew this was his artificial pension guarantee, but his PHBs had no idea.

    I remember one of the first ones of these I did was a call-centre wanting performance stats ready for the middle managers every morning, so they knew who to shout at. The main report had crept up to tying up a clustered SQL server for more than 12 hours each night, so the necessary numbers weren't getting delivered in time for the early morning motivational bollocking. The firm had invested in whatever iteration of Business Objects was wowing everyone in 2000, learned the hard way that it doesn't do what it says on the tin, and dumped the report authoring on some junior developer who only had experience with VB or C++ or something and didn't know databases, let alone BI and SQL. Took me a day to find the automated SQL generation had a bug and was missing ONE BRACKET in the where clause. Once I put the bracket in the report generation time dropped to around a minute. I think I get the 3 orders of magnitude prize for that one.

  • bvs23bkv33 (unregistered) in reply to kazitor

    it is more than six orders of binary magnitude

  • dusoft (unregistered)

    How about three orders of gratitude, if magnitude does not meet that.

  • Andrew Scott (unregistered)

    I once inherited something just like this, right down to the Access. It would generate data on hidden sheets which were referenced from visible sheets. Then it would replace the references in the visible sheets with the actual values and delete the hidden sheets. If someone asked where a value came from, the answer was "I don't know." It ran countless queries and stored intermediate steps in ranges of cells which were later deleted.

    The developer was, for lack of a better word, an idiot who claimed that any program worth writing could and should be written in Excel.

    As a new developer I remember the sinking feeling I got when I asked him how he handled exceptions. He turned his chair around, looked me in the eyes, and said, "Error handling is a bad idea." He went on to explain that every time anything goes wrong, the user should see the little VBA error pop up with its meaningless error code so they'd know to call him and he could get them to step through the code over the phone.

    This was the senior developer I was to learn from. He understood nothing. Really, nothing. Over the next few years we abandoned VBA for .NET (which was new) and he left to do whatever he does somewhere else. I can't find a trace of him anywhere.

  • Si (unregistered) in reply to Decius

    I made that mistake at my last job. Too much detail to go into here, but suffice to say every single thing about the new process was improved — it ran in a matter of seconds rather than hours, failures took a few minutes to resolve at most rather than the hours or days they did before (meaning the company didn't grind to a halt every time there was a blip), nobody needed to come in at 6am every day just to manually run through the Excel behemoth, and crucially all redundant/duplicate/contradictory/incorrect/confusingly-named stuff had been removed/resolved/corrected/renamed.

    All changes and, crucially, differences in the results were signed off by the appropriate report consumers, and the new reports were fully documented in addition to me writing a changeover guide to explain differences between the old and new reports (especially changes in the results of fields shared by both reports, which were all down to "old figure was incorrect, so difference is because new figure is correct"). Everybody's happy, we're good to switch over.

    Till the director overseeing all of the report consumers' departments decides to do her own reconciliation piece. All of my documentation is ignored — I'm actually okay with this, because having a critical pair of eyes doing naive checks and challenging me on the differences could be really useful for identifying anything I'd missed, and test the comprehensibility of my documentation when I explained the differences. Unfortunately the reasons for the differences were not deemed to be acceptable — as she kindly screamed in my face as she stormed out of the meeting, it wasn't good enough; the old figures couldn't possibly be wrong because the business had been using them to make business decisions for years.

    I made it clear to her manager and the managers under her that I wouldn't be working on anything that required me to communicate with her anymore and they'd need to find somebody else if they wanted a BI professional to knowingly provide incorrect data to an entire business.

  • Si (unregistered) in reply to Si

    It was okay though; after a few months we were ready to source the data from the "newly"-implemented and years-overdue SAP installation, so all of the line-level data was incorrect in the source system and nothing could be reconciled to the real world anymore.

  • Anonmouse (unregistered)

    Yes, it is not what you did on your past jobs that matters, it is how you make it sound on your resume. When applying for my first engineering co-op job during university, we had adviser's on how to write our resume. Sweeping the floor at the local burger joint became "Maintaining a clean working environment", etc...

  • eric bloedow (unregistered)

    is this a rerun? it sounds VERY familiar...

  • DQ (unregistered)

    When I started my job 15 years ago, I was Charles. Luckily I evolved into Liam.

  • (nodebb)

    On my first real (not via a temp agency) job, 20 odd years ago in the Windows 3.11 era, I inherited an XL (why does Excel acronymify into this?) sheet that contained data (to be expanded over time) as well as reports. At least my department chief was sensible enough to let me rewrite the 'app' into XS (why does Access acronymify into this?) so data entry/maintenance as well as reporting became a lot easier (I have to say, simply reporting from XS beat most reporting apps at the time). I also could then add cute graphs way more easily so the chief was mightily happy he gave the go on the rewrite.

    Of course, nowadays, I'm usually stuck with 'grown up' databases and whatever reporting tool is available. Sigh...

  • (nodebb)

    My first real job out of college was running field tests on a wireless long distance (500-1000 miles) WAN - mostly performance, but also testing that the software and firmware in the transmitters worked. The performance data was collected on paper strip charts (signal strength and connection duration) and fan fold dot matrix printouts of the test text data that was sent along with transmit params and IDs.

    The data was analyzed, collated and reported on by hand - at the end of that process the numbers were put into a spreadsheet. The process took months.

    I took one look and proposed that we replace the process with an A/D card and software to capture what the strip chart captured, and a terminal emulator to capture what the text printout captured, then run some analysis software to analyze and collate the results. This did in minutes what previously took months, such that we knew day to day whether a test was good or had to be rerun. It saved months of testing time and hundred of thousands of dollars.

    At least four orders of magnitude savings of time.

  • tbo (unregistered) in reply to Decius

    To be fair, "Where does this number come from?" can mean both, "How is it currently calculated?" and "How should it be calculated?" depending on context.

  • (nodebb) in reply to JiP

    While I don't doubt that there were still plenty of people using Windows 3.11 in ~1999, I wouldn't really call it the Windows 3.11 era.

  • Officer Johnny Holzkopf (unregistered)

    NB: The AS/400 is not a mainframe. It's a midrange system. CAVE TERMINOLOGIAE ET NUNC! :-)

  • Gallowglass (unregistered) in reply to jakjawagon

    I can verify when I started my job in 1997 they were still on windows 3.11 and remained so until after the 2k xover.

  • sizer99 (google)

    When I worked for the government (briefly, thank gawd), specifically the USDA, I actually got officially reprimanded for doing this. They had a two week schedule, but being lazy I automated everything and was getting their two weeks of work done in 5 minutes - yes, with the right answers.

    Forget disturbing Charles's gravy train, I had just upended the lazy gravy train for the entire department. No longer could they just plod slowly through their two weeks of mindless copying and pasting, of writing a custom CICS query and program for every new job even if the only thing that changed was the type of tree or fish.

    So they did the reasonable thing and buried it, said it was rogue overreach, officially reprimanded me and warned me not to pull anything like that again. And I didn't, basically sat there twiddling my schlong like the rest of them (actually, I was writing games and doing personal projects) for the rest of the summer - I needed the money for college.

  • Ryan of Tinellb (unregistered) in reply to kazitor

    Are orders of magnitude necessarily base 10?

  • Dave (unregistered)

    I can see why the first version ended up like that - Charles had to start from scratch, a simple task for a basic spreadsheet, until it accreted years and years of "what about adding this" and "can we make it do that" and "the CEO wants it to do this by the shareholder meeting at 2pm". Liam didn't have any of this, he just had to replicate the functionality of an existing working system. Given the chance to do a clean-sheet rewrite, I'd hope it performed better than the first version.

    And it's not really a WTF, that's how a lot of in-house software evolves, except usually there's never a chance to do it over.

  • Hasseman (unregistered)

    Kids. I started with Windows 2.11

  • (nodebb)

    You forgot to add that the protagonist of the story was also happy in the end:

    [quote] Management was happy; business was happy; developers were happy; Business Intelligence was happy. [quote]

  • MiserableOldGit (unregistered) in reply to Andrew Scott

    I think that might be an ex-boss of mine. He did try and use other languages when I knew him, but generally coded himself into a corner and handed the mess to me to "just finish off because I have a new project".

    Kind of true you can do most business type stuff in Excel, most people with a little skill and self awareness quickly learn the limitations of that when they are made to try it, or inherit a pile of pain from someone else who did. What I've found funny is how so many businesses/departments have some sort of allergy to compiled code due to past horrors, but then don't seem to see all the problems and pain caused by amateurs stuffing macros into MS Office are just as damaging and risky.

  • Feeling lucky (unregistered) in reply to kazitor

    Indeed, Minutes must be magnitudes faster than Seconds, as seconds are just too small to be fast!

  • (nodebb) in reply to Gallowglass

    Indeed, including a last minute patch to the File Explorer because the unpatched version didn't handle the y2k+ dates very well...

  • TheBestKindOfCorrect (unregistered) in reply to Ryan of Tinellb

    All orders of magnitude are base 10.

  • Argle (unregistered)

    I remember my very first ever contract job. A startup had a program that did some custom filtering on files made by OCR. It fixed certain common errors (like "Califomia" as it was California legal documents). The WTF was that was written in assembly for the 286. I told them there was no way I was going to try to patch someone else's assembly code. I could do a full re-write in C in less time. I warned them that it might be slower, but they didn't care. It had half-hour run-times on their files, so they just ran the program and walked away. When I finally threw a full file at my own program, it finished in 30 seconds. There was astonishment all around (including me). You might ask "just how bad WAS that assembly code?" I don't know. My curiosity to see it didn't overcome my dread of seeing it.

    Upside: I inherited an Everex 286 computer. It was my "George Washington's Ax" computer.* I technically didn't retire it until last month when I simply bought a replacement. But in all these past 30 years, it's had dozens of hard-drive updates, new cases motherboards,, etc. The 5 1/4" floppy finally bit the dust around 2005. I still claimed it was an Everex 286.

    • In case you don't know the joke, a collector pays good money to buy an ax from an old farmer who assures him it once belonged to George Washington. "It's in good shape" says he. "Should be," says the farmer, "it's had 5 new handles and 2 new heads since."
  • (nodebb) in reply to kazitor

    If we're being pedantic, by my calculation, 40 minutes is two orders of magnitude slower than 30 seconds.

  • (nodebb) in reply to Argle

    I actually had a 5¼" drive for longer than that... whether it actually worked was another matter, since I stopped using the disks well before that...

  • veryverypedantic (unregistered) in reply to kazitor

    1.903089986991944 to be really really pedantic

  • Some Ed (unregistered)

    So clearly I'm not special. At the end of my last while in college job, my boss told me they needed some help from somebody who could computer.

    They were a branch office of a national manufacturing company. Corporate had a BI process they ran each month that basically ran the whole month to get some predictions for the next month. But this branch had their own BI process, which ran in a week, and was much more accurate.

    Corporate had just fired the guy who was running this report. So I had to figure this thing out and teach a secretary to do it in three weeks before school started for my final year. The only hope I had was he'd literally just been fired, but wasn't forced to leave before his shift was over, so he was able to explain this to me.

    It worked by having somebody download the data from the corporate mainframe, and load it into excel. This required splitting it into 6 files, because this was the 90s. Once loaded, each file had to be filtered to just their branch, and then the pieces parts could be spliced back together. Except, thanks to this being so fast and so much better than corporate, the branch had grown to the point where their piece was now larger than Excel could handle in one file.

    The guy couldn't really explain how he worked around that in a way I could understand, but I wasn't worried. I was not at all a fan of MS, but I knew of Access, and knew it didn't have the row limit Excel did.

    I spent one week porting it to Access, via heavy Help usage, 5 minutes running it on the data, and then had two weeks to polish and train.

    Corporate next tried hiring more programmers to improve their process, but they couldn't improve it enough to be as fast or as accurate. But eventually, they found the computer that was running it, and confiscated it, claiming they were going to study it to improve the corporate report, but of course they didn't.

  • 🤷 (unregistered)

    "Damn," our submitter muttered. "Something's wrong, it must have died or aborted or something."

    I know that feeling. At my first job, I inherited a reports process. It ran for like 4 hours. It wasn't really fast, but since it was only needed once a month, and not on a very specific time ("During the first few days of the new month is good enough!") it didn't really matter. Plus, the guys who maintained it before me had years of job experience, while I was fresh out of uni, so I didn't question their methods too much. But since sometimes the report failed and I needed to manually fix the errors in the database it bothered me enough that I tried to optimize it.

    First, I added some logging, because there was none, to identify the bottlenecks. No use to shave off a few seconds of a report that runs for hours, right?

    After a few days of finding more stuff where I had to add logs, I finally found it: From a table that contained millions of rows (ie, a normal table in a normal database) the report selected everything. There was no WHERE clause. Then, in the program, it would throw away every row it wouldn't need by inserting the data into a temp table and running a few queries over the data. I put the WHERE's of those queries into the main query. All of a sudden the report finished in 10 minutes. I thought I must have done something wrong, or I only got half the data or whatever. Just like the submitter of this story. But the numbers where exactly the same as they where on the old report.

Leave a comment on “An Indispensible Guru”

Log In or post as a guest

Replying to comment #507026:

« Return to Article