• (cs)

    Some people's solutions always look like massive hacks and workarounds. They think they're brilliant and everyone around them is tearing their hair out.

  • Dan Krüsi (unregistered)

    Sidenote: very effective form of censorship.

  • shame (unregistered)

    I did the same thing in REXX. Though I had to pipe my SQL to mysql.exe since it didn't fit command line. And I was better in parsing the output.

    There's no ODBC driver for REXX however.

  • (cs)

    I admit it. I've done this. I had to develop a perl script that needed to pull stuff out of an Oracle database, and I just knew that I could never, ever rely on the people who wanted to run the script being able to install DBD::Oracle on their own machines. So I shelled out to SQL*Plus. It was evil, but it served its purpose.

  • (cs) in reply to Licky Lindsay
    Licky Lindsay:
    I admit it. I've done this. I had to develop a perl script that needed to pull stuff out of an Oracle database, and I just knew that I could never, ever rely on the people who wanted to run the script being able to install DBD::Oracle on their own machines. So I shelled out to SQL*Plus. It was evil, but it served its purpose.

    Why admit? I do this quite often for all kinds of ad-hoc reporting. First I write some SQL scripts, which generate an easily to parse output (mostly CSV). Then I use a perl-script which does some nifty transformations and outputs another CSV file, which the poor bastards from Legal & Compliance can trawl through.

    It's fast prototyping, I mostly junk the scripts after use, since I can rewrite them in the time it takes me to find them and understand/adapt to todays requirements.

    I even once handed over such a contraption to operations, it was easy enough since all the customization is done in the SQL part, which they understand. Leave the perl code alone, it works...

    Of course I probably could rewrite the whole thing as a complex

    SELECT FROM GROUP BY CUBE
    statement, and use iReport to format the output, but that is too much engineering that nobody underwrites, and then of course should one really do OLAP statements on a live OLTP DB?

  • Da' Man (unregistered)

    I recently talked to a guy who had to write a programm that accesses a 250 Gig XML-Database directly by using only standard-C. No external libraries allowed!

    So he wrote a "dumb" xml parser similar to the story - and when he found that parsing the whole file will take ages on that machine, he added an index file that tells this parser where to look.

    Voilá: the result was very fast - much much faster than any "proper" database server could have been. Not very elegant, though, but fast!

    Captcha: "consequat"... whatever you say, darling!

  • (cs) in reply to Mr. Impartial Pants
    Mr. Impartial Pants:
    Why admit?

    I think my reluctance to admit it is because the standard command-line database tools for MS-SQL and Oracle (the only two databases I've ever needed to do anything like this for) are not really well suited for use as script-slaves.

    As far as I can tell they were intended mainly either for interactive use, or for running SQL programs (like daily maintenance scripts to clean old data out of tables where it accumulates) whose output doesn't really matter.

    Now, if they provided a tool whose stated design goal was be used in scripting, especially with regards to retrieving data, I would use it proudly. But as far as I can tell neither SQL*Plus or isql is one of those.

    Is mysql (or something else) better about this?

  • (cs)

    I had to stare at the blank emptiness for a while, thinking "So the WTF is that there's nothing here?" before realizing that there probably was code, but as a screenshot.

    (Having an alt attribute present, but with an empty string, hides images that aren't loaded, for some people such as me. A pain in the ass, but I guess it's okay for a repost.)

  • (cs)

    I took a Perl developer to task for this very type of thing. I told him: If you're going to develop in Perl, then develop in Perl! If you're going to develop in shell script, then develop in shell script! Just figure out what the hell you are doing and do it.

    He went on to develop some pretty neat Perl programs.

  • Some DBA (unregistered) in reply to Da' Man
    Da' Man:
    I recently talked to a guy who had to write a programm that accesses a 250 Gig XML-Database *directly* by using only standard-C. No external libraries allowed!

    So he wrote a "dumb" xml parser similar to the story - and when he found that parsing the whole file will take ages on that machine, he added an index file that tells this parser where to look.

    Voilá: the result was very fast - much much faster than any "proper" database server could have been. Not very elegant, though, but fast!

    Captcha: "consequat"... whatever you say, darling!

    So the Real WTF(TM) is an XML Database then. A "proper" database server should be much faster than the hacked solution of which you speak.

    Also: I'd like to see how much time is spent updating this index after a few "rows" change in this XML nightmare!

  • (cs)

    The code looks pretty good. It makes you wonder if there was some issue with the environment that made getting the mysql libs installed an issue.

    I've had to push things through the mysql cli more than once; it's a nasty hack, but it's all I had at the time.

  • SomeCoder (unregistered) in reply to Some DBA
    Some DBA:
    Da' Man:
    I recently talked to a guy who had to write a programm that accesses a 250 Gig XML-Database *directly* by using only standard-C. No external libraries allowed!

    So he wrote a "dumb" xml parser similar to the story - and when he found that parsing the whole file will take ages on that machine, he added an index file that tells this parser where to look.

    Voilá: the result was very fast - much much faster than any "proper" database server could have been. Not very elegant, though, but fast!

    Captcha: "consequat"... whatever you say, darling!

    So the Real WTF(TM) is an XML Database then. A "proper" database server should be much faster than the hacked solution of which you speak.

    Also: I'd like to see how much time is spent updating this index after a few "rows" change in this XML nightmare!

    Yeah, that's definitely a real WTF, but I have to laugh a bit at "No external libraries allowed!" Where do you think those libraries come from, the sky? :)

    No, they are probably written in C and writing a basic XML parser in C is not as hard as you might think. Granted, I'd rather do it in C++ using STL (or better yet, pre-built libraries) but if the requirement is to write it yourself in C, that's not that bad :)

  • (cs)

    I was in a similar situation where it was more work to build SSL and LDAP extensions into PHP than it was to just invoke the command line utilities. Yeah its something you shouldn't do, but in a pinch it works.

  • jeff (unregistered) in reply to shame

    @Shame: fellow REXXer here. Here's yer REXX-ODBC connector:

    http://rexxsql.sourceforge.net/

  • zoips (unregistered) in reply to SomeCoder
    SomeCoder:
    Some DBA:
    Da' Man:
    I recently talked to a guy who had to write a programm that accesses a 250 Gig XML-Database *directly* by using only standard-C. No external libraries allowed!

    So he wrote a "dumb" xml parser similar to the story - and when he found that parsing the whole file will take ages on that machine, he added an index file that tells this parser where to look.

    Voilá: the result was very fast - much much faster than any "proper" database server could have been. Not very elegant, though, but fast!

    Captcha: "consequat"... whatever you say, darling!

    So the Real WTF(TM) is an XML Database then. A "proper" database server should be much faster than the hacked solution of which you speak.

    Also: I'd like to see how much time is spent updating this index after a few "rows" change in this XML nightmare!

    Yeah, that's definitely a real WTF, but I have to laugh a bit at "No external libraries allowed!" Where do you think those libraries come from, the sky? :)

    No, they are probably written in C and writing a basic XML parser in C is not as hard as you might think. Granted, I'd rather do it in C++ using STL (or better yet, pre-built libraries) but if the requirement is to write it yourself in C, that's not that bad :)

    If you leave out validation and entity expansion, other than default entities and NCR, XML parsing can be done by an extremely simple state machine: read character -> goto state, presto! XML parser. Nothing fancy.

  • Paul (unregistered) in reply to Licky Lindsay
    Licky Lindsay:
    I admit it. I've done this. I had to develop a perl script that needed to pull stuff out of an Oracle database, and I just knew that I could never, ever rely on the people who wanted to run the script being able to install DBD::Oracle on their own machines. So I shelled out to SQL*Plus. It was evil, but it served its purpose.

    I'm assuming this was a *NIX system (mainly because the Windows command shell looks like such a WTF that I could never bring myself to learn it, so this may be Windows and I'm just showing my ignorance), so surely it would have been easier to write an install script to install DBD::Oracle than to do this?

    Maybe I've been lucky so far in my 18 years industry experience, but every company I've worked for has had a procedure for writing install scripts (e.g. using rpms) that means all the user has to do in run the script and the pre-requisites are installed automagically from the third-party depot.

  • Congo (unregistered)

    I had to do something similar, shell to SQLPlus and parse text files because the evil bastards that administer the servers wouldn't install database the drivers for Perl. Net Techs and and Server Admins are natural enemies of Application Developers and Users.

  • (cs) in reply to Paul
    Paul:
    I'm assuming this was a *NIX system (mainly because the Windows command shell looks like such a WTF that I could never bring myself to learn it, so this may be Windows and I'm just showing my ignorance), so surely it would have been easier to write an install script to install DBD::Oracle than to do this?

    It was Windows. Since all the perl script did was execute sqlplus and then open the text file that sqlplus created, there wasn't much interaction with the shell involved really. I'm sure I could have done an auto-install kind of thing but at the time anyway doing what I did seemed less invasive.

    In an unrelated application, I did once had to write a batch file that would install Oracle drivers from a network drive, run the one lone application that needed Oracle, and then remove Oracle after the app had finished running.

  • (cs) in reply to Congo
    Congo:
    I had to do something similar, shell to SQLPlus and parse text files because the evil bastards that administer the servers wouldn't install database the drivers for Perl. Net Techs and and Server Admins are natural enemies of Application Developers and Users.

    I ran into that one as well. The server admins were trying to discourage people from using Perl for the internal web apps (In favor of servlets). They couldn't really get rid of perl from the system, but they found they could at least limit its usefulness by refusing to install database drivers for it.

    I eventually got around that by setting up my own little Linux server under my desk, where I could install anything I damn well pleased.

    There is in fact no variation of this WTF that I haven't used myself, probably. I eventually needed for a program on my Linux machine to get some data from a Microsoft database. So I stuck an asp file over on the nearest convenient Windows server, that ran the queries and returned the results as CSV. The program on the Linux box called it with lynx --dump.

  • Da' Man (unregistered) in reply to Some DBA
    Some DBA:
    So the Real WTF(TM) is an XML Database then. A "proper" database server should be much faster than the hacked solution of which you speak.

    Also: I'd like to see how much time is spent updating this index after a few "rows" change in this XML nightmare!

    I daresay no; Binary search in memory is very fast, file seek to a specified location is also quite fast... I don't think any DBMS can keep up with this.

    Updating the index takes about 2.5 hours, though. Luckily that only had to be done once a month :-)

  • David Schwartz (unregistered)

    The problem with writing your own "quick and dirty" XML parser (don't ask me how I know) is that for the rest of your life, you'll be fixing odd quirks that are already fixed in every other XML parser.

  • Alex (unregistered) in reply to Licky Lindsay

    This is a WTF on this web site I'm afraid. Anyone who has done support work as well as coding knows that this is a perfectly reasonable piece of code.

    Support Code: Target Audience: Non-coding Support Staff woken up at 2am to fix a problem. Language: Shell Script/Perl/Other scripting language - designed to be understood and maintained easily without an IDE. You have (generally) no built in libraries - and you don't want them - the aim is to automate a set of command line tasks which the support staff can test out and run manually if required.

    Application Code: Target Audience: Coding Staff. Language: Java/C# etc - designed to be maintained with an IDE that knows about all the libraries. The aim is to have something that is robust, can be easily extended and puts common logic in the same place.

    Application Code uses database drivers, strongly typed column types etc. Support code does not - but it can all just be run from the command line and edited as text. Which would you rather use if you were maintaining a batch job that has just failed at 2 in the morning over a VT220 terminal - and that has to be run by the morning?

    OK - the only think muddying the issue is that Python can handle DB drivers properly - whether or not to use them depends on whether or not the target audience understands them (a lot of support staff don't - their primary concern is maintaining their Solaris zones or whatever), and whether this piece of code has to be 100% resilient just in case someone registers as - Fred D"Onfrio, the first (sic).

    As a Java coder I still use sqlplus scripts and calls as part of the database build for two reasons: a) The scripts can be emailed to the DBA and data migration people directly - and they can understand and run them; b) SQLPlus still offers much greater functionality than the ant sql task.

  • Anonymous (unregistered)

    Wait, so where is the WTF?

    If the driver isn't already installed on the box (likely if it's stud or idi.ntnu.no), it's not likely a student will be able to get it installed.

    This might have been necessary to be able to run the script at NTNU, rather than his own computer.

    Regarding the code quality, the student may have just picked up the language from the course TDT4120. Perhaps it was his/her first application :)

    Oh, and the SQL server is probably either mysql.stud.ntnu.no or mysql.idi.ntnu.no - those are not really a seceret. Way to go censoring.

    Sorry if I ruined a good WTF :)

  • (cs) in reply to Licky Lindsay
    Licky Lindsay:
    I admit it. I've done this. I had to develop a perl script that needed to pull stuff out of an Oracle database, and I just knew that I could never, ever rely on the people who wanted to run the script being able to install DBD::Oracle on their own machines. So I shelled out to SQL*Plus. It was evil, but it served its purpose.
    I'm dealing with the results of that approach right now. A four day long end-to-end test, relying on a script (admittedly ksh) that shells out to SQL*Plus, running randomly-hacked, batched sql scripts that vary wildly between incremental releases of the "product;" in general, don't match the schema in any obvious way; and occasionally don't even exist. Oh well, back to Day One.

    Solution:

    (1) Rewrite it in Perl. Ugly, but available. (2) Persuade your friendly local Scottish nuclear physics graduate to rewrite the pathetic shell-outs as DBD::Oracle. (3) Use a very large and very sharp knife to behead the DBA. Allah is all-merciful, but even Allah cannot abide ludicrous solutions to a non-existent database problem.

    Selah.

    Do it now. The poor sods that follow you will be grateful.

  • (cs) in reply to Mr. Impartial Pants
    Mr. Impartial Pants:
    Licky Lindsay:
    I admit it. I've done this. I had to develop a perl script that needed to pull stuff out of an Oracle database, and I just knew that I could never, ever rely on the people who wanted to run the script being able to install DBD::Oracle on their own machines. So I shelled out to SQL*Plus. It was evil, but it served its purpose.

    Why admit? I do this quite often for all kinds of ad-hoc reporting. First I write some SQL scripts, which generate an easily to parse output (mostly CSV). Then I use a perl-script which does some nifty transformations and outputs another CSV file, which the poor bastards from Legal & Compliance can trawl through.

    It's fast prototyping, I mostly junk the scripts after use, since I can rewrite them in the time it takes me to find them and understand/adapt to todays requirements.

    I even once handed over such a contraption to operations, it was easy enough since all the customization is done in the SQL part, which they understand. Leave the perl code alone, it works...

    Of course I probably could rewrite the whole thing as a complex

    SELECT FROM GROUP BY CUBE
    statement, and use iReport to format the output, but that is too much engineering that nobody underwrites, and then of course should one really do OLAP statements on a live OLTP DB?
    Um ... no. Not really. No.

    I don't think so.

    No. Not at all. Never.

  • (cs) in reply to Alex
    Alex:
    This is a WTF on this web site I'm afraid.

    Application Code uses database drivers, strongly typed column types etc. Support code does not - but it can all just be run from the command line and edited as text. Which would you rather use if you were maintaining a batch job that has just failed at 2 in the morning over a VT220 terminal - and that has to be run by the morning?

    That depends, Alex. Do I want to get back to sleep in a couple of hours or so, or do I want to save the company a couple of million or so?

    There are hacks at 2am in the morning, and then there are hacks. I was in the zone for around six years, and I learned not to piss around with what are, effectively, free SQL queries with no audit trail. Don't go there. VT220 or worse (and, believe me, I've been there), there are better ways to fix things. And whether or not you can email the script to the DBA afterwards is generally irrelevant.

    Mind you, I did have a good laugh at the comparison between SQLPlus and the ant sql task. Dr Johnson springs immediately to mind...

  • Duff (unregistered) in reply to Alex
    Alex:
    This is a WTF on this web site I'm afraid. Anyone who has done support work as well as coding knows that this is a perfectly reasonable piece of code.

    I'm the guy who writes the support code at a little startup in Austin, and this is a WTF.

    Mind you, I've had the same need (to be able to write shell scripts and similar small tools that trivially wrap databases) -- but I wrote my own Oracle frontend (in Python, using cx_Oracle) built for scriptability. Unlike SQLPlus, it actually exits on errors instead of giving the user an invisible prompt. Unlike SQLPlus, it makes proper distinctions between stdout and stderr. Unlike SQLPlus, it allows arbitrary textual arguments to queries to be passed in on argv. Unlike SQLPlus, it has native support for CSV (or tab-separated, or many other forms of) output. Unlike SQLPlus, its output formats are well-defined and were selected for machine readability.

    The few hours it took to write that code has more than paid off. I can incorporate it in mission-critical pieces of code (like backup scripts) and know that it won't hang or quit with a successful exit code after an error has occurred, and when I'm not piping output to a UNIX shell (IFS=$'\t' and tab-separated output works, presuming there aren't any tabs in the input -- and with another parameter, substrings matching the separator can be squashed; of course, this is moot when CSV is in use), and managerial types like getting their queries back in a format Excel can natively read.

    There's good hacks, and then there's poorly-thought-out hacks that'll cause more trouble down the road. This is the latter.

  • - (unregistered)

    The real WTF is why someone would black out the passwords, but leave the bottom and top parts visible. Anyone with a few minutes of spare time and the same font, could probably cut down the possible passwords to a handful.

  • Alex (unregistered) in reply to Duff
    Duff:
    Unlike SQLPlus, it actually exits on errors instead of giving the user an invisible prompt.

    try using the command - exit whenever error

    Duff:
    Unlike SQLPlus, it makes proper distinctions between stdout and stderr.

    try using spool

    Duff:
    Unlike SQLPlus, it allows arbitrary textual arguments to queries to be passed in on argv.

    try using anonymous PL/SQL or declare variables

    Duff:
    Unlike SQLPlus, it has native support for CSV (or tab-separated, or many other forms of) output.

    try looking up the csvseparator value, along with pagesize, linesize, define etc.

    Duff:
    Unlike SQLPlus, its output formats are well-defined and were selected for machine readability.

    SQLplus was producing text based reports when Python was just a dream in Guido's head. RTFM before you critise it. I am not saying it can do everything Python can do but I am saying it can extract data from an Oracle DB a helluvalot easier - and the number of people that know sqlplus (obviously not including you) is 1000 times greater than the number who know Python.

  • (cs) in reply to Alex
    Alex:
    Duff:
    Unlike SQLPlus, it actually exits on errors instead of giving the user an invisible prompt.

    try using the command - exit whenever error

    Duff:
    Unlike SQLPlus, it makes proper distinctions between stdout and stderr.

    try using spool

    Duff:
    Unlike SQLPlus, it allows arbitrary textual arguments to queries to be passed in on argv.

    try using anonymous PL/SQL or declare variables

    Duff:
    Unlike SQLPlus, it has native support for CSV (or tab-separated, or many other forms of) output.

    try looking up the csvseparator value, along with pagesize, linesize, define etc.

    Duff:
    Unlike SQLPlus, its output formats are well-defined and were selected for machine readability.

    SQLplus was producing text based reports when Python was just a dream in Guido's head. RTFM before you critise it. I am not saying it can do everything Python can do but I am saying it can extract data from an Oracle DB a helluvalot easier - and the number of people that know sqlplus (obviously not including you) is 1000 times greater than the number who know Python.

    "We are many, and you are few."

    Excellent debating point, but irrelevant. You're right: Alex appears to be unaware of the elegant subtleties embedded into the sqlplus interpreter. Ho hum, Alex, you lose.

    An interesting sideline from the 1000/1 comparison arises when you consider exactly what that 1000 is composed of. Mostly cut'n'paste morons, if you're lucky, I would guess.

    Now, I'm not saying that you can't do anything or everything with sqlplus (insert appropriate throat-clearing exercises such as "exit whenever error" here), because you can.

    It's a bit of a shoddy way to build a production system that has to be maintained for the next few years or so, though, isn't it?

    Python, Perl, JDBC, Ruby on Acid, I don't really care. Gimme programmatic control. Please don't force me to fight through two days of an "end-to-end" test that involves a batched sql file that no longer exists. I do not want to do this.

    And, btw, Mr RTFM, why would you assume that Duff doesn't "know" sqlplus?

    I mean, it's not like having sex with your cousin. It's a lot simpler than that.

  • Hans (unregistered) in reply to Da' Man
    Da' Man:
    I recently talked to a guy who had to write a programm that accesses a 250 Gig XML-Database *directly* by using only standard-C. No external libraries allowed!

    So he wrote a "dumb" xml parser similar to the story - and when he found that parsing the whole file will take ages on that machine, he added an index file that tells this parser where to look.

    Voilá: the result was very fast - much much faster than any "proper" database server could have been. Not very elegant, though, but fast!

    Sure, you can read from something like this, but is it also possible to write any data? Or are these "databases" intended to be read-only?

    If you have to write some new data and it happens to be longer than the old data, with a structure like this your only option is to basically write the whole thing out again, isn't it? Unless you pre-allocate space in all your fields (that's a good way to get 250GB of 'data' of course)...

    At work we are also going through a phase like this. First they had an Oracle database. Then they build the meta-database anti-pattern (also on this site, somewhere) on top of the Oracle database. Then they did exports from the Oracle database to text files to allow faster reading by applications (the joins were killing them). Then the text files themselves were made writeable, because making a change in the database and then doing an export is still considered too slow. And now we are translating the text files to XML to allow things like validation and embedded spaces. But this time, they swear, the XML files will really be read-only. Sure...

    In the meantime I'm just thanking God that I'm not on that project...

  • Adamerator (unregistered) in reply to Da' Man

    Ok....so who put 250 Gigs of data into an XML file?

  • Duff (unregistered) in reply to Alex
    Alex:
    SQLplus was producing text based reports when Python was just a dream in Guido's head. RTFM before you critise it. I am not saying it can do everything Python can do but I am saying it can extract data from an Oracle DB a helluvalot easier - and the number of people that know sqlplus (obviously not including you) is 1000 times greater than the number who know Python.

    I don't doubt that I can write a big 'ol wrapper in SQL to try to make SQLPlus behave well, and some of the scripts around here use precisely the tricks you mention -- but given the choice between using wrapping a decades-old monstrosity and writing a 100-line Python script that does the right thing by default (and, in newer versions, respects our local conventions -- we have a database versioning mechanism which the Python script automates the use of), I think using the smaller, more well-defined solution is appropriate.

    Let me make the point. Give me a SQL script that does this

    USER_GUID="$(runQuery \
      'select guid from usr where username=:1' \
      "$USERNAME")"
    ...and let's see how long it is. Exit on errors with a non-zero return code (so the calling shell script knows that a failure occurred and can abort nicely), set all your variables appropriately for machine-parsable output, and everything else you said SQLPlus could trivially handle.

    Or even better...

    runQuery -dCSV 'select * from usr'
    Go on; I'm waiting to see how many lines of code it takes you. Surely much more than a one-liner in either case.

  • Duff (unregistered) in reply to Adamerator
    Adamerator:
    Ok....so who put 250 Gigs of data into an XML file?
    "XML database" != "XML file". For places which are actually *sane* (unlike the previous poster's employer), a 250GB XML database isn't necessarily unreasonable on its face.

    There are a bunch of XML database engines. They use XQuery, XUpdate and similar protocols; support indexing and query optimization; but can be much more free-form than traditional databases about the content of the data being stored (unless, of course, they're configured to validate -- and that can be done too). I don't have the volume of data being discussed, and eXist works for me right now -- and it's getting more scalable every year. For places that need larger volumes of storage now, there are commercial engines available.

    Me, I like being able to dump "lshw -xml" output for all of my systems into a database and use XPath to do a single query that tells me where the non-Dell servers with over 2GB of RAM are without needing to transform lshw's output at all. It makes even more sense if you're using XForms for data entry and can just POST your completed documents straight to the database's REST interface.

  • Pete (unregistered)

    I wonder if that code dated back to early Perl 4. Back then, CPAN didn't exist/wasn't popular yet, so people rolled their own by second nature.

    TBH, if anything, that script looks more readable than most of the more recent Perl scripts I've seen.

  • Gerry (unregistered) in reply to Pete

    The code is Python not Perl...

    Although CPAN didn't exist ISTR people champing at the bit for Tim Bunce's DBI stuff.

    Gerry A former 'Big Perl' on DOS user.

  • Rafal (unregistered)

    It's quirks but sometimes it is the most efficient way to populate database with large amount of data . Some day I've processed 500MB and over XML files using PHP and MySQL and I used this technique due to:

    • hosting limits (script cannot take more than 32MB memory)
    • mysql command line tool performs insert opertation more efficient than php mysql extension.

    Weird, but in some reasons it is good way...

  • Python Coder (unregistered) in reply to Pete
    Pete:
    I wonder if that code dated back to early Perl 4. Back then, CPAN didn't exist/wasn't popular yet, so people rolled their own by second nature.

    TBH, if anything, that script looks more readable than most of the more recent Perl scripts I've seen.

    That is the most concise indictment of Perl and validation of Python I have ever read - considering it came from someone who writes perl!

    I wrote perl for many years, and when I found python and got over the 'indentations are significant' hurdle, there was no looking back. Every now and then I have to maintain some ancient perl code, and I wince every time.

Leave a comment on “Classic WTF: Because database drivers can be too complex ”

Log In or post as a guest

Replying to comment #:

« Return to Article