• programer001 (unregistered)

    select * from death

  • brokentone (unregistered) in reply to trollsolo

    Yes, placing the pages inside the same barn would be really helpful!

  • Just guessing... (unregistered)

    Something to do with fraud detection, maybe? We have a query about two pages long that chews recursively through member accounts and activities doing multiple deep joins and counts to find webs of collusion / fraudulent activity, and assign scores to each potential network.

  • Look upon me and despair (unregistered) in reply to C-Octothorpe
    C-Octothorpe:
    frits:
    C-Octothorpe:
    frits:
    Looking at that image gave me a similar feeling to watching VanDamme dancing in the movie Kickboxer.
    What, nausea?
    I think the colloquial term is "douche chills".
    I totally forgot about that scene. Thanks for undoing hundreds of hours of therapy frits...
    Wallow in it: youtube.com/watch?v=CE8XKeN0zk4 - Kickboxer Jean Claude Van Damme Dance [HD]
  • Aleksey Tsalolikhin (unregistered)

    Use htsql to collapse the query. htsql is a 5th generation language, an abstraction on top of SQL. http://htsql.org

  • jcromartie (unregistered) in reply to Canthros

    Pfft... right here in front of me sits a query with 25 unions in it. It is ~1800 lines long, making it 37 printed pages in Calibri 10.

    Beat that.

  • jcromartie (unregistered) in reply to Ben

    I'm looking at my monster query here: 92 joins over 25 unions.

  • mgl (unregistered)

    Really nasty piece of SQL code, definitely not for human-based processing. What do you think about tools that may decipher and visualize such complex queries in a more structured way, like DBClarity (http://www.microgen.com/dbclarity/)? Have you been using something similar recently?

  • Bah (unregistered)

    Edit with TABs and LFs... half work done.

  • Shet (unregistered) in reply to Matt

    you're probably right in that no human wrote that

  • artuc (unregistered)

    What was he coding: the world?

  • Eddy (unregistered)

    This must be a generated Query from MS Access.

  • Larry Sheldon (unregistered)

    I knew a COBOL programmer who wrote a major application program that used DMS1100 and which had one period after the period at the end of the PROCEDURE DIVISION. header.

    No, unfortunately I don't have a copy of the code-list.

  • The His (unregistered)

    the his

  • BarofFoo (unregistered) in reply to seriously?

    6 pages with lots of formatting and whitespace is ok.

    6 pages with no line breaks, formatting or whitespace is blurgh.

  • AJ (unregistered)

    Probably took longer to run than debug

  • Abraham Crego (unregistered)

    Try with DISTINCT or change all for a "HELLO WORLD" (as many Microsoft good solutions)

  • select name from users where (unregistered) in reply to My Name Is Missing

    Oh. Dear. FAIL.

    You wrote a query to extract a ton of data, most of which you weren't interested in, and sent it over the wire (slow), in order to run your own unproven search algorithm against it.

    You have in one moment;

    1. Removed the ability of the solution to scale. The amount of data sent across the wire will determine the fastest speed of the query - and as that base set of 'relevant data' likely grows as a multiple of the desired result set, that query is likely slowing down.

    2. Removed the ability of the solution to scale. If the data exceeds your available RAM allocation, you're now writing to disk anyway. A disk on a web app platform that is probably already busy enough. Even slower.

    3. Removed the ability of the solution to scale. That 'relevant data' set is being sent over the wire every time the query is run. A web app you say? I hope you won't be surprised when your user base grow and your network grinds to a halt. Really slow.

    4. Removed the ability of the solution to scale. A good SQL engine optimises it's query execution plan according to the underlying data. It will use indexes when it makes sense, cache results when it makes sense, rewrite sub-queries into joins when it makes sense. As the underlying dataset changes, so does the optimisation. You're now at the mercy of your algorithm, your memory allocation system, your network bandwidth.

    I suggest you read the "Art of SQL" and get a clue. There were obviously some things wrong with the system before you touched it (eg schema flaws, or sql engine configurations, either of which could prevent optimisation), but you didn't fix them. You applied a band aid that will fail.

    Some people are best not allowed to do anything with a relational database.

  • daniel (unregistered) in reply to Steve The Cynic

    All the fields from every joined table are written out in the SELECTs. Temporarily simplifying that to check out the WHERE joins or qualifiers would be useful in debugging.

  • Kavius (unregistered) in reply to select name from users where
    select name from users where:
    You wrote a query to extract a ton of data, most of which you weren't interested in, and sent it over the wire (slow), in order to run your own unproven search algorithm against it.

    I couldn't agree more. I was debugging an app once (it was taking 8 hours to pull up a weeks payroll, and 50% of the time it failed) and isolated the problem down to a query. The query looked something like this printout.

    When I requested time to study the query, I was told the query could not be the problem because the guy that wrote it had been a genius. As I studied the query, I realized the problem was that his "hand crafted" query, had been pulling all the records from the database, running them through a Java filter, and writing them back to the database. I rewrote the query in my spare time (with a where clause to only look at the current pay period), it was a fraction of the size, formatted to be readable: it took less than 2 seconds to run. I quit the next day with the statement "filter early, filter often".

    When I found out the original author was working for my prior employer, I phoned them and recommended immediate termination.

  • Dwain A Wuerfel (unregistered)

    If he is able to find the line of code causing the error I would have to say he has a unique skill that is very marketable.

  • whar (unregistered) in reply to veggen

    Liz, is that you?

  • JRaymond (unregistered)

    Last place I worked had a 76 page long stored procedure.... added one entry to the database. Not computer generated, but rather, shoddy offshore contractor generated.

  • Josiah (unregistered)

    That is amazing. I am sure we will never know how that came to be.

  • Chris s. (unregistered) in reply to SilentRunner

    That's an SQL query, which should be declarative - in other words describe what you want to get out of the system. They are usually short. This query seems to go over 10 foolscap pages. In short, it's a nightmare - so bad that I wonder how the query engine even parsed it, let alone ran it...

  • cy (unregistered)

    Well at least the pages are numbered. I rarely use abbreviations but for this I make the exception. OMG

  • Cynep (unregistered)

    It is fun way to demonstate to outsiders how tough your work is.

    As employer who debug such queries almost every day I can say only "Deal with it!"

  • SQLdummy (unregistered)

    Looks bad and I feel sorry for him. Such things usually end up on my desk too. I would start to get some structure into it by using a SQL formatting tool. If he is lucky there might be a data model that helps to make some sense out of the join conditions. Slowly and systematically break it up into chunks. Not an easy and sexy task but achievable. Maybe the error message gives a hint too.

  • Chuck (unregistered)

    Query shmeery. I want to see the execution plan.

Leave a comment on “The Query of Despair”

Log In or post as a guest

Replying to comment #:

« Return to Article