• Gunner (unregistered) in reply to Mr B
    Lack of organisational structure is the first wtf, compounded by simply going along with it for the sake of an easy life. Typical wtf'er really, complaining about the state of things and yet unwilling to do anything about it.

    Have you ever had the pleasure of getting a job where your primary task was supporting and updating a system that was designed 5+ years before by people who probably didn't understand databases as well as they should have? If not, then you are a lucky man.

    Sometimes you can't just change the schema because you know it's wrong, because it would take months of work to rewrite thousands of lines of code and SQL. Sometimes this sort of thing actually has a big enough benefit that you can justify the cost, but usually it doesn't, so you learn to live with the quirks.

    Remember, we're talking real life here, where time and money is actually a commodity. I'm all for fixing the design when possible but even I wouldn't think of rewriting an entire schema just because one stored procedure has to be written badly.

  • Gunner (unregistered) in reply to Devil's Advocate
    Devil's Advocate:
    Because usually pointing it out will get you fired. In this case the code was written by a "senior developer" who has more clout than anybody else, so if you say "Hey this code is shit" then YOU will be the one fired.

    The sad part about our profession is that talent isn't rewarded, hackery and good BS skills are. I've seen a good developer let go without notice from a mediocre team because the good person was trying to encourage the right way to write software, but nobody else was interested in doing it the "right" way.

    Please quit reading my fortune. It's bad enough knowing I'm headed this way sooner or later, I don't need others reminding me.

  • Medezark (unregistered) in reply to MainCoder

    I very recently had to code something similar, because A) SP's were required B) The query was against an oracle linked server C) Depending on the parameters the schema for the tables queried was different

    No matter how much i argued with my boss, or our dba's they insisted that it be a SP rather than coded in the application.

  • (cs) in reply to Gunner
    Gunner:
    Lack of organisational structure is the first wtf, compounded by simply going along with it for the sake of an easy life. Typical wtf'er really, complaining about the state of things and yet unwilling to do anything about it.

    Have you ever had the pleasure of getting a job where your primary task was supporting and updating a system that was designed 5+ years before by people who probably didn't understand databases as well as they should have? If not, then you are a lucky man.

    Sometimes you can't just change the schema because you know it's wrong, because it would take months of work to rewrite thousands of lines of code and SQL. Sometimes this sort of thing actually has a big enough benefit that you can justify the cost, but usually it doesn't, so you learn to live with the quirks.

    Remember, we're talking real life here, where time and money is actually a commodity. I'm all for fixing the design when possible but even I wouldn't think of rewriting an entire schema just because one stored procedure has to be written badly.

    It's that sort of defeatists attitude which I am talking about.

    I'm not just talking about re-writing the schema, I'm talking about introducing an interface/façade which hides the complexity of the schema, usually for shifting schema design this is in the form of programatically created views (such as used in Remedy).

    Gandhi once said "Be the change you want to see in the world." - it also goes with the phenomena of broken windows.

    Part of my current role remit is a cross-development team code debt review, essentially buying time from the business, across the company, to fix these broken windows, with the promise of quicker delivery of new functionality, quicker turn-around of bug fixes, and greater business content delivery options (MVVM/MVC used extensively to give the option of some nice WPF development work, which keeps the business happy because it looks nice, and keeps the developers happy because it's new-ish technology).

    Time and money are indeed valuable but, long-term, doing this saves both - and in the present climate that can only be a good thing.

    shrugs

    This didn't mean to turn into a "all companies are shat, mine are great, aren't I wonderful" rant - but I was in the same position, as many of you are now, a few years ago - putting plasters on gaping wounds, sweeping stuff under a carpet - but I had an epiphany, a moment of clarity, and it was all so obvious what had to be done.

    Out went BeyondCompare, and in came Nant and some build scripts. Out went dynamic sql and in came some programatically generated stored procs and views to deal with shifting DB schemas. (1,000 Stored procs to write against 1,000 tables? if they follow a structure they can be automated, and indeed they were).

    End result? Two months work and the release process was reduced from 2 man weeks (each release took two developers one week working together) down to 5 mins (including compile time). Overnight data load process went from 7 hours (using dynamic sql) to 2 hours (using a series of code-generated stored procs).

    shrugs again

    :)

  • kfarmer (unregistered) in reply to Tim
    Tim:
    I realise this isn't a database forum but does anyone have a feel for the quantitative performance improvement from using stored procedures?

    The alleged perf benefit hasn't existed to any legitimate extent for several versions of SQL Server.

    SQL Server will: cache the query cache the execution plan

    end result: sprocs and dynamic SQL pretty much differ only in time it takes to transmit a few extra characters across the wire.

    Just used dynamic SQL when they allow you do, and make damn sure it's parameterized.

  • kfarmer (unregistered) in reply to Bytemaster
    Bytemaster:
    Now in .NET we have LINQ to SQL, which is also available for other platforms such as DB2. Using this optimizes the queries for you, and uses parametrized queries automatically. It even uses other optimizations which we as programmers usually skip because of time or just cluttering up the code so its insert and update statements are faster than normal.

    It'll even detect possible truncations and complain about them. Have you also seen the difference in paging support between SQL2k and 2k5? :)

    LINQ to SQL doesn't so much optimize the queries for you as it's systematic about how it writes them. It will try to make efficient queries, and may detect some situations you may not normally see, but it's not doing anything you couldn't do by hand. Obviously -- it's using ADO.NET underneath.

    Unfortunately, I wish we'd put in batch inserts/updates OOB in LINQ to SQL, as statics off of the data context, but time is time.

  • (cs) in reply to nick
    nick:
    SlyEcho:
    It's surprising that noone has fixed this yet. Here's my take:

    Aren't you still concatenating strings and sending them to EXEC? I thought that was the source of the injection vulnerability.

    Notice that I replaced EXEC with sp_execute_sql which allows executing parametrized queries. The trivial parameters are passed safely this way. The one parameter that is concatenated in the string is escaped securely with QUOTENAME.

    So there is no injection risk anymore, also each variation of the generated SQL strings' query plan is cached so no performance penalty. I reduced the number of variations to the minimum by using a parametrized query.

    I recommend anyone reading this to take a look at this excellent overview of the subject: The Curse and Blessings of Dynamic SQL

  • Lee (unregistered) in reply to Single User

    Exec is handy when you have to construct SQL dynamically based on data extracted from tables (often due to someone else's poor database design).

  • AndyB (unregistered) in reply to James L
    James L:
    Joon:
    James L:
    Unfortunately, if the application was written in ASP.NET, then this style of coding is the only way to get SPs to work when using SqlDataConnection objects.

    That's some good trolling. Well done!

    Let's see how many people fall for it and start saying "But to the front end this proc would look no different no matter how the back-end does it..."

    Examples of this WTF used in real life, too:

    http://social.msdn.microsoft.com/Forums/en-US/adodotnetdataproviders/thread/70af8b07-8e1a-44c7-9f27-6ec6dc0900f6/

    So the real WTF is ASP.NET?

    Fair enough :)

  • Zark (unregistered) in reply to redbeard0x0a

    The wonderful world of Microsoft SQL Server is where you find only RTRIM and LTRIM with no TRIM in sight. Don't forget they didn't have DATE or TIME datatypes until SQL Server 2008, you had to make due with DATETIME...[/quote]

    I always used YYYYMMDD formatted integers for dates in SQL Server, the DATETIME datatype completely sucked...

  • (cs) in reply to James L
    James L:
    Examples of this WTF used in real life, too:

    http://social.msdn.microsoft.com/Forums/en-US/adodotnetdataproviders/thread/70af8b07-8e1a-44c7-9f27-6ec6dc0900f6/

    Oh gods, I don't know what an "MB Limited" is supposed to be, but that's just so bad. I bet this is one of the people that calls himself a Guru.. but there are ways to write that query in an at least halfway decent manner.
  • WildKarrde (unregistered) in reply to Muppet

    If this is SQL Server (and I think it is), stored procedures do not get a performance boost over SQL commands because of compile once, run many.

    Since SQL Server 2000 (possibly SQL 7), EVERY sql statement (procedure or query) is compiled and cached without regard to its type.

    The only performance benefit one will see between a sql query and a sproc with the same execution plan will come from the network because it takes less time to transfer 'exec mySproc' than it does 'select col1, col2, col3, col4, col5 from myTable where [email protected]'

  • milop (unregistered) in reply to James L

    What are you talking about? Your applications should have at least three tiers, data access, business logic, UI. ASP.net app calls business functions(passing params), business functions call data functions(passing params), and the data functions call SP's in the actual database(passing params).

    You're not coding everything in the ASP.Net page itself, are you?

  • SQL DBA (unregistered) in reply to Smash King
    1. correct
    2. SQL Server 2000 doesn't have "trim", it has ltrim and rtrim. Same with some other DBMS's.
    3. It's possible that some stray data was in the other tables, and joining them ensured only rows which matched in all tables would be returned. "inner join" makes its OWN "where" clause... where the rows match!

    Dynamic SQL, Exec() and xp_cmdshell are necessary for many DBA tasks, such as scripting index regeneration, table checks, etc. It's possible you've never managed a large DB or data warehouse before so you may not have the need. One example is nightly scripting of every object in every DB to text files. Other examples include rsyncing data on a schedule, or FTP'ing a file using a predefined script as a template.

  • SQL DBA (unregistered) in reply to Medezark

    Putting SQL code in an app is just bad form. You can't change it easily, and the DBA can't use it to determine what's affected in the case of a schema change.

    Let me give you an example. My largest view is of many tables, each having between 215 million and 250 million rows, one table for each calendar year quarter. When I wanted to create a separate DB (and filegroup) just for this data, I modified TWO views. If someone had coded adds/deletes/selects in the applications, this wouldn't have been nearly as easy.

    At any time I can find any word in any table name, field name, view name / content, sp name / content, or UDF name / content. In less than a second. You can't do that in your code editor. I can print / output every one of these so I can test query execution plans. You can't do that in your code editor either. And I can put this in a DBA-level revisioning archive, which you can't do because your code and the raw SQL it calls are hopelessly intertwined.

  • SQL DBA (unregistered) in reply to WildKarrde

    A stored procedure called many times, each with different parameters, will be treated as a single query and will be weighted more likely to be pinned in memory automatically. Those same statements sent as raw SQL will be stored as separate statements, all of them vying for the same memory, which greatly reduces their ability to be precompiled the next time they are used. This is rue for at least SQL Server 6.5 through 2005.

    Dynamic SQL is fine in my book -- there are too many things you cannot do without it. There are also some statements which perform better in dynamic SQL than precompiled (because the execution plan is wrong for some combinations of optional parameters, or because inermediate processes within the sp are invoked depending on those optional parameters or parameter values).

Leave a comment on “All Pain, No Gain”

Log In or post as a guest

Replying to comment #:

« Return to Article