• Pablo (unregistered)

    Hell yeah, let the time put each guy on its place!

  • dhromed (cs)

    But then why do they keep saying that offloading things to the DB is quicker than using page scripts?

  • Zan (unregistered)

    I count a wtf-line/line ratio of 0'666, not so bad.

    (the wtf/line must be around 2, I guess)

  • White Echo (unregistered)

    The worst part is I am sure the programmer is proud of himself.

    Unfortunately there are many many bad programmers like that out there and their employers do not always realize they suck ever.

  • C User (unregistered)

    Puh-leaze. Get rid of C? Are you insane? Go back to Visual Basic, where life is simple, straightforward, and utterly inane and boring!

    The header file you mention--complex.h--describes complex numbers and not that C is complex. Although I suppose for the small of mind and intelligence C is complex.

  • C programmer with a sense of humor (unregistered) in reply to C User

    I think the original poster is well aware of what complex.h is... ;)

  • Glenn Lasher (unregistered)

    See, now, the real WTF is that he should have used Class::DBI.

    /me ducks

  • danixdefcon5 (cs) in reply to C User

    I C dead people ...

    That's what I think I saw after reading that code ...

    I've seen a guy who has written far worse code in VB to do things the language was not meant to do. Some of his "libraries" look somewhat like this.

    I wish I had that source code. Most of it would be WTF-worthy... like the non-LIFO "stack" class.

  • Ron (unregistered)
    chop($NOW=`date +%\Y\m%\d`)

    Is somebody suggesting it's actually a good idea to assume the OS has a "date" function, and to use it like that?

  • rmr (cs) in reply to C User
    C User:
    Puh-leaze. Get rid of C? Are you insane? Go back to Visual Basic, where life is simple, straightforward, and utterly inane and boring!

    The header file you mention--complex.h--describes complex numbers and not that C is complex. Although I suppose for the small of mind and intelligence C is complex.

    Wow. I'm speechless. Did you even read past the first line?

  • Ryan (unregistered)

    This actually makes some sense if the application can't rely on the application server and database server running with synchronized clocks (or even in the same time zone). The database clock is the "master" clock of the appliction. It's a poor man's application-level NTP.

    It would, of course, be nice to require that all machines have UTC-synchronized clocks and have their time zones properly set. If that's not possible, it would make sense to get the time once an hour from the DB server and cache an offset to the local clock (in milliseconds) to calculate timespamps locally.

  • zbigg (unregistered)

    I bet he had problems with timezones. DB was set to some timezone, but his script ran in other and to do some calculations he had to synchronize.

    Its WTF ... but it really very hard to know threee things:

    • what is timezone
    • how to change/use/ timezone
    • what timezone i should use

    ;).

  • Dave (not that one) (unregistered)

    Let's say you have a database server in Oklahoma, but it is queried by clients all over the world. A user in Tokyo creates a transaction that includes a date, and writes it to the database. A user in New York later reads that transaction and needs to know the time it occurred, in the New York time zone. How would you store and manipulate date/time in the database to ensure this could be done? Assume there are no stored procedures, and client code needs to determine the time.

    Hint: It may be useful to know the time on the SQL Server.

  • MacNugget (unregistered) in reply to Ron
    Comment held for moderation.
  • zbigg (unregistered)

    LOL. 3 timezone related posts in one minute period... it must be telepathy ;)

  • Animator (unregistered)

    The real WTF is using chop instead of chomp!

  • n (unregistered)

    this reminds me of a story my boss likes to tell me occasionally. (each time I pretend not to have heard it before.) Apparently once upon a time a new hire was told to write a perl script that logged to a file. When it was running in production, it started taking up vast amounts of CPU for a small task. It took them a while to figure out what was going on, but eventually it became clear that it was calling

    date
    every time it logged.

  • Troubled porter of code (unregistered)
    Comment held for moderation.
  • Anonymous (unregistered) in reply to Dave (not that one)

    That's why time should always be handled and manipulated in UTC (GMT) time. Time zones are strictly a user interface issue, or should be.

  • AnonY (unregistered)

    Has anyone tried to run

    date +%Ym%d
    from the cmd line (assuming that you have a system with the command on it)? Well on my mac it gives me 2007m18. m is always my favorite month of the year :)

  • wade b (unregistered) in reply to Dave (not that one)

    I don't know Perl so I can't comment if that code is written correctly.

    But I always allow the database server to generate date/time values.

    I don't really have to explain to you all why that is a good idea, do I?

    Dave mentionned one very valid reason.

    The other is that we often don't have control over the client workstations accessing the database; therefore synchronizing their clocks with NTP is not an option.

    Unless this code is badly written, the concept is extremely valid and is actually a "best practice" as far as I'm concerned.

    Go ahead and try to convince me otherwise - I'll still do things this way.

    Date/times are extremely important in transactional environments so you'd best get those values from a known stable clock.

    Why fool with sync'ing many client clocks (which can still go wrong if user has hosed the NTP service or if the client network is acting up.)

    Show me the WTF please?

  • wade b (unregistered) in reply to Anonymous
    That's why time should always be handled and manipulated in UTC (GMT) time. Time zones are strictly a user interface issue, or should be.

    Wrong-o. See my post above.

    You realize the user can set the time to anything they want, right?

    In a transaction environment, you ALWAYS want to get the time from the server, from a clock that you maintain and control.

    This solution is nice, easy, predictable, and controllable.

    Why make things tougher than they have to be?

  • Brent Ashley (unregistered)

    You're saying time and/or localtime would have been a better alternative?

    So tell me, how exactly do you get the database time on a remote db server using time and localtime?

    Have you considered an app that uses a db on another server?

    Have you considered a db instance that's set to a different timezone than the server it's on?

    WTF?

  • Rank Amateur (cs) in reply to Ryan

    Application time, database time, but with user-centered design, what matters is the user's time. Use a web cam and artificial vision to read the user's wristwatch. --Rank

  • Ron (unregistered) in reply to MacNugget
    Comment held for moderation.
  • Some Dude (unregistered) in reply to Dave (not that one)
    Dave (not that one):
    Let's say you have a database server in Oklahoma, but it is queried by clients all over the world. A user in Tokyo creates a transaction that includes a date, and writes it to the database. A user in New York later reads that transaction and needs to know the time it occurred, in the New York time zone. How would you store and manipulate date/time in the database to ensure this could be done? Assume there are no stored procedures, and client code needs to determine the time.

    Hint: It may be useful to know the time on the SQL Server.

    Just use UTC and be done with it. Solutions need not be overly complex.

  • Dr. Whoski (unregistered)

    In Soviet Russia, time synchronises YOU!

  • fennec (cs)
    There's ever a header file called complex.h, if you need proof.
    That's cute.

    On a related note, the WTFs here are complex: they have both real and imaginary parts...

  • JMC (unregistered)

    are you insane?? most of the programs you use on a day-to-day basis are written in C...

  • Winter (unregistered) in reply to wade b
    wade b:
    I don't know Perl so I can't comment if that code is written correctly.

    But I always allow the database server to generate date/time values.

    I don't really have to explain to you all why that is a good idea, do I?

    Dave mentionned one very valid reason.

    The other is that we often don't have control over the client workstations accessing the database; therefore synchronizing their clocks with NTP is not an option.

    Unless this code is badly written, the concept is extremely valid and is actually a "best practice" as far as I'm concerned.

    Go ahead and try to convince me otherwise - I'll still do things this way.

    Date/times are extremely important in transactional environments so you'd best get those values from a known stable clock.

    Why fool with sync'ing many client clocks (which can still go wrong if user has hosed the NTP service or if the client network is acting up.)

    Show me the WTF please?

    I vote we make this whole post the new WTF.

    Especially this bit:

    wade b:
    Go ahead and try to convince me otherwise - I'll still do things this way.

    That is how a lot of these WTFs come to be.

  • Ghost Ware Wizard (cs)

    what a wish! it' won't happen though as C is a foundation laying language to introduce noobs to programming.

    You want them to start thinking correctly about programming you have to start somewhere and C is perfect.

  • John (unregistered)

    Right, he was explaining that that is the easiest way to do something stupid.

  • THC (unregistered)

    He needs a large table and he could even move the bad old '+' into good SQL .. "SELECT SUM(id) FROM tbl WHERE id='1' OR '3' "

  • Milkshake (unregistered) in reply to Ghost Ware Wizard
    Ghost Ware Wizard:
    what a wish! it' won't happen though as C is a foundation laying language to introduce noobs to programming.

    You want them to start thinking correctly about programming you have to start somewhere and C is perfect.

    Besides, there's no time to replace it. We're all too busy eating babies to control the overpopulation problem.

  • PS (unregistered) in reply to Ghost Ware Wizard
    Ghost Ware Wizard:
    what a wish! it' won't happen though as C is a foundation laying language to introduce noobs to programming.

    You want them to start thinking correctly about programming you have to start somewhere and C is perfect.

    Exactly! That's why unis are switching to Java :p

  • (unregistered)

    Not a WTF at all!! reading the time from server it's a best Practice for me, if we are talking about client-server code!!!!

  • real_aardvark (unregistered) in reply to wade b
    Comment held for moderation.
  • Animator (unregistered) in reply to wade b

    The thing is, this isn't about transactions. If you need a stable clock then you should lookup the time when you use it in the query.

    As in, instead of first selecting it into your own code and in an own variable put it in the query you actually need in.

    That is, not insert some_table (some_field) values ($some-time) but use insert into some_table (some_field) values (getdate()) (or whatever you want.)

    If it is just for displaying/setting defaults on a form/... Then querying the database is just plain silly and creates extra, unneeded overhead. Use the local clock.

  • Falkan (unregistered)

    Tha last great work written in C is Schuberts 8th Symphony. ;-)

  • wade b (unregistered) in reply to Winter
    I vote we make this whole post the new WTF.

    Especially this bit:

    what have you added to the discussion?

    I clearly stated I do not know Perl. Other posters have pointed out how badly implemented the Perl code is.

    That is not what I was talking about.

    I merely said that if you want a good, stable clock source you get it from the server.

    And yes, you don't want to create a round-trip just to get the time.

    That is elementary.

    UTC or not, If you need accurate, stable time you get it from a clock source that YOU control.

    Capiche?

  • wade b (unregistered) in reply to real_aardvark
    The original poster (whose name you've chopped, or possibly chomped) is correct. If you are going to do this sort of thing, then store a UTC value (or equivalent, if you need dates before 1970 or after 2038). It is up to the user interface part of the code to convert this to the correct time-zone, locale, format, etc.

    I don't disagree with you at all. Again, I don't know Perl.

    I had no idea the guy was storing a date/time as a string in the DB.

    That's just horrible and violates domain integrity in the same way as storing an integer value in a varchar does.

    My post was aimed more at the people who were making comments about acquiring the time from the client.

    In that context, I stand by what I have said.

  • Elephant (unregistered)

    In reply to the "just use UTC" guys: That works part of the time. A problem situation is with appointments scheduled to the future. You schedule an appointment to October 10th, 9:00AM, six months in advance. In July, your government decides that this year, daylight savings time starts on October 1st rather than October 15th as it has been.

    If you stored it in UTC, your appointment just got rescheduled.

  • Still coding in C++ (unregistered) in reply to AnonY

    I also get 2007m18 when running date +%Ym%d in MinGW.

    <quote> Has anyone tried to run

    date +%Ym%d

    from the cmd line (assuming that you have a system with the command on it)? Well on my mac it gives me 2007m18. m is always my favorite month of the year :)

  • Dave (not that one) (unregistered)

    UTC or not, If you need accurate, stable time you get it from a clock source that YOU control.

    Those of you advocating UTC are correct that it is the same value all over the world right now. As Wade says, though, how do you have any confidence in the client's time and its relation to the server's time? Even if the server's time is off by a bit, it will be consistently off rather than randomly and unpredicably off as you'll get with client times.

    By getting the server time, the client can use it as a benchmark to adjust the times it reports to the server. Obviously, if the time represents the time of the transaction the best thing to do is let the server insert the time--assuming your server can do that. Depending on the situation, the code in the original post may have been the best solution to the clock problem.

  • not so sure (unregistered) in reply to Pablo

    Time to defecate...

  • devnull (unregistered) in reply to Still coding in C++

    I think you wanted "date +%Y%m%d" instead.

  • steve (unregistered)

    The real WTF is the quality of comments on this post and that I read them all.

    There is a lot of context missing in this example - and I agree with Wade for the purpose of a trusted clock that getting the date from the server is the right way to proceed.

    A better implementation though would be to embed getdate() function into all your stored procedures so that it automatically handles all your date inserting/updating. This way you will never need to bring it into the perl/c/java world except for display purposes and it won't lead to the confusion which is around this post.

  • cklam (cs) in reply to Ryan
    Ryan:
    This actually makes some sense if the application can't rely on the application server and database server running with synchronized clocks (or even in the same time zone). The database clock is the "master" clock of the appliction. It's a poor man's application-level NTP.

    It would, of course, be nice to require that all machines have UTC-synchronized clocks and have their time zones properly set. If that's not possible, it would make sense to get the time once an hour from the DB server and cache an offset to the local clock (in milliseconds) to calculate timespamps locally.

    I've seen a lot of shops where the time synchronization was all over the place (means not existing). It became better with win2k3/xp/2k-environments but then they often forgot the Unix/Linux-servers .... where the databases were running. So, IMHO, the second code snippet makes sense in a weird way ...

  • Sleepy Programmer (unregistered)

    I once worked with a high-priced Oracle consultant, he was actually really good but had a "hands off" approach (meaning that he never wanted to touch a keyboard and would dictate typing during the initial phase and then once he gaged that the developer knew what they were doing would change to a more task-oriented approach).

    During my initial time spent with him, he asked me to open SQLPlus and type "SELECT (500 / 15) * 100 FROM dual" - or some such numbers, basically using Oracle as a calculator. Besides just being able to do the math in my head, I commented that I had a calculator, or I could just use perl. He wasn't so amused.

    Captcha: waffles (mmmmm. waffles)

  • betaray (unregistered) in reply to Elephant
    Comment held for moderation.

Leave a comment on “Time to Deprecate”

Log In or post as a guest

Replying to comment #:

« Return to Article