• Zygo (unregistered) in reply to wade b

    Having a single time source in a distributed application is a good thing; however, using the 'date' command and a database server's clock interchangeably does not constitute using a single time source (unless the DB server is running on the same machine as the perl script, in which case it just looks like the coder is clueless. Well, more clueless...).

    Another wacky thing is that the code shown retrieves dates formatted as text strings, not times. I wonder if the application breaks in subtle ways for a few hours near midnight each day?

  • X (unregistered)

    This is the most non-wtf wtf I've seen on here. The date of the database is most certainly not guaranteed to be the same as the box the script is running on. Granted, the code could have been better written, but it's hardly an unusual technique.

  • shellboy (unregistered) in reply to Still coding in C++

    Try:

    date +%Y%m%d

    It was probably a typo that the percent before "m" was forgotten. Did any of you try a 'man date' to see what the issue might be?

  • server-side (unregistered) in reply to

    Right, and you, as an expert, have seen tons of perl on client-server applications. And btw, have you ever heard about UTC?

  • Mogri (unregistered)

    I could be wrong, but it seems like the WTF is that he is using both pieces of code (perhaps interchangeably). If he is doing each of them in context on purpose, this isn't a WTF. Needs more explanation.

    In conclusion, Wade's one-sentence-per-line-speak hurts my head.

  • RIP VAN WINKLE (unregistered)

    I think it would run faster than the pages on this site... yikes!!

  • jrockway (unregistered) in reply to Glenn Lasher

    Class::DBI is junk. Try DBIx::Class instead.

  • jrockway (unregistered)

    $dbh = new Sybase::DBlib "$user", "$password", "$server";

    The real WTF is that they used the indirect method call syntax.

  • (cs)

    One of the easiest ways to figure out now is to just use chop($NOW=date +%\Y\m%\d)

    Errr ... one of the easiest ways is to use strftime

    #!/usr/bin/perl
    use strict;
    use warnings;
    use POSIX qw( strftime );
    my $str = strftime( "%Y%m%d", localtime() );
    print $str, "\n";
    
  • (cs) in reply to AnonY

    try date +%y%m%d - '%' in from of 'm' It may work as expected. Then again the exapmle quotes works just as you would expect, if read correctly. My computer does what I tell it, no what I want it to do!

  • Laie Techie (unregistered) in reply to Falkan
    Falkan:
    Tha last great work written in C is Schuberts 8th Symphony. ;-)

    I thought it was written in B#, but my MIDI software can transpose it to other keys.

  • (cs) in reply to wade b
    wade b:
    In a transaction environment, you ALWAYS want to get the time from the server, from a clock that you maintain and control.

    This solution is nice, easy, predictable, and controllable.

    Why make things tougher than they have to be?

    No way, pulling the time off the server for client-side calculations is a complete WTF. If you're just displaying it, fine. But really, do you want to make a complete DB round-trip just to display the server time?!?

    In a truly time-sensitive environment you never do any time calculations on the client at all. You can never account for the network latency accurately. -Me

  • (cs) in reply to Elephant
    Elephant:
    In reply to the "just use UTC" guys: That works part of the time. A problem situation is with appointments scheduled to the future. You schedule an appointment to October 10th, 9:00AM, six months in advance. In July, your government decides that this year, daylight savings time starts on October 1st rather than October 15th as it has been.

    If you stored it in UTC, your appointment just got rescheduled.

    Bwahahaha! Bwahahahaha!

    What, "that works part of the time?" Which part? The middle two nibbles, perchance?

    I was going to make a sarky comment about the Julian calendar (Muscovite Old Believers, anyone?), the French Revolutionary calendar, the potential of any government (cf the early American debate about whether or not to adopt German as the national language) to really screw things up by changing your terms of reference ... and wouldn't that be a huge WTF for alll those happy folk abusing regexps in Perl out there?

    But I'm not going to.

    If you have a government like that, then missing an appointment by an hour is the least of your problems.

    Don't kill this one yet, Alex: I want to study its movements ...

  • (cs) in reply to shellboy
    shellboy:
    Try:

    date +%Y%m%d

    It was probably a typo that the percent before "m" was forgotten. Did any of you try a 'man date' to see what the issue might be?

    Some did, not doubt; some didn't. I have to admit that the typo caused me to completely misconstrue the intent of the perl one-liner, which all of a sudden looked like a regexp with '%' substituted for '/'. That's part of the problem with TMTOWTDI.

    No-one so far, unless I've skipped a bit, seems to have noticed the other WTF on this line, which is that it only prints out the day (again, in locale-specific format). It is not a timestamp.

    It seems to me that this renders the (exceptionally silly and indefensible) argument about retrieving a timestamp in charvar format from the (presumably not replicated) server somewhat moot. Unless, of course, your database server is running on a particularly under-powered Commodore 64.

  • WhiskeyPete (unregistered)

    The real wtf is that he's using perl! (ducks)

    Seriously, lets all use a language with as many control characters and cryptic, non-standard keywords as possible, and rely heavily on regular expressions to do... EVERYTHING!! It will be GREAT!! And its sooooo fast...

    captcha: sanitarium (welcome home? indeed.)

  • A Non-Mouse (unregistered) in reply to White Echo
    White Echo:
    The worst part is I am sure the programmer is proud of himself.

    Unfortunately there are many many bad programmers like that out there and their employers do not always realize they suck ever.

    There was a fascinating psych study done a few years back called 'Unskilled and Unaware of It' which drew the conclusion that complete idiots do not even posess the skill necessary to realize that they are idiots. You really need to learn a little first to see what a moron you are.

  • Franz Kafka (unregistered) in reply to Dave (not that one)
    Dave (not that one):
    Let's say you have a database server in Oklahoma, but it is queried by clients all over the world. A user in Tokyo creates a transaction that includes a date, and writes it to the database. A user in New York later reads that transaction and needs to know the time it occurred, in the New York time zone. How would you store and manipulate date/time in the database to ensure this could be done? Assume there are no stored procedures, and client code needs to determine the time.

    Hint: It may be useful to know the time on the SQL Server.

    Solution 1: use GMT. This assumes that the Tokyo user has his timezone set up right.

    Solution 2: if it's a datestamp, use server side triggers to set the field to sysdate or whatever.

  • (cs) in reply to Laie Techie
    Laie Techie:
    Falkan:
    Tha last great work written in C is Schuberts 8th Symphony. ;-)

    I thought it was written in B#, but my MIDI software can transpose it to other keys.

    At last: two sane, interesting and presumably intelligent people in this thread.

    Strictly speaking, Schubert's 8th symphony was not "written in C;" it was unfinished. As a previous poster has so cogently remarked, this means that its completion date lies in the future, and who knows what key transposition Frankie would have made? I'm sure we can all imagine "Schubert's 8th Symphony in UTC..." However, if the second movement relies on a database server for its time signature, it's going to be one mother of a slow movement.

    Falkan was, I believe, thinking of the Great, not the Unfinished -- ie number 9. Just to complicate matters, number 7 was originally a sketch, occasionally causing number 8 to be referred to as number 7, and number 9 as number 8. In which case Falkan would be absolutely correct. Even more complicatedly, number 9 is sometimes referred to as number 7, which would of course make number 8, number 8 again. In which case Falkan would, sadly, be incorrect.

    Further details are available from those noted musicologists and baseball fans, Abbott and Costello.

    ... and I hope that Laie T puts the transposition of the "Unfinished" into C on the web somewhere. I would really love to hear such a monstrosity. I'll bet it's a bugger to play for the woodwind.

    A propos nothing, and entirely self-contained: there's a reason that not many symphonies were written in C. They tend to be exceptionally boring. For reference, you may wish to sample Wagner's one and only symphony. I say "sample," because if you listen to it all the way through, you'll be brain-dead before the final thirteen thuds of the tutti are halfway complete.

  • (cs) in reply to A Non-Mouse
    A Non-Mouse:
    White Echo:
    The worst part is I am sure the programmer is proud of himself.

    Unfortunately there are many many bad programmers like that out there and their employers do not always realize they suck ever.

    There was a fascinating psych study done a few years back called 'Unskilled and Unaware of It' which drew the conclusion that complete idiots do not even posess the skill necessary to realize that they are idiots. You really need to learn a little first to see what a moron you are.

    Go on.

    There's me and about a quintillion other WTF readers who want to know the URL of this study, or at least some way to find it.

    It sort of encapsulates 95% of the WTFs on the site... and a depressingly large number of the comments (including mine).

  • THC (unregistered) in reply to real_aardvark

    so, did you figure out that google+'Unskilled and Unaware of It' = Win yet?

  • WTF Batman (unregistered) in reply to zbigg
    zbigg:
    I bet he had problems with timezones. DB was set to some timezone, but his script ran in other and to do some calculations he had to synchronize.

    Its WTF ... but it really very hard to know threee things:

    • what is timezone
    • how to change/use/ timezone
    • what timezone i should use

    ;).

    Mix in Daylight Saving Time, and how different timezones change on different dates, chill, and serve. Makes 4 portions.

    Captcha: dubya (WTF?)

  • (cs) in reply to real_aardvark
    real_aardvark:
    There's me and about a quintillion other WTF readers who want to know the URL of this study, or at least some way to find it.

    It's called Google people, it will suck the information right out of the internet tubes and pump it to your own computer. Like Magic!

    Geez, he even gave you the name of the article.... -Me

  • (cs) in reply to A Non-Mouse
    A Non-Mouse:
    There was a fascinating psych study done a few years back called 'Unskilled and Unaware of It' which drew the conclusion that complete idiots do not even posess the skill necessary to realize that they are idiots. You really need to learn a little first to see what a moron you are.

    There's a corollary that states the more you know about a given subject, the more your realize your own limits of knowledge, and how much more there is to know about the subject.

    So it can be fairly accurate to say anyone who rates themselves as an "expert" in a given subject is probably a complete idiot/novice, but someone who rates themselves as being just-below-expert is probably a guru-level master....

    -Me

  • anonny (unregistered) in reply to PS
    PS:
    Exactly! That's why unis are switching to Java :p
    What, precisely, do you think java was built with? What do you imagine the latest greatest next-generation language is going to be written in when people become bored with java? C has a peculiar combination of powerful, portable, and freestanding that's very hard to beat.
  • oncogenesis (unregistered) in reply to Animator
    Animator:
    The real WTF is using chop instead of chomp!

    If you assume a normal Unix/Linux environment, then chop is correct. The result of date(1) will end in a newline. chop will definitely, er, chop it off, whereas chomp may chop it off depending on the value of $/ (the input record separator).

    Isn't Perl kewl?!

  • verisimilidude (unregistered) in reply to Falkan
    Falkan:
    Tha last great work written in C is Schuberts 8th Symphony. ;-)

    And has there ever been a great work written in C#?

  • REy (unregistered) in reply to real_aardvark

    I'd like to take a moment to point out that the term UTC refers to a time string, what we like to call "wall clock time" and not a seconds count. Though it is true that you may represent UTC as a seconds count there are additional decisions that must be made before converting a UTC time into a seconds count, chief among them is selecting your epoch start date/time. It is common among programmers to use the UNIX epoch 1 Jan 1970 00:00:00:000 epoch as the epoch start date/time since this allows us to easily convert UTC into a seconds count using the ctime library. It is helpful to remember that by converting the string using the ctime library we are returning a seconds count that yeilds the desired UTC "time string" when converting back to a string using these same library calls; this seconds count is not acutally unix time as unix time is actually some leap seconds ahead of UTC time, if I remember correctly 25 leap seconds for all times after 1 Jan 2005 00:00:00:000.

  • sf (unregistered) in reply to real_aardvark
    real_aardvark:
    Laie Techie:
    Falkan:
    Tha last great work written in C is Schuberts 8th Symphony. ;-)

    I thought it was written in B#, but my MIDI software can transpose it to other keys.

    At last: two sane, interesting and presumably intelligent people in this thread.

    Wow! You must be in MENSA.
  • (cs) in reply to oncogenesis
    oncogenesis:
    Animator:
    The real WTF is using chop instead of chomp!

    If you assume a normal Unix/Linux environment, then chop is correct. The result of date(1) will end in a newline. chop will definitely, er, chop it off, whereas chomp may chop it off depending on the value of $/ (the input record separator).

    Isn't Perl kewl?!

    Perl is kewl. Very kewl. You can make all sorts of beautiful, elegant, wonderful, readable code in Perl. You can also make an incoherent unmaintainable mess. Of course, you can do that in any language, but Perl's messes are usually messier.

    No one in their right mind sets $/ to anything but its default value for anything more than a six-line script. In other news, using the default variable ($_) for much of anything other than a one-line loop/map/etc is also a bad idea.

    Of course... there are fewer people in their right mind than there ought to be. Otherwise, this site wouldn't be here.

  • (cs) in reply to verisimilidude
    verisimilidude:
    Falkan:
    Tha last great work written in C is Schuberts 8th Symphony. ;-)

    And has there ever been a great work written in C#?

    Well, since you ask, yes. Several. I would personally recommend Hans Pfitzner's Symphony in C# minor. (Since re-released under the Ballmer label as "Hans Pfitzner's Symphony in C# SO MAJOR YOU WOULDN'T BELIEVE IT!!! I MEAN, THIS GIVES A WHOLE NEW MEANING TO THE PENTATEUCHIC SCALE! soon to be in a vista version near you. in 2.5 d."

    Or you could try Shostakovich's Symphony 15. Let's face it, he'd programmed 14 great solutions before, but project managers from Stalin on downwards decided that the other key signatures wouldn't work. Gotta be good.

    (Actually, it is. I recommend it.)

    Other than that, a fair amount of Chopin, and the odd bit by Rachmaninov: no, you're right. There isn't much.

    Possibly this is the first and only time that Microsoft have got something wrong.

  • rgz (unregistered) in reply to danixdefcon5
    danixdefcon5:
    Most of it would be WTF-worthy... like the non-LIFO "stack" class.
    I introduce here a new adjective:

    WTF-worthy = WorthTF

  • (cs) in reply to n
    n:
    this reminds me of a story my boss likes to tell me occasionally. (each time I pretend not to have heard it before.) ...
    You had better own-up to hearing it occasionally or he may think you're senile,
  • (cs) in reply to its me
    its me:
    real_aardvark:
    There's me and about a quintillion other WTF readers who want to know the URL of this study, or at least some way to find it.

    It's called Google people, it will suck the information right out of the internet tubes and pump it to your own computer. Like Magic!

    Geez, he even gave you the name of the article.... -Me

    Gosh, thanks, "its me." I have never heard of this strange "google" device. Does one buy it from a specialist shop? I look forward to asking it all sorts of important questions, such as "Does God exist?" and "Is He going to play the Hollywood Bowl any time soon?" and "If you're going to give yourself a moniker like 'its me', should you or should you not use an apostrophe?".

    The last of these is easy. I don't even have to telephone God, which is just as well, because I understand that his pay-as-you-go mobile has just run out of money.

    Listen, dimwit: there is no "it" to which you belong.

    And yes, I do google. Constantly. This particular time, I felt like having a light-hearted conversation, instead.

    I am beginning to dread what the future will look like.

    Oh, and for "its" and all the other cretins out there, you're probably not looking for Google; you're looking for the Wikipedia. Authoritative answers to every single question you could possibly ask. Up to and including the ones to do with God (any flavour: I'll take triple vanilla, with a Cadbury flake).

    Just make sure you wear tin foil on your head whilst you google. Well, it couldn't hurt.

  • (cs) in reply to wade b
    wade b:
    I vote we make this whole post the new WTF.

    Especially this bit:

    what have you added to the discussion?

    I clearly stated I do not know Perl. Other posters have pointed out how badly implemented the Perl code is.

    That is not what I was talking about.

    I merely said that if you want a good, stable clock source you get it from the server.

    And yes, you don't want to create a round-trip just to get the time.

    That is elementary.

    UTC or not, If you need accurate, stable time you get it from a clock source that YOU control.

    Capiche?

    You said that one needs a single authoritative source of time (whether it's a "server" or not is immaterial). You replied to someone who said you should use a single time zone, and offset time only when output. I saw no mention of using multiple sources of time, where did you? As far as I can tell, your flame is totally orthogonal to the comment you flamed.

  • (cs) in reply to anonny
    anonny:
    PS:
    Exactly! That's why unis are switching to Java :p
    What, precisely, do you think java was built with? What do you imagine the latest greatest next-generation language is going to be written in when people become bored with java? C has a peculiar combination of powerful, portable, and freestanding that's very hard to beat.
    Java is written in, er, Java. The standard library of the language is written in the language itself. The standard compiler is written in Java. The bytecode interpreter itself is typically written in C++ (sometimes C or assembly depending on the needs of the platform), yes, but that's not the Java language at all. The language itself requires absolutely no knowledge of the working of the underlying C++ engine. (Unless you're trying to make it do some of the things it was specifically designed not to do.)

    By your reasoning, anything that runs on x86 is written in x86 machine code, because that's what it runs as.

  • (cs) in reply to sf
    sf:
    real_aardvark:
    Laie Techie:
    Falkan:
    Tha last great work written in C is Schuberts 8th Symphony. ;-)

    I thought it was written in B#, but my MIDI software can transpose it to other keys.

    At last: two sane, interesting and presumably intelligent people in this thread.

    Wow! You must be in MENSA.
    No, but you obviously are.

    Count the bananas one more time.

  • (cs) in reply to real_aardvark
    real_aardvark:
    its me:
    real_aardvark:
    There's me and about a quintillion other WTF readers who want to know the URL of this study, or at least some way to find it.

    It's called Google people, it will suck the information right out of the internet tubes and pump it to your own computer. Like Magic!

    Geez, he even gave you the name of the article.... -Me

    Gosh, thanks, "its me." I have never heard of this strange "google" device. Does one buy it from a specialist shop? I look forward to asking it all sorts of important questions, such as "Does God exist?" and "Is He going to play the Hollywood Bowl any time soon?" and "If you're going to give yourself a moniker like 'its me', should you or should you not use an apostrophe?".

    The last of these is easy. I don't even have to telephone God, which is just as well, because I understand that his pay-as-you-go mobile has just run out of money.

    Listen, dimwit: there is no "it" to which you belong.

    And yes, I do google. Constantly. This particular time, I felt like having a light-hearted conversation, instead.

    I am beginning to dread what the future will look like.

    Oh, and for "its" and all the other cretins out there, you're probably not looking for Google; you're looking for the Wikipedia. Authoritative answers to every single question you could possibly ask. Up to and including the ones to do with God (any flavour: I'll take triple vanilla, with a Cadbury flake).

    Just make sure you wear tin foil on your head whilst you google. Well, it couldn't hurt.

    That's funny. its a joke, thank you for getting it. Also the software doesn't allow the "'" character in names....

    And while you many Google all the time, some other cretins here obviously don't....

    Wikipedia's fine, but Google is right in my toolbar, vastly easier.... and it will find anything in Wikipedia or anywhere else on the net at the same time.

    I never take off my tinfoil hat, otherwise the Government will take over my thoughts! I'll start thinking horrible things like Bush actually "won" the last two Presidential elections....

    Oh, god is that a hole in my hat? -Me

  • (cs) in reply to foxyshadis
    foxyshadis:
    anonny:
    PS:
    Exactly! That's why unis are switching to Java :p
    What, precisely, do you think java was built with? What do you imagine the latest greatest next-generation language is going to be written in when people become bored with java? C has a peculiar combination of powerful, portable, and freestanding that's very hard to beat.
    Java is written in, er, Java. The standard library of the language is written in the language itself. The standard compiler is written in Java. The bytecode interpreter itself is typically written in C++ (sometimes C or assembly depending on the needs of the platform), yes, but that's not the Java language at all. The language itself requires absolutely no knowledge of the working of the underlying C++ engine. (Unless you're trying to make it do some of the things it was specifically designed not to do.)

    By your reasoning, anything that runs on x86 is written in x86 machine code, because that's what it runs as.

    I am beginning to think that the daily WTF is a sinister plan to justify out-sourcing. Sure, you get unworkable crap. What did you expect? But at least you get unworkable crap at 20% of the cost ...

    Look. Think. Read (and yes, I know this is difficult for the average techie). The original post, at which you cavill, has a perfectly reasonable point. Moreover, not one which suggests that there is anything wrong with Java. (Sigh.)

    I'm going to have to give up my PowerBook, aren't I? I do so wish it ran on x86, but (until recently) it doesn't. Gotta stop working with Solaris, as well: SPARC ain't i86.

    Java is not, and was never, written in Java. (Although a lot of the crappy libraries have been.) The JVM is written in whatever comes to hand and is most natural. For all I know, if you run it on an IBM mainframe, it's written in Cobol.

    Why the hell would any of us care?

    Yes, you can ignore the underlying C, C++, or whatever. But it's still there. Someone has to write it, debug it, and test it. Rather difficult to do if you only understand Java, I would have thought.

    One more thing: as Annony points out --

    What do you imagine the latest greatest next-generation language is going to be written in when people become bored with java

    Sadly, I don't think it's ever going to happen. However, if it did, the Silver Bullet language creators have two choices:

    (1) Front-end it to C, so it's portable (2) Replicate everything that the Gnu C++ folk (or equivalent) have done in order to support various platforms.

    Personally, I'd prefer (1). A bunch of other loonies have already done (2), so I will gratefully build on their efforts.

    However, good luck with the Java thing. Let me know when Sun stops supporting it (because, frankly, it was a stupid and non-commercial idea, and Microsoft are going to bite their nuts off).

    ... and, when that happens, remember. Kernighan and Ritchie. Not the world's greatest book, but, when you're on skid row, any hand looks like a helping hand.

  • (cs) in reply to its me
    its me:
    That's funny. its a joke, thank you for getting it. Also the software doesn't allow the "'" character in names....

    And while you many Google all the time, some other cretins here obviously don't....

    Wikipedia's fine, but Google is right in my toolbar, vastly easier.... and it will find anything in Wikipedia or anywhere else on the net at the same time.

    I never take off my tinfoil hat, otherwise the Government will take over my thoughts! I'll start thinking horrible things like Bush actually "won" the last two Presidential elections....

    Oh, god is that a hole in my hat? -Me

    Holes in the hat are quite the style this season in Sweden. And you're obviously buying the wrong sort of tin-foil. May I recommend the Fair Trade Bolivian Supremo Panama Tin Foil hat?

    Had you worn such a hat, of course, you would know that Bush did indeed win the 2004 election.

    (I'm sorry, I just had to go to the bathroom back there. Memo to self: buy a big paper roll and strong detergent.)

    The one in 2000 is questionable, though. As are elections at all times. I mean, McGovern in 1972? Against Nixon? Lost his home state of South Dakota, and only won Massachussets because, well, those people are at least half-way sane.

    Unlike Gore. Hell, yes. Do you, or do you not want the most popular Democratic President since Kennedy to campaign for you in Florida, Pennsylvania and Missouri? And nowhere else?

    Difficult question, that.

    It amazes me that anyone takes Al seriously.

    Ummm... what was the question? I don't need this job, really. Actually, I just want to be a DB Admin writing perl wrappers around T-SQL scripts. I can hum Schubert's 10th, you know.

    (Slightly off-tune, but that just makes it sound more like Java than C.)

  • SuzieQ (unregistered) in reply to real_aardvark
    real_aardvark:
    If you have a government like that, then missing an appointment by an hour is the least of your problems. ...

    The West Australian Government has been holding out against Daylight Saving for many years. Then, with 2 weeks warning, it decided that it was going to have a Daylight Saving 'trial' this year. What's more, they decided this several weeks after all the other Australian states that have daylight saving (1 state and 1 territory don't) had already changed over.

    Yep - there really are Governments like that!

    Captcha: craaazy - how true

  • worthawholebean (unregistered)

    The REAL WTF is that complex.h is a file that defines complex numbers... perfectly logical...

    I'm assuming that was a joke.

    captcha: smile?!?!?

  • regeya (unregistered) in reply to real_aardvark
    real_aardvark:
    the potential of any government (cf the early American debate about whether or not to adopt German as the national language) to really screw things up by changing your terms of reference

    http://www.snopes.com/language/apocryph/german.htm

    and minus a million points from the Strunk & White for using 'snarky' and 'perchance'. Archaic words are a linquistic WTF.

  • brian (unregistered)

    OP's

    chop($NOW=date +%\Y\m%\d)
    won't work outside of the Unix world. Perl is multi-platform. The POSIX-using comment with strftime will work on Win32 as well as Unix.

    Something to consider when you use a popular language like Perl. You can't assume that the OS of the server you get next year will be the same as the OS on the server you have now.

  • (cs) in reply to C User
    C User:
    Puh-leaze. Get rid of C? Are you insane? Go back to Visual Basic, where life is simple, straightforward, and utterly inane and boring!

    The header file you mention--complex.h--describes complex numbers and not that C is complex. Although I suppose for the small of mind and intelligence C is complex.

    Puh-leaze. Get a sense of humor. Although I suppose for the small of mind and intelligence this humor might have been difficult to understand.

  • Weave Jester (unregistered) in reply to wade b

    wade b: But I always allow the database server to generate date/time values.

    It is good practise to use the database to record time. However, as I'm sure you know, you'd have the server set the time on incoming data; there's no need to download the time from the database to the client.

    wade b: The other is that we often don't have control over the client workstations accessing the database; therefore synchronizing their clocks with NTP is not an option.

    Why would you need to do that? Even if, for some reason, the client needed to know the server's time (which it shouldn't), you'd query the NTP server directly via the client program. You don't need to alter the system clock just to be able read the server's time.

    In any case, why are client workstations that you don't have control over directly accessing the database?

    wade b: Go ahead and try to convince me otherwise - I'll still do things this way.

    I'm sure you meant to add, "Unless people demostrate a better way of doing things."

  • Jeff (unregistered) in reply to C User
    C User:
    Puh-leaze. Get rid of C? Are you insane? Snipped for Brevity Although I suppose for the small of mind and intelligence C is complex.
    Wow. A bonus WTF in the comments.
  • JL (unregistered) in reply to THC
    THC:
    He needs a large table and he could even move the bad old '+' into good SQL .. "SELECT SUM(id) FROM tbl WHERE id='1' OR '3' "
    Funnily enough, you might actually want to move "+" into SQL, though not in the method you describe.

    As many have pointed out, there are situations where you want to get your time from the database server rather than from the local system. Similarly, there are actually times when you want to do math on the database server rather than in the local system. Some programming environments calculate floating point values to different levels of accuracy, some round in different directions for money units (rounding up, rounding down, rounding toward or away from zero, rounding toward even digits, etc.). So calculations done on a database server might give different results than apparently identical calculations done in the local programming environment.

    If you need results in the database to be utterly consistent with the numbers your application displays, it is safest to do all calculation on the server, which may involve making seemingly silly calls to the database for individual sums or roundings. Thus you can end up with a best practice that on first appearance looks like a WTF, just like today's WTF.

  • Matthew (unregistered) in reply to wade b

    The WTF, then, is calling date to get the system date. You can safely ignore the SQL stuff and you still have a pretty good WTF.

  • (cs) in reply to regeya
    regeya:
    real_aardvark:
    the potential of any government (cf the early American debate about whether or not to adopt German as the national language) to really screw things up by changing your terms of reference

    http://www.snopes.com/language/apocryph/german.htm

    and minus a million points from the Strunk & White for using 'snarky' and 'perchance'. Archaic words are a linquistic WTF.

    In all honesty I see little point in slavishly following Strunk & White (I had to look it up -- American, isn't it?) or Partidge (English). These are guides, not an immutable collection of stringent laws.

    That said, I would have thought that "perchance" is easily intelligible to the average native speaker of English. "Snarky" is less so, I agree -- I had to look that one up, too -- but then, the word I actually used was "sarky," an English abbreviation for "sarcastic."

    Nice of you to ignore the first two scenarios of governmental interference, pass straight by my inference at the end of the tricolon, and concentrate on the truly important issue. Do I look up every assertion, no matter how casually made and ignoring that, in context, it is clearly meant as an illustrative allusion, in Snopes? No, I don't. Anal retention is not quite my style.

    The "one vote" thing is still a bloody good story. If you change the definite article of "the official language" to the indefinite, it's quite close to being accurate. To the average lay person not versed in the procedural intricacies of the legislature, the defeat of a motion to adjourn by one vote is not, in essence, substantially different from the defeat of the actual proposal by one vote.

    Certainly not enough for regeya to get snarky about.

  • ctcrmcou (unregistered)

    Well it's also time to get rid of the cumbustion engine. Unfortunately, unless the planet is wiped clean from a nuclear blast, or other God-like cleansing, it will always be around.

Leave a comment on “Time to Deprecate”

Log In or post as a guest

Replying to comment #:

« Return to Article