• Andrey (unregistered) in reply to Code Dependent

    Not everyone knows that data is the plural of datum. Why oh why is Latin 101 not part of the curriculum?

    Captcha: the plural of facilisus.

  • (cs) in reply to Franz_Kafka
    Franz_Kafka:
    Justification? Sure - warnings are usually errors, so treat them as such until proven otherwise.
    As evidence for one unsubstantiated assertion you have provided another unsubstantiated assertion (and called it a "rule of thumb"). Good effort, but you didn't make your sale today.
  • ShatteredArm (unregistered) in reply to Franz_Kafka
    Franz_Kafka:
    ShatteredArm:
    JD:
    amischiefr:
    JamesQMurphy:
    JD:
    Professionals treat all warnings as errors. Amateurs ignore warnings. Idiots suppress warnings.
    Nice. That's going on my wall.
    Both of you are idiots. What if you are using Java 1.5 or higher, and for some reason you cannot use Generics for a data type. So you're an idiot for suppressing the warnings for this? lrn2Java nubs
    Don't be a retard. There will always be a limited number of legitimate reasons to suppress certain warnings and these should be judged on a case by case basis. But suppressing a deprecation warning because your solution "works incredibly well" is completely idiotic and, for someone in my position, completely unacceptable. It is perfectly obvious that the developer suppressed this warning just to make it disappear, since that is easier than fixing it. I'm sorry, but professionals treat all warnings as errors. You may not fall into the category of "professional" but maybe one day...

    If you're doing some revenue-generating operation, why would you throw out the whole thing if you retrieved the information you want, but with a warning? Sorry, you're wrong, professionals don't treat all warnings as errors. They log all warnings, and then handle specific warnings on a case-by-case basis, depending on whether it's a show-stopping situation. And then they make a business decision on whether previously unencountered warning situations should break the entire operation, or if they too should simply be logged. Then, they monitor the logs, especially during the test and beta phases. That's what professionals do. They don't naively throw out any and every result because there was a non-critical warning.

    Revenue generating compile cycles? What are you on and where can I get some?

    There are also warnings that arise during execution (for example, on results from external systems), not just the compiler ones.

    But yeah, there are great reasons to suppress compiler warnings. For example, auto-generated code that has warnings. Or in .NET, suppressing the documentation requirements on individual enum values:

    ///

    /// This is gender!!! /// public enum Gender { /// /// You really are an idiot if you need this description /// Male, /// /// This too /// Female }

    Or if something becomes deprecated but you don't want to spend days updating old code--in which case, it'd be fine for a professional to simply ignore the warning and tackle it during down time.

    Or how about the warnings where you catch an exception like so, for debugging purposes: } catch (Exception exc) {} // This will result in a warning! instead of: } catch (Exception) {} // No warning here!

  • (cs) in reply to Fister
    Fister:
    Aaron:
    Do you have any justification for this statement...?
    Just do a literature search. Most good literature books on how to write good software (including that classic, "Writing Solid Code") recommends to treat warnings as errors, and only when you can't write warning-free code (but you've convinced yourself that the warning is unavoidable) should you suppress it in that particular instance.
    I've noted from past experience that those with the flimsiest arguments tend to resort to the "go look it up" defense. I've read plenty of books on the subject and I can't say that many of them recommend this as some definitive "best practice" that can be applied to all languages for all applications.

    Aside from that, there are so many weasel words in your argument ("most", "good" x2, "recommend", "convinced yourself") that it's next to impossible to derive any practical information from it.

    Paraphrasing something you read in an obscure and outdated (sorry, "classic") Microsoft C book and saying that one would find the same advice in "most" "good" books about "good" software just doesn't do it for me.

  • Pendant, redux (unregistered) in reply to Tom
    Tom:
    Have you ever considered that some people might not be native English speakers?
    Most of the non-native English speakers in my ken have better grammar and spelling skills than I do.
    Tom:
    We don't have a language compiler and I really don't see why everyone MUST be able to type perfect English in order to stop being considered The Idiot of The Universe by some dumb ass nazi.
    Sigh.

    It may come as a surprise to you but Nazis didn't go around correcting people's grammar.

    Nazis went around exterminating Jews, gays, gypsies, and other "undesirables" in some of the most heinous fashions possible.

    To compare someone's minor grammar flame to systematic murder of millions is execrable in the most extreme.

    The comparison is odious.

    Apparently some are not only lacking in grammatical skills but also in a working knowledge of history.

  • SomeCoder (unregistered) in reply to ShatteredArm
    ShatteredArm:
    There are also warnings that arise during execution (for example, on results from external systems), not just the compiler ones.

    But yeah, there are great reasons to suppress compiler warnings. For example, auto-generated code that has warnings. Or in .NET, suppressing the documentation requirements on individual enum values:

    ///

    /// This is gender!!! /// public enum Gender { /// /// You really are an idiot if you need this description /// Male, /// /// This too /// Female }

    Or if something becomes deprecated but you don't want to spend days updating old code--in which case, it'd be fine for a professional to simply ignore the warning and tackle it during down time.

    Or how about the warnings where you catch an exception like so, for debugging purposes: } catch (Exception exc) {} // This will result in a warning! instead of: } catch (Exception) {} // No warning here!

    First of all, we're talking about compiler warnings, not run-time warnings that the OS or programmer created. That's quite a bit different.

    Second, I've never seen .NET throw warnings because of lack of documentation (maybe you're using some setting that I typically didn't back when I was working with .NET or maybe it does that in .NET >3.0?)

    Third, I have no idea why you would ever use:

    catch (Exception exc)

    rather than

    catch (Exception)

    for debugging purposes. If you were debugging, you can get all the information about the exception without actually creating a variable as in the first example. Visual Studio is very handy for that. Also, why would you leave that kind of code in when you compiling it in Release mode anyway? That's something that I could see ignoring the warning while debugging but certainly you'd clean that up before you released.

    Fourth, rule of thumb doesn't mean absolute, hard and fast, always do it this way, RULE. It means, use judgment but USUALLY, treating warnings as errors is better.

  • Thunder (unregistered) in reply to Bobble
    Bobble:
    If your code will never be used outside of a given culture, what is the point of coding for it? QA isn't going to test against it and business is going to look at you like you are insane when you suggest that you need time in the project plan for ti-ET encodings for your internal app.
    Um... ok, your code is being used to get a date from a freeform field on a web site. That's just one of a huge number of scenarios - that's why data gets sanitized.
  • 182 (unregistered) in reply to Mike

    The function of the

     tags in the comments depends on the knowledge of Java of the original developer.

    a) <BLINK>He, like you, didn't know about javadoc, and decided to put them there for no reason</BLINK> b) He occasionally lets javadoc spit out some HTML because his boss told him to

  • (cs) in reply to ShatteredArm
    ShatteredArm:
    Franz_Kafka:
    ShatteredArm:
    JD:
    amischiefr:
    JamesQMurphy:
    JD:
    Professionals treat all warnings as errors. Amateurs ignore warnings. Idiots suppress warnings.
    Nice. That's going on my wall.
    Both of you are idiots. What if you are using Java 1.5 or higher, and for some reason you cannot use Generics for a data type. So you're an idiot for suppressing the warnings for this? lrn2Java nubs
    Don't be a retard. There will always be a limited number of legitimate reasons to suppress certain warnings and these should be judged on a case by case basis. But suppressing a deprecation warning because your solution "works incredibly well" is completely idiotic and, for someone in my position, completely unacceptable. It is perfectly obvious that the developer suppressed this warning just to make it disappear, since that is easier than fixing it. I'm sorry, but professionals treat all warnings as errors. You may not fall into the category of "professional" but maybe one day...

    If you're doing some revenue-generating operation, why would you throw out the whole thing if you retrieved the information you want, but with a warning? Sorry, you're wrong, professionals don't treat all warnings as errors. They log all warnings, and then handle specific warnings on a case-by-case basis, depending on whether it's a show-stopping situation. And then they make a business decision on whether previously unencountered warning situations should break the entire operation, or if they too should simply be logged. Then, they monitor the logs, especially during the test and beta phases. That's what professionals do. They don't naively throw out any and every result because there was a non-critical warning.

    Revenue generating compile cycles? What are you on and where can I get some?

    There are also warnings that arise during execution (for example, on results from external systems), not just the compiler ones.

    So what? The aphorism at the top of this cascade is about compilers.

  • (cs) in reply to Aaron
    Aaron:
    Franz_Kafka:
    Justification? Sure - warnings are usually errors, so treat them as such until proven otherwise.
    As evidence for one unsubstantiated assertion you have provided another unsubstantiated assertion (and called it a "rule of thumb"). Good effort, but you didn't make your sale today.

    I can't help it if you're unfamiliar with the field, and I'm not going to do your research.

  • DS (unregistered) in reply to Code Dependent

    I think "s" in datas stands for the object type (String in this case).

  • John (unregistered)

    Unless I'm missing something, the "corrected" code is broken -- it loops through each format then ignores it and does the same thing on each pass.

  • j0ney3 (unregistered) in reply to Code Dependent
    Code Dependent:
    What disturbs me is the parameter name: "datas". Since "data" is already the plural of "datum", what is datas? Some kind of fourth dimensional measurement?

    Pffft! You mean what ARE datas! Cleary YOU wouldn't know!

  • Joel (unregistered) in reply to Mike
    Mike:
    I love the other WTF in the code.. Using the miscellaneous "stop" boolean to detect if the inner for loop has been abandoned, just so that the outer loop can be broken out of! Instead of just using a labelled break.....

    Yes, it is completely unforgivable that someone missed out on the chance to label the loop "dance" so that one can break out of it with a snappy "break dance".

  • ShatteredArm (unregistered) in reply to SomeCoder
    First of all, we're talking about compiler warnings, not run-time warnings that the OS or programmer created. That's quite a bit different.
    Fair enough.
    Second, I've never seen .NET throw warnings because of lack of documentation (maybe you're using some setting that I typically didn't back when I was working with .NET or maybe it does that in .NET >3.0?)

    I haven't used <3.0 in awhile, so maybe it's a new warning. But it considers any class, public property, enum value, public method, etc. without a summary to be a warning.

    Third, I have no idea why you would ever use:

    catch (Exception exc)

    rather than

    catch (Exception)

    for debugging purposes. If you were debugging, you can get all the information about the exception without actually creating a variable as in the first example. Visual Studio is very handy for that. Also, why would you leave that kind of code in when you compiling it in Release mode anyway? That's something that I could see ignoring the warning while debugging but certainly you'd clean that up before you released.

    Without the actual variable, you only get the "Exception Helper," which doesn't give you the full information, such as the stack trace, inner exceptions, etc. It's more or less useless without full access to the variable. And yeah, maybe you could clean it up before release, but it's really quite a pain if you're just stepping through without having isolated the issue, then you get stuck with just an exception helper and have to create the variable, recompile, run it, and reattach the debugger because the unused variable creates a warning.

    Fourth, rule of thumb doesn't mean absolute, hard and fast, always do it this way, RULE. It means, use judgment but USUALLY, treating warnings as errors is better.

    What we have on our project is the "treat warnings as errors" on, and disable warnings on a case by case basis using #pragma compiler instructions, usually on things like enum values and autogen code.

  • SomeCoder (unregistered) in reply to ShatteredArm
    ShatteredArm:
    First of all, we're talking about compiler warnings, not run-time warnings that the OS or programmer created. That's quite a bit different.
    Fair enough.
    Second, I've never seen .NET throw warnings because of lack of documentation (maybe you're using some setting that I typically didn't back when I was working with .NET or maybe it does that in .NET >3.0?)

    I haven't used <3.0 in awhile, so maybe it's a new warning. But it considers any class, public property, enum value, public method, etc. without a summary to be a warning.

    Third, I have no idea why you would ever use:

    catch (Exception exc)

    rather than

    catch (Exception)

    for debugging purposes. If you were debugging, you can get all the information about the exception without actually creating a variable as in the first example. Visual Studio is very handy for that. Also, why would you leave that kind of code in when you compiling it in Release mode anyway? That's something that I could see ignoring the warning while debugging but certainly you'd clean that up before you released.

    Without the actual variable, you only get the "Exception Helper," which doesn't give you the full information, such as the stack trace, inner exceptions, etc. It's more or less useless without full access to the variable. And yeah, maybe you could clean it up before release, but it's really quite a pain if you're just stepping through without having isolated the issue, then you get stuck with just an exception helper and have to create the variable, recompile, run it, and reattach the debugger because the unused variable creates a warning.

    Fourth, rule of thumb doesn't mean absolute, hard and fast, always do it this way, RULE. It means, use judgment but USUALLY, treating warnings as errors is better.

    What we have on our project is the "treat warnings as errors" on, and disable warnings on a case by case basis using #pragma compiler instructions, usually on things like enum values and autogen code.

    I seem to remember the Exception Helper being more... well, helpful than you describe.

    If >=3.0 really has lack of documentation as warnings, then I'm glad I got out when I did. That would be the FIRST warning I turn off on any new installation of VS :)

  • (cs) in reply to 182
    182:
    The function of the
     tags in the comments depends on the knowledge of Java of the original developer.
    

    a) <BLINK>He, like you, didn't know about javadoc, and decided to put them there for no reason</BLINK> b) He occasionally lets javadoc spit out some HTML because his boss told him to

    That answers the basic question, but fails to address why
     as the first thing in the doc-comment. All it achieves is to make the output look rubbish by forcing a newline and not closing the 
     when the summary sentence is used, and by being broken into lines stupidly. Obviously, the “engineer” didn't go in for actually running javadoc and reading the output…

  • -- (unregistered) in reply to JD
    JD:
    Professionals treat all warnings as errors. Amateurs ignore warnings. Idiots suppress warnings.

    That's how it should be.

    Unfortunately you are wrong.

  • JB (unregistered)

    How does one "find a date into given datas" anyways? I might give you "find a date OUT OF given data", but TRWTF is that this guy couldn't write English.

    Or write code.

    Or fail to suck at life.

  • Michael Rutherfurd (unregistered)

    One good reason to treat warnings as errors is to get you to remove them from the compile output. The reason to do this is so that you don't miss important "warnings" intermingled among all the unimportant ones. Modern IDEs can help with this but it is still hard to spot problems if you have sometimes hundreds of ignored messages in a project.

  • (cs) in reply to Code Dependent

    He's obviously using postfix Hungarian notation to indicate that this is data in a string. I'd have thought that was perfectly obvious. Of course it should really have been datasz, so it's not too surprising that you didn't spot it.

    Addendum (2008-10-27 20:56): uh... it IS obvious that I'm not being serious here, right? Right?

  • (cs) in reply to 123456
    123456:
    vt_mruhlin:
    Those pres in there are part of the comments, and I'm not sure why they're in there.

    You've really never heard of javadocs?

    Ok, but the "pre" tags are not necessary.

    I presume he wanted to preserve his line breaks in the HTML while still having readable comments in the code? The only java job I ever had, people just put a br tag for all their line breaks. The pre is probably a little easier.

  • Doug (unregistered)

    I used to work for a company that wrote software that parsed third-party logs generated by web servers, web proxies, email servers, etc. Log files often contained one log entry per line.

    Some third-party apps always generated date/times in the same format, such as YYYY/MM/DD hh:mm:ss. Some didn't.

    In the "didn't" category, you could find absolutely everything: apps that used the localised date/time (bearing in mind that the computer generating the log is not necessarily localised the same as the computer that is parsing the log), apps that let the user configure the date/time format, apps that embedded the date in the name of the log file rather than recording it in each log line, apps that left out the year. And many, many more.

    So our custom date/time parser looked at a whole bunch of log lines, and tried to figure out the format. "10/28/08"? Probably MM/DD/YY. "28/Oct/08"? Definitely DD/MMM/YY. "1/11/2008"? Well, it's either MM/DD/YYYY or DD/MM/YYYY; keep reading forward in the log file, maybe you'll figure it out.

    Of course, for apps where the date/time format was consistent, we'd just use that. And the user could specify a mask. But we had to provide the "auto" option, too.

    Our entire team developed very strong opinions on date/times: they should always use a four-digit year; they should never be middle-endian (MM-DD-YYYY); and if there's any chance of either multiple time-zones or daylight saving being involved, each and every timestamp should also include the timezone in which it was captured.

  • Tama (unregistered) in reply to Aaron
    Aaron:
    Franz_Kafka:
    Justification? Sure - warnings are usually errors, so treat them as such until proven otherwise.
    As evidence for one unsubstantiated assertion you have provided another unsubstantiated assertion (and called it a "rule of thumb"). Good effort, but you didn't make your sale today.

    I agree with Franz_Kafka on this one. The reason to treat (compiler) warnings as errors is not merely "because it doesn't look nice on my IDE"; it's because they are compiler hints highlighting very probable semantic mistakes or harmless omissions that could hurt in the future, e.g. (in Java):

    1. If you use deprecated constructors/methods (for instance one of the constructors of Date), you will end up with something that is most likely going to fail at some point (configuration issue, thread safety, ...). There is a reason these constructors/methods are deprecated.

    2. If you forget to define a static version field on a Serializable object, you'll get a warning. Sure, you'll be fine... until one day when you serialize your object, change the class, and try to deserialize back.

    3. You'll get a warning if you use an unparametrized generic objects like list. Usually you can (and you should, because you will get compile-time guarantees on your code) refactor your code using parametrization. And if your code does not easily lend itself to that, maybe there's something wrong somewhere in your design...

    4. You'll get a warning if you have unused local variables, potentially uninitialized final variables, etc. This could just be bad style, leftovers from an attempt at doing something, or just plain omission. In any case, you should fix it.

    In all the above examples, the warnings are acurate detections of potential programming errors from the compiler. While you can certainly make your application kinda run with these warnings, you're much better off fixing them (or turning them off selectively after careful consideration if they turn out to be unwarranted), because it might very well save you a lot of time on debugging later on.

    Acceptable cases for turning off some compiler warnings for me are the ones coming from calls to third-party frameworks; example: Hibernate will return you unparametrized lists, and there's nothing wrong with forcing the cast to a parametrized list matching your query, and turning off that specific warning.

    As far as I'm concerned, I always produce compiler warnings-free code (or try my best to).

    Pendant:
    Tom:
    We don't have a language compiler and I really don't see why everyone MUST be able to type perfect English in order to stop being considered The Idiot of The Universe by some dumb ass nazi.
    Sigh.

    It may come as a surprise to you but Nazis didn't go around correcting people's grammar.

    Nazis went around exterminating Jews, gays, gypsies, and other "undesirables" in some of the most heinous fashions possible.

    To compare someone's minor grammar flame to systematic murder of millions is execrable in the most extreme.

    The comparison is odious.

    I think you deserve a Godwin point.

  • RT (unregistered) in reply to Bobble

    I was once on a one-year project to build infrastructure for a startup company. We were using Java, and planned to use Unicode and NLS libraries everywhere, but were overridden by management. They told us that the company had no interest in work overseas, that the DBA had told them that Unicode would require double the storage space, and that their experts said that using the multilingual libraries would add too much overhead and development time.

    One month after launch, they put in a requirement to enable Spanish and Japanese support.

  • BobbyT (unregistered) in reply to Code Dependent

    Maybe it's like the double-plural of person. You can have one person, a group of persons ("people"), or a group of groups--like for a poll where you want to distinguish normal Americans from various hyphenated-Americans. Just use "peoples"!

    So if you make a struct of structs, you have datas!

  • Ballman (unregistered)

    An even simpler way to do this is to use the Apache Commons DateUtils parseDate method

    Date date = DateUtils.parseDate(rawTimestamp,formats);
    
  • Marshall (unregistered)

    I work on rule of thumb that if the boss signs off on "The application in question is only used in one locale" in blood then I've probably got about 24 hours before s/he tells me that it now has to run across five continents or they are moving the server to Moscow.

    So I can understand why an "engineer" would at least try to allow for future change (weird collection of formats though - and it's never going to handle the mm/ddd/yyyy versus dd/mm/yyyy problem where dd <= 12).

    I have code that is signed off in blood to never be ported to another database. But every database does it's own "improvements" on SQL and I can't find anything that says that there is an across-the-board date format for "standard" (whatever that is) SQL and America is (surely?) the only place in the world that uses mm/dd/yyyy.

    So I have a function called SQLFormat, which returns the format. The first time it is called it fires some test SQL at the database asking for a count of all records in a config (single record) table where a known date field is equal to 31st December 1999. The date is a TDateTime (Delphi) constant that is converted via one of a table of formats (plus the local machines "short" format). As soon as it gets an answer other than an Exception it stores the format onto the heap then exits. Subsequent calls just return the answer from the heap.

    I admit that the first request it makes uses the format that the manual says will work for the chosen database :-)

    And ... it throws a SysError and closes down with an appropriate message pointing to where the table of possible formats needs to be expanded if it fails to find a working template/format.

    Is there some Java reason why s/he uses flags instead of just "return"'ing when s/he (thinks) s/he's found the format? The code would then unconditionally return null if it reaches then end of the outer loop.

  • return of the spelling nazi (unregistered) in reply to Tom
    Tom:
    Regarding your "friends" (sorry, but imaginary ones don't count, so we need the quotes around that word)
    Isn't a little hypocritical for the myspace guy to be lecturing others on what constitutes true friendship?
    Tom:
    Forget everything else that you're doing, throw your children in the trash can, eat your dog and kill your spouse with a corkscrew
    Well...I can't say I agree with most of what you're suggesting there. On the other hand, I'm pretty pleased to see someone spelling "you're" correctly on here.
  • Newman (unregistered)

    And who the heck invented the SQL? They are the ones who have started the built-in date parsing/

  • (cs)

    I'm an engineer, you insensitive clod!

  • Brompot (unregistered) in reply to Code Dependent
    Code Dependent:
    What disturbs me is the parameter name: "datas". Since "data" is already the plural of "datum", what is datas? Some kind of fourth dimensional measurement?

    Try person->people->peoples. It's the same thing.

  • IkS (unregistered) in reply to Date This

    What's wrong with that? It's clearly September 7th, 2008.

  • IkS (unregistered) in reply to IkS
    IkS:
    What's wrong with that? It's clearly September 7th, 2008.

    What i meant to quote was:

    Date This:
    I think there are some date formats missing in the original code... lots of them.

    Which reminds me, we need some special kind of torture for people who represent dates such as 07/08/09.

    stupid mornings.

  • synp (unregistered) in reply to JD
    JD:
    Professionals treat all warnings as errors. Amateurs ignore warnings. Idiots suppress warnings.

    No, we don't!

    :)

  • Yanman.be (unregistered) in reply to JD
    JD:
    Professionals treat all warnings as errors. Amateurs ignore warnings. Idiots suppress warnings.
    I suppress all errors.
  • Steve (unregistered)
    JD:
    Professionals treat all warnings as errors. Amateurs ignore warnings. Idiots suppress warnings.
    One of the many mantras of solid software development. There are exceptions to every rule but that doesn't make this rule any less worthwhile... or any less true.
  • yah (unregistered) in reply to Yanman.be
    Yanman.be:
    JD:
    Professionals treat all warnings as errors. Amateurs ignore warnings. Idiots suppress warnings.
    I suppress all errors.
    Now if only we could find some way of suppressing all idiots...
  • (cs) in reply to Date This
    Date This:
    I think there are some date formats missing in the original code... lots of them.

    Which reminds me, we need some special kind of torture for people who represent dates such as 07/08/09.

    I think some people actually died because of formats like that, if it's impossible for a human to distinguish the date format, it is definitely impossible for a computer.

    You can make assumptions though, based on locale and date separator characters.

    The dots are used mainly in Germany IIRC, so the format should be parsed as dd.mm.yy whereas the slashes mean it's English, and in the US this means mm/dd/yy but in the UK this means dd/mm/yy. In Holland we use dashes, and this always means dd-mm-yy.

  • (cs)

    By the way, I try to use the yyyy-mm-dd format whenever possible, in filenames this gives me the ability to sort by name and still get them in a chronological order. And it's immediately clear what format has been used so parsing errors should be a thing of the past.

  • (cs) in reply to Aaron
    Aaron:
    As evidence for one unsubstantiated assertion you have provided another unsubstantiated assertion (and called it a "rule of thumb"). Good effort, but you didn't make your sale today.

    Warnings are a sign that something might be wrong, and so should be looked in to. You want to stop the compiler bringing up warnings that you've already looked in to so that you notice when new ones show up. Usually it's very easy to just change the code to not generate the warning.

  • (cs) in reply to Gabelstaplerfahrer
    Gabelstaplerfahrer:
    By the way, I try to use the yyyy-mm-dd format whenever possible, in filenames this gives me the ability to sort by name and still get them in a chronological order. And it's immediately clear what format has been used so parsing errors should be a thing of the past.
    Until some retard in your company begin using the awesome yyyy-dd-mm brand... a WTF per se, which have already been shown in this very forum

    Addendum (2008-10-28 08:00): right here: http://forums.thedailywtf.com/forums/p/10064/180885.aspx#180885

  • Dascandy (unregistered) in reply to JD
    JD:
    Professionals treat all warnings as errors. Amateurs ignore warnings. Idiots suppress warnings.

    Warning: C4251, C4996: Microsoft is an idiot.

    If a given warning makes no sense whatsoever, working around it might be worse than suppressing it. Try using MSVC++ once, you'll know that some warnings exist only to lock you into their platform.

    The warning numbers above are no coincidence.

  • (cs) in reply to Smash King
    Smash King:
    Gabelstaplerfahrer:
    By the way, I try to use the yyyy-mm-dd format whenever possible, in filenames this gives me the ability to sort by name and still get them in a chronological order. And it's immediately clear what format has been used so parsing errors should be a thing of the past.
    Until some retard in your company begin using the awesome yyyy-dd-mm brand... a WTF per se, which have already been shown in this very forum

    Addendum (2008-10-28 08:00): right here: http://forums.thedailywtf.com/forums/p/10064/180885.aspx#180885

    That is horrifying, but I'm glad they used slashes instead of dashes. That should at least serve as a warning to developers.

  • (cs) in reply to Mike5
    Well, clearly you are NOT an engineer...

    But they are a language geek, which is just as good :)

  • methinks (unregistered) in reply to JZ
    JZ:
    Yes, and before toilet paper was invented, using grass was a typical way of wiping your ass.

    this should of course read:

    "... using grbutt was a typical way of wiping your butt."

    ;o)

  • methinks (unregistered) in reply to Tom
    Tom:
    Have you ever considered that some people might not be native English speakers?

    We had a similar discussion here before (quite a long time ago) and sadly it appears that (american) native speakers are the ones who tend to mix up "there"/"their"/"they're", "than"/"then" etc. the most.

    I'm not a native speaker, and I don't.

    And please, stop using the term "nazi" in such contexts - this really is not a term to be used frivolously.

  • Staz (unregistered)

    Uh... doesn't the simple version fail to parse a lot of strings that the original version handles correctly?

    Do we have any evidence that none of these other formats are actually used?

    If not, isn't the real WTF the attempt to simply code that is not properly understood?

  • (cs) in reply to Tama
    Tama:
    Pendant:
    *snip*
    I think you deserve a Godwin point.
    Nice application of Muphry's Law, too.

    Unless, of course, the redundancy was deliberate, in which case, it depends.

  • (cs) in reply to j0ney3
    j0ney3:
    Code Dependent:
    What disturbs me is the parameter name: "datas". Since "data" is already the plural of "datum", what is datas? Some kind of fourth dimensional measurement?

    Pffft! You mean what ARE datas! Cleary YOU wouldn't know!

    Sorry, my fault. By the time I typed that, I was tired of cluttering up my text with quotes. However, since you weren't capable of grasping the meaning without them, I'll rework it. Here you go.

    Code Dependent:
    What disturbs me is the parameter name: "datas". Since "data" is already the plural of "datum", what is "datas"? Some kind of fourth dimensional measurement?

Leave a comment on “Extensive Date Parsing”

Log In or post as a guest

Replying to comment #:

« Return to Article