• Michael R (unregistered)

    Sacre bleu!

  • (nodebb)

    I don't understand the obsession of some people to give something non-existent or unknown a specific value. I get the biological component of not being able to grasp someones own non-existence, but people are really taking their self replication-indoctrinated fears to another level when coding. Or maybe it has nothing to do with biology, who knows at this point.

    Accept nulls and handle them accordingly, they are part of the universe. There's no point in trying to some sort of magic religious value in the sky and pretend they don't exist :-)

  • De Gaulle (unregistered)

    It was a very adequate choice. Whoever wrote this code should be made to experience the guillotine.

  • (nodebb)

    I wouldn't be surprised if the developer thought along the lines of, what value do i use as the token for "null"? Hmm, minimum date is not a bad idea. The minimum value is January 1, 0001, which is based on the birth of Christ. I'm French, we have a secular republic, and in the history of my great republic, there are a number of dates which are more important than when some Jew was born in Nazareth millennia ago. Bastille Day sounds perfect!

  • Staircase27 (unregistered)

    Given the names used I think this implies something worse than "loads of stringly typed operations", it looks to like building sql queries by string concatenating the values into the query which means very easy SQL injection attacks.

  • (nodebb) in reply to Mr. TA

    Minimum date is also super dangerous.

    Ever tried DateTime.MinValue.ToUniversalTime() ? You expect an exception in certain cases, but it actually just returns silently DateTime.MinValue - so when you convert it back to local time, it's half of the time wrong.

  • John Melville (unregistered)

    I don't think this is WTF. You needed a sentinel value. In some languages (like c#) nullable date is actually a few bytes bigger than a simple date, so picking a sentinel date makes sense and Bastille day is as good as any other. (The advantage of not having 0 as you sentinel is that when you see the sentinel you are pretty confident that it was an explicitly set sentinel value and not just uninitialized memory.

    Because I am a solo coder I often use my birthdate in tests and such. The date is absolutely indistinguishable from a million others to anyone but me. That way when I see it I know it came from my test cases.

  • (nodebb)

    TRWTF is that they're using types and values specific to Oracle, but (apparently) with a PostgreSQL driver.

  • (nodebb) in reply to MaxiTB

    so when you convert it back to local time

    It's a sentinel value that indicates "no date/time available", so if you convert it to local time, you become TRWTF. (This is the advantage of a nullable type, because your sentinel value becomes self-evidently not-convertible-to-local-time.)

  • (nodebb) in reply to Steve_The_Cynic

    Oh yeah, I totally agree. But keep in mind, whenever I saw this in the wild, it was because someone thought it's a smart way to no longer check for invalid values and just let it go through the business logic and hope for the best. Hence I consider using DateTime.MinValue at all an anti-pattern since .net2.0 where we got nullable value types.

  • Not French (unregistered)

    Everyone is complaining about using some arbitrary date as a sentinel value, but nobody points out the bonus WTF where said date is a hard coded string that gets converted into a DateTime object only to be converted back into a string with different format. Why not just write the sentinel value in the required format? A simple string must not have looked 'professional' enough :)

  • Peter D (unregistered)

    I once used 1752-09-11. I needed to write to a log table to track errors. The date field was NOT NULL, but I needed to track the error 'Date not given'. The date never existed, at least in what is now the USA.

Leave a comment on “Historical Dates”

Log In or post as a guest

Replying to comment #687006:

« Return to Article