- Feature Articles
- CodeSOD
- Error'd
-
Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Sacre bleu!
Edit Admin
I don't understand the obsession of some people to give something non-existent or unknown a specific value. I get the biological component of not being able to grasp someones own non-existence, but people are really taking their self replication-indoctrinated fears to another level when coding. Or maybe it has nothing to do with biology, who knows at this point.
Accept nulls and handle them accordingly, they are part of the universe. There's no point in trying to some sort of magic religious value in the sky and pretend they don't exist :-)
Admin
It was a very adequate choice. Whoever wrote this code should be made to experience the guillotine.
Edit Admin
I wouldn't be surprised if the developer thought along the lines of, what value do i use as the token for "null"? Hmm, minimum date is not a bad idea. The minimum value is January 1, 0001, which is based on the birth of Christ. I'm French, we have a secular republic, and in the history of my great republic, there are a number of dates which are more important than when some Jew was born in Nazareth millennia ago. Bastille Day sounds perfect!
Admin
Given the names used I think this implies something worse than "loads of stringly typed operations", it looks to like building sql queries by string concatenating the values into the query which means very easy SQL injection attacks.
Edit Admin
Minimum date is also super dangerous.
Ever tried DateTime.MinValue.ToUniversalTime() ? You expect an exception in certain cases, but it actually just returns silently DateTime.MinValue - so when you convert it back to local time, it's half of the time wrong.
Admin
I don't think this is WTF. You needed a sentinel value. In some languages (like c#) nullable date is actually a few bytes bigger than a simple date, so picking a sentinel date makes sense and Bastille day is as good as any other. (The advantage of not having 0 as you sentinel is that when you see the sentinel you are pretty confident that it was an explicitly set sentinel value and not just uninitialized memory.
Because I am a solo coder I often use my birthdate in tests and such. The date is absolutely indistinguishable from a million others to anyone but me. That way when I see it I know it came from my test cases.
Edit Admin
TRWTF is that they're using types and values specific to Oracle, but (apparently) with a PostgreSQL driver.
Edit Admin
It's a sentinel value that indicates "no date/time available", so if you convert it to local time, you become TRWTF. (This is the advantage of a nullable type, because your sentinel value becomes self-evidently not-convertible-to-local-time.)
Edit Admin
Oh yeah, I totally agree. But keep in mind, whenever I saw this in the wild, it was because someone thought it's a smart way to no longer check for invalid values and just let it go through the business logic and hope for the best. Hence I consider using DateTime.MinValue at all an anti-pattern since .net2.0 where we got nullable value types.
Admin
Everyone is complaining about using some arbitrary date as a sentinel value, but nobody points out the bonus WTF where said date is a hard coded string that gets converted into a DateTime object only to be converted back into a string with different format. Why not just write the sentinel value in the required format? A simple string must not have looked 'professional' enough :)