• (cs) in reply to Pete

    Anonymous:
    What reason could any non-administrative application have to change the date?
    I've seen certain Systems and Setups where you need administrative privileges to change date and time and it makes total sense to me. If it actually needs to change the date and time it better be an admin running it. This said it's only clear that the problem is not VBA or VB, but the underlying OS.

    Well you can lock it down in Active Directory .. so the real WTF is that the Network Admin hadn't done that.

  • dave (unregistered) in reply to Hexar
    Anonymous:

    The Real WTF(tm) is that Europe uses DD/MM/YYYY for dates.  Which is kinda like saying The Real WTF is that the rest of the world uses the metric system.  Go USA!

     

    You really need to cut down on crack. The US is the only country that uses the brain-dead illogical month, day, year format. Which makes no sense to anybody at all (the only argument I've ever heard is that it's more akin to how it's spoken - which is total bobbins: you say 1st March, not March the 1st - of course I'm talking English here, not the twisted dialect known as American).

    So the real WTF is that the US still insist on using a crappy date format compared to the rest of the world.

  • (cs) in reply to xcor057
    Anonymous:

    Actually, there is a *small* use for the Date$ function.  In Excel, I have numerous macros that run on certain days, creating files that have a time-date stamp, open files that were made a previous week, etc. On some occasions (like after holidays), I have to set my system clock back and run the macros as if they were running on that day.  This statement makes it it possible to automate that procedure.

    This, of course, is no excuse for the way the Date$ function was used in TDWTF

    Maybe Michael J Fox has a DeLorean you can borrow.  It's just about as risky

  • Franz Kafka (unregistered) in reply to dpm
    dpm:


    What's your point?  I was discussing the name, not whether the usage is restricted.

    ok
    dpm


    That the date command manipulates dates and requires you to be explicit when setting?

    dpm:

    Unix systems are different, in that *usually* users do _not_ have physical access to the actual computer.  Windows of course is the opposite.

    ok
    dpm


    This has nothing to do with the system being unix or windows. If you're using a unix box as a workstation, then you probably have physical access to it.
  • Franz Kafka (unregistered) in reply to dave
    Anonymous:
    which is total bobbins: you say 1st March, not March the 1st - of course I'm talking English here, not the twisted dialect known as American).


    Yeah, twisted dialect, huh? WTF is total bobbins, anyway?
  • SwordfishBob (unregistered) in reply to HitScan

    I have a client whose (old, but off-the-shelf) DOS-based workshop management & billing software assumes everything happens "today". If you want to print or reprint an invoice for a job you did yesterday, it will print with today's date on it. Thankfully the program contains a handy function to change the date, making it easier to backdate job entries. Those pesky Windows 2000 and XP machines though, they periodically their date from the server - even while the user is running the app!

    They're now down to 1 Win98 machine to do all their backdated entry and printing.

    wtf: captcha = wtf ?!

  • Arbol Famileel (unregistered) in reply to dave
    Anonymous:

    You really need to cut down on crack. The US is the only country that uses the brain-dead illogical month, day, year format. Which makes no sense to anybody at all (the only argument I've ever heard is that it's more akin to how it's spoken - which is total bobbins: you say 1st March, not March the 1st - of course I'm talking English here, not the twisted dialect known as American).

    So the real WTF is that the US still insist on using a crappy date format compared to the rest of the world.



    I mean no disrespect--I completely enjoy the nuances of the language used across the pond--but here in the States, we don't say "March the first" very often...and certainly never "March the twenty-second." It's the simpler, "March first" or "March twenty-second." All of which only becomes illogical when we add the year on the end, "March twenty-second, two thousand seven."

    Of course, our military and our genealogists have adopted the superior date notation of the rest of the world.

  • (cs) in reply to Pete
    Anonymous:

    You know... it makes more sense to have smallest/medium/largest time value than it does having medium/smallest/largest? Of course, largest/medium/smallest would make the most sense in terms of easy parseability

    <font size="5">A</font>ctually, both are similar to the way one speaks: July Eleventh, 2006 or the Eleventh of July, 2006.
  • anonymous (unregistered) in reply to triso

    I think  YYYY-MM-DD is better , but my country use DD/MM/AAAA :(

    YMD is better because is stable order. DMA is chaos.

    --Tei

  • Migala (unregistered) in reply to dpm
    dpm:
    Franz Kafka:
    dpm:
    That says a lot about Windows WTFery, right there:  the user has the ability to reboot the machine but not set the clock.  Story!


    Unix is no different. I can reboot any unix box that I have physical access to, but I can't set the date.


    Unix systems are different, in that *usually* users do _not_ have physical access to the actual computer.  Windows of course is the opposite.


    That's because no user wants to come near a unix system. It is meant to be administered remotely by BOFHs and such, not to be used by actual persons.
  • ajk (unregistered) in reply to Pete

    the YYYY-MM-DD makes sense since it allows for easy sorting of date strings

  • (cs) in reply to Colin

    "Your crazy high-tech solution would put at least one consultant out of a job because he couldn't handle dates."

    Wouldn't be the first time a programmer had trouble...getting...a...date.

    I mean, all of high school and four years of college to boot, dude.  You get used to it, eventually.

  • Magnus (unregistered) in reply to Tom
    Anonymous:
    Whoever decided to call the SetSystemDate function simply DATE had very little programming experience. Descriptive naming anyone?

    It's not a function. It's a string, and it's really descriptive:
    PRINT DATE$ : ' this prints the date.
    DATE$ = "7/11/47" : ' this sets the date.

    How can you not think that is descriptive? This is BASIC we've talking about. It's supposed to be really easy (hint - look up what the B stands for). How are you supposed to remember "SetSystemDate" - that's far too long and cumbersome.

    As someone posted earlier, the DATE$ variable goes back to (at least) IBM BASIC. So it's been around for at least 25 years. I'd say anyone who's not heard of it has very little programming experience (at least in BASIC)..

    /Magnus
  • (cs) in reply to Mark H

    Anonymous:
    HitScan:
    Now that languages come with multi-megabyte SDK's, 600-page manuals, and it takes ten classes to say Hello World, what language can one offer to children to play around with?


    does Logo still exist?

    http://www.softronix.com/logo.html

     

  • (cs) in reply to Sgt. Zim

    Sgt. Zim:
    Anonymous:
    as soon as y'all stop describing how much people weigh using "stones" (WTF is that, anyway?) then we can talk about the metric system.

    as for fahrenheit -- it runs from intolerably cold at 0 degrees to intolerably hot at 100. we won't be losing that for some arbitrary system based on chemistry any time soon.

    "Tolerable" is relative.  I just came back from lunch, and it's something like 118° outside, and scheduled to top 125°, but at about 4% humidity.  I can tolerate that.  Transplant me somewhere that's only 90° and 50%+, and I can't breathe; I call that intolerable.  The point is, they're both arbitrary systems.  What's wrong with using the one that's closer to "absolute" numbers?

    I propose a new system, because the '°' is too difficult to type:  The freezing temperature of beer will be called 'a', while the temperature inside the car after 6 hours in the July sun will be "z."

    And just to finish stirring the shit-pot, I'm American, I prefer the metric system, and I tend to write dates as yyyy-mm-dd because there's little chance of mistaking it.

    Talking about temperature scales, I actually hate them all.  Though I do sort of like the BeerPoint temperature above.  As an engineer, I had to learn five different temperature scales, plus having to convert between any and all of them (from memory, of course).  I think that the European temperature scale is the worst, because as soon as you call it one of the names, someone corrects you with the other one:  I am speaking of Celcius/Centigrade.  Then the one I was raised with, and actually have a fondness for, which is Fahrenheit.  [Pop quiz:  What is the Fahrenheit reading if it is minus forty degrees Celsius?]  However, then we have the odd one, called Reaumur, after some Frenchman with too much time on his hands.  But the two that stand right next to Celsius and Fahrenheit are Kelvin and Rankine.  Those two are based on zero being absolute zero, and thus make more sense than any of the other temperature scales.  Besides, I sort of like thinking of 300 degrees Kelvin being room temperature.

    I hate the metric system, but only because everyone who is raised on it seems to know nothing about fractions.  Besides which, I'd rather order a pint of beer than a half-liter of beer.  Just sounds like you're getting more by using a full unit rather than half of one.

     

  • John (unregistered) in reply to xcor057

    <FONT color=#000000>

    Anonymous:
    </FONT>

    <FONT color=#000000>Actually Celsius was defined using the specific heat of water. Specific heat was originally a chemistry metric.</FONT>

    <FONT color=#000000>

    </FONT>

    <FONT color=#000000>So was Ferenheit, 212(the boiling point of water at sea level) -32(the freezing point of water) = 180, or half a circle in degrees. Where Zero was based is the main issue with it.</FONT>

    <FONT color=#000000>Perfectly sensible in Sexagesimal, on which our clocks (60 seconds, 60 minutes) and geography (latitude, longitude) are based.</FONT>

    <FONT color=#000000>Base 10 metric can only be easily be factored with 2 and 5 while base 60 is easy to divide by 2,3 and 5; also with 4(2x2), 6(2x3), 10(2x5) and 12(2x2x3) as easy non-prime factors.</FONT>

  • tomandlu (unregistered)

    Heh - a lot of script-based programming on an app. I work on is done in the US (I'm UK). As soon as users tell me they've got an intermittent problem, the first things I look for now are any hand-rolled functions involving date manipulation....

  • SilverDirk (unregistered) in reply to John

    ok john, thats got to be the wierdest thing I've ever heard.

    But its also wrong, because from freezing to boiling spans 244 degrees.

    Anyway, I would also point out that 100 fereinheit is body temperature (off by 1.4 degrees from the modern standard), for those missing the connection to a real-world value. 0? well thats just darn cold ;-) but I think the way I remember it is that Fereinheit was holding an inches-ruler up to a tube of mercury, to mark points on it.

    As for number systems, I agree that it is good to be able to divide evenly on more things than just 2 (hands) and 5 (fingers). "The Real WTF" is that we use a decimal system for the silly reason of being able to count it on our hands, rather than a dodecimal system based on 12 which would have all the same benefits number-wise.

    sidenote- if you count binary on your fingers you can reach 1024

  • (cs) in reply to kalasz

    Anonymous:
    Well... in my country we use YYYY. MM. DD. It's our system, it's our habit. It is at least logical than DD/MM/YY. Notice that the YYYY. MM. DD. format contains ordinals - one should read it as 2006th year, 7th month, 12th day. Logical, isn't it?

    Ahh good. I see you are of the group who agree that this century did not start until 2001-01-01

  • PHP hater (unregistered) in reply to Walrus

    Because YYYY-MM-DD sorts correctly

  • rob_squared (unregistered) in reply to Ken
    Ken:

    Anonymous:
    Wouldn't it be great if an ordinary application did not have the privilege of changing the system's date? Can you dare to imagine such a revolutionary concept?

    An *ordinary* application?  What exactly entails an *ordinary* application?

    And what would do if you DID need to change the system date?  Set ExtraOrdinary = True? 



    Erm, no?  You'd be able to set at the OS level which applications have the ability to modify the system time.  If its something trivial, any user can set it.  If its enterprisey, only someone with admin privlages could run it.

    captcha = \0
  • (cs) in reply to ISO
    Anonymous:
    Carnildo:
    Anonymous:

    The Real WTF(tm) is that Europe uses DD/MM/YYYY for dates.  Which is kinda like saying The Real WTF is that the rest of the world uses the metric system.  Go USA!

     

    The *real* WTF is how few people use ISO standard dates. 20060712 forever!

    As I said before, yes, that's the correct way. If only some people would accept ISO's authority as the world standards organization... but nooo, let's continue measuring length in how long parts of your body are, rather than something that actually makes sense. Lots of ISO units fit together in a manner (for example, one cubic meter is 1000 liters, and that much water weighs about 1000 kilograms).

    Even without that, don't you think it's of some value to have ONE and ONLY ONE standard for things?

    What's the use of the ISO with all it's standardizings if it confuses Americans? I say the ISO and the whole world should adopt the standard used by God's chosen nation.

    And while we're at it, let us correct binary integer representation to match the superior date format. From now on, when I enter the decimal integer 111 into a computer, I expect its binary representation to be 10011111. Anything less is an insult to humanity. At best it panders to all those liberals from other countries and at worst is playing into the hands of terrorists!

  • rob_squared (unregistered) in reply to Hubert Farnsworth
    Hubert Farnsworth:
    Anonymous:

    The Real WTF(tm) is that Europe uses DD/MM/YYYY for dates.  Which is kinda like saying The Real WTF is that the rest of the world uses the metric system.  Go USA!

    While the Yuro way still has the order of magnitudes reversed, at least we're not mixing up the magnitudes!



    Dear god man: Euro
  • (cs) in reply to jspenguin

    jspenguin:
    It screws up on the 13th of the month? I guess Friday the 13th really is cursed...

    LOL excellent

  • Marcel (unregistered) in reply to mnature
    mnature:

    I hate the metric system, but only because everyone who is raised on it seems to know nothing about fractions.  Besides which, I'd rather order a pint of beer than a half-liter of beer.  Just sounds like you're getting more by using a full unit rather than half of one.

    No problem, in some parts of Germany you can easily order a full liter of beer.
  • (cs)

    Just curious... Does anyone know the official measurement for a bucket of worms?

  • (cs) in reply to mnature
    mnature:

    Sgt. Zim:
    Anonymous:
    as soon as y'all stop describing how much people weigh using "stones" (WTF is that, anyway?) then we can talk about the metric system.

    as for fahrenheit -- it runs from intolerably cold at 0 degrees to intolerably hot at 100. we won't be losing that for some arbitrary system based on chemistry any time soon.

    "Tolerable" is relative.  I just came back from lunch, and it's something like 118° outside, and scheduled to top 125°, but at about 4% humidity.  I can tolerate that.  Transplant me somewhere that's only 90° and 50%+, and I can't breathe; I call that intolerable.  The point is, they're both arbitrary systems.  What's wrong with using the one that's closer to "absolute" numbers?

    I propose a new system, because the '°' is too difficult to type:  The freezing temperature of beer will be called 'a', while the temperature inside the car after 6 hours in the July sun will be "z."

    And just to finish stirring the shit-pot, I'm American, I prefer the metric system, and I tend to write dates as yyyy-mm-dd because there's little chance of mistaking it.

    Talking about temperature scales, I actually hate them all.  Though I do sort of like the BeerPoint temperature above.  As an engineer, I had to learn five different temperature scales, plus having to convert between any and all of them (from memory, of course).  I think that the European temperature scale is the worst, because as soon as you call it one of the names, someone corrects you with the other one:  I am speaking of Celcius/Centigrade.  Then the one I was raised with, and actually have a fondness for, which is Fahrenheit.  [Pop quiz:  What is the Fahrenheit reading if it is minus forty degrees Celsius?]  However, then we have the odd one, called Reaumur, after some Frenchman with too much time on his hands.  But the two that stand right next to Celsius and Fahrenheit are Kelvin and Rankine.  Those two are based on zero being absolute zero, and thus make more sense than any of the other temperature scales.  Besides, I sort of like thinking of 300 degrees Kelvin being room temperature.

    I hate the metric system, but only because everyone who is raised on it seems to know nothing about fractions.  Besides which, I'd rather order a pint of beer than a half-liter of beer.  Just sounds like you're getting more by using a full unit rather than half of one.

     

    Everyone I've talked to has always very strongly stated that it's "300 Kelvin", not "300 degrees Kelvin". There's no degree sign and there's no word degrees.

  • Cyril Gupta (unregistered)

    Classic WTF!

    I am in a country where we use the dd/MM/yyyy system, so I know exactly how much havoc the american date system can cause.

  • Me (unregistered) in reply to Sgt. Zim

    The REAL wtf is that you're still American.

  • Me (unregistered) in reply to Sgt. Zim
    Sgt. Zim:
    And just to finish stirring the shit-pot, I'm American, I prefer the metric system, and I tend to write dates as yyyy-mm-dd because there's little chance of mistaking it.

    whoops, forgot to quote

  • Dwonis (unregistered)

    The real WTF is that DATE$ = "03/02/2006" has a different affects US machines differently than European machines.

  • Dwonis (unregistered) in reply to Dwonis
    Anonymous:
    The real WTF is that DATE$ = "03/02/2006" has a different affects US machines differently than European machines.


    I stand corrected: The real WTF is that we can't edit our posts.

    s/has a different//
  • (cs) in reply to My Name
    Anonymous:
    And the day will dawn when you want to write your fist system administration tool in Excel. Don't cry when you recognize that this function is missing.

    I can administer my fist system just fine without Excel.
  • evamedia (unregistered) in reply to Dwonis

    Dates suck in most languages (that I've come across anyway)

    I'm expecting a WTF that I created to pop up on here one day, I live and work in the UK, and get the odd job in Europe, which usually means code up what you can in London, get on a plane and implement it over there.

    On my first gig with another developer we spent hours localising due to the way the dates had been stored and how they were interpreted by the German version of Windows. On my next solo gig I didn't want to have to go through that rubbish, so I stored the date as a double in the julien format in the db.

    More work everytime you wanted to use the dates, but I least I knew it was the 4th July and not the 7th April

  • element[0] (unregistered) in reply to treefrog

    i think the real wtf here is that the USA doesn't use the rest of the worlds date format of dd/mm/yyyy not to mention not using the metric system!

    Who decided the date should go median value / min value / max value
    rather than min value / median value / max value

    ?!?!!?

  • Martijn (unregistered) in reply to element[0]

    The real WTF is that it took three years for the IT guys to take an entire continent of users seriously.

    Honestly, WHAT THE F**K were you guys thinking???

    Couldn't you have just sent a guy overseas to verify that the bug, that so many people reported, actual exists? Or better yet; couldn't you have just trusted a European IT guy to check it? I honestly can't believe a smart person (or even a person not completely moronic) would be so cock-sure of him/her-self to not believe an entire continent of collegues all reporting the same bug over a three year period.

    The real WTF is every single person in the entire IT department.

    People like these shame me to be an IT guy myself.

  • Gabe (unregistered) in reply to GoatCheez
    GoatCheez:
    Anonymous:

    GoatCheez:

    lol...
    Does anyone even expect microsoft to implement a proper priviledge system in Vista? Or document it well for that matter? I know I don't... I expect that to be happening around 2020, assuming Microsoft is still around, but that's just me... Yeah, Microsoft needs to get their act together when it comes to their OS.

    The real WTF is people who are seemingly intelligent enough to comprehend the WTFs, while not being smart enough to look into even the basics of Windows security as it has been for over 13 years!

    While the ability to configure what users have the right to change the system clock has been around since the first version of Windows NT in 1993, the interface was changed in 2000. Go to Administrative Tools|Local Security Settings|Local Policies|User Rights Assignment, and look for the right called "Change the system time". By default administrators and power users have the right, and most users run as power users or administrators by default, so most users always have the ability to change the time.



    The system clock is just the tip of the iceberg.

    The real WTF is people who are seemingly intelligent enough to comprehend the intricacies of the windows system clock, while not being smart enough to observe commenting on the fundamental flaws in Windows security as it has been for over 13 years!

    Apparently I'm not intelligent enough to know what a "proper priviledge system" looks like. Please enlighten us.

    While you're at it, why don't you let us all know what fundamental flaws are in Windows security. I know of many flaws (sometimes poor defaults, bad conventions), but none that are fundamental.

  • Program.X (unregistered) in reply to Gabe

    "On the 13th and later, it would not mess up, because it would silently swallow the 'invalid date' error. "

    Hasn't anyone realised that had the original developer not used the typical low-grade coding VB approach to error handling - probably "On Error Resume Next", this would have been easy to spot?

    You're all deeply involved in a Euro-US fight, so I can understand.

  • tomandlu (unregistered) in reply to Mark H
    Anonymous:
    HitScan:
    Now that languages come with multi-megabyte SDK's, 600-page manuals, and it takes ten classes to say Hello World, what language can one offer to children to play around with?


    does Logo still exist?


    <p>Dunno, but perl does...</p>

    <pre>
    my $name = <STDIN>;
    print "Hello world, and hello $name\n";
    </pre>
  • American Arrogance (unregistered) in reply to Martijn
    Anonymous:

    The real WTF is that it took three years for the IT guys to take an entire continent of users seriously.

    Honestly, WHAT THE F**K were you guys thinking???

    Couldn't you have just sent a guy overseas to verify that the bug, that so many people reported, actual exists? Or better yet; couldn't you have just trusted a European IT guy to check it?[...]

    Well, you see, they were Americans. They knew there was nothing wrong with there code. It must have been some other code causing the problems they were experiencing. And furthermore, when the problem would crop up, some one would get on a plane later that week and make over to one of the European clients on the 14th or so, and verify that everything was alright.

    Isn't it obvious?

    captcha = shizzle
  • fwt (unregistered) in reply to WIldpeaks
    WIldpeaks:

    jspenguin:
    It screws up on the 13th of the month? I guess Friday the 13th really is cursed...

    LOL excellent



    Apart from the fact that the 13th is the first day of every month that it doesn't screw up on...
    (I know it doesn't screw up on 1 Jan, 2 Feb etc, but I said every month)
  • me (unregistered) in reply to Gabe
    Anonymous:
    GoatCheez:
    Anonymous:

    GoatCheez:

    lol...
    Does anyone even expect microsoft to implement a proper priviledge system in Vista? Or document it well for that matter? I know I don't... I expect that to be happening around 2020, assuming Microsoft is still around, but that's just me... Yeah, Microsoft needs to get their act together when it comes to their OS.

    The real WTF is people who are seemingly intelligent enough to comprehend the WTFs, while not being smart enough to look into even the basics of Windows security as it has been for over 13 years!

    While the ability to configure what users have the right to change the system clock has been around since the first version of Windows NT in 1993, the interface was changed in 2000. Go to Administrative Tools|Local Security Settings|Local Policies|User Rights Assignment, and look for the right called "Change the system time". By default administrators and power users have the right, and most users run as power users or administrators by default, so most users always have the ability to change the time.



    The system clock is just the tip of the iceberg.

    The real WTF is people who are seemingly intelligent enough to comprehend the intricacies of the windows system clock, while not being smart enough to observe commenting on the fundamental flaws in Windows security as it has been for over 13 years!

    Apparently I'm not intelligent enough to know what a "proper priviledge system" looks like. Please enlighten us.

    While you're at it, why don't you let us all know what fundamental flaws are in Windows security. I know of many flaws (sometimes poor defaults, bad conventions), but none that are fundamental.



    http://www.google.com/search?q=windows+messaging+flaw+paget
  • Hambone (unregistered) in reply to GoatCheez
    GoatCheez:
    Anonymous:

    GoatCheez:

    lol...
    Does anyone even expect microsoft to implement a proper priviledge system in Vista? Or document it well for that matter? I know I don't... I expect that to be happening around 2020, assuming Microsoft is still around, but that's just me... Yeah, Microsoft needs to get their act together when it comes to their OS.

    The real WTF is people who are seemingly intelligent enough to comprehend the WTFs, while not being smart enough to look into even the basics of Windows security as it has been for over 13 years!

    While the ability to configure what users have the right to change the system clock has been around since the first version of Windows NT in 1993, the interface was changed in 2000. Go to Administrative Tools|Local Security Settings|Local Policies|User Rights Assignment, and look for the right called "Change the system time". By default administrators and power users have the right, and most users run as power users or administrators by default, so most users always have the ability to change the time.

    What Vista introduces is the ability to assign rights separately for changing the clock and changing the time zone. That way a laptop user can be prevented from screwing up their clock while still being able to tell their computer what time zone they have most recently traveled into.

    The only way to allow a Unix user to change the time is to give them complete 100% control over the machine. Or you could write an suid root program that has a non-standard way of assigning rights, and hope that you're a good enough suid programmer that you didn't just create a privilege elevation attack waiting to happen.



    The system clock is just the tip of the iceberg.

    The real WTF is people who are seemingly intelligent enough to comprehend the intricacies of the windows system clock, while not being smart enough to observe commenting on the fundamental flaws in Windows security as it has been for over 13 years!

    Insulting others' intelligence is never a nice thing to do, unless it's their code that was posted ;-P

    So, to recap, some guy posted an extremely informative and complete correction of your original mistake, and your only response is to throw your toys out of the pram :)

  • (cs) in reply to Marcel
    Anonymous:
    No problem, in some parts of Germany you can easily order a full liter of beer.


    You will, however, be looked at very strangely if you actually ask for "a liter". It's called a "Maß", a measure in English. All other units are derived from this fundamental one.
  • Tragomaskhalos (unregistered)

    I refer the esteemed readers of the Daily WTF to these words under the "Date Statement" section of O'Reilly's "VB and VBA in a nutshell";
    "Modern windows systems are more reliant on the system date than ever before. A single machine can have literally hundreds of different applications installed, many of which will use dates in one way or another. You should respect the machine on which your application is running and only in very exceptional circumstances should you change the system date programatically."

    Amen brother.

     

  • (cs) in reply to dpm
    dpm:
    That says a lot about Windows WTFery, right there:  the user has the ability to reboot the machine but not set the clock.  Story!

    ok
    dpm

    If you can find a software solution to preventing a user from yanking out the power cord....

  • Zachary Palmer (unregistered) in reply to Gabe
    Anonymous:

    The only way to allow a Unix user to change the time is to give them complete 100% control over the machine. Or you could write an suid root program that has a non-standard way of assigning rights, and hope that you're a good enough suid programmer that you didn't just create a privilege elevation attack waiting to happen.

    I'm not an expert on the topic, but can't you just set the hardware clock to UTC and then have the users set their own time zones on a per-user basis? The clock in my KDE session allows me to pick the time zone in which to view it.

    Last I checked, Windows assumes that the hardware clock is set to local time which, IMHO, is a bad idea on a laptop or other mobile computing devices. Of course, using a UTC hardware clock would require applications bright enough to realize that their OS doesn't necessary set the clock to "local" time...

  • Robin (unregistered) in reply to rsynnott
    Anonymous:
    Also, there's more than one 'European date format'. The UK and Ireland tend to use DD/MM/YYYY, while on the continent YYYY-MM-DD seems more common.
    Coming from the continent (the Netherlands), let me assure you that I *never* come across the YYYY-MM-DD format in daily life. Furthermore, the date separator in the Netherlands is '-' instead of '/'. That's pretty useful, because the date 01-02-2003 must be in the Dutch DD-MM-YYYY format (1 February 2003) and cannot be misinterpreted as 2 January, which would be written with slashes instead of hyphens: 02/01/2003.
  • John Betonschaar (unregistered)

    Great! That'll teach them using VB/VBA or whatever crap excuse for a 'programming language' that's based on, or borrows from Visual Basic...

    I'd say sue the moron who came up with the great idea to introduce a global string variable to set the workstation date, using nothing but a common string assignment. To a variable with a name so common that you can find it in almost every piece of code that handles dates in any language...

    captcha: pacman!

  • Yazeran (unregistered) in reply to mnature
    mnature:

    Talking about temperature scales, I actually hate them all.  Though I do sort of like the BeerPoint temperature above.  As an engineer, I had to learn five different temperature scales, plus having to convert between any and all of them (from memory, of course).  I think that the European temperature scale is the worst, because as soon as you call it one of the names, someone corrects you with the other one:  I am speaking of Celcius/Centigrade.  Then the one I was raised with, and actually have a fondness for, which is Fahrenheit.  [Pop quiz:  What is the Fahrenheit reading if it is minus forty degrees Celsius?]  However, then we have the odd one, called Reaumur, after some Frenchman with too much time on his hands.  But the two that stand right next to Celsius and Fahrenheit are Kelvin and Rankine.  Those two are based on zero being absolute zero, and thus make more sense than any of the other temperature scales.  Besides, I sort of like thinking of 300 degrees Kelvin being room temperature.

    I hate the metric system, but only because everyone who is raised on it seems to know nothing about fractions.  Besides which, I'd rather order a pint of beer than a half-liter of beer.  Just sounds like you're getting more by using a full unit rather than half of one.

     


    Actually, you Would get more if you order a Pint as it is 0.568 or 0.551 l (depending on UK or US Pint (for the US one, a DRY one at that. (http://en.wikipedia.org/wiki/Pint for an other WTF))).

    Yours Yazeran

    Plan: To go to Mars one day with a hammer.

Leave a comment on “Long Distance DATE$ing”

Log In or post as a guest

Replying to comment #:

« Return to Article