- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Admin
This is sad, sad, but true.
It does betray a lack of foresight and understanding. For one thing, what major programming language today lacks a mechanism for getting the current date and time?
Second, what sort of programmer would think a major programming language lacks one? What could one possibly think to not even just f* google for it?
Also, what sort of programmer would not think "uhmmm, what would happen to this little piece of database-calling thingie during peak hours"?
A good friend of mine calls this "happy programming" - programming for the happy case. It worked on my computer, it gotta work on freaking production, too!!!
Admin
Admin
But who was david's phone?
Admin
Jake, can please proof read the article you submit it you submit it.
Admin
Admin
Only one spam in his inbox, that is actually pretty good.
Admin
Sure, funny, haw haw. Except that there are some problems with the JS date object, most notable being that it uses local computer system time rather than anything "trusted", so it's only as accurate as the system clock.
If you've ever tried to rely on the system clock, you'll quickly know just what a bad idea that this is.
Admin
Because time is money of course. The more clocks you have, the more time you have, the more time you have, the more money you have. So clocks==money. Don't you get it?
Admin
You don't travel very much, do you?
Admin
did they ever think making a server ever think think
You found the problem!
Admin
There's that issue you mentioned in the post...and then there's the issue that setInterval calls do repeat until the page is closed... So the first updateClock sets a timer to call updateClock every second. Fine. The second updateClock sets a timer to call updateClock every second. So now it is being called twice a second. Every second it doubles. It's a wonder their computers didn't crash first.
Admin
I'm only an amateur coder as a hobby, but once I was working on a homebrew project where I needed to get the time from the server.
All I did was get the server time during page load and then use that as a starting point for javascript to start incrementing the time. The update was only called once per page load which seemed to do the job.
If people are getting paid to code like this then there may be hope for me yet! ;)
Admin
How about having a javascipt that gets the time from a NTP server once a second?
Admin
Yes, JS has a Date object.
No, you shouldn't use it.
Not when such a delicious extension to it called DateJS exists...
Admin
I certainly like to Interface with Dates. Shame they never let me at their private members.
Admin
Time is money, to get a Date, one needs time and money -> Date = Time*Money -> Date=money^2
Now, money is the root of all evil: root(evil) = money, money^2 = evil, Date=evil...
Somehow I doubt Dates are worth the time...
Admin
Another manifestation of this problem occurs when using 'rsync' for backups. I backup gigabytes of data on a Windows system every day. Apart from the initial backup, it completed in minutes until the daylight saving changeover.
Then, because the timestamp of all files on the Windows system changed, rsync took hours to complete a backup that had taken a few minutes the day before.
Admin
That would be "R you must TMF"
Admin
Not only was he unaware of it, but he should have also known what kind of impact it would have on performance. He should be reprimanded.
Admin
Admin
No, TRWTF is having gigabytes of data on Windows.
No, TRWTF is Windows.
C'mon, people, we keep giving you examples like this that Windows is chock full of insane fundamental design errors that haven't been fixed in decades and still you keep dragging out excuses that it will be better next year. Get a clue already and give up! There's no hope for this clunker! There never will be.
Admin
Admin
Except the Mutli billion dollar company I work for uses it.
We all know that bad things get better with time, just like fine wine or cheap beer.
dignissim - make a dignissim one way or the other.
Admin
I'm aware of changes made to the local source repository in real time, 24 hours a day, 7 days a week.
If I'm looking at that window, which is on a screen physically located in the office.
And if I drill down under the summary data, which looks like this:
Despite raising the issue in every single weekly meeting, we still aren't getting developers to write "created severe performance problem on Intranet web server" in their commit logs, even though they do write things like "fixed severe performance problem on Intranet web server caused by commit 1005231."
Admin
...and they're all WRONG, except for the ones that are slaved to an atomic clock somewhere on the network. Heck, my cell phone has a dialog under "Settings" which says things like "Obtain time from <carrier network>" and "Network time is <correct current local time>", but the cell phone's clock is still wrong by at least 3 minutes.
Admin
The first version displayed the server's local time (unless they jumped through hoops to convert to the client's time zone). That way, when your geographically sparse workforce needs to coordinate meetings in multiple time zones through your web calendar application, they can just look at part of the web page to figure out what time it is now, in the tiny little mind of the calendar app. Everyone uses the same time zone, whatever it is, and does the conversion themselves to figure out when the meeting really is.
The second version displayed the user's local time (unless they jumped through hoops to convert to the server's time zone). That way, it's no better (and possibly actually worse) than the clock in the user's system tray. The geographically sparse workforce will never have meetings at the same time ever again, unless the length of the meeting in hours is equal to or larger than the number of time zones between the eastmost and westmost attendee.
Admin
Too many clocks?! That's like saying, "Too many Irish girls". The individual words are real words, but when you string them together in that order, it just doesn't make a coherent sentence.
Just what are you doing that having a clock that is off by a minute or two causes you irreparable harm? Do you really have your schedule planned that precisely? Oh no, I had planned to go to lunch at 12:13 but here it is 12:15 and I haven't gone yet!
Admin
The strange thing about the NTP client built into Windows is that it always seems to end up settling on a time a few hundred milliseconds before or after every other NTP client I've been able to test, then drifts all over the place, occasionally applying a half-second step forward or backward to get back in line. It's like Windows doesn't even try to maintain accurate time.
Contrast with Linux systems on the same LAN that maintain their clocks within the same millisecond with each other, and a dozen ms or so over WAN links.
It's been a few years since I've had any Windows systems to test with, but we set up an NTP server hierarchy on both Windows and Linux, and any client that had a choice (Windows clients always pick their domain master) always picked the Linux servers for their more stable time.
Admin
and provided that both computers have up-to-date time zone data. Legislators like moving the transition dates around every few years...
Admin
The clock on the motherboard is a cheap, slow, inaccurate piece of crap. There's dozens of RTC chips in the field. The lowest common denominator of these devices can only return accurate time once per second, and requires quite long CPU-driven signalling sequences (a millisecond here, a millisecond there, pretty soon your server is crawling under this load). "Accurate" here is a relative term, since some of these clocks drift by minutes per day. They're actually so bad that most operating systems reset the motherboard clock from the OS-maintained clock every few minutes--without that adjustment, the motherboard time would be off by most of an hour within a week or two.
Modern systems have cycle counters built into the CPU as well as programmable interrupt timers. The lowest common denominator of these devices has a precision of dozens of nanoseconds and can return accurate time directly from a CPU register. "Accurate" here is still pretty bad by clock standards but it's orders of magnitude better than the motherboard clock. The real win is the precision, because it can be used with an NTP server to calculate what the CPU clock rate actually is.
Even Windows, which has arguably the most primitive time keeping of any of the non-embedded modern operating systems, still has the ability to measure its local time sources (PIC and CPU cycle counters) against reference time sources (NTP servers with data from atomic clocks) and adjust the OS-maintained time keeping.
For example, the OS might program your PIC to give an interrupt every 1000us, then after several hours of tracking NTP servers the OS may discover the PIC interrupt comes every 999.962us. Once this is known, the OS can simply increment the system time by 999.962us every PIC interrupt, and keep time accurate to seconds per month even if the network clock goes away.
Admin
Admin
That's why you configure your NTP server to use at least 4 upstream NTP servers operated by different organizations, at least one of which should be a local GPS receiver or your own atomic clock if your organization is large enough to own one. That way, one broken server will be voted down by all the others (though you should have an alert configured so that when it happens, you choose a new 4th upstream server to replace the one with the incompetent admin).
Your clients, of course, should speak only to your own NTP servers.
Admin
Try pressing F11 and see if you can still see the time in your taskbar.
Admin
Admin
My data is generated in a Unix-like environment but transferred to Windows because, well, that's what everyone else uses.
I'm a little tired of being considered weird and subversive because I prefer to process my data with non-Windows tools.
Tired, too, of explaining to colleagues that Word is not my text editor of choice. That, although all the data I process has to end up on Windows machines, Windows hinders rather than helps me.
I've been told I'm not allowed to complain about Windows any more.
I try not to, but the phrase "blinkered philistine pig-ignorance" comes to mind several times a day.
Admin
Yes, actually. Don't you? How else do you know when you're on time? How do you catch a bus, or turn on a radio or TV at the right time to catch a scheduled broadcast? How do you know your alibi is confirmed by security camera footage? If you set other clocks using your clock of questionable accuracy, how do you know you're not compounding errors? A minute or two can cause irreparable harm to a steak on a BBQ. A clock that can be off by a minute or two can be off by 20 minutes, or 2 hours. How would you know?
Yes, I can do whatever I need to do one minute earlier, but then I'd have to know (or guess) the time on my clock is a minute slow. What if my clock is actually two minutes slow? Or ten? Is waiting a total of an hour a day for slow clocks OK, but 70 minutes is too much? Or is 50 too much? What if I make a decision about someone else based on a clock that is fast, making me falsely believe they are late?
Generally I disable the time display feature entirely on any device with a clock that can't get its time from a network, and thereby minimize the number of clocks I have to maintain. The few that are left, e.g. the one in my stove, can detect and report failure to maintain accurate time, e.g. by blinking "12:00". Others, like the occasional mechanical clock or watch that is unable to detect that it loses an hour or two a day, I learn to ignore, because sooner or later they will tell me the wrong time (though they are useful for timing steaks on the BBQ, as long as I only use one clock for the entire cooking process).
Anyway, the size of the error doesn't matter if it's big enough to display. Whether it's 3 minutes or 30, my point about the cell phone is that if you're going to claim network time sync as a feature, and there is a difference between cell phone time and network time greater than one minute, or if the difference cannot be measured because the network is unavailable, or the error in the measurement is greater than one minute, I should be seeing some kind of error notification so that I know the clock might be wrong...otherwise, you might as well not bother putting that feature in the firmware, and use the space saved for more ringtones.
Admin
What I don't get, is why they had to do
url = url + '?employee_id=' + <?=$employeeID;?>;
and getting the timezone from the DB, when they could use, say... sessions, avoiding DB connection and queries?
Admin
Yes. Yes, I can. Admittedly I have to move the mouse down to the lower edge of the screen to see it, but I'd have to do that even without pressing F11 if I had the taskbar set to auto-hide. I don't think it's a big problem.
Admin
A man with one clock knows the time. A man with more than one clock isn't sure. Those who have their clocks sync'd with a standard are sure.
Me? I have a wall clock (a nice Radio Shack one, thank you) that gets its information from WWVB. The clocks on my computer automagically sync via NTP. Why do I worry about this? For me, having "second accurate" time is nice when you are doing EBAY stuff. Their clocks are nicely sync'd to some nice NTP server and that makes my clocks in sync with them. It is quite nice.
Than there is windows. It isn't that accurate AT ALL. I had to find an NTP update program to get it to display the time correctly. Even then, it doesn't display seconds, a major bummer if you are trying to land a last second bid.
Then I look at the guide on my satellite TV provider. It is always a bit off, probably due to the round trip time to the satellite!
Time is such fun!
Admin
Tray Clock can sync to a timeserver, the MP3 player can sync to the computer, the phone can sync to a timeserver on the cellular network.
All my clocks are set to the same time +- 2 seconds.
Admin
Awesome - astutely put...!
Admin
It's a bit like that.. although since my wristwatch died I haven't missed it. Either the PC, my phone, or the clock in the car will tell me what I need to know. I'm seldom away from all 3.
Admin
Admin
Couldn't stand him, left. Eventually company couldn't stand him either.
Admin
The honking big Vista desktop clock that immediately gets covered by windows? Why would anyone hate that?
Admin
i have a date function
it's very limited to "before foo" and "after bar"
but I can sort events like nobody's business
Admin
So, one day I was looking for tileset making software, for a console style RPG; gog'd 'tile making software' and went to a promising result, turned out it was a ceramic tile company, but there was something odd...
they showed the date like "March 10th, 19107"
I laughed a bit, as it had only been 7 years since the whole Y2K thing ended the world.
followed the link to their web design company, that also showed the year as 19107..., as discussed their combined 120 years of web design exp, etc. etc.
it worked fine in IE, showing '2007' but not firefox...
I can't fathom how they ever got 19107...
(1900)+(107) numerically gives 2007... if it were erroniously treated as a string "1900" + "107" should have given "1900107"...
My simple solution, Don't show the time. Why bother, when there are so many other clocks around, to add a feature that might go wrong?
Admin
Windows hasn't been like that since Win95; it's just like any other operating system now.
And the dialog box is so that users don't think they have to update it, and for that it doesn't really matter if what actually happened was "the system clock was changed" or "the offset used from the system clock to local time is now different", dumb users need to be told this regardless.
Try "19"+"107". As in, you know "19"+"99".
I think I can make some guesses from here.
In the Javascript standard, getYear() returns 98, 99, 100, 101 - you are supposed to add 1900 numerically. On IE, getYear() returns 97, 98, 99, 2000, 2001, 2002. (They can't be blamed, necessarily, because the standard used to do this)
One "obvious" solution in the IE situation (which would also apply to netscape 4) is if(year<2000)year="19"+year
Now, what is the correct answer?
If you said "if(year<2000) year = 1900+year", you are WRONG. the correct answer is to use getFullYear().