When someone mentioned to Abraham, "Our product has an auto-sync feature that fires every hour," Abraham wasn't surprised. He was new to the team, didn't know the code well, but an auto-sync back to the server sounded reasonable.
The approach, however, left something to be desired.
syncTimer = new Timer(1000);
syncTimer.Elapsed += new System.Timers.ElapsedEventHandler(syncTimer_Elapsed);
syncTimer.Enabled = true;
syncTimer.AutoReset = true;
//...
void syncTimer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
{
if (DateTime.Now.Second == 0 && DateTime.Now.Minute == 0)
{
//... do what you have to do
}
}
The .NET Timer object does exactly what you expect. In this case, 1,000 milliseconds after getting Enabled
, it fires an event. Because AutoReset
is true, it will fire that event every 1,000 milliseconds.
Now, this is supposed to trigger the auto-sync once every hour, but instead it fires once a second. On that second, we check the time- if we're on the first second of the first minute in the hour, then we fire the auto-sync operations.
There are a few problems with this. The first, and arguably most minor, is that it's just silly. Firing every second turns this timer into an "Are we there yet? Are we there yet? Are we there yet?" nuisance. It makes more sense to set the timer of 3600000
milliseconds and just have it fire once an hour.
The second problem is that this means nearly every client is going to sync at roughly exactly the same time. All of the load is going to spike all at once, at the top of the hour, every hour. That's manageable, but there's no business requirement that the sync happens at the top of the hour- any time would be fine, it should just only happen once in any given hour.
But the third problem is that this code doesn't guarantee that. This is .NET code running on Windows, and the Windows system clock can have a variety of different resolutions depending on version and configuration. In newer server versions, it could be as small as 1ms, if the sysadmin configured it for high accuracy. But this ran on clients, running desktop OSes, where the number would be closer to 15.6ms- which doesn't divide evenly into 1000ms. Which means this interval isn't going to fire exactly 1000ms apart- but 1005ms apart. Depending on what the OS is doing, it might drift even farther apart than that. But the key point is, some hours, this timer won't fire in the first second of the first minute.
The good news is that fixing this code by converting it to just fire once an hour at any time is a lot less than an hour's worth of work. Validating that in testing, on the other hand…