- Feature Articles
- CodeSOD
-
Error'd
- Most Recent Articles
- Stop Poking Me!
- Operation Erred Successfully
- A Dark Turn
- Nothing Doing
- Home By Another Way
- Coast Star
- Forsooth
- Epic
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Maybe they're opening a file from a network share? Maybe the file is being written to simultaneously, and it's their crude way of waiting for the write to complete?
Admin
If more people gave up, there would be less wars
Admin
"Opening a file is not the sort of task that's prone to transient failures"
No? External storage, from USB drives to NAS to cloud?
Admin
While the code is crude it's not really a WTF, opening a file can have transitory failure on some file systems. Given the snippet has no backstory, there is no way to know for sure.
Admin
One common failure on Microsoft operating systems is that the file is locked due to being opened by another process.
Admin
But how often is a file locked by another process going to become suddenly available in 11 seconds? I guess you should know the issue in your own file system to determine how big that count should go before you drop out, but there is something to know what your failure modes are. Maybe check the FileSystem object to see what is happening to the thing? Path not present? File missing? No read permission? File Locked? Assume it is not going to be fixed in the next 10 seconds and drop out right away. Network error on the way to your RAID? The tape backup taking its time finding the entry point (it is an archive after all)? Yeah, we will wait.
Admin
A common failure on ALL operating systems is that the file doesn't exist -- which gives us a perfect excuse to use that third possible boolean value of FILE_NOT_FOUND
Admin
Bananafish's WTF FTW!!
Bravo!
Admin
I knew my intense study of programming WTFery would pay off one day ;)
Admin
Ehm. Opening a file can often fail when it is locked by another process for example. This code is totally reasonable, especially if the OpenFile method opens the file for read/write operations.
Admin
Just do give a example how often you can no rely on opening a file, even on the same hard disk without any user accessing it:
I had once the case, that for just a few users a file could not be open with read/write share. Turns out it was their installed McAffee. The only way to open the file was actually trying to open it four times in a row (no delay needed), before that this sub optimal background process locked the completely harmless UTF8 text file exclusively.
I don't know why, this anti-virus software didn't change their behavior (I tried three times in a span of two years), so that code was there till I left the company. I left a comment tho, to warn every other developer that McAffee is in the wild doing nonsensical things, as has to be expected in hindsight.
Admin
Reading through the comments justifying the retry: They all seem to be pointing to problems with accessing the device that the file is stored on rather than actually accessing the file. Network access might have a temporary failure that should be retried. RAID or tape drives might take more time than expected. But if the hardware and network are not the problem, then file access failure isn't likely to be a problem that is going to go away by itself. Maybe a short duration read/write file lock. But that doesn't seem like something that should be put into a generic file reading wrapper.
Admin
Are you saying other OSes, like Linux, don't have the ability to open a file for exclusive access?
Admin
The network issues and file lock by another process can last for days. And we don't know if this function is a generic wrapper meant to replace the native library calls; FWIW it's called by code which knows it's opening a file where it may need to wait.
Admin
Really, it shouldn't be counted as an error until you can't open it and the user notices that there's something wrong after you just start supplying junk data.
Admin
For legitimate IO errors retrying sometimes makes sense. Note that no sharing mode is specified--it's going to be exclusive. If someone else is trying to do the same thing you get a lock that likely won't persist. And even if you understand sharing modes that doesn't mean there isn't something else out there accessing it that doesn't.
I've written retry loops more than once when a need was observed. I've even written code that wrote a file that another program believed it had exclusive control over, I made sure my write was atomic and did it fast enough that no error ever got returned to the program. It went peacefully on unaware the network even existed.
Admin
The real issue is not checking WHY the file open failed. I mean, if the file failed due to a transitory error, fine retry it. But if the error is permanent, then perhaps failing fast is better.
Like say, the file doesn't exist - if you're trying to open a file that doesn't exist but shows up later, there's other issues wrong with your program. Failing fast is fine (unless you're really trying to adapt to the WTF that is waiting for the file to exist because someone else will create it).
Or if the file is in the wrong format - again, if you're expecting say, a ZIP file and someone hands you a text document, short of some WTF in other places in your application, that text document won't transform into a ZIP file. (There were some classic MacOS apps that looked for file extension changes, and automatically processed files that way, so you could compress a file merely by renaming it which would cause the compression program to launch and compress the file).
Admin
Yep, I was tackling with the very same problem when windows 2000 ( and XP) was a thing. Some virus scanners kept my temporary files locked for 1 or 2 seconds and retrying was only way to continue correctly. After that I modified the software to use memory buffers instead of temporary files (that helped also to gain some speed).
Admin
Due to the lack of context I'm free to imagine one in which OpenFile makes sense Here's an scenario (additional to the ones already given in other comments):
So... just send the request and then call OpenFile.
Admin
They do, but they (almost) never use it. The problem is exclusive access locks out everything, even that which perhaps shouldn't be locked out (like an admin trying to clear the lock). Exclusive access locks are at the core of why updates in Windows require a reboot even for trivial things. On Unixes, cooperative locking is much more common, and if files need more protection than that then it is usually best to wrap a service in front of them anyway. (Devices do support exclusive locks.)
Admin
Windows API does not mandate exclusive locks, either.
Updates are a different story. Running executables are locked always, and that's way outside the scope of this article. Now, can Windows make its core executables hot-restartable? Maybe. But I don't think it's because they're afraid of making the file writable, rather it's because of complications of inter-process cooperation between its core processes. I'm sure they will need to do very thorough testing to make sure the system is stable in such a scenario. They did make some improvements in that direction, and I think they just don't prioritize that heavy burden of such a thorough testing and I'm sure bug fixing effort just to save end users from restarting the system, to be able to advertise "completely restart-free updates".
Admin
It's fun to speculate on why. It could be that a developer was debugging something that happens on step 20 of a test. They open the file to view what could be causing an issue, have that Aha! moment, switch back to Visual Studio, make the change, compile, then get back to step 20 and it bombs out because the file is still open in the background. With a change like this, instead of starting all over they notice that the file didn't open immediately and have time to close it to continue on.
Admin
Historically (Windows XP minus), executables from network locations were not locked by Windows. That changed with Vista. In the old days, locking the executable happened because the memory cache system assumed that disk files could be reloaded from disk, and could be safely resumed: the system didn't make that assumption about memory backed by network access. Post Vista, the same assumption is about preventing malware altering the executable during the process, rather than just allowing more aggressive memory management.