- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Hopefully he was an intern at the company when he first wrote that abomination...
Admin
That's the problem with most WTFs. You can never tell if it's an act of utter incompetence, or malevolent genius.
Admin
Even on purpose writing an app like that it a dick move.
Admin
I am inclined to believe he was a genius.
captcha: feugiat
Admin
I think its safe to assume this one was incompetance. Either 1) he lacked the level of skill that was required by his job or 2) he was lazy to a fault. Either I would classify as incompetant.
as for malevolent genius that would be if he had siphoned off company data, for years, to sell to a competitor or something like that.
Admin
This guy is a genius!
He is just using the Lazy Allocation of Memory Eventing coding guidelines as found in many well designed applications that follow the Rigorous Enumeration Threading And Resolvable De-allocation pattern. Duh!
Admin
I once as an intern had to rewrite an app doing more or less the same thing. Took me 2 minutes to replicate the functionnality with rsync, and 2 more hours to script something able to use the already existing conf file ... The wheel exists already !
Admin
This is not so uncommon. I translatet a piece of C code a few years ago to Java 1.1 and got a 99% speedup as a result (You'd expect it to go in the other direction). The original C code appeared as though it was heavily optimized, because the hashing function included lots of documentation, and cute shortcuts used throughtout. The culprit was the memory management and inner-loop allocations.
Admin
One the one hand is worth two in the bush.
Admin
Reminds me of merging a couple of batch files long, long ago. The new process took 10 - 20 minutes depending on the size of the data sets. Easily time for a quick ciggie. I'd often see the boss in there: "smoking again SR?" she'd ask - "my PC's hard at it" I'd reply.
I'm glad the place closed down before we'd migrated off DOS 6.1 and onto that horroble multitasking Windows.
Admin
So this code just returns the length of the file? You don't have to read any bytes of the file to know the file length -- no allocations/reallocations or file-reading necessary.
Admin
Maybe he's also done that, but nobody believes it was him because this code here "proves" he's too stupid to pull it off. Now that would be true malevolent genius
Admin
http://xkcd.com/303/
Enjoy
Admin
Only if you're gullible
Admin
TRWTF is that Rik V, having brought it down to under 20 seconds expended effort to further reduce it to 13 seconds. The payback time for that last adjustment must probably be measured in centuries.
Admin
I love the way that pBuffer is declared as an array of pointers, but actually used as an array of BYTE.
Admin
Mmmm, delicious incompetance. This isn't a deliberate slow-down loop from a lazy genius - a true lazy genius blends into his environment; he moves like a stalking tiger, codes like the shadow of a swooping eagle and then disappears into the night, leaving nothing but a fake username against his check-in history. Most importantly of all, the lazu genius does not get fired - because management doesn't even know he works here.
Admin
Captcha: vulputate, from the dutch vulpatat, meaning stuffed potatoe.
Admin
Admin
The company may actually realize the time savings here within a couple years if he spent less than 2 hours on it. Here comes that big promotion.
Also, anyone at this company ever here of job scheduling?
Admin
Of course, this also assumes that humans are as time-efficient as machines, i.e. that five minutes spent on one thing are equal to five minutes spent on something else. I've never mastered that level of self-management-fu. If I have five minutes, I can get five minutes of work (or optimizations or whatever) done on whatever I'm doing, or I can spend five minutes task-switching to something else (usually takes me longer than that to remember everything about where I left off) with no measurable, visible progress, i.e. I just wasted 5 minutes (and remember that achievement is 70% perception and 30% effort, and task-switching is alwayus perceived as slacking).
A few years ago, I gave up trying to optimize my life like that and decided to try to minimize task switches. It's a lot simpler and less stressful, and if they want to fire me, they'll find a "reason" no matter how "efficient" I am.
Admin
Admin
Minor errors: should be *pResult = new ..., if(!*pResult) return..., and file.Read(*pResult,...).
delenit: de-len it: remove its length.
Admin
Admin
Admin
Admin
TRWTF is using Hungarian notation with verbose variable names in a 34 line function.
Admin
This almost screams the need for a speedup loop.
Admin
At least the original author optimized the code. Originally, the code seeked back to the start of the file, but then he optimized it so he just memcpy'd the results from the previous loop iteration.
Admin
You should have just trimmed a couple zeros off the speed-up loop.
Admin
I opt for genius.
Anybody who can create a good enough excuse to peacefully read his newspaper during office time can't be anything else.
Admin
Admin
class first { char* buf; void free_buf() { for(int i=0;i<=1024;++i) buf[i]=0; }
public: first() { buf = new char[256]; int i=0; while(FIRST[i]) buf[i]=FIRST[i]; } ~first() { delete buf; } ostream& output(ostream& s) { cout<<buf<<"\n"; } }; const char* first::FIRST="First!";
int main() { first F; ofstream write("/tmp/first.txt"); F.output(write); write.close(); return system("cat /tmp/first.txt"); } // acsi
Admin
Actually there is no need to check for NULL, unless someone has incorrectly overloaded operator new.
Admin
I've seen worse/better improvements. I've been enlisted as an intern in a hospital, where I was responsible for writing a search function for a website in ASP classic. Now usually that would be a question of doing a full text search on a couple of tables, but in this case it involved searching a load of HTML and text files that the managing doctor had supplied, of course placed in a real developmestruction environment.
Before I started, another intern had spent weeks on developing a search function and was finally done. Proud, she showed the manager how well it worked on her development machine, which it did. With a handful of pages. When it was deployed to production, the search would simply time out and show nothing.
Guess what: she did a live search on the file system for every search. A system with roughly 1500-2000 MS Frontpage 2000 generated HTML files. Just for the fun of it I decided to allow the thing to run its full course. I disable the timeout and started up a random search in the morning. When I was going home 8 hours later, it was still running. It finished right after I got back to work the day after, meaning a simple search for the word "test" took just over 24 hours.
I managed to scrape something off of that time. I indexed the full web site into a database and periodically ordered a reindex through a scheduled task. Since the reindex would only do anything for new or changed files, it would be done in seconds. And the actual search query? It sped up from 24 hours and a few minutes to about .4 seconds. I'd call that an improvement.
Admin
Admin
Admin
Or you could just run diff -r -q to get a list of filenames to copy.
Then, depending on your OS and available libraries, you either use the library-provided function to copy files, or you shell out and run cp.
Admin
This is one of the very, very few cases where I would opt for rewriting a C application, from scratch, in PHP.
Actually, even a rewrite in Assembler using INT 21h would be infinitely preferable.
Admin
Why did you write this? I really want to know why. Of all the code snippets that get dropped on this site, what stimulus can you blame this response on?
Honestly this worries me alot. I think TRWTF that happens to me on a daily basis is people who overcode things. It seems to be a rampant problem. I'm not sure if it's the programmer version of Dick measuring or not, but either way it's #$%@#$% annoying.
Admin
what's the non-programmer version of "Dick measuring"? and why is Dick proper?
Admin
Going meta is a useful skill. I don't care how good your Java or C++ skills are. I know you're an excellent programmer, but do you really need to write a program for this? And then there's the defensive meeting where the programmer says how efficient it will be, that we need the speed of (insert language here) to handle this effectively. The 800-pound gorilla in the room is of course (and the programmer gets the strawman out... Like, well mine will be more efficient because it's threaded), the OS already does it. When your only tool is a hammer, every problem looks like a nail.
Admin
Admin
Sorry, I'm still reading the article. I'll post a comment later. returns to digg
captcha: minim[al amount of work]
Admin
Admin
You guys don't get all obsessive compulsive over this sort of thing? What sort of programmers are you?
Admin
Some more WTF's in there:
(1) it does not check for success on the read.
(2) It assumes that the file will fit in RAM. If it requires virtual memory, that's another couple decades of slowdown.
(3) It assumes the allocations are all going to work.
(4) It reads 255 bytes, which is not a block size on any kind of disk. 255 out of 256 reads are going to be skewed and will require the kernel to copy buffers.
Admin
I would have just increased the size of the buffer from 256 bytes to 10MB (or whatever depending on the typical size of the files you're reading)
99% of the speedup with 1% of the effort. no need to unit test as you haven't changed the logic
Admin
Posting overly complicated VB source. And no one likes an improper Dick.
Admin
Barring memory-constrain environments, reading the content of a file to a 256b sized buffer (specially when it is to scan multiple, potentially large files), that's the give-away to something: That something can be qualified as either sophomorish genius or inexcusable professional wtflocaust.