• sagaciter (unregistered)

    I've always disliked people like Mike. Someone writes a little thing, for their own reasons, then they put it out for the public to use once they see that there's no other tool available. And then some smartass like Mike comes along, at a later date, and gets all high and mighty.

    Who cares that a tool wasn't done the best way - when there are no other tools available. Mike is just a whiner for complaining.

  • (cs)

    Why in the hell would an Amiga coder need a third-party library to generate screenshots from his own program? Everything was bare metal access back then, so the hardest part would have been implementing an IFF compressor, which has nothing to do with reverse-engineering the OS or anything hackish like that.

  • (cs)

    I guess that the term "code review" wasn't around back then. How do you blindly put code into your program without reviewing it when you have the source?

    TRWTF is definitely Mike.

  • Leonardo Herrera (unregistered)

    The real WTF is that Mike was an asshole.

  • gnasher729 (unregistered) in reply to ¯\(°_o)/¯ I DUNNO LOL
    ¯\(°_o)/¯ I DUNNO LOL:
    It could have been worse. At least he didn't close and re-open the file after every pixel!
    Many years ago, I used a programming editor which saved files 2KB at a time, and then would do a flush operation that wrote all caches and everything to the disk (which was a floppy disk). Once my code was bigger than tiny, say 50 KB, saving files did take ages.
  • Alessandro Stamatto (unregistered)

    TRWTF in this (fictitious?) story is definitely Mike...

    Before his arrogant response everything was well, no problem laughing (in private) at an incredible inefficient code.

    But shaming the creator of the code you're using?! That's too much of ingratitude. And to make things worse, Dr. John was so sad (and shamed) that he never went public again, and stopped releasing his free programs... Way to go =p

    Anyway, it's necessary to grow a tough skin to be a programmer because those things always happen (specially if you work with Linus - great programmer but a little too much aggressive).

  • (cs) in reply to Anon
    Anon:
    The high point for me was MFD. Not because it was any good. It wasn't. It was absolutely terrible. But the comments were hilarious.
    And tellingly, Alex went to extraordinary lengths to stifle those very comments.
  • anonymous (unregistered)

    This reminds me of the time when I wanted to print out screenshots and other things from games that I'd written. Except my story doesn't have any crusty guy on some BBS providing free code for me to use, or asshole letters criticising people who did most of my work for me. I pored over the dot-matrix manual, figured out how to use the escape sequences to put it into pixel mode, and did my own damn screenshot code. Then I modified it to overstrike each pixel twice because the ribbon on my dot-matrix printer was getting so low on ink that the printing was hardly visible anymore.

  • anonymous (unregistered) in reply to anonymous
    anonymous:
    This reminds me of the time when I wanted to print out screenshots and other things from games that I'd written. Except my story doesn't have any crusty guy on some BBS providing free code for me to use, or asshole letters criticising people who did most of my work for me. I pored over the dot-matrix manual, figured out how to use the escape sequences to put it into pixel mode, and did my own damn screenshot code. Then I modified it to overstrike each pixel twice because the ribbon on my dot-matrix printer was getting so low on ink that the printing was hardly visible anymore.
    Oh, and I also reverse-engineered the BMP format so I could import and export monochrome bitmaps.
  • Stuart (unregistered)

    I like reading the comments.

    If someone didn't do something he's an idiot and should have done it. If someone did something then he's an arse and shouldn't have done it.

    Rather than arguing about how shit Dicksourcer is or how comicly appalling CS is maybe comments should just be turned off?

    captcha: who the fuck cares

  • Anon (unregistered) in reply to Zylon
    Zylon:
    Anon:
    The high point for me was MFD. Not because it was any good. It wasn't. It was absolutely terrible. But the comments were hilarious.
    And tellingly, Alex went to extraordinary lengths to stifle those very comments.

    How to save TDWTF:

    Step 1) bring back MFD Step 2) let people comment on it / post their own creations based on it (i.e. write the content for you - for free!) Step 3) ???? Step 4) Profit.

  • Anon (unregistered) in reply to Stuart
    Stuart:
    I like reading the comments.

    If someone didn't do something he's an idiot and should have done it. If someone did something then he's an arse and shouldn't have done it.

    Rather than arguing about how shit Dicksourcer is or how comicly appalling CS is maybe comments should just be turned off?

    captcha: who the fuck cares

    Could have skipped your strawman comment that's for sure.

    The only thing people are complaining about is how Mike abused the guy who's free work he exploited. Not the exploiting. Not even the not doing it himself. Not the giving back his improvements. The being a total dick and humiliating the original author for no good reason.

  • Valued Service (unregistered)

    What?!?!?!

    I have to disagree with this whole article.

    A curious perversion is some code going way out of its way to accomplish (or fail to accomplish) a simple task. Or maybe having business procedures that hinder development to the point of absurdity. A curious perversion is not a program that does a simple task in exactly the way a human would THINK to do it. The code doesn't take the steps to pervert the ideal in order to more efficiently operate on a machine.

    Dr. John is guilty of not knowing the nuances of an OS, and optimizing his code. This is not a perversion. This is a lack of polish. The only WTF I could possibly see here is how an industry can let a person get so many accolades without having adequate polish.

    Then, to top the cake, Mike acts unprofessionally and bashes the Dr. by looking a gift horse in the mouth.

    Then people wonder why no one wants to contribute to open source projects.

    So if Dr. John is guilty of not having the polish to make efficient code. Mike is guilty for not having the polish of acting civil.

    So, by extension, TRWTF is how an industry can allow such an unprofessional programmer to work on such an important project.

  • (cs)

    I'm sorry, "burning hundreds of floppies?" One did not burn floppies. "Burning" was a new term that came about for CD-Rs, because making those involved actually burning dye with a laser.

    Fie on you and your anachronisms.

    Addendum (2014-06-19 14:56): Also I recently came across a similar WTF in production code where I work, written by a skilled senior developer who must have had a brainfart that day. Shit happens.

  • BryanJ (unregistered)

    TRWTF is writing a letter.

  • QJo (unregistered) in reply to ¯\(°_o)/¯ I DUNNO LOL
    ¯\(°_o)/¯ I DUNNO LOL:
    It could have been worse. At least he didn't close and re-open the file after every pixel!

    I'm going to guess that if it was written in C, it did NOT use the standard library FILE* type, because that buffers output by default. From the way this sounds, he probably did an OS-level write call for each pixel.

    But even that shouldn't have been a problem if the OS was doing half-decent buffering. Rewriting an entire sector because one byte was written to an open file sounds pretty stupid, even if you're worried about the possibility that a user can hit a manual eject button at any moment. But what do I know, I was using Macintosh at the time. Maybe he called a sync/flush function after every pixel, too.

    On the other hand, I'm not surprised that a PhD in CS would be too ivory towered to comprehend the concept of how long I/O requests take to finish. As the saying goes, Piled Higher and Deeper.

    The initial version of the first usable word processor for the BBC-B was a bit slow and clunky because it wrote strings one byte at a time. It was only the mark 2 version which worked more efficiently and quickly -- and that was a direct result of a more efficiently-written OS and more memory (64k in total, that's ROM and RAM and includes the space taken up by BASIC and the rest of the OS) to play with.

    I suspect that the screenshot program may also have been written at a time where there were such severe memory constraints that there was no such 1kB buffering capability. By the time Mike turned up, the OS had evolved such that the buffering was now available, but Dr John had not gone back through all the software he had written just in order to make more efficient use of the better I/O.

    This puts Mike's crassness into the same league as the people who ridicule Robinson Crusoe for not using his mobile phone to call for help.

  • Tux "Tuxedo" Penguin (unregistered) in reply to Paul Neumann
    Paul Neumann:
    LivingForASolution:
    ^^This
    Which?

    I would like to take your comment seriously; but given that you have failed to master commonly mistaken Reply/Quote semantics I must question your upbringing.

    Dear Paul Neumann,

    When you see "^^This" comment or similar one, without appropriate quote, you assume it was directed at comment just before it. Unless your education about commenting has been entirely confined to theory and not practice, I suspect you may be exaggerating your credentials.

  • Some Guy (unregistered)

    What I don't understand is how the screenshot toolkit code was so well written, but the first time you try to actually integrate it into a working application, it's almost unusably slow.

    So Dr. John has the skills to design and write a very functional toolkit, but never used it in a single application that would make it obvious that it was practically unusable without disk buffering?

  • OldPeter (unregistered)

    What I don't understand: Why would anyone need to resort to some third-party screenshot kit? It was included! With every Amiga, there came a disk with AmigaBasic (ok, some TRWTF in its own) together with some very instructive sample programs. And IIRC one of the programs did show how to write and read ILBM files, which was the main Amiga graphics format at the time. Though that was Basic, it was easy to understand, and anybody should have been able to convert this into straght and fast C code in an moment. Strage.

  • Some Other Guy (unregistered) in reply to Some Guy

    Sounds like Dr.John's very functional toolkit was a proof-of-concept and optimization was left as an exercise for the reader. No wonder Dr.John went incommunicado after it became apparent that his very generous work was unappreciated by asshats like Mike.

  • Zapp Brannigan (unregistered) in reply to Tux "Tuxedo" Penguin

    How do you know that someone didn't insert a comment or two before "^^This" was added?

  • esse (unregistered) in reply to Tux "Tuxedo" Penguin
    Tux "Tuxedo" Penguin:
    When you see "^^This" comment or similar one, without appropriate quote, you assume it was directed at comment just before it.
    Of course, in this case that assumption would be wrong, as you can see from the "in reply to" link in the top right corner of the "^^This" in question. (That does make quoting less important here than on forums that don't directly provide that information, although it's probably a good idea to do so anyway for clarity.)
  • Norman Diamond (unregistered) in reply to balazs
    balazs:
    The Daily Happy Ending, where is my WTF??? Every second programmer does write to file byte by byte at least once at some point in their career.
    You mean doing things like putchar('H') or cout << 'w' ? If the OS buffers disk sectors or the library buffers lines or records, it's not a problem.

    More troublesome is storing a value in a char or any datatype shorter than the memory word size, forcing the controller to do a read-modify-write operation. Well over 50% of programmers do stuff like this.

  • Norman Diamond (unregistered) in reply to Steve
    Steve:
    Jim the Tool:
    Eh, at the time the term open source didn't even exist. It only came into existence in 1998 or later.
    More like 1990, that's about when the BSD and GPL licenses were created. Prior to this we just called everything public domain.
    The GPL license (or at least the gnu GPL license) existed before 1990. It wasn't public domain. Public domain means you can do whatever you want with it, but the GPL puts restrictions on what you can do. The slang word "copyleft" probably hadn't been invented yet but the meaning was there.
  • Norman Diamond (unregistered) in reply to gnasher729
    gnasher729:
    ¯\(°_o)/¯ I DUNNO LOL:
    It could have been worse. At least he didn't close and re-open the file after every pixel!
    Many years ago, I used a programming editor which saved files 2KB at a time, and then would do a flush operation that wrote all caches and everything to the disk (which was a floppy disk). Once my code was bigger than tiny, say 50 KB, saving files did take ages.
    Many years ago, I used a programming editor which didn't save files 2KB at a time, and when I did a "w" operation (same as what would later be a ":w" operation in vi) it started writing all the caches, but Unix crashed and when Unix came back up it had no trace of my file, not the old version, not the new version, not a mixture. fflush() has a reason to exist. sync() has a reason to exist.

    (But sometimes you'd better not fflush because Japanese toilets have sync built in:

    [image]

    http://www.geocities.jp/hitotsubishi/suzushii.jpg)

  • Norman Diamond (unregistered) in reply to dtech
    dtech:
    Sadly a CS degree still isn't a guarantee that the holder can program his/her way out of a wet cardboard box.

    In my current master programs there's a guy who doesn't anything about programming (he didn't know what a variable or loop was) despite having completed a CS bachelor degree at a foreign university. He lifted on others for the group labs and projects, and was allowed to pass them despite nearly every partner protesting that he didn't contribute.

    For his thesis he "designed" a system for a business. It's something a first year student would've done better. He got the minimal passing grade.

    It's so sad, as it greatly devaluates the ~80% of graduates who actually are decent or better programmers.

    It's not just foreign universities. This example comes from the country where Commodore got its start.

    An undergraduate course might have around 60 students. Around 40 students deserved to pass the course. But the prof couldn't fail 33% of the students, so the prof manipulated some grades to let around 58 students pass.

    Then computer science became popular.

    An undergraduate course might have around 250 students. Around 40 students deserved to pass the course, same as before. The prof had to manipulate some grades to let around 240 students pass.

    80% of graduates are not decent or better programmers. They provide the material that makes this site exist.

  • Jeff Grigg (unregistered)

    No WTF here. Move along.

    I learned early on that most software written by most people is crap. Sharing software as listings to be typed in was common in books and magazines early on. But I learned early on that most were hardly worth typing in, they were so bad. But they were good for education and inspiration.

    So when adopting just about any software, it's a good idea to look it over and try it out, to see how well it works. It's crazy to assume that some other person or organization just happened to write just the software you wanted, just the way you'd like it done.

  • Ike (unregistered) in reply to Rupee Everet
    Rupee Everet:
    MrOli:
    I hope by "Open Source", they mean the MIT or BSD license and not GPL, because otherwise incorporating GPL software in to your closed source code and releasing it without source is fairly illegal. GPL is also partially incompatible with Shareware, in that you can charge a distribution fee but no ongoing fees - so as long as one person payed you to give it to them, the can redistribute it at no cost.

    http://www.gnu.org/licenses/gpl-faq.html#DoesTheGPLAllowDownloadFee

    Ah the GPL argument, The programmer's equivalent to Godwin's law.

    What are you, some kind of GPL argument Hitler?

  • Jonathan (unregistered) in reply to Zylon
    Everything was bare metal access back then, so the hardest part would have been implementing an IFF compressor

    So you know what a copper is, and what it does, how that can change the operation of the display hardware, when it can do so, and what services related to the display the copper helps Workbench to provide?

    Answer: display list processor, stores values to chipset registers, modes/planes per pixel/colors/video pointer/etc, at any point while drawing the display, and you can have multiple independent apps on the screen at once, stacked vertically, each with their own screen parameters.

    For that last one you need to know the OS intimately and scrape a lot of information out of the bits the OS designers never thought you'd need to touch, because saving the copper list isn't really an option. You don't need to know that your OS treats all disk writes as synchronous (spoiled kids never walked sixteen bits uphill both ways, spending all their CPU time and RAM fiddling with weak caches and going blind) -- but it helps.

  • Dominic (unregistered)

    If Mike made a public post to "thank" Dr. John and then immediately take a verbal swipe at him over an application he had nothing to do with, that is certainly TRWTF.

    Hey writers, do you think you could maybe not glorify the bad guys?

  • Friedrice The Great (unregistered) in reply to Jonathan
    Jonathan:
    Everything was bare metal access back then, so the hardest part would have been implementing an IFF compressor

    So you know what a copper is, and what it does, how that can change the operation of the display hardware, when it can do so, and what services related to the display the copper helps Workbench to provide?

    Answer: display list processor, stores values to chipset registers, modes/planes per pixel/colors/video pointer/etc, at any point while drawing the display, and you can have multiple independent apps on the screen at once, stacked vertically, each with their own screen parameters.

    For that last one you need to know the OS intimately and scrape a lot of information out of the bits the OS designers never thought you'd need to touch, because saving the copper list isn't really an option. You don't need to know that your OS treats all disk writes as synchronous (spoiled kids never walked sixteen bits uphill both ways, spending all their CPU time and RAM fiddling with weak caches and going blind) -- but it helps.

    How true! Today's PC/Mac wimps, whose systems can only run one graphic resolution/color depth at a time.

  • Very Anonymous Coward (unregistered)

    OMG, I have seen this. Not in some shareware library or cheap off-the-shelf program. In a fairly expensive (mid-6 figure $$$) ERP system. I was tasked with finding out why invoices were taking so long to generate. This was on some fairly fast hardware (for the time) - analysis of the infrastructure (my domain) showed RAM, CPU, and disk latency were all at awesome levels. I had no source code, so I used the always-awesome SysInternals tools to capture disk I/O calls to see WTF. Turns out, when generating PDFs it would always output the graphics portions... one... byte... at... a... time.

    I still get to deal with this company on a regular basis, and while they still insist on putting patches into production without testing at least I've shamed them into using source control. One of these days maybe ... just maybe ... they'll learn that it's OK to use ACLs to protect files. I'm not holding my breath, though.

  • (cs) in reply to balazs

    I never wrote my output byte-by-byte because I started on punched cards. Everyone know that that God intended records and buffers to be 80 bytes each.

  • Geoff (unregistered) in reply to html nazi
    html nazi:
    dtech:
    In my current master programs there's a guy who doesn't anything about programming...
    I think you accidentally a word there.

    CAPTCHA: abigo

    Given how much the guy didn't contribute to his team mates, I would agree that he "doesn't anything" about programming!

  • Mindwarp (unregistered)

    The real WTF here is that we're not all using Amigas. Still reckon the Amigas wre the best computers I ever owned. And the A500 still works.

  • (cs) in reply to balazs
    balazs:
    The Daily Happy Ending, where is my WTF??? Every second programmer does write to file byte by byte at least once at some point in their career.

    The Java class RandomAccessFile from the offical JDK has a readLine() method that reads lines byte by byte...so it is basically unusable fro that task.

  • (cs) in reply to sagaciter
    sagaciter:
    I've always disliked people like Mike. Someone writes a little thing, for their own reasons, then they put it out for the public to use once they see that there's no other tool available. And then some smartass like Mike comes along, at a later date, and gets all high and mighty.

    Who cares that a tool wasn't done the best way - when there are no other tools available. Mike is just a whiner for complaining.

    agree. HE could have just informed the original creator of his improvements without being an ass about it.

  • Anon (unregistered) in reply to Some Guy
    Some Guy:
    What I don't understand is how the screenshot toolkit code was so well written, but the first time you try to actually integrate it into a working application, it's almost unusably slow.

    So Dr. John has the skills to design and write a very functional toolkit, but never used it in a single application that would make it obvious that it was practically unusable without disk buffering?

    Well, apparently it wasn't "practically unusable" for Mike since he happily shipped it out knowing it was slow.

  • (cs) in reply to plaidfluff

    Burning up the drives. From the ridiculous activity they were being asked to perform. Not an anachronism at all.

    Addendum (2014-06-20 17:11): Nevermind. Stupid assumption. They said it killed the drives.

    Burning floppies - IS the Daily WTF

  • (cs) in reply to ¯\(°_o)/¯ I DUNNO LOL
    ¯\(°_o)/¯ I DUNNO LOL:
    Rewriting an entire sector because one byte was written to an open file sounds pretty stupid, even if you're worried about the possibility that a user can hit a manual eject button at any moment.
    It's a little bit worse than that - on the Amiga, the standard floppy disk device driver is track-based (hence the name "trackdisk.device"), so it'd be rewriting 11 sectors each time (though the driver was probably smart enough to keep them cached).
  • Meep (unregistered) in reply to balazs
    balazs:
    The Daily Happy Ending, where is my WTF??? Every second programmer does write to file byte by byte at least once at some point in their career.

    And it's entirely reasonable if your standard library buffers IO for you. The professor may have assumed it already did.

  • Val (unregistered) in reply to balazs
    balazs:
    The Daily Happy Ending, where is my WTF??? Every second programmer does write to file byte by byte at least once at some point in their career.

    In a decent language on a decent operating system you don't have to care. You should be able to write byte by byte and the let the system decide how it will buffer it.

  • Shaun (unregistered)

    It still happens - in 2006 I came across a similar piece of code written in C# which used pixel-by-pixel rendering of an image (direct to the the screen no less) written by an "expert". A few hours work to use arrays and off-screen rendering before transferring the final image to the screen and I had one very happy customer who could now page through the images as fast as they liked rather than wait several seconds per image - I didn't milk it as much as I should have done :)

  • Not quite gray beard (unregistered)

    I remember the function call overhead for putchar being significant back in those days, even if you had a buffered stdio library.

  • Norman Diamond (unregistered) in reply to omnichad
    omnichad:
    Burning up the drives.
    I wonder if that's the method the IRS uses to destroy e-mail, destroy original records of withholding after insiders alter records to reassign refunds to identity thieves instead of the actual payers, destroy records of receiving correspondence including registered letters, and destroy records of their own forms written by their insiders.

    Here's one way to destroy a hard drive, probably involving some burning: http://www.ebay.com/itm/Computer-hard-drive-melted-down-Novelty-Yes-its-real-Paper-weight-/121058269308?pt=US_Internal_Hard_Disk_Drives&hash=item1c2fa2987c

    Another way is to use a degausser to completely demagnetize a drive. Since this destroys the servo tracks, the drive can't be reused afterwards. The IRS was a customer of my employer before I joined this company, so they probably use my co-workers' products to destroy records proving what their identity thieves did to me.

    Another way is specialized erasure by software, so the drive can be repurposed afterwards, but this isn't accepted if the drive had stored documents above the level of top secret, such as photos of torture or proof of identity theft.

    It is standard practice to issue certificates documenting secure destruction of drives. I suppose those certificates went into a highly secure paper shredder.

  • Norman Diamond (unregistered) in reply to Not quite gray beard
    Not quite gray beard:
    I remember the function call overhead for putchar being significant back in those days, even if you had a buffered stdio library.
    The macro uses more CPU time than I'd expected before reading it, but it doesn't force one byte at a time to the hard drive. (Well ... it could, since the standard doesn't say how putchar has to be implemented.)
  • Goat Oobed (unregistered)

    Maybe Dr. John did his testing on an Amiga that had a hard disk? Being a doctor, he could probably afford one.

  • Cheong (unregistered) in reply to Goat Oobed
    Goat Oobed:
    Maybe Dr. John did his testing on an Amiga that had a hard disk? Being a doctor, he could probably afford one.
    Actually with a harddisk, this will still be a slow action.

    10+ years ago, I was writing code to copy raw bits from certain location of harddisk to another harddisk (think it's something like "dd" in *nix). In the first version I also use byte-to-byte copy and the performance is extremely slow, it goes way better after I changed to buffer approach to use 256kB copies.

  • Dominic (unregistered) in reply to Val
    Val:
    In a decent language on a decent operating system you don't have to care. You should be able to write byte by byte and the let the system decide how it will buffer it.
    I don't think any 80s microcomputer was "decent" by that standard since automatic I/O buffering would have taken memory away from applications.
  • anonymous (unregistered) in reply to Shaun
    Shaun:
    It still happens - in 2006 I came across a similar piece of code written in C# which used pixel-by-pixel rendering of an image (direct to the the screen no less) written by an "expert". A few hours work to use arrays and off-screen rendering before transferring the final image to the screen and I had one very happy customer who could now page through the images as fast as they liked rather than wait several seconds per image - I didn't milk it as much as I should have done :)
    There's no reason why it should have taken longer to render to the screen than to render to an array. They're both just places in memory. I'm guessing that the "plot pixel" routine had some horrid overhead.

Leave a comment on “Write Universe to Disk”

Log In or post as a guest

Replying to comment #:

« Return to Article