• Reply To All (unregistered)

    I think this coder is very special

  • Devi (unregistered)

    hahahahaha =D

    That's brilliant, it really terrfies me to think what the guys render loop must have looked like to make such a terrifying hack necessary...

  • (cs)

    Now that's a creative way to install a hook to handle events :X He was probly having troubles with slow response when heavy drawing happend.

  • JTK (unregistered)

    Welcome back, my friends, to the show that never ends!

  • (cs) in reply to JTK

    We're so glad you could attend! Come inside, come inside...

  • John (unregistered)

    I've been a daily reader for a long time so maybe I've become somewhat jaded. This is the first piece of code in a long time that makes me want to climb to the top of a mountain, scream at the top of my lungs "WHAT THE FUCK?!", and then jump to a grisly death. Much like you can't un-see a certain goat-related picture, you can't un-read this code. I will have nightmares for weeks.

  • PyneJ (unregistered)

    static int called98765 = FALSE;

    Now that is just cute.

    and we have to set it also...

    called98765 = TRUE;

  • The Frinton Mafia (unregistered)

    It looks to me as if something was horribly wrong elsewhere in the program (such that you had to pump messages for a bit before rendering each frame) and this is a fairly non-stupid attempt to allow that to happen without having to accompany every call to glBegin with some horrible code.

    At least this way the rest of the program will look like ordinary opengl code.

  • Devi (unregistered)
    this is a fairly non-stupid attempt to allow that to happen without having to accompany every call to glBegin with some horrible code.

    Can you say "Functions"?

    CheckProcessMessages(); glBegin(someMode);

    I'd much rather the above than to have perfectly innocent OpenGL calls being invisibly hijacked by that lunatic...

  • This is nothing... really... (unregistered) in reply to The Frinton Mafia
    The Frinton Mafia:
    It looks to me as if something was horribly wrong elsewhere in the program (such that you had to pump messages for a bit before rendering each frame) and this is a fairly non-stupid attempt to allow that to happen without having to accompany every call to glBegin with some horrible code.

    At least this way the rest of the program will look like ordinary opengl code.

    So this code is sort of like the "Neo" of the program... all the stupidity has been concentrated here.

    To avoid spreading the stupidity like peanut-butter on the hood of a Porche as it races down the highway we have instead put the stupidity in a jar and set it on the gear shift.

  • Skas (unregistered)

    sweet mother of pearl, my brain!

    captcha: bathe, if they hire someone other than a monkey to write their code, we'll allow it!

  • (cs)

    3d = tridefecta?

  • (cs) in reply to This is nothing... really...
    This is nothing... really...:
    So this code is sort of like the "Neo" of the program... all the stupidity has been concentrated here.

    To avoid spreading the stupidity like peanut-butter on the hood of a Porche as it races down the highway we have instead put the stupidity in a jar and set it on the gear shift.

    That is probably the strangest analogy that I have ever heard.

  • James (unregistered)

    So, uh... I guess it might be because I haven't had my coffee yet, but I still have no idea what this is actually supposed to do. Does somebody want to lay it out for me?

  • (cs) in reply to PyneJ
    PyneJ:
    static int called98765 = FALSE;

    Now that is just cute.

    and we have to set it also...

    called98765 = TRUE;

    Welcome to C/C++... We don't have booleans.

  • (cs) in reply to James
    James:
    So, uh... I guess it might be because I haven't had my coffee yet, but I still have *no* idea what this is actually supposed to do. Does somebody want to lay it out for me?

    Basically it looks for messages in the queue and if any are found pumps them all out before entering executing whatever is supposed to come after the coderspecial call.

    This seems like a fix for something even worse though. As stated, this code was in the front of every call, presumably copy-pasted into each function. The next developer took the proper step to make it a function called coderspecial and simply make the call to that function instead.

    Why was this the proper step? He hasn't found the root cause as yet, but when he does he can simply comment out the code in coderspecial one time rather than track down all the copies in order to remove the footprint. So while there is a WTF we have not yet seen, this coder actually did something right.

  • M. Dizzy (unregistered) in reply to James
    James:
    So, uh... I guess it might be because I haven't had my coffee yet, but I still have *no* idea what this is actually supposed to do. Does somebody want to lay it out for me?

    Do you really want to know? Basically all it does is ensure that the message pump sleeps for 250 ticks between processing events. That is unless the mode is -98765 then it processes immediately.

  • (cs) in reply to Mcoder
    Mcoder:
    Welcome to C/C++... We don't have booleans.

    Um, last I checked, C++ had a bool type.

  • Shinobu (unregistered) in reply to James

    Every time glBegin is called, and the last time messages were processed is more than a quarter of a second ago, it processes messages. Presumably this was done because sometimes a frame would take a long time to render and the program would become unresponsive. The coder didn't want to put extra ProcessMessages calls throughout the rest of his code, so he hijacked glBegin. 98765 is just a number designed not to be equal to any constant that would normally be fed to glBegin.

  • Devi (unregistered)
    So, uh... I guess it might be because I haven't had my coffee yet, but I still have *no* idea what this is actually supposed to do. Does somebody want to lay it out for me?

    glBegin is a function that you call while programming in OpenGL that means (I think, it's been a while) "I am about to start rendering a single frame", so every time the screen updates glBegin will be called once to prepare the renderer to start drawing. Essentially the code he's written replaces all the calls in the code base to glBegin with a call to his automagical function, which as far as I can see does the standard windows message handling loop every 250 ticks (a tick normally being a single frame update) before calling the actual implementation of glBegin. It also does some weird madness with a flag to make sure that the message handling only starts happening after you call CoderSpecial with the magic parameter -98765.

  • leeg (unregistered) in reply to mkb

    I come from a century throughout which C has a boolean type. It's the 21st century.

  • (cs) in reply to mkb
    mkb:
    Mcoder:
    Welcome to C/C++... We don't have booleans.

    Um, last I checked, C++ had a bool type.

    Does C99 count as "C"?

  • tehdew (unregistered) in reply to mkb
    mkb:
    Mcoder:
    Welcome to C/C++... We don't have booleans.

    Um, last I checked, C++ had a bool type.

    And last time I checked, sarcasm was not uncommon in these parts.

  • Mike (unregistered)

    Honestly, this reminds me of lots of Fortran code I've seen. In particular, a function for performing FFTs. You have to call it with a flag that says if you want to 1) perform an FFT, -1) perform an inverse FFT, or 0) set up the twiddle tables. Why not have three separate functions? Because they wanted those twiddle tables statically declared in the function, so nobody else would mess them up. I know Fortran doesn't believe in opaque data, but I'm guessing they didn't have pointers and structures (hence, parameter blocks or anything that could be made to resemble OOP) either.

    Also, apparently, this Fortran didn't have dynamically-sized arrays or anything like malloc, because you had to provide the function with not only the input and output buffers, but also an intermediate (temp) buffer.

    I hate Fortran. I don't normally hate languages I don't have to use, but the code in this WTF just screams to me "former Fortran programmer." Poor guy. Someone put him out of his misery.

  • (cs)

    Actually, glBegin(GLenum) is used to tell OpenGL that we're about to define a primitive. Mainly:

    glBegin(GL_TRIANGLE_LIST); glVertex3f(0.4f, 0.3f, 0.2f); glVertex3f(1.4f, 1.3f, 1.2f); ... glEnd();

    which makes this code even MORE of a WTF. What the FUCK was this guy thinking?

    To avoid spreading the stupidity like peanut-butter on the hood of a Porche as it races down the highway we have instead put the stupidity in a jar and set it on the gear shift.

    I think this quote deserves a WTF front page story! Wow, this made me laugh AND think WTF at the same time!

  • (cs) in reply to mkb

    Yep, and it is an int (typedef bool int).

  • Devi (unregistered)
    Actually, glBegin(GLenum) is used to tell OpenGL that we're about to define a primitive

    Ah, that actually makes his code make a little more sense now :)

  • (cs) in reply to John
    John:
    Much like you can't un-see a certain goat-related picture, ...

    Hey, thanks for reminding. The vivid memory of it just flashed in front of my eyes...

  • (cs) in reply to Shinobu
    Shinobu:
    Every time glBegin is called, and the last time messages were processed is more than a quarter of a second ago, it processes messages. Presumably this was done because sometimes a frame would take a long time to render and the program would become unresponsive. The coder didn't want to put extra ProcessMessages calls throughout the rest of his code, so he hijacked glBegin. 98765 is just a number designed not to be equal to any constant that would normally be fed to glBegin.

    Ehh... that's not quite right. The right side of this:

    if ((mode == -98765) || (called98765 && (now - lastCheck> 250)))
    will never be true. Everytime it gets to that called98765 will have been set to FALSE. Of course, I'm assuming that FALSE == 0. It could be 1, 7, or 65536.
  • Devi (unregistered)
    Everytime it gets to that called98765 will have been set to FALSE.

    No it won't, because called7865 has been declared as local and static, meaning it will only be initialized to false the first time you cll the function. In following calls to the function the flag will retain the same value as it had the last time the function was called.

  • Steve (unregistered)

    Since FORTRAN has been mentioned, it should be noted that John Backus, the original developer of the language has passed on.

    http://www.nytimes.com/2007/03/20/business/20backus.html

    For anyone who spent any time coding in Assembler or, even more to the point, raw machine code (I've done both), it is clear his contributions to the art of computer programming were monumental.

    FORTRAN is like a second language to me.

  • drdamour (unregistered) in reply to Devi

    C#/JAVA static != VB6/C++ static

  • (cs) in reply to Mike

    Maybe you're ecstatic about this then: http://developers.slashdot.org/

  • Reed (unregistered)

    I'm guessing this came as a result from porting some OpenGL program to Windows, without refactoring how other parts of the program work. Trying to get 3D render loops integrated with event-driven (and latancy-laden) GUI toolkits such that both the 3D and the GUI work perfectly is a huge bitch.

  • PseudoNoise (unregistered) in reply to Reed
    Reed:
    I'm guessing this came as a result from porting some OpenGL program to Windows, without refactoring how other parts of the program work. Trying to get 3D render loops integrated with event-driven (and latancy-laden) GUI toolkits such that both the 3D and the GUI work perfectly is a huge bitch.
    It seems from the problem statement that this is the canonical example for multithreading: like the spreadsheet app that does background calcs while rendering output, the word processor that does background spell checking while interracting with user, or apparently this graphics app that drains the message pump while working. In each case, the anti-pattern is sprinkling "do some background work" function calls in the middle of the main code loop.

    Granted, I've never worked with graphics, but I have worked with win32. Any reason this won't be a good solution? Googling shows some folks have had tried it.

  • Zygo (unregistered) in reply to John
    John:
    Much like you can't un-see a certain goat-related picture, you can't un-read this code.

    DAMNIT! I just managed to forget that thing...

    <whimpers and twitches in the corner>
  • Devi (unregistered)
    It seems from the problem statement that this is the canonical example for multithreading: like the spreadsheet app that does background calcs while rendering output, the word processor that does background spell checking while interracting with user, or apparently this graphics app that drains the message pump while working. In each case, the anti-pattern is sprinkling "do some background work" function calls in the middle of the main code loop.

    Granted, I've never worked with graphics, but I have worked with win32. Any reason this won't be a good solution?

    Alot of modern games use the approach you describe (an Xbox360 has 6 cores for example, making multithreaded coding pretty much compulsory) however, this is a port of old code. Chances are therefore:

    1. It will be single threaded (old skool 3D applications couldn't afford the processor hit of multithreading, they had to run as fast as possible)
    2. The simulation logic and the render logic will be completely linked together. That is an object will update itself by doing it's business logic (moving, colliding etc) and then kicking off its rendering logic. Generally this would mean that the render code will access the same data as the business logic extensively.

    If you wanted to seperate the business and render logic properly so they could run on seperate threads you would probably end up having to either rewrite the entire application, or alternately write so many weird and wonderful hacks that the original SOD would look like shakespeare does C++ in comparison :)

  • nobody (unregistered)

    [Church Lady mode on] Isn't this coder special. Who made him write code like that? Could it be.... SATAN! [Church Lady mode off]

  • EmmanuelD (unregistered)

    "Think Different"?

  • (cs) in reply to Steve
    Steve:
    Since FORTRAN has been mentioned, it should be noted that John Backus, the original developer of the language has passed on.

    http://www.nytimes.com/2007/03/20/business/20backus.html

    For anyone who spent any time coding in Assembler or, even more to the point, raw machine code (I've done both), it is clear his contributions to the art of computer programming were monumental.

    FORTRAN is like a second language to me.

    Oh my. That explains why the flags are at half-staff. Seriously, where the hell would we all be without formal grammars (Backus-Naur Form) and fortran? It was bad enough when the GCC folks discontinued g77, but this makes me sad. Ranks right up there with Jon Postel's passing.

  • Zygo (unregistered) in reply to Mike
    Mike:
    Honestly, this reminds me of lots of Fortran code I've seen. In particular, a function for performing FFTs. You have to call it with a flag that says if you want to 1) perform an FFT, -1) perform an inverse FFT, or 0) set up the twiddle tables.

    The 1 and -1 make sense for FFT, since (depending on the implementation) you can do the same math with one or the other values stuck in the equations as a constant.

    0 is a bit out of place though.

    I once worked with an industrial digital camera application which provided for user-supplied image processing via a DLL provided by the user. The DLL had to provide one function with 22 arguments of various types. The second argument specified the requested operation, and each operation code used various subsets of the 22 arguments (8 doubles, 12 ints, and two char *'s).

    Somewhere in the main application code (which we never saw, but had to learn about through trial, error, and disassembling it in the debugger) there was a function that looks sort of like:

      static int iv1, iv3, iv4, iv5, iv6;
      static double dv1, dv2, dv3, dv4, dv5, dv6, dv7, dv8;
      char *fileName;    // set by "file save" dialog
    
    void callUserFileSaveFunction(int iv7, int iv8, int iv9, int iv10) {
      char *ucp;           // uninitialized, "reserved" according to the manual
      statusBits = dllfunc(iv1, OP_SAVE_IMAGE_TO_FILE, iv3, iv4, iv5,
                           iv6, dv1, dv2, dv3, dv4, iv7, iv8, ucp, dv5
                           dv6, fileName, iv9, iv10,
                           iv11, iv12);
    }
    
    Mike:
    Also, apparently, this Fortran didn't have dynamically-sized arrays or anything like malloc, because you had to provide the function with not only the input and output buffers, but also an intermediate (temp) buffer.

    Actually that's fairly common for code of this type, in any language. Various SSL libraries, zlib, some image libraries, even C stdio all allow for user-specified buffer memory areas, and in some cases insist on them. On embedded systems the application has to control who gets to use the "fast" memory (some address range that decodes with higher clock rate, write-back caching, or unshared RAM in a multi-CPU environment) and who gets to use "slow" memory, and there isn't enough RAM of the right kinds to run the entire application in "fast" memory, so you don't want a library function to allocate stuff from some generic pool of memory as malloc does. You have stuff like "fast_ram_malloc(size_t)" or "malloc_from_pool(size_t, pool_id)".

    For FFT's in particular, even on conventional desktop hardware there's often some kind of cache advantage to gain or memory allocation penality to avoid by allocating a buffer in the application and using it over and over again in the library.

  • Zygo (unregistered) in reply to Licky Lindsay
    Licky Lindsay:
    mkb:
    Mcoder:
    Welcome to C/C++... We don't have booleans.

    Um, last I checked, C++ had a bool type.

    Does C99 count as "C"?

    bool was a common C extension among implementations in the field long before C99, and fairly easy to emulate with a typedef as well.

  • Leo (unregistered)

    Oh, sure, we have booleans in C :

    typedef enum {
            FALSE = 0,
            TRUE  = 1
    } BOOL;
    
    #define bool  BOOL
    #define true  TRUE
    #define false FALSE
    

    captcha: ewww - Exactly how I feel after reading this WTF.

  • Quietust (unregistered) in reply to Zygo
    Zygo:
    char *ucp; // uninitialized, "reserved" according to the manual

    That alone is a considerable WTF - "reserved" parameters should always be set to a specific value (generally 0 or NULL), otherwise all sorts of nasty things can happen when said reserved parameter suddenly starts getting used.

  • Been here long? (unregistered) in reply to Leo
    Leo:
    Oh, sure, we have booleans in C :
    typedef enum {
            FALSE = 0,
            TRUE  = 1,
            FILE_NOT_FOUND = 2
    } BOOL;
    
    #define bool  BOOL
    #define true  TRUE
    #define false FALSE
    

    captcha: ewww - Exactly how I feel after reading this WTF.

    At least do it right!
  • Botzinger Gulm (unregistered)

    All right, so this is from some old SGI OGL program that was hurriedly ported to Windows. Quite likely it has some 'modal' windows that do OGL rendering without returning to the main app loop at all - if there IS a main app loop. Now some programmer in a big hurry (with a few managers, directors, and other evils) breathing at his neck that WE'LL DEMONSTRATE THIS TOMORROW MORNING AT 8AM figures out why the message thing in the actual main loop isn't enough and can't go through the 52785 files to hunt down through the spaghetti for all the 275 occurrences of misbehaving 'modal' windows. Thus he makes a really ugly desperate hack. Hopefully he has since found a much better workplace.

    It also could be a game with several different screens that each have their own convoluted logic flow outside the main loop.

    The real WTF is that in Windows you have to keep the message pump manned at all times or your program sinks.

    (All resemblance to real and imaginary people and numbers is purely imaginary.)

  • Nataku (unregistered) in reply to Devi

    The XBOX 360 only has 3 cores.

    http://en.wikipedia.org/wiki/Xbox_360#Central_processing_unit

  • Botzinger Gulm (unregistered) in reply to Zygo
    Zygo:
    For FFT's in particular, even on conventional desktop hardware there's often some kind of cache advantage to gain or memory allocation penality to avoid by allocating a buffer in the application and using it over and over again in the library.

    That would be especially true with Win9x which would always allocate a chunk of disk space when you asked for memory. That's why all programs allocated a bunch of memory and gave that to malloc/free/whathaveyou runtime memory management (with a fallback of allocating more windows heap.) Rest in pieces and good riddance to that.

  • Hit (unregistered)

    Silly mortals. Obviously, this special programmer was trying to introduce AOP into the C++ world.

    It's ingenious! I think I'll call it "#define based Aspected Oriented Programming" or DBAOP for short.

  • (cs) in reply to Mcoder
    Mcoder:
    PyneJ:
    static int called98765 = FALSE;

    Now that is just cute.

    and we have to set it also...

    called98765 = TRUE;

    Welcome to C/C++... We don't have booleans.

    Now I see why C# was loved when it came.

    Arghhhhhhhhh.

Leave a comment on “Special Coders”

Log In or post as a guest

Replying to comment #127587:

« Return to Article