• Devi (unregistered)
    Nataku:
    The XBOX 360 only has 3 cores.

    http://en.wikipedia.org/wiki/Xbox_360#Central_processing_unit

    Sorry, that's true yeah. It's just each core has 2 hardware threads, so I always end up thinking of it as if it has 6 cores instead...

    "Three symmetrical cores, each two way SMT-capable and clocked at 3.2 GHz "

    http://en.wikipedia.org/wiki/Xenon_%28processor%29

  • Kalen (unregistered)

    Is there a Special Coder Olympics?

    This guy should win gold.

  • (cs) in reply to Mike
    Mike:
    Honestly, this reminds me of lots of Fortran code I've seen. In particular, a function for performing FFTs. You have to call it with a flag that says if you want to 1) perform an FFT, -1) perform an inverse FFT, or 0) set up the twiddle tables. Why not have three separate functions? Because they wanted those twiddle tables statically declared in the function, so nobody else would mess them up. I know Fortran doesn't believe in opaque data, but I'm guessing they didn't have pointers and structures (hence, parameter blocks or anything that could be made to resemble OOP) either.

    I once had to program against a binary library which did something similar: it exposed one single function, which one had to call with different parameters in order to do the different steps (8+) of processing data.

    The really f:d up part was that the first parameter, which decided what should be done, was a C-string. Naturally, the documentation sucked, so I ended up running `strings' on the binary library in order to figure out which "commands" were available. :-(

  • Dustin (unregistered)

    -98765? That's just jibberish.

    FFFE7E33!?!!! AAAAHHHHHHHHHHHHHHHHHH!!!!!!!!!!!!!!

  • Jon (unregistered) in reply to Mcoder
    Mcoder:
    Welcome to C/C++... We don't have booleans.
    Hello, and welcome to 9 years ago.

    By the way, "C/C++" isn't a language.

  • anonny (unregistered) in reply to PseudoNoise
    PseudoNoise:
    It seems from the problem statement that this is the canonical example for multithreading
    Multithreading can be an incredible bitch, too. For instance, in Windows, the thread that makes OpenGL calls has to be the same thread that initialized OpenGL in the first place.
  • (cs) in reply to anonny
    anonny:
    Multithreading can be an incredible bitch, too. For instance, in Windows, the thread that makes OpenGL calls has to be the same thread that initialized OpenGL in the first place.
    Not really, IIRC you can use wglMakeCurrent to bind any OpenGL context to any thread (with the limitation that a context can only be bound to one thread at any time and each thread can only have one "current" context).

    The same call can be used to unbind GL contexts.

  • (cs)

    Like many have pointed out before this is perfectly decent code that works around issues in a generally accepted ways of the language in question, using standard paractice documented in more than a few well respected sources for the platform in question. The only curious thing about is the somewhat funny function name.

    Please Derrick, if this is the best you can come up with, I suggest you take a holiday break.

  • Teh dayli WFT (unregistered)

    Guys, it's scary how many of you didn't realize that the one who said that C and C++ don't have boolean types was SARCASTIC. Both languages do and have had booleans for quite some time now. Don't make yourself look stupid by going "oh but C++ DOES have a boolean type" or "Is that really true! C++ doesn't have booleans?"

    Sarcasm, everyone. It's how we get an idea of which ones of you will be giving us the next good laugh when we see your code here.

  • lackluster (unregistered)

    Old 3d code? Damn, for a second there I thought I was going to be able to read about something interesting. Where's the function to trick int10 into ModeX or fool DOS into protected-mode? I guess 'old' isn't even the word to describe code that will render 3000 environment mapped, gourand shaded polys in real time on a 66mhz 486.

  • (cs) in reply to This is nothing... really...
    This is nothing... really...:
    To avoid spreading the stupidity like peanut-butter on the hood of a Porche as it races down the highway we have instead put the stupidity in a jar and set it on the gear shift.

    lol, i love it

  • pc (unregistered) in reply to Leo

    You forgot FILE_NOT_FOUND

  • Joce (unregistered) in reply to Mcoder
    Mcoder:
    PyneJ:
    static int called98765 = FALSE;

    Now that is just cute.

    Welcome to C/C++... We don't have booleans.

    Ummm.... C++ does...

    (What really freaks me about WTF is that the comments are usually worse than the WTF)

  • Magnus (unregistered) in reply to kimos

    I heard an even stranger analogy once.

    I don't recall exactly what the argument was about, but it was related to some development work that I proposed. To which the development manager replied:

    "This is like driving a car in the desert! With no doors! Anything can happen."

    I always wondered what it would be like to drive a door-less car in the desert.

  • Eternal Density (unregistered) in reply to Jon
    Jon:
    Mcoder:
    Welcome to C/C++... We don't have booleans.
    Hello, and welcome to 9 years ago.

    By the way, "C/C++" isn't a language.

    It is at my uni... well it seems like it some times...

    The peanut butter analogy is one of the best I've seen for a while.

    oh, and it should be:

    typedef enum {
    
            FALSE = 0,
    
            TRUE  = 1,
    
            FILE_NOT_FOUND = 2
    
            YES = 3
    
            NO = 4
    
            SILVER = 5
    
            MAYBE_NOT = -1
    
    } BOOL;
  • AdT (unregistered)

    That's amazing. I got the same combination on my luggage.

  • James (unregistered)

    First off, C++ has a builtin bool type. So does C99 (which depending on your definition may or may not count as C).

    C++'s bool is not a typedef, not a #define, not an enum, it's a builtin. See the Guru of the Week entry for why.

    http://www.gotw.ca/gotw/026.htm

  • RogerWilco (unregistered) in reply to Steve
    Steve:
    Since FORTRAN has been mentioned, it should be noted that John Backus, the original developer of the language has passed on.

    http://www.nytimes.com/2007/03/20/business/20backus.html

    For anyone who spent any time coding in Assembler or, even more to the point, raw machine code (I've done both), it is clear his contributions to the art of computer programming were monumental.

    FORTRAN is like a second language to me.

    May he rest in peace.

    As for FORTRAN, unfortunately it's still widely used in my line of work. Some of the programs I use have code dating back 20-30 years, and I can still find bugs in them. (Hurray for IMPLICIT NONE*, it makes the compiler find typos that have been in the code for 20 years.) http://www.aoc.nrao.edu/aips/ http://www.atnf.csiro.au/computing/software/miriad/

    • Most of this code does not use it, but tries to be FORTRAN66 compatible.
  • Mike (unregistered) in reply to T604

    For the record, no, I'm not happy about Mr. Backus' passing.

    Sure, he invented Fortran, but the damage is done and hopefully one day people will move on and leave that language behind. It did its job, which as already mentioned was "being better to code in than assembly."

    No, I'll always remember Mr. Backus as the creator of BNF (Backus-Naur Form). One of my favorite things to do ever since I learned to program was study computer languages. BNF was my lingua franca for that.

  • Amadan (unregistered)

    I had a Fortran project recently, and I must say that horrid though it still is, F90 is actually usable. Can dynamically allocate, can use command line params, can define structures... As long as you don't need binary file access or string manipulation, you have high chances of surviving the encounter with sanity relatively intact.

  • Robert (unregistered)

    My brain is just not complex enough to understand why and how one would cook up code like that.

  • zarazek (unregistered)

    It may scare you, but I've seen a lot of code like this at my former employer. The company did some strange telecom equipement for military: highly overpriced, technologically years behind COTS stuff that civilians use, but using military's strange, proprietary (sometimes confidential) protocols, and, ehem, full custom (the company motto: we'll make what they want without trying to explain them that what they want is stupid).

    This stuff was usually controlled by some embedded PowerPC CPU with software running on bare metal, without any operating system. In consequence essentially all applications developed there have some sort of operating system in them, all different, often interleaved with the application logic in a sphagetti manner. I've seen in-house developed flash file systems, in-house developed TCP/IP stacks, creative use of page tables to make applications bigger than RAM possible. And of course CONCURRENCY. As you know, telecom app, like one controlling branch exchange, has to deal with lot of concurrent events and stay responsive. But there was no support for any form of threads. So the main loop of all the apps looked like this:

    while (1) { doStuffA(); doStuffB(); doStuffC(); ... doStuffZ(); }

    Each doStuff() represented one task (mainly handling input form some peripherial), and had to be carefully coded to not to do too much work at one time (most often using some form of state machine to divide work to stages). Nothing unusual so far. But there were also so called BACKGROUND TASKS. The name "background tasks" is misleading: in reality those tasks were "realtime tasks" or "low-latency tasks": they were doing job that couldn't wait for the next turn of the main loop, so you have to somewhat interleave their execution with execution of normal tasks. For example "network task": every time you called some some network-related function, like send() or receive(), the hardware packet queues were checked if something had arrived and eventually data was copied to memory buffers. If these functions were not called often enough, then the hardware buffers were overflowed, and when they were called in wrong proportion (too much send()'s, to little receive()'s) - memory buffer was overflowed: in both cases packets were lost or (worse) screwed up.

    So what would you do if you were asked to add some functionality to system like this? Probabely you need to create new task. If your task in not time-critical one, you're lucky: you just have to write doNewStuff() function and call it from the main loop. But if it is, you have to append your code to some frequently called library function, like malloc, and pray it will be called often enough.

  • Darwin (unregistered) in reply to zarazek
    zarazek:
    ... I've seen a lot of code like this ... strange telecom equipement for military: highly overpriced, technologically years behind COTS stuff that civilians use, but using military's strange, proprietary (sometimes confidential) protocols....

    ... I've seen in-house developed flash file systems, in-house developed TCP/IP stacks, creative use of page tables to make applications bigger than RAM possible. And of course CONCURRENCY. As you know, telecom app, like one controlling branch exchange, has to deal with lot of concurrent events and stay responsive. But there was no support for any form of threads. ...

    But there were also so called BACKGROUND TASKS. The name "background tasks" is misleading: in reality those tasks were "realtime tasks" or "low-latency tasks": they were doing job that couldn't wait for the next turn of the main loop, so you have to somewhat interleave their execution with execution of normal tasks.

    ... every time you called some network-related function, like send() or receive(), the hardware packet queues were checked if something had arrived and eventually data was copied to memory buffers. If these functions were not called often enough, then the hardware buffers were overflowed, and when they were called in wrong proportion (too much send()'s, to little receive()'s) - memory buffer was overflowed: in both cases packets were lost or (worse) screwed up.

    So what would you do if you were asked to add some functionality to system like this? Probabely you need to create new task. If your task in not time-critical one, you're lucky: you just have to write doNewStuff() function and call it from the main loop. But if it is, you have to append your code to some frequently called library function, like malloc, and pray it will be called often enough.

    My government? That I pay for with my taxes? That's ... xevious. Also, insane.

    What would I do? I would demand that they switch to using Erlang. They would refuse, and I would quit.

  • Nemo (unregistered) in reply to Magnus
    I always wondered what it would be like to drive a door-less car in the desert.
    Sandy.
  • #define this_code stupid (unregistered)

    The real WTF here is the use of #undef. If they absolutely had to do it that way: #define _GL_BEGIN glBegin #define glBegin CoderSpecial

    Then rather than doing #undef, just call _GL_BEGIN.

    captcha: stinky, yep it sure is.

Leave a comment on “Special Coders”

Log In or post as a guest

Replying to comment #153464:

« Return to Article