- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Sorry, that's true yeah. It's just each core has 2 hardware threads, so I always end up thinking of it as if it has 6 cores instead...
"Three symmetrical cores, each two way SMT-capable and clocked at 3.2 GHz "
http://en.wikipedia.org/wiki/Xenon_%28processor%29
Admin
Is there a Special Coder Olympics?
This guy should win gold.
Admin
I once had to program against a binary library which did something similar: it exposed one single function, which one had to call with different parameters in order to do the different steps (8+) of processing data.
The really f:d up part was that the first parameter, which decided what should be done, was a C-string. Naturally, the documentation sucked, so I ended up running `strings' on the binary library in order to figure out which "commands" were available. :-(
Admin
-98765? That's just jibberish.
FFFE7E33!?!!! AAAAHHHHHHHHHHHHHHHHHH!!!!!!!!!!!!!!
Admin
By the way, "C/C++" isn't a language.
Admin
Admin
The same call can be used to unbind GL contexts.
Admin
Like many have pointed out before this is perfectly decent code that works around issues in a generally accepted ways of the language in question, using standard paractice documented in more than a few well respected sources for the platform in question. The only curious thing about is the somewhat funny function name.
Please Derrick, if this is the best you can come up with, I suggest you take a holiday break.
Admin
Guys, it's scary how many of you didn't realize that the one who said that C and C++ don't have boolean types was SARCASTIC. Both languages do and have had booleans for quite some time now. Don't make yourself look stupid by going "oh but C++ DOES have a boolean type" or "Is that really true! C++ doesn't have booleans?"
Sarcasm, everyone. It's how we get an idea of which ones of you will be giving us the next good laugh when we see your code here.
Admin
Old 3d code? Damn, for a second there I thought I was going to be able to read about something interesting. Where's the function to trick int10 into ModeX or fool DOS into protected-mode? I guess 'old' isn't even the word to describe code that will render 3000 environment mapped, gourand shaded polys in real time on a 66mhz 486.
Admin
lol, i love it
Admin
You forgot FILE_NOT_FOUND
Admin
Ummm.... C++ does...
(What really freaks me about WTF is that the comments are usually worse than the WTF)
Admin
I heard an even stranger analogy once.
I don't recall exactly what the argument was about, but it was related to some development work that I proposed. To which the development manager replied:
"This is like driving a car in the desert! With no doors! Anything can happen."
I always wondered what it would be like to drive a door-less car in the desert.
Admin
The peanut butter analogy is one of the best I've seen for a while.
oh, and it should be:
Admin
That's amazing. I got the same combination on my luggage.
Admin
First off, C++ has a builtin bool type. So does C99 (which depending on your definition may or may not count as C).
C++'s bool is not a typedef, not a #define, not an enum, it's a builtin. See the Guru of the Week entry for why.
http://www.gotw.ca/gotw/026.htm
Admin
As for FORTRAN, unfortunately it's still widely used in my line of work. Some of the programs I use have code dating back 20-30 years, and I can still find bugs in them. (Hurray for IMPLICIT NONE*, it makes the compiler find typos that have been in the code for 20 years.) http://www.aoc.nrao.edu/aips/ http://www.atnf.csiro.au/computing/software/miriad/
Admin
For the record, no, I'm not happy about Mr. Backus' passing.
Sure, he invented Fortran, but the damage is done and hopefully one day people will move on and leave that language behind. It did its job, which as already mentioned was "being better to code in than assembly."
No, I'll always remember Mr. Backus as the creator of BNF (Backus-Naur Form). One of my favorite things to do ever since I learned to program was study computer languages. BNF was my lingua franca for that.
Admin
I had a Fortran project recently, and I must say that horrid though it still is, F90 is actually usable. Can dynamically allocate, can use command line params, can define structures... As long as you don't need binary file access or string manipulation, you have high chances of surviving the encounter with sanity relatively intact.
Admin
My brain is just not complex enough to understand why and how one would cook up code like that.
Admin
It may scare you, but I've seen a lot of code like this at my former employer. The company did some strange telecom equipement for military: highly overpriced, technologically years behind COTS stuff that civilians use, but using military's strange, proprietary (sometimes confidential) protocols, and, ehem, full custom (the company motto: we'll make what they want without trying to explain them that what they want is stupid).
This stuff was usually controlled by some embedded PowerPC CPU with software running on bare metal, without any operating system. In consequence essentially all applications developed there have some sort of operating system in them, all different, often interleaved with the application logic in a sphagetti manner. I've seen in-house developed flash file systems, in-house developed TCP/IP stacks, creative use of page tables to make applications bigger than RAM possible. And of course CONCURRENCY. As you know, telecom app, like one controlling branch exchange, has to deal with lot of concurrent events and stay responsive. But there was no support for any form of threads. So the main loop of all the apps looked like this:
while (1) { doStuffA(); doStuffB(); doStuffC(); ... doStuffZ(); }
Each doStuff() represented one task (mainly handling input form some peripherial), and had to be carefully coded to not to do too much work at one time (most often using some form of state machine to divide work to stages). Nothing unusual so far. But there were also so called BACKGROUND TASKS. The name "background tasks" is misleading: in reality those tasks were "realtime tasks" or "low-latency tasks": they were doing job that couldn't wait for the next turn of the main loop, so you have to somewhat interleave their execution with execution of normal tasks. For example "network task": every time you called some some network-related function, like send() or receive(), the hardware packet queues were checked if something had arrived and eventually data was copied to memory buffers. If these functions were not called often enough, then the hardware buffers were overflowed, and when they were called in wrong proportion (too much send()'s, to little receive()'s) - memory buffer was overflowed: in both cases packets were lost or (worse) screwed up.
So what would you do if you were asked to add some functionality to system like this? Probabely you need to create new task. If your task in not time-critical one, you're lucky: you just have to write doNewStuff() function and call it from the main loop. But if it is, you have to append your code to some frequently called library function, like malloc, and pray it will be called often enough.
Admin
My government? That I pay for with my taxes? That's ... xevious. Also, insane.
What would I do? I would demand that they switch to using Erlang. They would refuse, and I would quit.
Admin
Admin
The real WTF here is the use of #undef. If they absolutely had to do it that way: #define _GL_BEGIN glBegin #define glBegin CoderSpecial
Then rather than doing #undef, just call _GL_BEGIN.
captcha: stinky, yep it sure is.