- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Admin
Many if not most DOS oldies had timing loops, which meant they ran poorly in newer computers. In fact, this problem was so common than it was a noteworthy occasion when an old DOS game was properly coded, such as Alley Cat (which ran fine in faster computers, see http://www.mobygames.com/game/alley-cat ).
Admin
http://support.microsoft.com/kb/192841/en-us
(This irrelevant line is added to speed up posting because without this irrelevant line TDWTF said this was spam.)
Admin
Flippin eck. Can't even say it.
Admin
A little bit. You ought to be okay with the fact that most programmers are ignorant fuckers who just think they're smart.
Admin
Reminds me of probably my first experience with debugging - as a kid, I quite enjoyed playing Nibbles, which you may recall was distributed in source form (specifically, QBasic source). It actually did have code to check how fast your computer was, and modify its internal speed based on the result of the check... too bad computers quickly got so fast that that check threw a divide by 0 error sometimes. Just running too quickly would probably have been preferable to a divzor. On the plus side, yay open source!
Admin
Was thinking the same thing... At risk of continuing ever more off-topic, Juliet is not asking Romeo where he is, but why he is who he is.
Admin
It happened once at my first job, but only during an installation routine (and it did eventually get eliminated).
Admin
Damn network engineers are worse.
Admin
Wow, it's nearly 2012 and all the world's still a VAX.
(Hint: The post mentions Windows 2.x).
Admin
Admin
Remarkable how the same big corporations that basically require Moore's Law for their ever-more bloated frameworks to remain usable, are ignorant about the effects of this law on their own code just a few years hence.
Admin
Admin
Dear God, I had forgotten all about that error. Thanks for undoing months of expensive therapy.
Admin
Agreed, but, the IBM PC and XT had no hardware interrupts for hsync nor vsync. It's possible to sync the timer interrupt to vsync but it affects the system clock (no RTC), games using this usually didn't return to DOS.
Taking about WTF speed-up loops, most old PC/XT software had to always wait before writing text! The original IBM CGA card displayed "snow" on the entire display unless video RAM was written during the hync or vsync periods.
Admin
Pretty damn badly, I'd imagine. Seeing as "Wherefore" means "Why", not "Where".
Yeah. I need to get out more.
Admin
Admin
I remember when I was a kid I had a 386DX2 and got a Simpsons arcade port. Twice in the game there were bonus stages that asked you to press two alternating keys the faster you could. Unfortunately The Computer Is A Cheating Bastard {url=http://tvtropes.org/pmwiki/pmwiki.php/Main/TheComputerIsACheatingBastard}™{/url}* and your enemies would end the same task as you under 2 seconds. It was humanly impossible to defeat them, but pressing Turbo slowed them to a crawl and I ended up winning without breaking a sweat.
Akismet's creator is guilty of crimes against humanity. How the hell do we put up with this monstrosity?
*Intentionally broken tag, or my comment wouldn't get through.
Admin
For GCC:
gcc -O0
or
#pragma GCC optimize "O0"
should do the trick
Admin
Admin
Don't mind thinking about memory. That code will try to iterate the int untill it reaches 1000000.
As that int doesn't reach 1000000, you can see the problem.
Admin
Everything I can think about depends on the number of times the loop runs. Thus would insert bugs on the application if you put the wrong maximum in there.
Admin
Admin
Admin
Admin
Back when I worked on image processing software, there was more than one customer who believed we put in these very sort of loops.....
Admin
Admin
Is Alex still in-laws place eating Christmas stuff?
Admin
Admin
Admin
Admin
Admin
Any updating of shared state should do it.
Admin
Windows 2.11? I don't remember that one.
Admin
From 1989. See http://en.wikipedia.org/wiki/Windows_2.1x#Windows_2.11
Admin
“The idea is,” Alex continued, “whenever we have those really slow weeks – you know, the kind where don’t actually have any WTFs any other stories – we just re-post an old WTF. And then we just tell the readers that we ran into some issues with the latest story, but after a whole lot of deep-juju searching, we were able to come up with a WTF that's still interesting and should be able to find a few new ones the next week.”
Admin
Suggestion for tomorrow: null
Admin
(Another one I like is: 'fulsome' means 'full' just as much as 'noisome' means 'noise'. Etymylogically, 'noisome' is related to 'annoy' and 'fulsome' is related to 'foul'.)
Admin
Admin
"Dang, Ro! How's come you hafta be a black hat?"
Admin
Not everything is Java ;) The purported loop is in C-like pseudocode, and that should give you a hint ;)
In C and C++ the size of an int is platform (hardware+OS) dependent. The C standard guarantees that a signed or unsigned int will be at least 16 bits long. In some platforms it could be 32 bits, in others it could be 16. Heck there is nothing preventing it to be, say 24 or 28 bits.
So if you want an int type that is guaranteed to be of a certain size, you need to use one of the typedefs in stdint.h/cstdint.h . For example, you want a 32 bit int, you use int32_t, or uint32_t for an unsigned 32 bit int, or int16_t/uint16_t, or uint8_t/int8_t...
That's an standard capability of most (if not all) languages that follow a C-like syntax. And a char is nothing more than a 8-bit (or 16 or 32-bit) numeric value. Take 'i' for instance. Its ascii value is 105 decimal. If you increase it by one, then you get 106 decimal, which is simply the successor of 'i', which is 'j'. So doing 'i'++ simply yields 'j'.
Asking questions is how we learn ;)
Admin
Or: "Romeo, Romeo, the real WTF is that you're a frigging Montague, FFS."
Admin
Indeed.
I'm thinking this deceased equine has been thoroughly flogged and the remains should be inhumed approximately 1.8288m below ground level. This being TDWTF, the remainder of the audience will go out and procure whips, chains, rods, Tiki torches (local story), and the like in order to continue the flagellation for the simple purpose of participation with no discernible value.
Admin
Fuckinell yes we do. It's called "code reviews". If any of the monkeys try and get something like that past me then I'll need a hefty bribe and a cut of their bonuses or their out on their arses.
Admin
I'm naming my Silicon Valley Sno-cone shop that (or maybe my cryogenic storage facility (heck why not a combo?)). It'll be a wonderful inside joke.
Admin
Correct me if i'm wrong, but doesn't that actually make that line relevent? Or is it as useless as the speedup loop?
Admin
It's not that int used to be different, it's that in C++, the max value for int, char, long int, etc are "implementation dependent." That means it's up to the compiler writer to define those things, possibly based on the system it's running on. There are some requirements -- for example a lowest possible max value for each type and guarantees that some types are larger than others.
But actual values may differ, and might be more likely to be different today than in the 80s. There might be some embedded systems with limited resources that use smaller data types than what you would find on a PC or server.
Admin
We tend to hire programmers instead of monkeys. You have to pay them - and it isn't cheap, but I've learned that working with monkeys slowly dissolves your soul. The same goes for being treated as a monkey (unless you are a monkey, of course).
I sneak in and review code behind my people's backs and then have them terminated if they force me to submit their code here. That's my process. It's Darwinian: they're the species, and I'm natural selection eliminating the unfit.
Addendum (2011-12-28 22:58): And yes, agile would be much better, but I don't have a team that works well together or sufficient proof of incompetence (yet) to have them terminated.
Admin
Admin
I actually have a real life case where I can determine their worth.
The WAN has two LANs linked by MPLS - a thousand miles between domains in the forest. They were joined a year ago as part of a merger.
My network has never gone down. My Exchange always works, my DC's are always up, my databases are always running.
The other network has a dedicated network engineer. His Exchange works about two weeks between crashes. His SQL processes fail 50% of the time. Yes, we replaced both - hardware and software. There is a 15 second latency before the fastest query returns. The OLAP cubes take up to 5 minutes to return results (and as said before, half the time, the cubes didn't get updated anyway). He has yet to get kerberos to work.
I know what you're thinking: half of that crap shouldn't even be in his lap. But it isn't. This shit happens because (besides outsourced Indian quarter-million dollar OLAP boondoggle of a monstrosity and 1.5 megaloc COBOL fossil) the problem is caused by the network. It's so freaking slow, all technology that connects to it randomly goes on strike as if to protest abusive working conditions.
Is their job description really, "Show Up, The End"? Shouldn't they figure out what's causing the slowness and give us a proposal to fix it, then do it?
Network engineer really doesn't mean anything other than knowing what can be learned on the job through google or technet, yes?