• Keith Brawner (unregistered) in reply to Mason Wheeler
    Mason Wheeler:
    I think the main difference between "a code monkey" and "a real programmer" is the ability to understand what's really going on and use that understanding wisely to make design decisions. Even if you never have to write a single line of assembly code, if you can't understand what's going on at the assembly level, you're contributing to the problem.

    Are you guys even reading what Mason is writing? His statements indicate a preference for higher level languages that can manipulate below the abstraction layers. The attacks on him are along the lines of "why don't you make your own clothes?!"

    I mirror this preference, I love me some BASH and Python, but even in C++ you have to know what happens when you declare that char recvBuffer [MAX_BUFF_SIZE] (the memory demands of your program just shot up). I agree that if you can't understand what's going on at the assembly level, you're contributing to the problem. However, I am far from asking someone to actually write assembly, except as a learning exercise in the same way that student preform derivations by hand. Like with anything that you wish to become good at, you have to understand what is happening and why it is happening that way.

    2nd submission...

  • lulzfish (unregistered)

    To be honest, I did this one time with Lua, to see if it would work. But it was more like an actual array, with just a few items, on a hobby site that never went public.

    Now I'm installing MySQL.

  • (cs) in reply to bethatway
    bethatway:
    The main difference between web programmers and so-called "real programmers" is that web programmers will still have a job in five years.

    WIN!

  • (cs) in reply to SoaperGEM
    SoaperGEM:
    To sum up half the comments so far:

    Blah blah blah No true Scotsman blah blah blah.

    That is true, there are too many fake Scotsmen these days. And I blame the blancmanges for that.

  • (cs) in reply to Keith Brawner
    Keith Brawner:
    Mason Wheeler:
    I think the main difference between "a code monkey" and "a real programmer" is the ability to understand what's really going on and use that understanding wisely to make design decisions. Even if you never have to write a single line of assembly code, if you can't understand what's going on at the assembly level, you're contributing to the problem.

    Are you guys even reading what Mason is writing? His statements indicate a preference for higher level languages that can manipulate below the abstraction layers. The attacks on him are along the lines of "why don't you make your own clothes?!"

    Thank you! That's exactly what I mean. At least one other person here gets it.

    I mirror this preference, I love me some BASH and Python, but even in C++ you have to know what happens when you declare that char recvBuffer [MAX_BUFF_SIZE] (the memory demands of your program just shot up). I agree that if you can't understand what's going on at the assembly level, you're contributing to the problem. However, I am far from asking someone to actually write assembly, except as a learning exercise in the same way that student preform derivations by hand.

    Yep. I haven't actually written any assembly in a couple years now, but I often find myself tracing through the CPU View when I'm debugging, if that's where the problem is or if I'm in a module with no source available, and if I didn't know how to read it I'd have a lot of trouble.

  • tracker1 (unregistered)

    Honestly, it was far from the worst I've seen... I would probably have the array in memory, and pushed out to a serialized file that gets loaded when the application loads. The data set here is relatively small, so having it in memory isn't a huge deal. This doesn't say anything, though about tracking changes along with order state, order history, or customer information, which is probably the bigger issue. I once dealt with an application where the variable were _m1, _m2, _3, ... _m42, and the usernames and passwords were in a database table called "Phone2" (true enough).

  • mike (unregistered)

    That's absurd.

    Reminds me of a consulting job I did converting an accounting system from Foxpro to an As400. The Foxpro system was created in someone's basement. Each day the last user would start the final step in the daily procedures and go home. It was to calculate a total and did some unspeakably stupid and unecessary things. I replaced it with one SQL statement that took about 20 seconds to execute.

    How do these people get these jobs?

    And what kind of consultant gets up and walks out when confronted with this all too frequent bad coding?

  • ben (unregistered) in reply to Sanity
    Sanity:
    Scott Saunders:
    Isn't this how CouchDB works?

    Not even close. I started writing a post describing exactly how, but the number of differences are quite large.

    Here's a small, simple one: Couch is designed to be able to operate on large clusters, with large on-disk databases on each one. Storray is limited by what can fit into RAM on a single machine.

    Yea.... I'm pretty sure Scott was making a joke

  • (cs) in reply to wee
    wee:
    Java and .NET are evil, got it. C/C++ aren't evil, I presume. And you would never bother with "abstractions" like the STL, right? Straight to asm for you. There's some real programming!

    Well, I don't use the STL, I'll admit, but not because it's an abstraction layer. I don't use it because it's built upon the C++ language and its template system, both of which are massive WTFs. (But that's a completely different argument.) I'm fairly competent in C and very good at Delphi, which is about as high-level as you can get and still be in native code, with an extensive, mature framework library that I love using.

    But the nice thing is that it doesn't force me to. Delphi's libraries are object-oriented and its visual controls wrap the Windows API well enough that you don't have to know anything about Win32 to get most things done. But if you don't want to use any given feature, you aren't locked into it. (Unlike, for example, Java, in which you can't choose not to use objects.)

  • Buddy (unregistered) in reply to Mason Wheeler
    Mason Wheeler:
    ...

    Why? Because of code monkeys who rely too heavily on abstractions without understanding what's really going on and how much it costs in system resources. And now they're starting to teach Java and .NET in "programming" classes in colleges. It makes me shudder when I think of the future.

    What's gets my blood boiling in Windows is when a long running background task doesn't set itself to a lower priority - e.g. Windows or Outlook search, Memeo backup, Norton scanning, etc., etc., etc. A fix can be as simple as one line, something like this in services:

    SetPriorityClass(GetCurrentProcessId(), BELOW_NORMAL_PRIORITY_CLASS);

    or like this in multi-threaded apps:

    SetThreadPriority(GetCurrentThreadId(), THREAD_PRIORITY_BELOW_NORMAL);

    This kind of thing you'll never see in a .NET or Java application. Just not part of the mindset.

    By the way, when I wrote Memeo about it, they "fixed" it by setting the UI to a lower priority but kept the backup engine as-is. Now in Memeo the UI takes forever to show but the background engine still runs at full priority slowing your machine to a halt. I decided not to talk to them any more.

  • (cs) in reply to Mason Wheeler
    Mason Wheeler:
    For example, my first computer was an Apple IIe. My first video game system was a NES. The NES would boot up and be ready to play instantly, and the Apple took a few seconds. Now I have a PS3 and a modern, high-end PC. Both of them are well more than a thousand times more powerful than their early-90s counterparts, by any metric you can think of for measuring computing power, but they take far longer to get up and running or load programs.

    Why? Because of code monkeys who rely too heavily on abstractions without understanding what's really going on and how much it costs in system resources. And now they're starting to teach Java and .NET in "programming" classes in colleges. It makes me shudder when I think of the future.

    If you ignore the fact that your modern machines do more.

    Look, your Apple ][ booted up quickly. Fine. Now let's see it sort a 40 million row CSV fil-- what's that? Can't do it? Awww. Well, your NES turns on instantly, let's see it play a Netflix-- oh can't do that either. Poor baby.

  • Grouchy (unregistered) in reply to Mason Wheeler

    Mason, I agree with your point about it being a good idea to know what's going on behind the scenes when you use a high-level language, but I have to object to the "code" vs "script" distinction: unless you are applying voltages to the CPU directly with wires from your brain, your so-called "code" is interpreted by other software. Even zeros and ones are abstractions of a sort. I don't get the attitude that because compiling C# happens in two steps and compiling C happens in one step, that it's somehow an entirely different thing.

  • (cs)

    Holy mother of god you idiots are easy to troll.

  • AA (unregistered)

    @Zylon: So what's new?

    The primary audience who comments here are programmers who want to feel good about themselves because "at least I'm not that bad".

    Hence, a good amount of them will respond defensively if you challenge that notion.

    Incidentally, real men program with a magnetized needle and a steady hand.

  • taiki (unregistered) in reply to Engival

    [quote user="Engival"][quote user="Spivonious"]That said, if he absolutely wants to avoid SQL (maybe it's a db of 10 items), there's STILL better ways to do it. Dump your data to a text file as a csv or some kind of serialized array. You don't just go generating new source files on a whim, that's dumb.[/quote]

    I've read a lot of stories on TDWTF, this is the first time I've ever seen a story who's situation could be improved with XML.

  • taiki (unregistered) in reply to Mason Wheeler
    Mason Wheeler:
    For example, my first computer was an Apple IIe. My first video game system was a NES. The NES would boot up and be ready to play instantly, and the Apple took a few seconds. Now I have a PS3 and a modern, high-end PC. Both of them are well more than a thousand times more powerful than their early-90s counterparts, by any metric you can think of for measuring computing power, but they take far longer to get up and running or load programs.

    You're either a clever troll or a complete idiot.

    The irony here is that in the next paragraph you rail on programmers who rely too heavily on abstractions, when, in reality, your NES and Apple II never had a HDD, a TCP/IP stack(okay, this is possible but not likely), or a 3D accelerator. Besides, I'd compare loading times of an Apple //e's Disk ][ drive with a ps3.

  • St Mary's Repositorium for Tired Code (unregistered) in reply to Bored
    Bored:
    All these web programming stories are really depressing. Is there anyone left in the world that still does some real programming?

    Well, uhm... there's also the opposite.

    There is Fefe's Blog, a German blog that advocates online freedom, open source software and argues against web censorship of any kind.

    AFAIK it is written in C++.

  • (cs)

    Come on, this is easy. Trivial, even. Take your current storray, iterate over the components, plop them into a database. Write a cute front-end to maintain them. Heck, write an adapter if you want which can translate the numbers into proper fields.

    The brainless programmer is another matter, I suppose.

  • Contra (unregistered) in reply to Mason Wheeler
    Mason Wheeler:
    wee:
    Java and .NET are evil, got it. C/C++ aren't evil, I presume. And you would never bother with "abstractions" like the STL, right? Straight to asm for you. There's some real programming!

    Well, I don't use the STL, I'll admit, but not because it's an abstraction layer. I don't use it because it's built upon the C++ language and its template system, both of which are massive WTFs.

    Enlighten me, please. What is the problem with the template system?

  • nobody (unregistered) in reply to Mason Wheeler
    Mason Wheeler:
    And now they're starting to teach Java and .NET in "programming" classes in colleges. It makes me shudder when I think of the future.
    Um. You've been where for the last 15 years? They're moving past Java already.
  • Fnord (unregistered) in reply to ih8u
    ih8u:
    Yeah, you sissies. If your keyboard has more than a 0 and a 1 button on it, you suck. If you're not writing for a specific CPU in machine code, you're a baby. In fact, you are the worst people ever.

    I spend all day doing what a high level language, a competent (non-real) programmer, and a decent compiler/interpreter could do in 10 minutes.

    Keyboard? You lightweight! Real programming doesn't bother with that kind of stupid crap. I bet you even use integrated circuits and a monitor, ya pansy! Real programmers do input with a toggle switch and a button, and read output as raw bits on four lightbulbs, with three of them burned out.

    Mind you this is only because management insists on using fancy buzzword technologies like "electricity." As soon as that fad dies, we'll be back to REAL real programming, with rocks and sticks and nothing else. Don't cut yourself on the sharp parts! Rock=0, stick=1, and that's the way it was and we liked it!

  • nobody (unregistered) in reply to Contra
    Contra:
    Mason Wheeler:
    wee:
    Java and .NET are evil, got it. C/C++ aren't evil, I presume. And you would never bother with "abstractions" like the STL, right? Straight to asm for you. There's some real programming!

    Well, I don't use the STL, I'll admit, but not because it's an abstraction layer. I don't use it because it's built upon the C++ language and its template system, both of which are massive WTFs.

    Enlighten me, please. What is the problem with the template system?

    That it's ugly as balls.

  • (cs) in reply to nobody
    nobody:
    Contra:
    Mason Wheeler:
    wee:
    Java and .NET are evil, got it. C/C++ aren't evil, I presume. And you would never bother with "abstractions" like the STL, right? Straight to asm for you. There's some real programming!

    Well, I don't use the STL, I'll admit, but not because it's an abstraction layer. I don't use it because it's built upon the C++ language and its template system, both of which are massive WTFs.

    Enlighten me, please. What is the problem with the template system?

    That it's ugly as balls.

    Don't "yuck" my "yum".

  • nasch (unregistered) in reply to Keith Brawner
    Keith Brawner:
    I think the main difference between "a code monkey" and "a real programmer" is the ability to understand what's really going on and use that understanding wisely to make design decisions. Even if you never have to write a single line of assembly code, if you can't understand what's going on at the assembly level, you're contributing to the problem.

    What is it you think happens when someone writes code (scripts?) without ever having programmed in assembler? Do you think the code is universally unreadable, or horribly inefficient, or buggy, or what?

  • (cs) in reply to web-programmer
    web-programmer:
    Think google, basecamp , facebook , hotmail etc would you say that the people who built these don't know how to do "real programming"?
    Facebook, yes. I think the people who built that were monkeys literally pounding on keyboards.
  • (cs) in reply to highphilosopher
    highphilosopher:
    Can one of you not so smart fellas query what the real WTF is so I'll have a chance to explain it?

    Oh wait, someone already did!!!

    I noticed that despite the fancy talk, you failed to actually explain the WTF. Which leads me to conclusion that you are really one of those dumb fellows, but actually doing a pretty good job looking smart. Would you happen to work in management?

  • Mr.'; Drop Database -- (unregistered) in reply to ben
    ben:
    Sanity:
    Scott Saunders:
    Isn't this how CouchDB works?
    Not even close. I started writing a post describing exactly how, but the number of differences are quite large.

    Here's a small, simple one: Couch is designed to be able to operate on large clusters, with large on-disk databases on each one. Storray is limited by what can fit into RAM on a single machine.

    Yea.... I'm pretty sure Scott was making a joke
    Scott wasn't making a joke. Scott was being a joke.

  • Glass mostly empty (unregistered) in reply to Fnord

    To be fair, this guy has met all of his design requirements.

    1. Don't learn how to use a database. Tick.
  • roflcopter (unregistered) in reply to you're-not-getting-it

    LMAO, "stupid baby languages", you just excluded most of the programmers on the planet who create real apps, solve real business problems, and make good $$ doing it - asshat, all you left out was a comment like "M$ sucks and the penguin rulzzzzz". When will you kids ever learn to stop generalizing and grow up.

    you're-not-getting-it:
    web-programmer:
    So the definition of "real programming" is something that doesn't use HTTP/HTML/JS as a front end?

    I thought one of the selling points of .net was that you could build a single set of libraries and expose them to various different front-ends, so does that mean if I build a library and create a windows forms or CLI front-end I'm doing "real programming" and if I create a web based front-end to the same thing then I'm not?

    I don't quite understand how building a front-end using web technologies (and dealing with cross browser problems , latency issues etc) is somehow magically easier than building a GUI for windows or Java Swing (both of which have graphical drag and drop editors).

    Surely the reason that there is more "web programming" is because in most cases having a web application is just more useful in this day and age as it then becomes trivial to make the application accesible to large numbers of people accross the globe using different computing platforms.

    Think google, basecamp , facebook , hotmail etc would you say that the people who built these don't know how to do "real programming"?

    "Real programming" doesn't involve stupidly bloated baby languages like VB, C#, and Java. "Real programming" involves not being an ass-tard code monkey who actually understands how to write optimized routines and can deal with the inner workings of the system without the necessity of 500000000 layers of abstraction so they don't hurt themselves on the sharp things (see what I did there).

    If you don't get the difference, you are not a real programmer.

  • trwtf (unregistered) in reply to lolwtf

    Based on what?

  • trwtf (unregistered) in reply to lolwtf
    lolwtf:
    web-programmer:
    Think google, basecamp , facebook , hotmail etc would you say that the people who built these don't know how to do "real programming"?
    Facebook, yes. I think the people who built that were monkeys literally pounding on keyboards.

    Based on?

  • (cs)

    I reckon it's this page.

    http://www.giftshopperhq.com/tools/ryobi-tools.php?p=0

  • next_ghost (unregistered)

    The biggest problem of high level languages is evolution. If you think something is completely idiot proof, the nature will breed a bigger idiot and prove you wrong (as we can see every day on this site).

    Real programmers understand the advantages of high level programming concepts but they don't need a high level language to make use of them where appropriate. That's my definition of "real programmer".

  • 50% Opacity (unregistered) in reply to Mason Wheeler
    Mason Wheeler:
    Code that is not run by the computer, but instead is executed by some other program, is not a computer program, but a script.

    Said as if that's a bad thing. And, oh wait, the computer doesn't run the source code of your "real" programming language either, another program compiles it first. Shoot, an abstraction! Guess what, unless you throw the electric impulses at your CPU by hand you're always dealing with abstractions in programming.

    Mason Wheeler:
    But any abstraction that you can't get beneath is evil because any code can have serious bugs, including the code that comprises an abstraction layer.

    And half your job as a programmer (real or otherwise) is to understand the faults in your tools.

    Mason Wheeler:
    I think the main difference between "a code monkey" and "a real programmer" is the ability to understand what's really going on and use that understanding wisely to make design decisions. Even if you never have to write a single line of assembly code, if you can't understand what's going on at the assembly level, you're contributing to the problem.

    I agree with this point in general, but I wouldn't go as far as the assembly level. You don't need to be able to translate your code to assembly to be a good programmer, but having at least a general idea a) that there exist lower levels and b) that abstraction X works better than abstraction Y at the lower levels is important to a point.

    Mason Wheeler:
    Apple IIe … NES … boot up instantly … PS3 … high-end PC … thousand times slower...

    You're forgetting that the PS3 or any modern OS are doing a lot more stuff than the IIe or NES did, and that coordinating and writing all this stuff in a low-level language is just about impossible in any realistic time-frame. Abstractions, encapsulation, interfaces and all that stuff were invented just to be able to deal with ever more complex software. Yes, there's a performance drop while the computer does its song and dance to go through all the proper interfaces and protocols you have dreamt up instead of just flipping the bit at the other end of the memory directly, but the increase in stability and manageability for the human is immense and necessary.

    Having said that, modern computers are terribly slow! ;-)

  • Fred (unregistered) in reply to lulzfish
    lulzfish:
    Now I'm installing MySQL.

    I thought you were going to use a database?

  • gil (unregistered) in reply to Keith Brawner
    Keith Brawner:
    I agree that if you can't understand what's going on at the assembly level, you're contributing to the problem.
    Why do you stop at the assembly level? Why not require everyone to have a clear understanding of how the assembly instructions are executed by the logic gates within a CPU? Why not require everyone to understand how these logic gates operate, from the first principles of physics? (Of course, the latter would be impossible since last time I checked, physics was still an active research area.)

    I assume that since you stopped at the assembly level, you do understand that some abstractions are not necessary nor worth exposing. So isn't it reasonable to assume that for some problems, the abstraction of assembly is just as unnecessary? As an example of such problem, think about something like an online shopping cart application. How would knowing assembly be useful for implementing that?

  • The PHB (unregistered) in reply to Alekz
    Alekz:
    highphilosopher:
    Can one of you not so smart fellas query what the real WTF is so I'll have a chance to explain it?

    Oh wait, someone already did!!!

    I noticed that despite the fancy talk, you failed to actually explain the WTF. Which leads me to conclusion that you are really one of those dumb fellows, but actually doing a pretty good job looking smart. Would you happen to work in management?

    Look again...

    He didn't say he wanted to explain it, merely the chance to explain it.

    Definitely in management.

  • Sylver (unregistered) in reply to Mason Wheeler
    Mason Wheeler:
    It's quite simple, really. A computer program is code that's run by a computer, and programming is writing such code.

    Code that is not run by the computer, but instead is executed by some other program, is not a computer program, but a script. The fact that these days we have scripting languages that can accomplish many of the tasks that were originally the sole purview of real programs (Java, .NET, etc) does not change the basic definitions....

    Program doesn't quite mean what you think it does. The word program come from the Greek pro- (before) and graphein (to write) and therefore it's core meaning is "write in advance". As regards to computer, programming is simply the action of writing in advance the tasks that the computer must execute. How this is accomplished, whether through an interpreter or in asm, is irrelevant.

    And, I don't know about Java, but in .NET, you can easily fall back to unmanaged code if needed.

  • (cs) in reply to 50% Opacity
    50% Opacity:
    Mason Wheeler:
    Apple IIe … NES … boot up instantly … PS3 … high-end PC … thousand times slower...

    You're forgetting that the PS3 or any modern OS are doing a lot more stuff than the IIe or NES did, and that coordinating and writing all this stuff in a low-level language is just about impossible in any realistic time-frame.

    I don't know enough about what's going on in the NES or PS3 to comment on those.

    However, with old PCs versus new PCs, a big part of the bootup problem is that the bootup sequence code runs mostly in series, when much of the code has to wait for relatively slow device responses; running the initializations in parallel could make the process a lot faster.  Of course, the reason they don't do that is that messing up those initializations could cause significant issues.

    Further, when you're talking about intel compatible systems, they not only run in series, but they also use the oldest, slowest mode available.

    And don't get me started on what's going on for MacOS X - OMFG. (Actually, that could be an abstraction issue - it could be I only think it isn't because I understand at a low level how it's messing up.)

    It's really not Mason's best argument - the biggest issue isn't the encapsulation at all.

    50% Opacity:
    Mason Wheeler:
    Code that is not run by the computer, but instead is executed by some other program, is not a computer program, but a script.

    Said as if that's a bad thing. And, oh wait, the computer doesn't run the source code of your "real" programming language either, another program compiles it first. Shoot, an abstraction! Guess what, unless you throw the electric impulses at your CPU by hand you're always dealing with abstractions in programming.

    I believe Mason's point was that the resulting compiled program runs directly, through an abstraction layer. If one is so inclined, one can muck with the binary directly to effect some small changes (I've done it on rare occasion - but it's generally far easier to make changes to the source and recompile. Failing that, it's usually easier to disassemble, tweak, and reassemble - but you'll point out that's using an abstraction layer again. And you'll also point out my hex editor's also an abstraction, as it shows the code in hexadecimal format, rather than electrical impulses.)

    Of course, that only really matters when the performance is critical - and most of the time, it isn't.

    50% Opacity:
    Mason Wheeler:
    But any abstraction that you can't get beneath is evil because any code can have serious bugs, including the code that comprises an abstraction layer.

    And half your job as a programmer (real or otherwise) is to understand the faults in your tools.

    Yes, but as any experienced programmer on ancient MacOS (pre X) knows, if your abstraction layer doesn't allow you to work around it, there are things you just cannot do. Understand the fault all you want - the application will still bomb when certain conditions arise, and the only thing the application developer can do is disable the functionality near that bomb. (Yes, many bombs were things which could be fixed. Not all of them were.)

    (Disclaimer: I fled the MacOS programming scene when I went to college, over this issue. I've heard it remained up until MacOS 9 (and has some tendrils in 10, but fewer people run into them), but my last real exposure was around MacOS 3 or 4. Which I've mostly repressed.)

    50% Opacity:
    Mason Wheeler:
    I think the main difference between "a code monkey" and "a real programmer" is the ability to understand what's really going on and use that understanding wisely to make design decisions. Even if you never have to write a single line of assembly code, if you can't understand what's going on at the assembly level, you're contributing to the problem.

    I agree with this point in general, but I wouldn't go as far as the assembly level. You don't need to be able to translate your code to assembly to be a good programmer, but having at least a general idea a) that there exist lower levels and b) that abstraction X works better than abstraction Y at the lower levels is important to a point.

    Agreed, to a point. One also needs to have some concept for how things are implemented. For example, knowing about how associative arrays work beyond just the simple fact that they run a lot slower than standard arrays, but handle non-integer indices and sparse arrays can be useful in optimizing code which uses associative arrays. It also can help you determine when one should use an associative array versus traditional array versus a multi-level array assortment.

    Further, knowing quite a bit about the way things work at a lower level is insanely useful if you want/need to do debugging with strace/truss/dtrace or similar programs. I've debugged about a dozen different programs that were running into problems with unchecked errors, and thus behaving erratically. Things like assuming that mkdir(2) always works, sleep(2) always waits exactly the requested number of seconds, and execve(2) never returns. (Yeah, I know - the last is a very common misconception, some compilers even have it. But the POSIX definition for execve(2) clearly lists a number of errors for it, and indicates it will return -1 if it does return.)

    Also, it can be phenomenally useful to be able to code inner loops in lower level languages when the situation calls for it.

  • (cs) in reply to trwtf
    trwtf:
    lolwtf:
    web-programmer:
    Think google, basecamp , facebook , hotmail etc would you say that the people who built these don't know how to do "real programming"?
    Facebook, yes. I think the people who built that were monkeys literally pounding on keyboards.

    Based on?

    I can't speak for lolwtf, but for me it's based on using that pile crap.

  • 50% Opacity (unregistered) in reply to tgape
    tgape:
    I believe Mason's point was that the resulting compiled program runs directly, through an abstraction layer.

    Erm… Maybe you wanted to say one abstraction layer (instead of several)?

    tgape:
    Of course, that only really matters when the performance is critical - and most of the time, it isn't.

    Exactly. Programmer hours are often worth more than CPU time.

    tgape:
    50% Opacity:
    Mason Wheeler:
    But any abstraction that you can't get beneath is evil because any code can have serious bugs, including the code that comprises an abstraction layer.

    And half your job as a programmer (real or otherwise) is to understand the faults in your tools.

    <snip>...if your abstraction layer doesn't allow you to work around it, there are things you just cannot do. Understand the fault all you want - the application will still bomb when certain conditions arise, and the only thing the application developer can do is disable the functionality near that bomb.

    I completely agree. The thing is, these days it's mostly a non-issue, really. I can't recall the last time I have found a bug in a compiler or interpreter, if it was reasonably mature. If you're choosing an immature platform to do your coding on you're expecting bugs in the compiler/interpreter, but who really does that these days?

    tgape:
    Agreed, to a point. One also needs to have some concept for how things are implemented.

    Sure, the more you know, the better. If you get the general idea of how things work at an assembly level, even if you couldn't assemble your way out of a wet paper bag, is usually enough though. As you said yourself, often you're not bothering actually descending to that level, because it's a lot less hassle to just change something in the abstraction layer.

    If you're a real-time embedded systems guy, you absolutely need to know assembler. Heck, you should know your way around with an oscilloscope. If you're a GUI-app developer on any reasonably recent OS, you mostly don't, it's a lot more important to know your libraries in and out. If you're a Java or .NET or PHP or whatever guy who's mostly coding business logic with a bit of output, what makes you a good programmer is rigorous discipline and clean structuring ability (+ knowing your libraries), since an internally consistent and (business-)logically bug free app is much more important than a wasted CPU cycle or two.

  • Lee K-T (unregistered) in reply to bethatway
    bethatway:
    The main difference between web programmers and so-called "real programmers" is that web programmers will still have a job in five years.

    Yeah, I've heard this one already... ten years ago...

    The real main difference between web programmers, desktop programmers and real programmers is that real programmers don't waste time with religion wars and don't really give a .... weather they have to code for web or desktop, they just learn and code.

    P.S. By the way, don't your genuine full web apps use servers? Oh, maybe servers are written in HTML, how would I know...

  • Watership Downs (unregistered)

    When I was a boy, our Computer Science department told us that the school did not need a Time Share system. 24 turn-around (cards and printers) was good enough for their students.

    I see that MS has decided that ASP.NET does not need to do dynamic scripts.

  • Thomas (unregistered)

    To add to what other people are saying about the original post, the biggest oversight is data integrity. There is nothing to stop a developer from putting a string where a date or number ought to go. How do you define allowed ranges on dates or numbers? This is to say nothing of enforcing uniqueness and referential integrity. To call this an epic fail is an understatement.

    Classic ASP is a wretched, wretched beast that should die a thousand deaths and be condemned to rot in the ninth level of Hell for all eternity.

    FTW!

  • Lee K-T (unregistered) in reply to Watership Downs
    Watership Downs:
    I see that MS has decided that ASP.NET does not need to do dynamic scripts.

    All the world will be your enemy, ASP programmer. And when they catch you, they will kill you. But first they must catch you. Be cunning, and full of tricks, and your softwares will never be destroyed!

  • Nakke (unregistered)

    Now what the hell was he thinking? Rewriting that code with .NET could have provided him hundreds of hours of work and thousands of dollars in cash.

    Standing up and walking away to the nearest bar was real WTF.

  • Burpy (unregistered) in reply to Nakke
    Nakke:
    Now what the hell was he thinking? Rewriting that code with .NET could have provided him hundreds of hours of work and thousands of dollars in cash.

    Standing up and walking away to the nearest bar was real WTF.

    Except if it's a progress bar...

  • (cs) in reply to Buddy
    Buddy:
    SetPriorityClass(GetCurrentProcessId(), BELOW_NORMAL_PRIORITY_CLASS);

    or like this in multi-threaded apps:

    SetThreadPriority(GetCurrentThreadId(), THREAD_PRIORITY_BELOW_NORMAL);

    This kind of thing you'll never see in a .NET or Java application. Just not part of the mindset.

    Since I have used the below code numerous times in .NET apps, I think you should shut the fuck up about things you obviously know nothing about.

    Process.GetCurrentProcess().PriorityClass = PriorityClass.BelowNormal;

    Thread.CurrentThread.Priority = ThreadPriority.BelowNormal;

    Addendum (2009-12-31 06:21): Also, you fail at trolling.

  • (cs) in reply to ObiWayneKenobi
    ObiWayneKenobi:
    Classic ASP is a wretched, wretched beast that should die a thousand deaths and be condemned to rot in the ninth level of Hell for all eternity. ...

    Even PHP is better, since while PHP can be turned into complete shit by novices, hacks and wannabes, it's at least really object oriented and you could at least use real software principles on PHP projects.

    The "brain" was obviously an intelligent person...

    Agreed, classic ASP makes me break out in cold sweats whenever I see it. VBscript... ugh.

    You cannot call PHP "object-oriented" when it doesn't even have a built-in "string" object. It can be made OO, but it really isn't out-of-the-box.

    No, the "brain" was lazy and stupid. It doesn't take much smarts to implement a storage system in that way, in fact it's far easier (and requires far less effort = lazy) to do so than to use a database. Also, you have to be stupid to not be able to recognize that such a system might be simple to write, but a nightmare to contain.

  • (cs) in reply to Fred
    Fred:
    lulzfish:
    Now I'm installing MySQL.

    I thought you were going to use a database?

    You win over nine thousand internets for that comment.

Leave a comment on “Classic WTF: The Storray Engine”

Log In or post as a guest

Replying to comment #:

« Return to Article