• (cs)

    The REAL WTF here is how he prefixed all of the bitwise operators with a lowercase b. lol....
    until (my BAND makesit)
    {
        repeat
        {
              practice();
        } until (good(NOT bad))
    }

    so confusing lol...

  • (cs) in reply to graywh
    graywh:
    Rank Amateur:

    #define ADD(x,y) (x) + (y)
    #define SUBTRACT(x,y) (x) - (y)
    #define MULTIPLY(x,y) (x) * (y)
    #define DIVIDE(x,y) (x) / (y)
    --Rank



    Don't you mean

    #define (x) PLUS (y)   (x) + (y)
    #define (x) TIMES (y)   (x) * (y)
    #define (x) MINUS (y)   (x) - (y)
    #define (x) DIVIDEDBY (y)   (x) / (y)

    I might, but my language happens to have a LISP.

    --Rank

  • (cs) in reply to The Bears
    Anonymous:

    ... go wash your brain out with vodka.

    I have found my new sig[:D]

  • (cs) in reply to GoatCheez
    GoatCheez:
    The REAL WTF here is how he prefixed all of the bitwise operators with a lowercase b. lol....
    until (my BAND makesit)
    {
        repeat
        {
              practice();
        } until (good(NOT bad))
    }

    so confusing lol...


    ultra confusing that i'm using a bitwise operator in a boolean evaluation statement... just wanted to make it look like english ;-)
  • (cs) in reply to Chris

    Oh, I just vuv the do_nothing macro... How quaint!


    REPEAT
       BEGIN
          do_nothing
       END
    FOREVER

  • ChiefEngineer (unregistered) in reply to ZeoS
    ZeoS:
    The real WTF is that there are people still programing in C


    The real WTF is that there are still ¿people? who thinks that not programming in C makes them smart...
  • Fregas (unregistered)

    I especially  like these:

    #define forever   while(1)
    #define untilbreak forever
    #define untilreturn forever

    Nothing like adding your own inconsistencies.

  • Scott (unregistered) in reply to Arancaytar
    Arancaytar:
    > # define do_nothing
    Priceless.


    Actually this is encouraged in the great book Code Complete by Steve McConnell.

    if (someTestPasses())
    {
        // do some stuff
    }
    else
    {
        do_nothing(); // this is cleaner than the alternative ";"
    }

    Steve encourages this style of code because it implies that the programmer knows that nothing is to be done if the test failes.

    I really don't think this post is a WTF. A little extreme, but definitely not a WTF.

  • (cs)

    #ifndef BRILLANT

    #define BRILLANT

  • ChiefEngineer (unregistered)

    #define IF           if (
    #define THEN    )
    #define BEGIN  {
    #define END      }
    #define ELSE     else

  • ismaelj (unregistered) in reply to ZeoS

    Correction:

    ZeoS:
    #define COMMENT //

    #define REM //

  • Mike K. (unregistered) in reply to Sam
    Anonymous:

    Oh, it's just personal preference.  He didn't like the syntax, so he changed it.  Big deal.

     

     

    Well, it's obvious you don't maintain code in the real world, or else you'd understand why this sort of thing is one of the ultimate WTF's.

    Let's assume that the original code was around for 22 years, and ALL of the original programmers have moved on and/or died. And there's no documentation of the system.

    Now, one day, you inherit this code. And your boss asks to make some significant changes, and they need to be done yesterday. Not only do you have to learn all the programming idiosyncracies of all the programmers who came before you and modified this code, but you have to LEARN A NEW LANGUAGE!

  • ben (unregistered) in reply to ZeoS

    hurrfffdurrrr

  • Suidae (unregistered) in reply to ChiefEngineer

    For a little flavor, how about:

    #define BREAKERONENINE     /*

    #define OVER        */


  • MatzeLoCal (unregistered) in reply to wintermyute

    At the first glance it looked to me like this had been done by an old RPG-"who-needs-all-that-new-stuff-all-i-need-is-a-5250-Terminal-Programmer

  • (cs) in reply to Sam
    Anonymous:

    Oh, it's just personal preference.  He didn't like the syntax, so he changed it.  Big deal.

    This reminds me of a situation we had here at work.  We have this horrific web application that we have to use to deploy and launch applications.  It's so handy that we have this team of "script" developers that have to hand code something everytime you add a new application.  We had them add access to a set of spreadsheets the business developed and then dumped on us to support.  We also asked them to add a link for quality.  The menu routine they wrote for quality was completely different than it was for production.  We asked them about this.  Their answer, "Different developer, different style."

    We still laugh about that.  You wouldn't want to let 1) saving time by resuing existing code, 2) being consistent between quality and production, or 3) providing the user with a consistent experience get in the way of "style."

  • PiV (unregistered) in reply to Mike K.
    Anonymous:
    Anonymous:

    Oh, it's just personal preference.  He didn't like the syntax, so he changed it.  Big deal.

     

     

    Well, it's obvious you don't maintain code in the real world, or else you'd understand why this sort of thing is one of the ultimate WTF's.

    Let's assume that the original code was around for 22 years, and ALL of the original programmers have moved on and/or died. And there's no documentation of the system.

    Now, one day, you inherit this code. And your boss asks to make some significant changes, and they need to be done yesterday. Not only do you have to learn all the programming idiosyncracies of all the programmers who came before you and modified this code, but you have to LEARN A NEW LANGUAGE!



    Don't be so melodramtic. I program in Python, Ruby, Java, C# and ObjC. Those are the languages I say I "know". However, I could sit down, right now, and fix a bug in a VB application if you gave me a couple of minutes to get comfortable.

    That being said, if someone who claims to know C sits down at this system, these syntax changes are nothing more than a nuisance. If you can't get beyond this, you don't deserve to be called a programmer.

    Sure, he didn't have to, but this is not a WTF...or at least not a major one.

  • (cs)

    I love how the page comment looks like an Apple ad.  Instead of "iPod: 1000 songs, in your pocket," it's:

    <p>

    /*
     * BETTER_C.H   
     * Language refinements for C. 
     */


    It gives it that unmistakable air of "I'm so cool I don't have to explain very much."
  • (cs)

    Years ago, ~15, I tweaked the source for the 'C' preprocessor to leave #include and comments and just translate #defines so I could make sense of some code that  abused the preprocessor. I never thought it might be useful again, and the source is gone. Adam might consider it as an option.

  • (cs) in reply to RandomEngy

    This is the easiest WTF to fix:

    /*
     * BETTER_C.H   
     * Language refinements for C. 
     */
    #ifndef BETTER_C_H
    #define BETTER_C_H
    

    /* logical/comparison operators */ #define NE != #define EQ == #define GT > #define LT < #define GE >= #define LE <= #define AND && #define OR || #define NOT !

    FIX:

    /* original C /
    / logical/comparison operators */
    #define != NE  
    #define == EQ  
    #define > GT  
    #define < LT  
    #define >= GE  
    #define <= LE  
    #define && AND 
    #define || OR  
    #define ! NOT 
    ... And so on
    
    BANG! The WTF is FIXED!
     
    with another wtf...
    but fixed [8-)]

  • Randal L. Schwartz (unregistered)

    Oddly enough, the V7 Bourne Shell (the One True Shell) source was coded just like this. Steve Bourne was apparently more familiar with Algol than with C, so he wrote up a series of #defines to make C more Algol-like. Quite an amusing read.

  • Fast B (unregistered) in reply to PiV
    PiV:

    That being said, if someone who claims to know C sits down at this system, these syntax changes are nothing more than a nuisance. If you can't get beyond this, you don't deserve to be called a programmer.


    You have to extend that to the guy responsible for this mess. If he's a "real" programmer, why does he have to waste time concocting a set of wrappers for C constructs? If he has to work in C, what's wrong with just learning C? If this guy is for some reason incapable of using a language without resorting to these sorts of measures, I say he doesn't deserve to be called a programmer.
  • (cs)

    nice!!!

    I'm sure these ones are on the list too

    #define begin {

    #define end }

  • mcguire (unregistered) in reply to Scott
    Anonymous:

    Steve ... knows ... nothing....


    Too true, too true.

  • Richard C Haven (unregistered) in reply to bramster

    #define begin {
    #define end }

  • Steighton Haley (unregistered) in reply to Jeff S

    For those of you (Windoze users who are utterly confused, this syntax is similar to bourne shell scripting).

    Clearly some UNIX nerd was trying to make his C programs look like his shell scripts.

    Not uncommon in the early days of C.

  • (cs) in reply to Rick
    Rick:
    Years ago, ~15, I tweaked the source for the 'C' preprocessor to leave #include and comments and just translate #defines so I could make sense of some code that  abused the preprocessor. I never thought it might be useful again, and the source is gone. Adam might consider it as an option.

    The folks over at the IOCCC can do this; perhaps they should be asked?

  • (cs)
    Alex Papadimoulis:
    Now that this well-kept "Better C" secret is in the public, I suspect there will be nothing short of a complete revolution in the world of C programming.

    A complete revolution?  360 degrees?

    I played around a bit with stuff like this myself when I was younger.  It can be amusing briefly, but it certainly does not belong in a production system.

    Sincerely,

    Gene Wirchenko

  • Loopy (unregistered)

    This is definitely a WTF, but when I look at some of the operator replacements I think of operator overloading.  Seriously, that's almost a WTF feature in itself.

    Oh, now I get the CAPTCHA everbody's talking about.

  • (cs) in reply to Maurits

    Maurits:

    excerpt of cruise_missile_guidance_system.c...

    pos p = starting_position(), p_old, target = acquire_target();

    do {
        p_old = p;
        p = gps_query();
        trajectory t = vector(p_old, p);
        correction c = calculate_course(target, p, t);
        adjust_course(c);
    } forever;

    But your cruise missle doesn't need to run forever. You *obviously* need a check at the end to see if the missle has exploded, so you can exit the subroutine cleanly.

    [:D]

  • Derek (unregistered) in reply to Scott
    Anonymous:

    Actually this is encouraged in the great book Code Complete by Steve McConnell.

    if (someTestPasses())
    {
        // do some stuff
    }
    else
    {
        do_nothing(); // this is cleaner than the alternative ";"
    }

    Steve encourages this style of code because it implies that the programmer knows that nothing is to be done if the test failes.

    I really don't think this post is a WTF. A little extreme, but definitely not a WTF.


    This:

    else
    {
        // do nothing
    }

    is just as clear, without adding an unnecessary function call (and definition).
  • (cs) in reply to PiV

    I program in Python, Ruby, Java, C# and ObjC. Those are the languages I say I "know". However, I could sit down, right now, and fix a bug in a VB application if you gave me a couple of minutes to get comfortable.

    I don't doubt that this is true, but there's more at stake here.  When you sit down to a Python app, you expect Python.  That means you start thinking in the Python syntax.  Same with Ruby, C#, ObjC, C++ and so on.  You don't want to sit down and get something completely different.

    Aside from that, you also have to work within the coding standards of the team - this clearly didn't do that.

  • jart (unregistered)

    Just thank your lucky stars nobody told him how to use trigraphs

    http://en.wikipedia.org/wiki/C_trigraph

  • PugMajere (unregistered) in reply to Arancaytar
    Arancaytar:
    > # define do_nothing
    Priceless.


    The best part of that one is that it can actively cause problems.

    It *should* be:

    #define do_nothing do {} while (0)
    (Note the lack of a trailing ;, so you can do:
    if (something)
         do_nothing;
    ..)

  • Michael Rutherfurd (unregistered) in reply to Manni

    No it is worse than that, I think he should be sent driving with Teddy Kennedy instead...

  • Silex (unregistered) in reply to makomk

    <font color="#CCCCCC">Years ago, ~15, I tweaked the source for the 'C' preprocessor to leave #include and comments and just translate #defines so I could make sense of some code that  abused the preprocessor. I never thought it might be useful again, and the source is gone. Adam might consider it as an option.</font>

    Today, you just use the correct compiler option for that. Both gcc & vc++ have one for it.

  • (cs) in reply to Otto
    Otto:

    Maurits:

    excerpt of cruise_missile_guidance_system.c...
    pos p = starting_position(), p_old, target = acquire_target();
    do {
        p_old = p;
        p = gps_query();
        trajectory t = vector(p_old, p);
        correction c = calculate_course(target, p, t);
        adjust_course(c);
    } forever;

    But your cruise missle doesn't need to run forever. You *obviously* need a check at the end to see if the missle has exploded, so you can exit the subroutine cleanly.


    At first I thought that we could do it by handling an exception.  Then I realized -- Naw.  Just leave it as an unhandled exception and let it blow up.
  • Dan (unregistered)

    Skinnable programming languages - it's the future.

    Don't like your current language? Don't want to go to the hassle of writing your own language and compiler? Just pick a language that's close enough and skin it.

  • pinguis (unregistered) in reply to Angstrom
    Angstrom:
    Reminds me of Bournegol.

    http://www.goof.com/pcg/marc/bournegol.html

    This explains SO MUCH about bash sintax...

  • (cs) in reply to Mike K.
    Anonymous:
    Anonymous:

    Oh, it's just personal preference.  He didn't like the syntax, so he changed it.  Big deal.

     

     

    Well, it's obvious you don't maintain code in the real world, or else you'd understand why this sort of thing is one of the ultimate WTF's.

    Let's assume that the original code was around for 22 years, and ALL of the original programmers have moved on and/or died. And there's no documentation of the system.

    Now, one day, you inherit this code. And your boss asks to make some significant changes, and they need to be done yesterday. Not only do you have to learn all the programming idiosyncracies of all the programmers who came before you and modified this code, but you have to LEARN A NEW LANGUAGE!



    Or just run the preprocessor on the code only changing the #defines?

  • (cs) in reply to ZeoS
    ZeoS:
    The real WTF is that there are people still programing in C


    Yeah, it's funny how new kernels keep getting built... ;)
  • (cs) in reply to marvin_rabbit
    marvin_rabbit:
    Otto:
    But your cruise missle doesn't need to run forever. You *obviously* need a check at the end to see if the missle has exploded, so you can exit the subroutine cleanly.


    At first I thought that we could do it by handling an exception.  Then I realized -- Naw.  Just leave it as an unhandled exception and let it blow up.


    That is fine as long as your code blows.  If it sucks, it might cause a missile malfunction.

    My code is hot, unless it is cool.

    Sincerely,

    Gene Wirchenko

  • (cs) in reply to bramster
    Anonymous:
    I never did like those pesky equal signs anyway.

    #define CURLY_OPEN_BRACE {




    ??(
    ??[

    Anyone remember these?
  • Sam LG (unregistered)

    Okay, it's mostly annoying, but there's a minor case to be made for the EQ operator -- namely, that it's easy to leave off an '=' by mistake.

    Of course, a better solution to this is, whenever possible, to put one's variables on the right side of the == operator, but still, I can almost forgive that one.

  • (cs) in reply to RanmaChan

    Anonymous:
    Hey, I want to see the whole file!  That is only part of a Better C!  Surely there must be something better than switch statements and structures!

     

    Shhhhhhhhhhhhh that is SUPER secret. If Alex told you, he would have to kill you.

  • ev (unregistered)

    Hehehe, funny NOT NOT NOT
    No, really NOT NOT then NOT then NOT

    Some then people do_nothing then do_nothing seem do_nothing to really then want then then then to go back do_nothing to then do_nothing VB.. then Wonder do_nothing why NOT

  • (cs) in reply to Dan

    Those of us commenting here might well agree that "C" is perfect as it is (and would never argue vociferously about pre-increment, post-increment, or explicit add as being the only "right" way to go about something).

    That's why, I'm sure, that a quick Google of the phrase "better c" turns up a mere <font size="-1">224,000,000 hits.
    </font>

  • (cs) in reply to Arancaytar

    Arancaytar:
    > # define do_nothing
    Priceless.

     

    The concept isn't priceless, but the implementation surely is.

    I believe they are trying to achieve:

    if (a GT b) then

       do_nothing

    else

       blah blah blah

    endif //which I assume is in the snipped version

    Of course this won't compile because they didn't define do_nothing properly.

  • (cs)
    Alex Papadimoulis:

    /* control constructs */
    #define repeat    do
    #define until(p)  while(NOT (p))
    #define forever   while(1)
    #define untilbreak  forever
    #define untilreturn forever
    #define unless(p) if(NOT (p))
    #define ifnot(p)  if(NOT (p))
    #define do_nothing
    #define then

    I actually like the forever, untilbreak and untilreturn keywords. They're very descriptive about the purpose of the loop.

    do_nothing could be useful if you want to make explicit that nothing should be happening in a certain case... though a comment would usually be more appropriate.

    The rest is a bit weird, but I've recently done something pretty similar, extending the language with macro's. Annoyed by the lack of try...finally in C++, I've defined:

    <FONT color=#008000>#define finally(code) catch(...) { code; throw; } code;</FONT>

    <FONT color=#000000>So now I can do:</FONT>

    Resource r = allocateResource();
    try {
      do_stuff(r);
    }
    finally(freeResource(r));

    <FONT color=#000000>Which works just fine as long as you make sure that you dont return in the guarded block.</FONT>

  • (cs) in reply to PiV
    Anonymous:
    This is not a WTF, this is just someone using syntactical sugar type approach.

    You mean syntactical arsenic.

    Anonymous:
    No big deal. Who really cares?

    Ask the person who has to maintain this.

    Why do you think languages have standardized syntax? Wait, you're right! It's boring seeing the same keywords, constructs and idioms in every program. Let's mix it up a little! Instead of merely stating that I know C on my résumé, I can have pages listing all the flavors of C I can code in.

Leave a comment on “The Secret to Better C”

Log In or post as a guest

Replying to comment #:

« Return to Article