• ChiefEngineer (unregistered) in reply to marvin_rabbit
    marvin_rabbit:
    Otto:

    Maurits:

    excerpt of cruise_missile_guidance_system.c...
    pos p = starting_position(), p_old, target = acquire_target();
    do {
        p_old = p;
        p = gps_query();
        trajectory t = vector(p_old, p);
        correction c = calculate_course(target, p, t);
        adjust_course(c);
    } forever;

    But your cruise missle doesn't need to run forever. You *obviously* need a check at the end to see if the missle has exploded, so you can exit the subroutine cleanly.


    At first I thought that we could do it by handling an exception.  Then I realized -- Naw.  Just leave it as an unhandled exception and let it blow up.


    You are probably running that control logic on a secondary thread, and that thread forces the whole process to exit when it decides it's time to blow-up...

  • John Hensley (unregistered) in reply to akrotkov
    akrotkov:

    Or just run the preprocessor on the code only changing the #defines?

    Anyone who has been in the industry significantly long knows the power of code inertia. If code is fubar, the maintainers are more likely to endure it than take any radical steps to fix it, mainly because of QA and time to market concerns. Today I have no trouble reading misindented C code because for four years that's what I maintained.

    The professional thing to do when you need to use an unfamiliar language is to learn it. Someone who needs these macros to use C effectively doesn't belong in a C programming job. It's that simple.

  • (cs) in reply to uncool
    Anonymous:

    Alex Papadimoulis:
    #define forever   while(1)
    #define untilbreak forever
    #define untilreturn forever

     

    why didn't i think of that...

    why stop there?

    #define forever   while(1)

    #define untilbreak forever

    #define untilreturn untilbreak

  • ChiefEngineer (unregistered) in reply to Anonymous
    Anonymous:

    And also perhaps the do_nothing also adds some readability?  Not sure?
    That is, if you defined it by { and } (which still does nothing)
    #define do_nothing { }

    if ( x == y ) do_nothing
    else {
      printf( "X and Y are not equal!\n" );
    }

    Yes, yes... I guess the below tells it enough as it is.

    if ( x == y ) { }
    else {
        printf( "X and Y are not equal!\n" );
    }


    You can't be serious on this...
    If you we're working for me I would have fired you at once.
    In such a case you should code:

    if ( x != y )
        printf( "X and Y are not equal!\n" );

    which actually relates to the condition you are looking for...
  • queus (unregistered) in reply to diaphanein
    Anonymous:

    That some people don't think this is that big of a deal frightens me.  As a developer that is solely responsible for approximately 500k lines of C++ code, most of which I am not the original author, allow me to state that encountering something like this would bring out murderous tendencies.



    Well just suppose that this is ot 500K lines of C++ code.. And I guess it is not... From CLISP's CodingStyle

    START_OF_QOUTE
    Most of the C code of CLISP is stored in files with extension '.d'. It _is_
    C code, but the file will be preprocessed a bit.

    A comment until end of line is introduced by '# '. Other '#' signs, not
    followed by a space, have the usual C meaning. Please avoid adding new
    comments in this style, use the normal /* C comments */.

    Preprocessor commands can start at any column, not necessarily at the first
    column of a line.

    These 'var' symbols that you encounter in every function introduce variable
    declarations. We use one variable per declaration; the C syntax rules make
    declarations of multiple variables on the same line hard to read and hard
    to edit. 'var' is a preprocessor symbol which expands to nothing. Writing
    'var' not only makes it human reader to understand the code; it also allows
    you to mix declarations and statements freely, without the need for
    additional braces.

    We have a few macros defined in lispbibl.d: 'elif' means 'else if',
    'until' means 'while not' (both are now deprecated).
    'loop' introduces a loop which can be exited only via 'break'; it's
    equivalent to 'for(;;)'. It is deprecated as well. Use 'for (;;)',
    not 'while (1)', instead.
    END_OF_QUOTE
  • Jonn (unregistered) in reply to RiX0R
    RiX0R:
    <FONT color=#008000>#define finally(code) catch(...) { code; throw; } code;</FONT>

    Resource r = allocateResource();
    try {
      do_stuff(r);
    }
    finally(freeResource(r));

    Or you could do it the easy way:

    ResourceWrapper r(allocateResource());
    do_stuff(r);

  • (cs) in reply to Peter Makholm

    On the one hand, a lot of beginning C programmers coming from a Pascal or FORTRAN background are tempted to do this, because it looks more familiar and seems less inscrutable than all those braces. I'll admit to having written a header like this at first myself. The difference is, most quickly realize that it just makes things more confusing, not less, and it rarely goes beyond the first program done that way.

    OTOH, to see this in production code is really, really quite sad. Apparently, someone took the aphorism about writing FORTRAN in any language a bit too literally.


    As for syntax hacking... there are some languages which can reasonably accomodate that. C isn't one of them.

  • (cs) in reply to Schol-R-LEA

    Once upon a time, there was B.  And it was good.  But nobody ever heard of B, so we'll skip to the next chapter.  After all, we've aready skipped COBOL, Fortran, and APL.

    Once upon a time, there was C.  And programmers saw that it was very good.  Indeed, its devotees considered it perfect.

    And the Standard Template Library grew and was refined.  And people forgot what the C Language was as distinct from the STL, but that mattered not since C was perfection.

    And the Pascal compiler companies moved to California to get rich off of news services.  (Well, Alice moved there, anyway.)  And C grew in its influence.

    And lo, compiler groups said they could not improve on perfection, so they developed C++.  And Micro$oft released VisualBasic.  And lo, Tcl and Tk burst forth.  (No, Tcl and Tk didn't break "Forth".)  And Perl ventured out into the wild.  And Python followed suit.  Some were awakened with Java.  But PHP wasn't considered a language, so it was widely used.  Ruby on Rails followed Ruby.  Others were busy since the beforetimes with SmallTalk, some with a LISP.  And lo, C was perfect without any need to change.

    Yeah.  C is the first letter in Crap.

    Hey, if the guy had written this using REs and SQL queries, generating p-code, he'd be onto something.  He'd be on the wrong side of the compiler executable, but he'd be onto something.  As it is, he's just personalizing the language (call it "C-me"), as others have done before and as others will do again.

  • John Hensley (unregistered) in reply to Coughptcha
    Coughptcha:

    As it is, he's just personalizing the language (call it "C-me"), as others have done before and as others will do again.

    Save your personalized languages for your personal projects, please.
  • Silex (unregistered) in reply to Coughptcha
    Coughptcha:
    And the Standard Template Library grew and was refined.  And people forgot what the C Language was as distinct from the STL, but that mattered not since C was perfection.
    ...
    And lo, C was perfect without any need to change.
    ...
    Yeah.  C is the first letter in Crap.


    To begin with the STL is part of the C++ standard, not C. Then you seems to blame that C or C++ never change when they are ! The last C revision is from 1999 and the next C++ standard is expected around 2009. The thing is that C/C++ are ISO (and ANSI) so changes are done like any ISO thing, slowly and carefully (and probably with too much considerations for backward compatibility). Comparing how fast/often it updates with any languages that are not in the same category is just retarded too.

    p.s: C++ was developped by "he" not "they"
  • Silex (unregistered) in reply to Silex

    I forgot to add C/C++ are far from perfection, like any languages.

  • Miles Archer (unregistered) in reply to Silex

    Hasn't anyone heard of Cobol?

  • (cs) in reply to Silex
    Anonymous:
    Coughptcha:
    And the Standard Template Library grew and was refined.  And people forgot what the C Language was as distinct from the STL, but that mattered not since C was perfection.
    ...
    And lo, C was perfect without any need to change.
    ...
    Yeah.  C is the first letter in Crap.


    To begin with the STL is part of the C++ standard, not C. Then you seems to blame that C or C++ never change when they are ! The last C revision is from 1999 and the next C++ standard is expected around 2009. The thing is that C/C++ are ISO (and ANSI) so changes are done like any ISO thing, slowly and carefully (and probably with too much considerations for backward compatibility). Comparing how fast/often it updates with any languages that are not in the same category is just retarded too.

    p.s: C++ was developped by "he" not "they"
    Oooh.  So I boo-booed. 
    s/Standard Template Library/Standard C Library/g
    s/STL/SCL/g
    Because, apparently, my point was incomprehensible with that error.  And some still won't get it.
  • Hex (unregistered) in reply to PiV

    Anonymous:
    Anonymous:
    Anonymous:

    Oh, it's just personal preference.  He didn't like the syntax, so he changed it.  Big deal.

     

     

    Well, it's obvious you don't maintain code in the real world, or else you'd understand why this sort of thing is one of the ultimate WTF's. Let's assume that the original code was around for 22 years, and ALL of the original programmers have moved on and/or died. And there's no documentation of the system. Now, one day, you inherit this code. And your boss asks to make some significant changes, and they need to be done yesterday. Not only do you have to learn all the programming idiosyncracies of all the programmers who came before you and modified this code, but you have to LEARN A NEW LANGUAGE!


    Don't be so melodramtic. I program in Python, Ruby, Java, C# and ObjC. Those are the languages I say I "know". However, I could sit down, right now, and fix a bug in a VB application if you gave me a couple of minutes to get comfortable.

    That being said, if someone who claims to know C sits down at this system, these syntax changes are nothing more than a nuisance. If you can't get beyond this, you don't deserve to be called a programmer.

    Sure, he didn't have to, but this is not a WTF...or at least not a major one.

    yeh man whats the big deal

    what do we need standards for, anyway?

    i invented my own version of UML the other day, i call it NSUML (not so universal markup language) sure it'll be a small nuisance, but i think you can understand it if you take a few minutes to familiarize yourself with it:

    [*]----[st]----[co]
                 |
                 |
               [B]

  • Brittny (unregistered)

    Please say this is a dream...

  • (cs) in reply to Miles Archer

    Anonymous:
    Hasn't anyone heard of Cobol?

    I quit my first job because the non-Cobol project I was promised a position on was run without me, after 15 months programming Cobol I knew I either had to quit, go crazy, or both.
    I quit, whether I went crazy I'll leave for others to decide [:P]

  • Corentin (unregistered)

    This kind of preprocessor-based, poor man's syntactic sugar is quite commonplace in embedded systems source code (e.g. see http://www.bytecraft.com/tidbits.html).
    I use a few header files containing this kind of stuff, both to avoid falling in traps and to ease code review :

    // reset
    #define BITS__RST(variable, bits)        ((variable) &= ~(bits))
    // set
    #define BITS__SET(variable, bits)        ((variable) |= (bits))

    or

    #define ITEMS_IN_ARRAY(array)    (sizeof(array) / sizeof(*(array)))

  • Dr MindHacker (unregistered) in reply to Sam

         I agree with that, perhaps he came from having started out with a different
    language or something - I now when I started C  I used EQ to represent '=='
    just so I wouldn't make the mistake of using '=' in during comparisons. 

    Dr MindHacker

  • Iago (unregistered) in reply to Corentin
    Anonymous:

    #define ITEMS_IN_ARRAY(array)    (sizeof(array) / sizeof(*(array)))


    What language is that?  Because it damn well isn't C, where sizeof(array) returns the size of the POINTER to the first element of the array, which will always be the same regardless of how many items the array holds.
  • Silex (unregistered) in reply to Iago
    Anonymous:

    What language is that?  Because it damn well isn't C, where sizeof(array) returns the size of the POINTER to the first element of the array, which will always be the same regardless of how many items the array holds.


    No, it's correct if you pass an array on the stack to it (and not a pointer you received via some function).
  • Not me (unregistered) in reply to Iago
    Anonymous:
    Anonymous:

    #define ITEMS_IN_ARRAY(array)    (sizeof(array) / sizeof(*(array)))


    What language is that?  Because it damn well isn't C, where sizeof(array) returns the size of the POINTER to the first element of the array, which will always be the same regardless of how many items the array holds.

    Hmm, I think, you're mistaken. This program

    int main(void) {
    	int array[10];
    
    	printf("%d %d %d\n",sizeof(array),sizeof(*array),sizeof(array)/sizeof(*(array)));
    	return 0;
    }
    

    give the following output: 40 4 10...

  • Corentin (unregistered) in reply to Iago

    > What language is that?  Because it damn well isn't C, where sizeof(array) returns the size of the POINTER to the first element of the array, which will always be the same regardless of how many items the array holds.

    POINTER what? We are talking about arrays here.

  • kalle (unregistered) in reply to jart
    Anonymous:
    Just thank your lucky stars nobody told him how to use trigraphs

    http://en.wikipedia.org/wiki/C_trigraph



    Before ISO 8859-1 got common, ÅÄÖåäöÜü was equivalent to []}{|^~ in the ASCII variant used in Sweden, and you could see code like this:

    f() ä
      aÅ0Ä="abcÖn";
    å

    and we were fluent in reading and writing plain text with brackets: g|kst|vlar g}sb{gare m|tesb}tar sn| b}t l|k

    depending on how the terminal emulator was configured

  • kalle (unregistered) in reply to ChiefEngineer
    Anonymous:
    marvin_rabbit:
    Otto:

    Maurits:

    excerpt of cruise_missile_guidance_system.c...
    pos p = starting_position(), p_old, target = acquire_target();
    do {
        p_old = p;
        p = gps_query();
        trajectory t = vector(p_old, p);
        correction c = calculate_course(target, p, t);
        adjust_course(c);
    } forever;

    But your cruise missle doesn't need to run forever. You *obviously* need a check at the end to see if the missle has exploded, so you can exit the subroutine cleanly.


    At first I thought that we could do it by handling an exception.  Then I realized -- Naw.  Just leave it as an unhandled exception and let it blow up.


    You are probably running that control logic on a secondary thread, and that thread forces the whole process to exit when it decides it's time to blow-up...



    Hey guys I found a great website which I think could be useful for your new missile project

    http://en.wikipedia.org/wiki/Halting_problem

  • Wil (unregistered)

    My favourite like this is to go

    #define the_cows_come_home 0

    so that you can go

    do { ... } until ( the_cows_come_home );


  • (cs) in reply to Corentin
    Anonymous:
    > What language is that?  Because it damn well isn't C, where sizeof(array) returns the size of the POINTER to the first element of the array, which will always be the same regardless of how many items the array holds.

    POINTER what? We are talking about arrays here.


    Are you being serious?
  • Silex (unregistered) in reply to nickelarse
    nickelarse:
    Anonymous:
    > What language is that?  Because it damn well isn't C, where sizeof(array) returns the size of the POINTER to the first element of the array, which will always be the same regardless of how many items the array holds.

    POINTER what? We are talking about arrays here.


    Are you being serious?


    I think he is, what's wrong in what he said ?

    int arr[20]; sizeof(arr); // correct, gets sizeof(int(*)[20])
    int arr[20]; int* p = arr; sizeof(p); // incorrect, gets sizeof(int*)
  • (cs) in reply to Iago

    Anonymous:
    Anonymous:

    #define ITEMS_IN_ARRAY(array)    (sizeof(array) / sizeof(*(array)))


    What language is that?  Because it damn well isn't C, where sizeof(array) returns the size of the POINTER to the first element of the array, which will always be the same regardless of how many items the array holds.

    sizeof is one of the few contexts where the type of the array identifier does not automatically decay to a pointer; sizeof will return the total number of bytes used by that array, so provided that the macro argument is really of an array type, the expression above will return the right value. 

     

  • Gary (unregistered) in reply to Arancaytar

    Creating a do_nothing macro or procedure is actually useful for debugging; it allows you to place a breakpoint inside an otherwise empty "else" clause.

  • Corentin (unregistered) in reply to nickelarse

    > Are you being serious?

    Of course.
    Arrays and pointers are two (very) different beasts; read this part of the C FAQ for more information : http://c-faq.com/aryptr/index.html (a very interesting part indeed, because many people do not actually understand this part of the sometimes weird semantics of C).

  • (cs) in reply to BAReFOOt

    The difference is that when you sit down to work on a haskell program, you expect that behavior. When you sit down to work on C program, you don't, so when it happens, it's annoying.

    Also, I wonder why there aren't many enterprise type apps written in Haskell? hhmmm...

  • (cs) in reply to Corentin
    Anonymous:

    Arrays and pointers are two (very) different beasts; read this part of the C FAQ for more information : http://c-faq.com/aryptr/index.html (a very interesting part indeed, because many people do not actually understand this part of the sometimes weird semantics of C).


    People always get arrays and pointers confused, but most especially when you consider it in terms of the sizeof() operation. What people really don't understand is that while sizeof() looks like a function, and acts like a function, it's actually evaluated at compile time. So if you did something like this:

    void main()
    {
    int array[5];
    printf("%d ",sizeof(array));
    myfunc(array);
    }
    void myfunc(int * array)
    {
    printf("%d ",sizeof(array));
    }

    What you get will be 20 followed by 4 (assuming 32-bit ints and pointers). In main, "array" is an array of 5 ints, which is 20 bytes long. In myfunc(), it's still the same array, but you're not printing the size of it, you're printing the size of a pointer to it. sizeof() is evaluated at compile time to 20 and 4, respectively.

  • (cs) in reply to Corentin

    sigh

    In C, arrays are syntactic sugar for pointer manipulation.

    a[3] is actually something like *(a + 3)

  • (cs) in reply to prehnRA

    I just noticed your other posts, so it seems that you are in fact, aware of this.

    There's no promise that the array in question will, in fact, be treated as an array in sizeof().
    Mostly, it will only be treated as an array if it is a local. Otherwise, it will decay to a pointer.

  • (cs) in reply to prehnRA
    prehnRA:
    The difference is that when you sit down to work on a haskell program, you expect that behavior. When you sit down to work on C program, you don't, so when it happens, it's annoying.

    Also, I wonder why there aren't many enterprise type apps written in Haskell? hhmmm...

    Because most people don't have the ability to understand functional programming, or find it too difficult to learn, or just don't want to learn FP.

    Oh, and FP does not have giant corporations behind it, FP doesn't "feel natural" unless you have a heavy math backround, when programmers have a hard time grokking FP there's no way in hell most managers could even begin to understand it, ...>/p>

  • Anonymous (unregistered) in reply to NoName

    There are just too many idiot programmers out there.


    Yes. They're easy to identify. They use C by choice.
  • Anonymous (unregistered) in reply to Maurits

    Except real cruise-missile software requires a level of reliability that is too expensive to achieve in C, so it's written in Ada.

  • (cs) in reply to ChiefEngineer
    Anonymous:
    marvin_rabbit:
    Otto:

    Maurits:

    excerpt of cruise_missile_guidance_system.c...
    pos p = starting_position(), p_old, target = acquire_target();
    do {
        p_old = p;
        p = gps_query();
        trajectory t = vector(p_old, p);
        correction c = calculate_course(target, p, t);
        adjust_course(c);
    } forever;

    But your cruise missle doesn't need to run forever. You *obviously* need a check at the end to see if the missle has exploded, so you can exit the subroutine cleanly.


    At first I thought that we could do it by handling an exception.  Then I realized -- Naw.  Just leave it as an unhandled exception and let it blow up.


    You are probably running that control logic on a secondary thread, and that thread forces the whole process to exit when it decides it's time to blow-up...

    Hmmm ... being paid to write code whose intent is to crash the system ... *into* other things ... cool ...

  • Szabi (unregistered)

    #define BEGIN {
    #define END }

    is not enough!!

    It should be:

    #define begin {
    #define begiN {
    #define begIn {
    #define begIN {
    #define beGin {
    #define beGiN {
    #define beGIn {
    #define beGIN {
    #define bEgin {
    // ... and so on...
    #define BEGIN {

  • Corentin (unregistered) in reply to prehnRA

    Also, I wonder why there aren't many enterprise type apps written in Haskell? hhmmm...


    Probably because it is kind of like the chicken/egg paradox: few people use Haskell (or Eiffel, or Scheme, or any other good-but-otherwise-confidential-in-the-industry language) because... few people use it, so few good tools and libraries are available.

    As harsh as it sounds, it is much less risky for a development team to have to deal with funky run-time errors / prehistoric programming languages than with potential platform obsolescence*.


    * not in the technical meaning of the term.
  • (cs) in reply to Javariel
    Javariel:
    Anonymous:
    Skinnable programming languages - it's the future. Don't like your current language? Don't want to go to the hassle of writing your own language and compiler? Just pick a language that's close enough and skin it.


    Comeon, you aren't thinking hard enough.  The future is XML programming language, with all the variable names, function names, and operators as their own tag.  The IDE would then be an XML parser, which when opening a file would convert the XML elements into your preselected viewing preference, and convert back to XML on a save.  The interpreter can JIT compile the XML into the byte code of your choice by means of a virtual byte code processor, which can then be interpreted by the VM of your choice.

    Minimum specs for running any program:  Athlon64x8 4000 with 10GB of RAM.


    Great idea, I've already started writing all my programs in this new XML based language, I like to call it 'Better XML'
    <?xml version="1.0" encoding="iso-8859-1"?>
    <Better_XML>
        <variable type="boolean" name="is_xml" value="true"/>
        <if>
            <expression>
                <call type="function">isTrue
                    <arguement name="is_xml"/>
                </call>
            </expression>
            <statements>
                <print>WTF!</print>
                <exit/>
            </statements>
        </if>
       
        <function name="isTrue">
            <parameter type="boolean" name="value_to_check"/>
            <if>
                <expression>
                    <call type="function">isTrue
                        <arguement name="value_to_check"/>
                    </call>
                </expression>
                <statements>
                    <return>true</return>
                </statements>
            </if>
            <else>
                <statements>
                    <retur>false</return>
                </statements>
            </else>
        </function>
    </Better_XML>
  • (cs) in reply to prehnRA

    prehnRA:
    I just noticed your other posts, so it seems that you are in fact, aware of this.

    There's no promise that the array in question will, in fact, be treated as an array in sizeof().
    Mostly, it will only be treated as an array if it is a local. Otherwise, it will decay to a pointer.

    If the sizeof operand has an array type in the current scope, then it does not decay to a pointer.  You're thinking of passing an array as an argument to a function and then using that argument as the sizeof operand, in which case the decay happened as part of the function call, and the sizeof operand is already of pointer type. 

  • (cs) in reply to Silex

    Sorry, I think I misuderstood the entire post.  The way I read it, it looked like someone had posted something that implied

    int *p = new int[20];
    sizeof(p);   // returns sizeof(int)

    (which is correct)

    and that Anonymous was claiming that sizeof(p) would return sizeof(int) * 20, which is obviously wrong.

    It would appear I misread the whole thread.  Sorry about that.

  • (cs) in reply to nickelarse
    nickelarse:

    int *p = new int[20];
    sizeof(p);   // returns sizeof(int)


    No, sizeof(p) returns sizeof(int*). Now the size of any pointer has the same size than an int on most x86 implementations.
  • Dorks (unregistered)

    Seems this guy was trying to fix some of the obvious flaws of C, everyone should take note.

  • Sam (unregistered)

    Wow, I'm impressed that my little comment garnered so much attention and stirred such controversy. 

    I was going to respond, but most of my points have already been made.  The only thing I'd add is that I have maintained <BUZZWORD>enterprise-level</BUZZWORD> code, and would agree this wouldn't be particularly advisable for it, though I still think a coder should be able to adapt to it fairly easily.  Also, we don't actually know whether this application is <BUZZWORD>enterprise-level</BUZZWORD>, as there's no mention made of it.

  • SuperKoko (unregistered) in reply to SeekerDarksteel
    SeekerDarksteel:

    Awww, I saw the GT, GE, and LTs and thought FORTRAN.  Unfortunately those control structures don't look like they're from FORTRAN.  I was hoping there would finally be a WTF I could share with my co-workers.  :(



    OMG!
    I wonder how old this guy is!
    The repeat, unless, until, combined with GT, GE, LT... it looks like BCPL.
    BCPL is an old non-typed language.
    Non-typed doesn't mean dynamically typed.
    It means that there is no type check (a bit like assembly).
    The only type is a word. And, a word could be dereferenced, or used as a function pointer, or as a vector (pointer to array), or as an integer, or as a boolean, or as a label of a goto.

    I wonder if there were:
    #define rv *
    #define lv &
    #define let union {void* p; int i;}


  • (cs) in reply to SeekerDarksteel

    I was reminded of RatFor, a "Rational Fortran" language devised by Kernighan and Plauger in about 1975.  This used a preprocessor to provide "modern control-flow statements" to FORTRAN (if -else, for, break, repeat until ). FORTRAN was then the most widely available "standard" language.  What must FORTRAN programmers have thought?

    And then, not content with tinkering with FORTRAN, Kernighan went on to devise a conpletely new language...

  • (cs) in reply to Silex
    Silex:
    nickelarse:

    int *p = new int[20];
    sizeof(p);   // returns sizeof(int)


    No, sizeof(p) returns sizeof(int*). Now the size of any pointer has the same size than an int on most x86 implementations.


    That's what I meant.  Typo.  I'm making myself look totally incompetent now... ooops.
  • Merlin (unregistered) in reply to Trevor Raynsford
    Trevor Raynsford:

    And then, not content with tinkering with FORTRAN, Kernighan went on to devise a conpletely new language...



    There's a fine line between constructive criticism and plain trolling...

Leave a comment on “The Secret to Better C”

Log In or post as a guest

Replying to comment #60561:

« Return to Article