• TheCPUWizard (unregistered)

    Just hope that the typedefs do not occur (in terms of order of source processing) after the #defines... then they are illegal..... One of the reasons to always check header Idempotence - prevents things like that sneaking up on you...

  • (nodebb)

    I was genuinely curious what was going to be the catch, but this... this is just delightful!

  • asdf (unregistered)

    Someone was too lazy to do a global search and replace for "byte" and "word", so they did this instead. And I will bet you anything they first tried putting those #defines above the typedefs.

  • (nodebb)

    As always, it's worth noting that short has a minimum size, but no maximum size defined in the spec. So assuming that a word is two bytes is potentially a mistake.

    Indeed, because they might be the same size. Or short might be 72 bits while char might be just 9.

    One header doing a typedef, and another header doing a #define would be bad, but that's just the life of a C programmer, as you pull together modules written by different people across different years. I could understand having that happen. But right next to each other?

    Actually, having the defines in one header and the typedefs in another could make the code not compile, because if the header with the defines comes first in compilation order, the typedefs become:

    typedef unsigned char unsigned char;
    typedef unsigned short unsigned short;
    

    and the compiler will barf up some lovely error messages as a result.

    But having the two definitions next to each other like that is ... weird.

  • Sauron (unregistered)
    typedef unsigned short wtf;
    #define me not_frist
    
  • Jajcus (unregistered)

    Maybe there where some preprocessor conditionals around those (for compatibility with weird compilers or libraries), but then the conditions were removed during some refactoring and both definitions got left around, because no one caught the mistake, as the code complied and worked correctly.

  • Sauron (unregistered)

    but it still leaves me scratching my head, trying to understand how we got here. I could understand it better if these four lines weren't all together.

    Maybe that happened when rebasing/merging code, like in Happy Merge Day? https://thedailywtf.com/articles/Happy_Merge_Day!

    Like, there could have been several months worth of code changes merged in a very short time, turning the codebase into the equivalent of a Chernobyl meltdown, and whoever did the merging had to hastily fix hundreds of code conflicts, leaving some radioactive fallout littered everywhere in the codebase, filling the bugfix backlog for the entire century.

  • Dr, Pepper (unregistered) in reply to asdf

    This is an easy way to CYA. A global search/replace means you've changed a ton of source files, which you are now responsible for. Instead, make the change in a header file. Since header files don't contain actual code, you won't be considered responsible if that code breaks.

  • (nodebb)

    I think it's fair to point out, that typedef is not an alias, it's actually creating a new type. Auto cast rules still apply, but there are edge cases where a real alias (aka macro aka #define) acts like the original type while a typedef acts differently. And because it's C, it heavily depended also on the compiler and platform in use. Overall typedef as a real type definition acted "stronger" than a pure #define which was only a text replacement and that's it.

    Addendum 2024-06-03 09:28: BTW that discrepancy explains the "revert" with the follow up #define's. A classic quick and dirty "fix".

  • (nodebb) in reply to MaxiTB

    Can you give examples of these differences? The only thing I can think of is that there may be cases where the macro needs parentheses (just like macros that expand to expressions), whereas the typedef is treated as a unit.

  • (nodebb) in reply to Barry Margolin

    I remember a lot of compilers complaining with a warning when auto-casting.

    typedef unsigned char number_t;
    void foo(number_t value) { /* bar */ }
    
    unsigned char number = 0;
    foo(number) // WARNING
    

    I remember an IBM C++ compiler (I think it was one for AIX) creating methods signatures by type not their underlying type:

    typedef unsigned char number_t;
    class it_just_works
    {
      void method1(unsigned char x) { }
      void method2(number_t x) { }
    }
    

    You can't do this obviously with a text replacing marco using #define.

    Addendum 2024-06-03 11:25: Oh, and yeah, it has implications with nested #define statements because if you have two single keywords it can result in issues in some places because while you can use () for expression, you can't use them for types. But I can't remember exactly the problem I had back then, that was decades ago :-)

  • (nodebb)

    I think it's fair to point out, that typedef is not an alias, it's actually creating a new type.

    cppreference.com disagrees.

    From https://en.cppreference.com/w/c/language/typedef (about C):

    The typedef declaration provides a way to declare an identifier as a type alias

    typedef declaration does not introduce a distinct type, it only establishes a synonym for an existing type

    From https://en.cppreference.com/w/cpp/language/typedef (about C++):

    The typedef names are aliases for existing types, and are not declarations of new types

    // simple typedef typedef unsigned long ulong;

    // the following two objects have the same type unsigned long l1; ulong l2;

  • (nodebb) in reply to MaxiTB

    I remember an IBM C++ compiler (I think it was one for AIX) creating methods signatures by type not their underlying type

    Then it wasn't a C++ compiler (which I knew anyway), because C++ says that they are aliases, and therefore the same type. It was, in effect, a compiler for a different language whose syntax is identical to C++'s syntax.

    But I also remember another thing it did. It would work together with the linker to remove global (file-scope or extern-scope) variables that weren't explicitly referenced anywhere(1), which broke systems based on "my global object registers itself with this compile-time plugin system by calling from its constructor", which don't require anything to explicitly reference the global object.

    (1) That is, not counting the implicit references that enable things like collect2 to do their jobs.

  • Duke of New York (unregistered)
    Comment held for moderation.
  • Steve (unregistered) in reply to Steve_The_Cynic
    Comment held for moderation.
  • (nodebb) in reply to Steve_The_Cynic

    Haha, keep in mind in the old days there was a race between c/c++ compiler vendors; they often implemented features even before they got standardized and sometimes years later when the standard finally included the feature they added some compatibility compiler switches to make their legacy solutions work and also support the feature in a standard fashion. IBM was king of this but Watcom, Borland, Microsoft and many more were not really that far off (especially when it comes to C++ because it took over a decade to get all initial features standardized while still having tons of undefined fields, name mangeling was a major one).

    So nah, it was a C/C++ compiler, I just don't remember which one it was and which combination of legacy switches resulted in the behavior cause it was over a decade ago ;-)

  • Worf (unregistered)
    Comment held for moderation.
  • TheCPUWizard (unregistered) in reply to Steve_The_Cynic
    Comment held for moderation.
  • Entropy (unregistered) in reply to Barry Margolin

    As has been noted, in standard C/C++, a typedef is simply an alias. I can think of two cases where the #define works differently to a typedef though:

    1. (C++ only) With a typedef, byte(0) is a valid expression, but with the #define it expands to unsigned char(0) which is not (the type would need to be parenthesized - but note that C++ also has some weird corner cases where adding extra parentheses changes semantics even though it doesn't change the grouping).
    2. If one had instead written typedef int foo; or #define foo int then unsigned foo x; is a valid declaration with the #define but not with the typedef.
  • (nodebb)

    Seems pretty clear to me. Back in the 8 or 16 bit days somebody did the typedefs and wrote the whole app using those aliases. Which failed badly once the bitness of the target processor(s) changed.

    The #defines were added to essentially render the typedefs invisible; regardless of the source's benighted use of byte and word, the compiler only saw unsigned char and unsigned short and did the right thing for whatever bitness they were then targeting.

    In my interpretation, putting the 4 lines together is far better than spreading them someplace else in the source tree. Of course some comments in teh code and or in the commit to source control would have been nicer yet.

Leave a comment on “A Type of Alias”

Log In or post as a guest

Replying to comment #:

« Return to Article