- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Just hope that the typedefs do not occur (in terms of order of source processing) after the #defines... then they are illegal..... One of the reasons to always check header Idempotence - prevents things like that sneaking up on you...
Admin
I was genuinely curious what was going to be the catch, but this... this is just delightful!
Admin
Someone was too lazy to do a global search and replace for "byte" and "word", so they did this instead. And I will bet you anything they first tried putting those #defines above the typedefs.
Admin
Indeed, because they might be the same size. Or
short
might be 72 bits whilechar
might be just 9.Actually, having the defines in one header and the typedefs in another could make the code not compile, because if the header with the defines comes first in compilation order, the typedefs become:
and the compiler will barf up some lovely error messages as a result.
But having the two definitions next to each other like that is ... weird.
Admin
Admin
Maybe that happened when rebasing/merging code, like in Happy Merge Day? https://thedailywtf.com/articles/Happy_Merge_Day!
Like, there could have been several months worth of code changes merged in a very short time, turning the codebase into the equivalent of a Chernobyl meltdown, and whoever did the merging had to hastily fix hundreds of code conflicts, leaving some radioactive fallout littered everywhere in the codebase, filling the bugfix backlog for the entire century.
Admin
This is an easy way to CYA. A global search/replace means you've changed a ton of source files, which you are now responsible for. Instead, make the change in a header file. Since header files don't contain actual code, you won't be considered responsible if that code breaks.
Admin
I think it's fair to point out, that typedef is not an alias, it's actually creating a new type. Auto cast rules still apply, but there are edge cases where a real alias (aka macro aka #define) acts like the original type while a typedef acts differently. And because it's C, it heavily depended also on the compiler and platform in use. Overall typedef as a real type definition acted "stronger" than a pure #define which was only a text replacement and that's it.
Addendum 2024-06-03 09:28: BTW that discrepancy explains the "revert" with the follow up #define's. A classic quick and dirty "fix".
Admin
Can you give examples of these differences? The only thing I can think of is that there may be cases where the macro needs parentheses (just like macros that expand to expressions), whereas the typedef is treated as a unit.
Admin
I remember a lot of compilers complaining with a warning when auto-casting.
I remember an IBM C++ compiler (I think it was one for AIX) creating methods signatures by type not their underlying type:
You can't do this obviously with a text replacing marco using #define.
Addendum 2024-06-03 11:25: Oh, and yeah, it has implications with nested #define statements because if you have two single keywords it can result in issues in some places because while you can use () for expression, you can't use them for types. But I can't remember exactly the problem I had back then, that was decades ago :-)
Admin
cppreference.com disagrees.
From https://en.cppreference.com/w/c/language/typedef (about C):
From https://en.cppreference.com/w/cpp/language/typedef (about C++):
Admin
Then it wasn't a C++ compiler (which I knew anyway), because C++ says that they are aliases, and therefore the same type. It was, in effect, a compiler for a different language whose syntax is identical to C++'s syntax.
But I also remember another thing it did. It would work together with the linker to remove global (file-scope or extern-scope) variables that weren't explicitly referenced anywhere(1), which broke systems based on "my global object registers itself with this compile-time plugin system by calling from its constructor", which don't require anything to explicitly reference the global object.
(1) That is, not counting the implicit references that enable things like collect2 to do their jobs.
Admin
Indeed, C and C++ books have tended to be oddly emphatic that typedef does not create a new type.
Admin
I thought the code was C, not C++, so cpppreference might not be the right site to cite.
Admin
These days if you need a specific type size, use <inttypes.h>. There you have access to signed/unsigned ints with 8/16/32/64 bits size. Even better, these sizes are guaranteed - if you ask for a uint8_t, you will get an 8-bit unsigned int. There are also printf and scanf type definitions which are a little arcane to use, but guarantee that you'll use the right formatting codes for the type in question. And there's int_ptr which is an int large enough to hold any pointer.
Remember the only guarantee on sizes that C has: sizeof(char) <= sizeof(short) <= sizeof(int) <= sizeof(long) <= sizeof(long long)
On some architectures especially in the past, this could mean they were all the same size (e.g., 32 bits).
Admin
@Steve - just a side note, that since C++ 11 [wow well over a decade] typedefs are not the generally preferred approach for C++...
https://en.cppreference.com/w/cpp/language/type_alias
Admin
As has been noted, in standard C/C++, a typedef is simply an alias. I can think of two cases where the #define works differently to a typedef though:
byte(0)
is a valid expression, but with the #define it expands tounsigned char(0)
which is not (the type would need to be parenthesized - but note that C++ also has some weird corner cases where adding extra parentheses changes semantics even though it doesn't change the grouping).typedef int foo;
or#define foo int
thenunsigned foo x;
is a valid declaration with the #define but not with the typedef.Admin
Seems pretty clear to me. Back in the 8 or 16 bit days somebody did the typedefs and wrote the whole app using those aliases. Which failed badly once the bitness of the target processor(s) changed.
The #defines were added to essentially render the typedefs invisible; regardless of the source's benighted use of
byte
andword
, the compiler only sawunsigned char
andunsigned short
and did the right thing for whatever bitness they were then targeting.In my interpretation, putting the 4 lines together is far better than spreading them someplace else in the source tree. Of course some comments in teh code and or in the commit to source control would have been nicer yet.