• (disco)

    :wtf:

    <frist

  • (disco)
        big_double_buf = buf_mem;```
    Why would this _ever_ be anything even approaching a reasonable idea?
    
  • (disco) in reply to Fox

    What? It deduplicates memory and you only have to call malloc once, right?

  • (disco) in reply to Fox

    I can't think of any. Though I have seen code like:

    struct header * header = mem;
    int * ints = (int *)(header + 1);
    double * floats = (double *)(ints + numints);
    unsigned stroff = (char *)(floats + numfloats) - (char *)mem;
    
  • (disco) in reply to PleegWat
    PleegWat:
    Though I have seen code like:

    You can get into bad trouble with such trickery. It's how you discover that the platform you're running on has non-trivial alignment requirements. :smile:

  • (disco)

    Hang about… That code isn't even type-correct, so the compiler should be shitting bricks at this point. Also, it will be reading from memory that hasn't been written to, because the […] operator in C does a dereference and the author didn't know about calloc(). Jesus Wept! This is awful.

    The only right thing to do with this code is to delete it. The only right thing to do with the code's author is to erase them.

  • (disco) in reply to dkf

    Yeah, the only thing that for() loop does is write undefined values into the various *_in and *_out buffers. Assuming those pointers are previously initialized, of course.

  • (disco)
  • (disco) in reply to dkf
    dkf:
    Hang about… That code isn't even type-correct, so the compiler should be shitting bricks at this point. Also, it will be reading from memory that hasn't been written to, because the […] operator in C does a dereference and the author didn't know about calloc(). Jesus Wept! This is awful.

    The only right thing to do with this code is to delete it. The only right thing to do with the code's author is to erase them.

    "yeah ... the compiler gives lots of warnings when I compile this project, they're just annoying so I disabled them. I mean, it compiles okay and produces an object, we don't have time to waste chasing warnings around, we've got bugs to fix."

  • (disco) in reply to Fox
    Fox:
    ``` big_int_buf = buf_mem; big_double_buf = buf_mem;``` Why would this _ever_ be anything even approaching a reasonable idea?

    Sounds perfectly rasinable to me.

  • (disco) in reply to Fox
    Fox:
    ``` big_int_buf = buf_mem; big_double_buf = buf_mem; ```

    *cough* strict aliasing *cough*

  • (disco) in reply to Quite

    Was exactly my thought when i read that....

    If you are insane enough to make something like that, it makes perfect sense to you to just ignore those pesky warnings.

    (This is why I always write code with the maximum warning verboseness the compiler/interpreter allows. That makes it less likely that any typos / wtfs on my part makes it past my testing and into production. Note that i say less likely, not impossible.. :-) )

  • (disco) in reply to asdf
    asdf:
    Fox:
    ``` big_int_buf = buf_mem; big_double_buf = buf_mem; ```

    *cough* strict aliasing *cough*

    This isn't Fortran, you know. Fortran has strict, strict rules about aliasing, and it's a fundamental rule that aliasing will cause problems, but that compilation will proceed as if there isn't any aliasing going on. But of course it also has the ```EQUIVALENCE``` keyword for overlaying one variable over another, which also allows you to give a specific name to an array element...

    This, I think, led some fine colleagues (at a previous workplace) to nominate themselves as GAU-8 targets by overlaying C++ objects (carefully crafted to permit a maximum of UB) over C-language declarations approximately equivalent (pun most definitely intended) to some grubby-looking Fortran COMMON blocks heavily dosed with EQUIVALENCE.

    FunkyStupidClass &funky_stupid_variable = *(reinterpret_cast<FunkyStupidClass *>(&c_name_for_fortran_variable));
    

    All in the name of creating a syntactic sugar class...

    EDIT: left off a closing parenthesis...

  • (disco) in reply to Steve_The_Cynic

    Anyone who uses reinterpret_cast for anything else than casting from/to char* in cases where that's absolutely necessary deserves to be bludgeoned to death with a clue bat.

  • (disco) in reply to asdf
    asdf:
    Anyone who uses reinterpret_cast for anything else than casting from/to char* in cases where that's absolutely necessary deserves to be bludgeoned to death with a clue bathasn't read the standard.
    FTFY
  • (disco) in reply to asdf
    asdf:
    Anyone who uses `reinterpret_cast` for anything else than casting from/to `char*` in cases where that's absolutely necessary deserves to be bludgeoned to death with a clue bat.
    FooClass foo;
    char* ptr = reinterpret_cast<char*>(&foo);
    BarClass& bar = *reinterpret_cast<BarClass*>(ptr);
    

    :trollface:

  • (disco) in reply to cvi
    cvi:
    asdf:
    Anyone who uses `reinterpret_cast` for anything else than casting from/to `char*` in cases where that's absolutely necessary deserves to be bludgeoned to death with a clue bat.
    FooClass foo;
    char* ptr = reinterpret_cast<char*>(&foo);
    BarClass& bar = *reinterpret_cast<BarClass*>(ptr);
    

    :trollface:

    Following the letter of what he said but exploding the spirit of it like a blood sausage(1). Well done.

    (1) When I was much younger than I am now, I played Wasteland. I have carried some of its wording with me ever since.

  • (disco) in reply to asdf
    asdf:
    Anyone who uses `reinterpret_cast` for anything else than casting from/to `char*` in cases where that's absolutely necessary deserves to be bludgeoned to death with a clue bat.
    The cast that always amused me was ```dynamic_cast<void *>(pointer)```. I even used it once in some deeply unportable and probably UB debugging aids.
  • (disco) in reply to Steve_The_Cynic
    Steve_The_Cynic:
    blood sausage

    http://www.thehaggis.com/EZ/sh/sh/imagelibrary/black-gold-stick-with-slices_350.jpg

  • (disco) in reply to Steve_The_Cynic
    Steve_The_Cynic:
    Following the letter of what he said but exploding the spirit of it like a blood sausage(1). Well done.

    To be fair, the standard doesn't have this "loophole"; it states (IIRC) that you're now allowed to access one object through the pointer of a different/incompatible type (exception: char*). So casting is OK, using the resulting pointer isn't.

  • (disco) in reply to Steve_The_Cynic
    Steve_The_Cynic:
    This, I think, led some fine colleagues (at a previous workplace) to nominate themselves as GAU-8 targets by overlaying C++ objects (carefully crafted to permit a maximum of UB) over C-language declarations approximately equivalent (pun most definitely intended) to some grubby-looking Fortran COMMON blocks heavily dosed with EQUIVALENCE.

    FunkyStupidClass &funky_stupid_variable = *(reinterpret_cast<FunkyStupidClass *>(&c_name_for_fortran_variable));

    All in the name of creating a syntactic sugar class...

    EDIT: left off a closing parenthesis...

    This was a technique which we used to good effect once to knit a c program with a FORTRAN program when we were under constraints we weren't particularly happy about being constrained by. While we appreciated the danger in what we were doing, we took great care to hide the business end in as hidden-away INCLUDE library module we could muster, and set severe constraints to the edit privileges. Damn thing worked like a dream.

  • (disco) in reply to Steve_The_Cynic
    Steve_The_Cynic:
    Following the letter of what he said but exploding the spirit of it like a blood sausage(1). Well done.

    (1) When I was much younger than I am now, I played Wasteland. I have carried some of its wording with me ever since.

    Blood sausages a.k.a. black puddings, delightfully tasty and nourishing as they are, don't explode anywhere near as flamboyantly as haggises.

    But then you have to catch them when they're young. The traditional technique of haggis-hunting espouses the use of a net, but recently the unscrupulous method of using vacuum-cleaners has started to become popular. Until Scotland becomes independent, unfortunately, the Scottish Parliament is powerless to enact any laws making it illegal to hunt haggis by vacuum.

  • (disco) in reply to Quite

    Those of us who hunt the haggis with the use of a large electromagnet suspended from a drone are OK.

  • (disco) in reply to dkf

    Please don't make me weep with despair.

  • (disco) in reply to Steve_The_Cynic
    Steve_The_Cynic:
    The cast that always amused me was dynamic_cast<void *>(pointer).

    That should do the same as reinterpret_cast<void*>, right? So no UB there…

  • (disco) in reply to asdf
    asdf:
    Steve_The_Cynic:
    The cast that always amused me was dynamic_cast<void *>(pointer).

    That should do the same as reinterpret_cast<void*>, right? So no UB there…

    Not even close to the same, but the good news is that there isn't directly any UB.

    If the original pointer points to an element in a complicated inheritance tree, a dynamic_cast<void *> will give you a pointer to the outermost (most-derived) object without you having to know where you are in the tree and what the most-derived class actually is. It is more or less equivalent to static_cast<void *>(dynamic_cast<MostDerived *>(pointer)).

    (Caveat: that was what it did on C++98. YMMV on C++11.)

  • (disco) in reply to Quite
    Quite:
    Steve_The_Cynic:
    Following the letter of what he said but exploding the spirit of it like a blood sausage(1). Well done.

    (1) When I was much younger than I am now, I played Wasteland. I have carried some of its wording with me ever since.

    Blood sausages a.k.a. black puddings, delightfully tasty and nourishing as they are, don't explode anywhere near as flamboyantly as haggises.

    I know what blood sausages are (and not all of them are black puddings, and presumably therefore not usable as weapons by practitioners of Ecky Thump). I'm not convinced the writers of *Wasteland* knew what they are, though, in particular that while they are made from blood (in part), they no longer contain actual blood once they are fully cooked.
  • (disco) in reply to Steve_The_Cynic
    Steve_The_Cynic:
    If the original pointer points to an element in a complicated inheritance tree, a dynamic_cast<void *> will give you a pointer to the outermost (most-derived) object without you having to know where you are in the tree and what the most-derived class actually is.

    Wow, TIL. Never needed that before…

  • (disco)

    Now, malloc is slow. I worked on a project that did lots of work with linked lists of equal-sized objects, and it was much faster to allocate by 10000*sizeof(list_elem_t) instead of doing malloc and free on each little list_elem_t.

    But walking is slow too. I much prefer walking slow to being dragged behind a buggy buggy bugging speedily towards the edge of a precipice to plummet into a sea of buggy bugs.

  • (disco) in reply to asdf

    Yeah, it's fairly obscure. I see two main uses for it:

    • Do two instances of AbstractBase refer to the same overall object? Sure, you can ask them, if they provide a method to do so, but it's a stricter test than an operator == would provide. Just do the magic cast, and compare the results.

    • Does this small object lie within that larger object accessed through a tiny base object? You need a class-specific operator new for the base class that enlarges the allocation by at least enough to be able to store the size of the object allocated, and then you can play stupid and marginally(1) portable pointer comparison games to see if the small thing lies in that range. I actually did this once, so that a refcounted object could try to provide a list of other refcounted objects that had member variables that were smart pointers to itself.

    (1) It's portable only if you use std::less to do the pointer comparisons, because that is guaranteed to compare pointers in whatever funky architecture-specific way is necessary.

  • (disco) in reply to Quite
    Quite:
    Damn thing worked like a dream.
    One of those dreams where you wake up suddenly in a cold sweat, right?
  • (disco) in reply to Lawrence
    Lawrence:
    it was much faster to allocate by 10000*sizeof(list_elem_t) instead of doing malloc and free on each little list_elem_t

    Well, obviously. If you need performance, you should make sure you don't (de-)allocate memory all the time. You also need to be able to control the location of your objects in memory.

    Steve_The_Cynic:
    (1) It's portable only if you use std::less to do the pointer comparisons, because that is guaranteed to compare pointers in whatever funky architecture-specific way is necessary.

    Yeah, I remember that. Don't ask me why I needed to sort objects by their location in memory some time ago…

  • (disco) in reply to Lawrence
    Lawrence:
    Now, malloc is slow.

    That's true, and the cost comes from three things:

    1. The memory allocator doesn't sit well with the CPU's data cache.
    2. The memory allocator needs to be thread-aware, so it needs locks and so on. Slow.
    3. The memory allocator may need to make a system call to get more memory pages. Which may in turn do all sorts of things internally. Very slow!

    Per-thread arenas are much cheaper if you can construct them and avoid having to go to the locking stage. But you're still going to have locality of reference issues; you're getting a new space to write into, so it is bound to have to evict something else (since the likelihood that the cache is empty is vanishingly small on any real system running real programs).

    None of which should stop you from using malloc() (or new in C++, which is a fancy wrapper round malloc()) if you need it. Sometimes people lose sight of that.

  • (disco) in reply to dkf

    malloc (or new) is only really "bad" when people get lazy. Don't run a hot loop, creating a new int inside each iteration. As much as I enjoy coding in .NET and C#, I think .NET has done too good of a job, for people just getting into software development, of hiding what's really going on in the background. I've met too many junior dev's with zero concept of memory allocation/management. Even with a memory managed framework, you should understand heaps and pointers and "help" the GC know when to dispose stuff.

    Just because your gun has the safety on, doesn't mean you should run through the mall pulling the trigger.

    Managed frameworks as supposed to help prevent accidental memory leaks, not be a crutch to not ever learning how memory management works.

  • (disco) in reply to tenshino
    tenshino:
    Even with a memory managed framework, you should understand heaps and pointers and "help" the GC know when to dispose stuff.

    From my experience trying to "help" the GC 99.9% of the time is a huge WTF.

    ... I do agree with your general point, which is that it never hurts to be aware of how much memory you're actually using at any given moment.

  • (disco) in reply to blakeyrat
    blakeyrat:
    From my experience trying to "help" the GC 99.9% of the time is a huge WTF.

    I will agree with you, but only to the extent that (in my experience) it's because developers who don't understand memory allocation and try to "help" the GC the wrong way. Usually, by "forcing" garbage collection, thinking that it will just magically keep the application's memory footprint small.

    It's shocking to me how many developers I've worked with (usually fresh out of school, sometimes even with a "Masters" degree) who have no clue what the difference is between "heap" and "stack" or "class" and "struct". I get that they don't generally teach specific languages in school, but you would expect them to at least touch on basic concepts like memory allocation and error handling.

  • (disco) in reply to tenshino
    tenshino:
    no clue what the difference is between "heap" and "stack"

    :wtf::question: I learned that, and I wasn't even a CS major.

  • (disco) in reply to HardwareGeek

    Many colleges have bad CS programs. I've experienced it first hand and have seen the effects of it on various forums.

  • (disco) in reply to blakeyrat
    blakeyrat:
    From my experience trying to "help" the GC 99.9% of the time is a huge WTF.

    That's because as soon as you use GC and a virtual machine, you pretty much give up control over memory (de-)allocations and controlling the memory location of your data structures. Which is a good thing if you just want to quickly get stuff done, but will be a huge pain in the ass if you need to optimize for performance.

    There's a reason sun.misc.Unsafe is so popular.

  • (disco) in reply to tenshino

    IME that can be a bad curriculum, a bad school, or a bad student who managed to muddle through a school good enough that the student should have learnt things but not good enough to keep the bad students from getting the diploma. Or the guy may not have the degree at all, of course.

  • (disco) in reply to LB_
    LB_:
    Many colleges have bad CS programs.

    And some are not bad, but focus very much on the theoretical to the severe detriment of the practical. I don't know if it's true, but I remember being told once that it was possible to get a CS degree from Berkeley without ever writing an actual program.

  • (disco) in reply to Lawrence
    Lawrence:
    IME that can be a bad curriculum, a bad school, or a bad student who managed to muddle through a school good enough that the student should have learnt things but not good enough to keep the bad students from getting the diploma. Or the guy may not have the degree at all, of course.

    Don't rely on the diploma to short-cut the interviewing process. The school's criteria for passing the student are not your criteria for hiring them. And I say this despite working at a university; some graduates are just horribly wet behind the ears and need a few years of life to knock some sense in before they're worth putting on the payroll.

  • (disco) in reply to dkf

    Since interview coding questions are apparently ill-seen by HR, my solution for hiring fresh-out-of-school talent is to take not-yet-graduated students as interns (in France most or all five-year engineering degrees mandate several months of internship). If the guy is good, hire him (or her, obviously). Who could have been certain on the basis of a mere job interview that the gal with a master's in biology who - not finding work - did a one-year class in computing would be a natural at Pig queries?) This is even more important in France than in the US, because after hiring someone you get three months to say "sorry you don't fit after all", after that you're basically stuck with the guy unless he does something really wrong. On my side-line as TA for master's students, I can only wonder at how some students got there and what they are going to be able to do once they get their diploma...

  • (disco) in reply to Lawrence
    Lawrence:
    Since interview coding questions are apparently ill-seen by HR, my solution for hiring fresh-out-of-school talent is to take not-yet-graduated students as interns (in France most or all five-year engineering degrees mandate several months of internship). If the guy is good, hire him (or her, obviously). Who could have been certain on the basis of a mere job interview that the gal with a master's in biology who - not finding work - did a one-year class in computing would be a natural at Pig queries?) This is even more important in France than in the US, because after hiring someone you get three months to say "sorry you don't fit after all", after that you're basically stuck with the guy unless he does something really wrong.On my side-line as TA for master's students, I can only wonder at how some students got there and what they are going to be able to do once they get their diploma...

    Most places I've worked have a general policy that the initial period of employment (usually the first three months or so) is probationary. In fact scrub that, make that all the places I've worked. Then you both get the opportunity to try each other out to see whether you are as good a fit as you anticipated from the interview process. This proby period has proved extremely useful to both parties on two specific occasions where I started a period of employment that both I and the employer had second thoughts about.

  • (disco) in reply to Fox
    Fox:
    Why would this ever be anything even approaching a reasonable idea?

    You ever heard of a C union?

  • (disco) in reply to FrostCat
    FrostCat:
    You ever heard of a C union?

    Does it involve a C civil ceremony?

  • (disco) in reply to dkf

    And is it followed by a C reception?

  • (disco) in reply to RaceProUK

    Will the reception have C food?

  • (disco) in reply to dkf
    dkf:
    Does it involve a C civil ceremony?

    No, that's UB.

  • (disco) in reply to HardwareGeek
    HardwareGeek:
    Will the reception have C food?
    Go fish ;)

Leave a comment on “High Performance Memory Allocation”

Log In or post as a guest

Replying to comment #:

« Return to Article