• Luiz Felipe (unregistered) in reply to nah
    nah:
    MarkJ:
    I once saw a legitimate use for preallocated objects in C#. The garbage collector was too slow for the near real time application, so the developer created "pools" of objects and maintained free lists for them. Of course, he actually used these objects over and over rather than having them sit there in the heap, moldering...

    I'd argue that of you're coding realtime or near realtime applications then C# might not be the best language choice.

    Nor any machine that has more than one core. Even if the SO is linux, that has some kind of real time scalonator.

  • Luiz Felipe (unregistered) in reply to Anonymous Cow-Herd
    Anonymous Cow-Herd:
    Niels Esge Rasmussen:
    C-Octothorpe:
    I like how his cloning logic doesn't actually, well, you know, CLONE!!!

    Here, let me fix the code sample:

    public static IClonable SuperDuperSafeCloneIClonable(IClonable a)
    { return a; }

    FTFY

    You forgot to serialize the reference to XML, write it out to a configuration file, print out the file, put the printout on a wooden table, take a photograph, print the photograph, scan the printed photograph, email the scan to the operator, OCR the XML data from it and deserialize it before returning it. Call yourselves "enterprise"?

    Briliant. Whatever the enterprises think that we are an cost center, we need to do this thing.

  • (cs) in reply to C-Octothorpe
    C-Octothorpe:
    aptent:
    TheCPUWizard:
    1) NEVER call GC.Collect from "Real" code. For any company/project/team I am involved with, that is an immediate firable effect.
    1. Do not think you relly know what the GC is doing (unless you really are an expert). In fact from a purist perspective there is no Grbage Collector.

    Hint: Temporary Objects to NOT cause issues, they prevent them. The overhead of GC is based much more of the complexity of the LIVE object graph. The frequency of GC is dependant on the total allocation rate...

    So, would you rather have 1 GC that runs for 200mS, or 100 GC's that each run for 25uS?????

    How much do you pay for rent in your ivory tower?
    The blood of one junior developer... at the 1st of every month.

    1. No "Ivory tower" here, I am probably in the bloodiest trenches more than post.

    2. "Junior Developer" do not appease the powers that be...something "more" is needed.

    Seriously, I stand by all that was in the original (quoted) posting. Specifically, there have been three major engagments that my firm completed in the past 2 years, where the root cause of the clients perofrmance problems, was that they had indeed reduced the number of GC cycles, but had made them very expensive. With a complete re-design, keeping object lifetimes as short as possible (and increasing the nuimber of temporary objects per second by over three orders of magnitude, with a corresponding increase in the number of collections, performance went up singlificantly, and the % time spend in GC when down by over a factor of 10.

  • (cs) in reply to Luiz Felipe
    Luiz Felipe:
    nah:
    MarkJ:
    I once saw a legitimate use for preallocated objects in C#. The garbage collector was too slow for the near real time application, so the developer created "pools" of objects and maintained free lists for them. Of course, he actually used these objects over and over rather than having them sit there in the heap, moldering...

    I'd argue that of you're coding realtime or near realtime applications then C# might not be the best language choice.

    Nor any machine that has more than one core. Even if the SO is linux, that has some kind of real time scalonator.

    I do near real time applications in C# all of the time. There has not been a single case where the perofrmance of .NET made the difference between being able to achieve the goals (or not) compared to native C++. In every case, it has been the underlying operating system which is not designed for nrt applications (unless you start getting into kernel mode drivers, but then technically they are not applications).

  • (cs) in reply to Lone Marauder
    Lone Marauder:
    z00n3$!$:
    Just thought of a name for a song/story/book/movie/whathaveyou:

    "Cum-sealed envelope"

    Had to share.

    You are a strange, sad little man.

    You never know, it might be a woman, or a bothie, or a neither.

  • madarchod (unregistered) in reply to F
    F:
    TheCPUWizard:
    ...
    1. Do not think you relly know what the GC is doing (unless you really are an expert). In fact from a purist perspective there is no Grbage Collector. ...
    Very true. In fact, from *any* perspective there is no such thing as a Grbage Collector.
    Then who is picking up my Grbage each week?

    Anyway, what I usually do is I create all the windows, dialog boxes, and such ahead of time, and just move the one I want to use to the top and make it modal. It may look a little messy to a beginner, but if you try it, I'm sure you'll see the benefits.

    Also, I love you, Nagesh.

  • (cs) in reply to TheCPUWizard
    TheCPUWizard:
    Luiz Felipe:
    nah:
    MarkJ:
    I once saw a legitimate use for preallocated objects in C#. The garbage collector was too slow for the near real time application, so the developer created "pools" of objects and maintained free lists for them. Of course, he actually used these objects over and over rather than having them sit there in the heap, moldering...

    I'd argue that of you're coding realtime or near realtime applications then C# might not be the best language choice.

    The underlying OS was Windoze XP. And, no, the developer didn't choose it - it's what the customer wanted.

    Nor any machine that has more than one core. Even if the SO is linux, that has some kind of real time scalonator.

    I do near real time applications in C# all of the time. There has not been a single case where the perofrmance of .NET made the difference between being able to achieve the goals (or not) compared to native C++. In every case, it has been the underlying operating system which is not designed for nrt applications (unless you start getting into kernel mode drivers, but then technically they are not applications).

    The underlying OS was Windows XP. NO, the developer didn't choose it; the customer wanted it. AS ugly as this approach appears, it does work, and seems to be reliable; at least the customer hasn't complained.
  • Luiz Felipe (unregistered) in reply to MarkJ
    MarkJ:
    TheCPUWizard:
    Luiz Felipe:
    nah:
    MarkJ:
    I once saw a legitimate use for preallocated objects in C#. The garbage collector was too slow for the near real time application, so the developer created "pools" of objects and maintained free lists for them. Of course, he actually used these objects over and over rather than having them sit there in the heap, moldering...

    I'd argue that of you're coding realtime or near realtime applications then C# might not be the best language choice.

    The underlying OS was Windoze XP. And, no, the developer didn't choose it - it's what the customer wanted.

    Nor any machine that has more than one core. Even if the SO is linux, that has some kind of real time scalonator.

    I do near real time applications in C# all of the time. There has not been a single case where the perofrmance of .NET made the difference between being able to achieve the goals (or not) compared to native C++. In every case, it has been the underlying operating system which is not designed for nrt applications (unless you start getting into kernel mode drivers, but then technically they are not applications).

    The underlying OS was Windows XP. NO, the developer didn't choose it; the customer wanted it. AS ugly as this approach appears, it does work, and seems to be reliable; at least the customer hasn't complained.

    I was talking about the fact that today computers arent anymore predicatable machines. Real realtime is impossible. only near real-time is, and with an error margin, that can be very small and for some things doesnt matter.

    Very cool that is possible to make thinks in csharp with near real-time performance. i will give a try.

    Sorry, i am very poor english writer and in my primary lang also (i think i am poor at it). whatever, i am a coder, not an writer.

  • AnotherAnon (unregistered) in reply to MarkJ

    I'm going to have to second this (no +1, sorry), but will ignore modern pools, caches, etc... "Back-in-the-day", when programming games to strict system requirements, I pre-allocated so nothing could screw you after loading.

    But then I was young and may have suffered from the ignorance that comes with it.

  • AnotherAnon (unregistered) in reply to AnotherAnon
    AnotherAnon:
    I'm going to have to second this (no +1, sorry), but will ignore modern pools, caches, etc... "Back-in-the-day", when programming games to strict system requirements, I pre-allocated so nothing could screw you after loading.

    But then I was young and may have suffered from the ignorance that comes with it.

    Okay, the "reply-no-quote" semantics weren't expected... Just responding to someone defending certain cases of pre-allocation.

  • (cs) in reply to Roman
    Roman:
    Heh.

    P.S. No, my comment is not spam!

    yes it is

  • (cs) in reply to Art
    Art:
    It really annoys me to see code where the programmer obviously thought "OK, something might go horribly wrong here, but if it does, I'll just shrug it off and muddle on."

    Instead of looping 10 times and then fail, it should have been an infinite loop with sleep calls so that the program will simply pause until the required memory is available and then proceed to complete its assigned task.

    hey topcoder long time...

  • (cs) in reply to atk
    atk:
    Art:
    It really annoys me to see code where the programmer obviously thought "OK, something might go horribly wrong here, but if it does, I'll just shrug it off and muddle on."

    Instead of looping 10 times and then fail, it should have been an infinite loop with sleep calls so that the program will simply pause until the required memory is available and then proceed to complete its assigned task.

    I agree that sleeps are better, but I do not agree about an infinite loop. If the system is unrecoverably out of memory, an infinite loop would look like a hang rather than providing opportunity to gracefully handle the failure. Graceful recovery would then require a finite number of tries. Perhaps a configurable timeout would have been useful ("after M minutes, we're really out of memory, so just give up").

    And 10 may simply have been good enough to recover the application in the environments where the problem exhibited itself. Not all problems are fully understood when they're patched - sometimes a tweak here or there hides a deeper, extremely difficult to diagnose problem. And if the product does what it's supposed to do, under the conditions in which it is used, is that not good enough until the problem is better understood? Or would you prefer a production-down situation to an ugly, temporary workaround?

    In such cases, it would be good to insert a comment, "Regarding the magic number 10: it just seems to work, and I don't have any better idea how to do this, right now." If the code reviewers don't have a better suggestion, and the component is not critical enough for an architect to look at, then it may well be good enough.

    die!!!!!!

  • Dirle (unregistered)

    ...and if it would still fail, instead of throwing an exception about not able to clone the object it would return the same object that was passed to it. Caller might have the surprise of his/hers life.

  • (cs) in reply to TheCPUWizard

    [quote user="TheCPUWizard"]Hint: Temporary Objects to NOT cause issues, they prevent them. The overhead of GC is based much more of the complexity of the LIVE object graph.quote]This.

  • Nagesh (unregistered) in reply to Luiz Felipe
    Luiz Felipe:
    Sorry, i am very poor english writer and in my primary lang also (i think i am poor at it). whatever, i am a coder, not an writer.
    Be doing what I persist to going to Google translation.
  • Disgruntled Former Irishman (unregistered)

    Why no update after this p***-poor article in 3 days? The least Alex could do is release some obviously-photoshopped Error'd images!

  • causa (unregistered) in reply to TheCPUWizard
    TheCPUWizard:
    C-Octothorpe:
    aptent:
    TheCPUWizard:
    1) NEVER call GC.Collect from "Real" code. For any company/project/team I am involved with, that is an immediate firable effect.
    1. Do not think you relly know what the GC is doing (unless you really are an expert). In fact from a purist perspective there is no Grbage Collector.

    Hint: Temporary Objects to NOT cause issues, they prevent them. The overhead of GC is based much more of the complexity of the LIVE object graph. The frequency of GC is dependant on the total allocation rate...

    So, would you rather have 1 GC that runs for 200mS, or 100 GC's that each run for 25uS?????

    How much do you pay for rent in your ivory tower?
    The blood of one junior developer... at the 1st of every month.

    1. No "Ivory tower" here, I am probably in the bloodiest trenches more than post.

    2. "Junior Developer" do not appease the powers that be...something "more" is needed.

    Seriously, I stand by all that was in the original (quoted) posting. Specifically, there have been three major engagments that my firm completed in the past 2 years, where the root cause of the clients perofrmance problems, was that they had indeed reduced the number of GC cycles, but had made them very expensive. With a complete re-design, keeping object lifetimes as short as possible (and increasing the nuimber of temporary objects per second by over three orders of magnitude, with a corresponding increase in the number of collections, performance went up singlificantly, and the % time spend in GC when down by over a factor of 10.

    In terms of point 1, please defend your stance of "never garbage collect and pretend the garbage collector doesn't exist". Assuming you are not a college student, in which case your comment makes perfect sense.

  • (cs) in reply to Nagesh
    Nagesh - faker:
    Luiz Felipe:
    Sorry, i am very poor english writer and in my primary lang also (i think i am poor at it). whatever, i am a coder, not an writer.
    Be doing what I persist to going to Google translation.

    Post using diff name, idiot!

  • radarbob (unregistered) in reply to STAR WARS FAN!!!!!!!!!!!!

    Hmm... the 11th time I watched that movie Chewbacca didn't escape.

    I know that's lame, but I had to make some comment on the best WTF comment I've read in a very, very long time.

  • z00n3$!$ (unregistered) in reply to radarbob
    radarbob:
    Hmm... the 11th time I watched that movie Chewbacca didn't escape.

    I know that's lame, but I had to make some comment on the best WTF comment I've read in a very, very long time.

    So he got out of the trash compactor the first 10 times, but the eleventh time, he was crushed? Did you see it? What colour were his guts? Did he squeeze out one last Number 3 before he died?

  • kmeisthax (unregistered) in reply to BadCode

    So, he created a really, really shitty memory pool allocator implementation. For UI controls.

  • may be not :) (unregistered) in reply to andytech

    i was curious why one would do that (what is the connection between system.gc and f.delete), then with some google search, i came across this thread:

    http://stackoverflow.com/questions/2128537/how-to-diagnose-file-delete-returning-false-find-unclosed-streams

    May be, the developer felt the system.gc as a fallback in case the cleanup of file-handles were missed at a place or two.

  • (cs) in reply to TheCPUWizard
    TheCPUWizard:
    Seriously, I stand by all that was in the original (quoted) posting.
    Including all of your needless sloppiness at the keyboard? (See below for examples and corrections.)
    2) Do not think you really know what the GC is doing (unless you really are an expert). In fact, from a purist perspective there is no Garbage Collector.

    Hint: Temporary Objects do NOT cause issues, they prevent them. The overhead of GC is based much more on the complexity of the LIVE object graph. The frequency of GC is dependent on the total allocation rate... ... 2) "Junior Developers" do not appease the powers that be...something "more" is needed. ... Specifically, there have been three major engagements that my firm completed in the past 2 years, where the root cause of the client's performance problems was that they had indeed reduced the number of GC cycles, but had made them very expensive. With a complete re-design, keeping object lifetimes as short as possible (and increasing the nuimber-->number of temporary objects per second by over three orders of magnitude, with a corresponding increase in the number of collections, performance went up singlificantly-->significantly, and the % time spent in GC when-->went down by over a factor of 10.

    FTFY

  • percular (unregistered) in reply to Jibble
    Jibble:
    Um, doesn't the garbage collector call itself when there's no memory left?

    I mean ... isn't that what they're for?

    I wish it did that. It doesn't, even with recent .NET implementations. OutOfMemoryException gets thrown even if there is tons of allocated memory able to be garbage collected.

  • (cs) in reply to Luiz Felipe
    Luiz Felipe:
    l.ForEach((p) => p.Kill());
    
    What language is this supposed to be?
  • (cs) in reply to frits
    frits:
    Also, can you not find Uncle Remus books anymore?
    You can find this though - have you seen us, Uncle Remus?
  • (cs) in reply to method1
    method1:
    frits:
    Also, can you not find Uncle Remus books anymore?
    You can find this though - have you seen us, Uncle Remus?
    +1
  • Never too late (unregistered) in reply to method1
    method1:
    Luiz Felipe:
    l.ForEach((p) => p.Kill());
    
    What language is this supposed to be?

    C#

    [This is not a spam]

  • Fil (unregistered) in reply to snoofle

    Doesn't make any sense. Might swell create the dialog at startup, since you are using the amount of memory anyway.

  • Fil (unregistered) in reply to snoofle
    snoofle:
    I did something vaguely like that a very long time ago, but not in a loop. Our system needed to slurp up pretty much all the memory our (then) modest Unix box could muster. Depending upon what you were doing, the right combination of inputs could cause enough stuff to be loaded into cache that we'd get an out of memory error.

    Bigger boxes were ordered, but because of budgets, weren't going to arrive for a while.

    The requirement was that when it happened, we were to pop a dialog saying it was out of memory and that the system would exit, so the user wouldn't just sit there with a hung system. But how do you get the memory for a dialog if you're out of memory?

    Simple, pre-allocate an array to reserve enough bytes that, when freed, would allow the dialog to be created.

    Doesn't make a any sense, since you are allocating the amount of memory anyway. You may as well create the dialog at start time.

  • Ditto (unregistered) in reply to Coyne
    Coyne:
    Dr Evil: (evil laugh) Clones!

    Dr Evil: (even more evil laugh) Ve must haf more clones!

    Dr Evil: Arrrrrrrrgggggghhh!!!!!! Out of memory??!! Vhy must my evil plans always be foiled???!!

    His plans weren't foiled .. he just realized that since he doesn't have enough memory to make a full sized clone, he had to reduce the size of the clone .. ;)

  • What do you expect from a Agile company (unregistered)

    The real WTF is running OO Memory in the first place!

  • (cs) in reply to Luiz Felipe
    Luiz Felipe:
    Sorry, i am very poor english writer and in my primary lang also (i think i am poor at it). whatever, i am a coder, not an writer.
    We've never had a problem understanding you - you're pretty much perfect. I could never tell you weren't native.
  • Mikey (unregistered) in reply to atk

    Who knows, it might have been a magic number... doubled... with padding... maybe 4 worked and he upped it to 10 to make really sure.

    At any rate, looping at all for this kind of thing is a performance issue, the correct behavior would be to let the UI (or logging mechanism, etc...) catch the OOM exception, inform the user and gracefully close down the app.

  • percular (unregistered) in reply to Mikey

    The implementation (and placement) is screwy, but there are cases where this sort of thing is warranted. It's far easier to run out of memory in C# than in C or C++, since objects persist for a while after you're done with them and you can't allocate instances of classes on the stack, so it's possible to outpace the garbage collector by simply performing some calculations in a loop, even if you don't allocate any objects that aren't immediately discarded. In such cases, if you realize you're outpacing the GC, then forcing it to collect at certain points in the code can transform an always-runs-out-of-memory program into a never-runs-out-of-memory program, and depending on the complexity of the program, that can be the easiest (I won't claim it's the best) way to get it working.

  • Beat (unregistered)

    uh, I'm currently suffering from an LOGIC DOES NOT COMPUTE exception, my head hurts :(

  • Poqwe (unregistered) in reply to Jon E.

    LOL :D

Leave a comment on “Clone Collector”

Log In or post as a guest

Replying to comment #:

« Return to Article