• (nodebb)

    Somebody thinks everything is passed by reference, including function calls?

  • Michael R (unregistered)

    ""

  • some guy (unregistered)

    They looked into the abyss (i.e. empty string) and decided: I better sanitize that.

  • some guy (unregistered)

    They looked into the abyss (i.e. empty string) and decided: I better sanitize that.

  • Bogolese (unregistered)

    Talk about insanity . . .

  • (nodebb)

    The only rational way that happens is a series of global (or at least large span) search & replaces that caught a lot more fish than the dev intended. Which changes were then never examined. And which were then checked into their test-free production environment.

  • (nodebb)

    LOC

  • (nodebb)

    Better to sanitize it and not need it than need it and not sanitize it, amirite?

  • (nodebb)

    I regret that I cannot upvote your comment, Rick.

    My father was a classic program manager from the 60's and 70's. When we discussed my career, he would invariably ask how many lines of code my application had, because his mind could not grasp any other metric for measuring programming task. And since it was the only metrics, it must be valid.

  • no (unregistered) in reply to n9ds

    defensive programming!

  • (nodebb) in reply to Michael R

    ""

    Well played, Sir. Very well played!

  • (nodebb) in reply to Michael R

    How dare you. You clean that up right now.

  • (nodebb)

    This one is more complicated than it seems to be.

    So till .net framework 2.0 pretty much all strings were interned, but that caused issues because the amount of RTFM developers was high even back then and devs did funny things like concats in iterations.

    However .net framework 2.0 was different that only constant member literals got interned. So basically two empty literal strings in a local scope were different instances. Which made things obvious worst, especially if you had to write an efficient parser and so MS changed the behavior multiple times.

    Now with .netcore everything changed on both the runtime and compiler level; while the rule of thumb that literals are getting interned, it's not 100% true (I think utf8 literals don't count, at least the did not when they got introduced).

    But let's be frank, if you rely on different reference for read only objects containing the same value then there is something seriously wrong with your thought process. That is true for every language that every existed and I think it will for the rest of the time humanity exists. So about a decade or so, if we don't find a solution to get rid of those idiots in power in some countries :-).

  • (nodebb) in reply to kilroo

    How dare you. You clean that up right now.

  • Loren Pechtel (unregistered)

    Evolution. Sanitized, then later eliminated without proper cleanup.

    What's the problem, the result clearly is sanitized!

  • (nodebb)

    They may have been believers of looped string theory...

  • (nodebb)

    This feels like an old school C programmer desperately trying to make sure the "text" variable's resources are being released, and had no idea what "sanitizing" means in the context of strings, but had seen that elsewhere numerous times within the codebase.

  • (nodebb) in reply to Michael R

    @MaxiTB: Most .Net devs need exactly zero knowledge of these detailed levels of compilers and runtimes. Write code that is correct in terms of the business needs and moved on to the next business thing.

  • (nodebb) in reply to WTFGuy

    What kind of testing is going to pick up the addition of pointless code that has no visible effect? Sufficiently advanced static analysis might pick up something (variable assigned a value that is never used) though...

  • (nodebb) in reply to WTFGuy

    Totally agree.

    Another example would be how string literals are actually stored in the assembly. While const members will be only stored once per value, local members will always be their own entry in the string table. That's the reason why the old rule always use string.Empty over "" was very important not only because of the additional space requirement but also because those literally get always interned which is also a noticeable performance hit especially in large applications with large string tables. However with .net 5 this changed on a compiler level and now it doesn't matter - at least for empty strings, it still matters for all other strings.

    So long story short, if you don't know what you are doing, optimizations don't matter. Following best practices, which also change constantly, is way more important. 99.99% of app developers will never need to know details like "oh, I need to const this shared non-empty literal or I potentially waste 10ns for a hash lookup". If you write a source generator, sure; if you write a RT app, sure; if you write a low level SDK, sure - but most devs write code that is not that performance critical.

Leave a comment on “Insanitize Your Inputs”

Log In or post as a guest

Replying to comment #:

« Return to Article