• Irony Lost (unregistered) in reply to chubertdev

    It's like you don't even realize that he said exactly the same thing about your original post.

  • (cs) in reply to Irony Lost
    Irony Lost:
    It's like you don't even realize that he said exactly the same thing about your original post.

    "chubertdev" realized it a long time ago, but he still like to troll for heck of it.

  • andyf (unregistered) in reply to np
    np:
    Beware of people that repeatedly refuse help but are obviously stuck. They likely just don't want to be caught in the mess they put themselves in.

    Best to explain to them that they are causing the company money even if they work best alone. If they aren't getting work done, then they are just sucking money from the company.

    I love this idea.

    Captcha: tristique, for when a bistique just isn't enough.

  • Thomas (unregistered) in reply to Subversion weenie
    Subversion weenie:
    Can someone explain how adding a file to a zip causes a delta "nearly as large as the archive itself"? Adding more files doesn't change how pre-existing files are compressed, and any half-decent binary diff algorithm should be able to take advantage of that.
    He wasn't just adding a file, but also renaming his old versions. If he renamed old foo.c to foo.1.c, old foo.1.c to foo.2.c, and so on, when he zipped it up probably the whole archive would be different. I'm thinking this might increase the size of the diffs.

    And if that's not enough, what if the story munged things and it was actually a tarball not a zip, allowing the new file to affect the compression of the old ones?

  • Meep (unregistered) in reply to foo AKA fooo
    foo AKA fooo:
    So TRWTF wasn't that nobody bothered to check on the new guy's changes? The repo size exploding by some orders of magnitude didn't ring any bells before it was too late?

    It wouldn't have been exploding by orders of magnitude. It would never have doubled in a single commit.

    If he had been adding files using the zip -a, they might have been able to compress it just using git gc. The reason is that zip -a removes the trailing zip directory, adds a new entry, and rewrites the new directory. Thus a chunked binary comparison like git's delta compression would find both files largely the same.

    But git doesn't do this until it runs garbage collection to clean up loose objects.

    And many shitty zip utilities probably recompress the zip file, thus throwing of a simple binary delta.

  • Norman Diamond (unregistered) in reply to Nagesh
    Nagesh:
    The first thing when backing up source code is not to make complaint about redundancy. That is one place where redundancy is good.
    "Sure, we used to employ a backup system, but it was costing too much so we made it redundant."
  • Norman Diamond (unregistered)
    Remy Porter:
    in-house developed standard library
    Oxymoron alert.
    Remy Porter:
    <code>derpecated.zip</code><!-- intentional mispelling --> [...] <code>derprecated.zip</code>
    Does this mean derprecated was the correct spelling?

    Oops wait. Was mispelling the intentional misspelling of misspelling? Then what was defecated? Hey wait, I know!

    git, but .zip instead of .tgz? Oxymoron alert.

  • (cs) in reply to EatenByAGrue
    EatenByAGrue:
    Software is truly done when it's: 1. Checked into the code repo. 2. Reviewed by a peer or two. 3. Through a round of QA in a test environment. 4. Pushed live / shipped to customers. 5. QA'd again in whatever qualifies as the live environment. 6. Verified a week later to make sure that it's still working as it should.

    Even in a 1-person shop like I usually am in, I go through all those steps except step 2.

    You can actually do step 2 in a 1-person shop.

    Just think of your "future self", once you no longer have the code fresh in your head, as peer no. 2, no. 3, or as many as you need. It still takes the same amount of man hours as if you were working with an actual peer, just spread along more time. Of course that means you have to reschedule the same pieces of code, and sometimes QA efforts get lost if you find some nasties at review time, but at least you do get to find them.

  • Binaries 4 eva (unregistered) in reply to Tux "Tuxedo" Penguin
    Tux "Tuxedo" Penguin:
    ubersoldat:
    Ah! One of my favourite flame wars: You'll add binaries to our repo over my dead body.

    What if those binaries are actually required for the application to work? Like data files containing game's levels, etc. or images used as icons within the application?

    Then just add them. I've never had any trouble with hit repositories containing large incompressible binary files that slowly change over time - eg. PNG stream. If you don't gc often then they get huge and then gc takes ages when you finally do it. In one case >2GB down to 108MB when it was finally done...

    The real problem with binaries is when they can't be delta'd - if the whole file changes, then the whole file has to be stored! If it's also incompressible then this is going to be huge on any VCS whatsoever. So a ZIP file or similar pre-compressed package is the absolute worst case - each time a file is added to the ZIP the delta can be much larger than the (zipped) file that was added, and it's already compressed the repo can't make it much smaller.

  • Binaries 4 eva (unregistered) in reply to Norman Diamond
    Norman Diamond:
    git, but .zip instead of .tgz? Oxymoron alert.
    Git for Windows. Windows Explorer has truly terrible ZIP handling built-in.
    • I really don't understand how it takes Windows Explorer longer to show the directory contents of large zip files than it takes my application to decompress and process every file in the same zip.
  • QJo (unregistered) in reply to Norman Diamond
    Norman Diamond:
    Nagesh:
    The first thing when backing up source code is not to make complaint about redundancy. That is one place where redundancy is good.
    "Sure, we used to employ a backup system, but it was costing too much so we made it redundant."

    TRWTF is the use of "to make redundant" for "to dismiss someone from their position because it was decided that their position within the company is redundant".

    "Redundant" just means "superfluous". In the context of engineering it means not necessary for functional requirements, but included to cover for failure of another component.

    Unfortunately, the mealy-mouthed and dishonest started to use it as a euphemism for "fired because of lack of work to do", in other words, "surplus to requirements".

  • Rebel_X (unregistered)

    LoL, this reminds me with a youtube video I watched last week named "Hitler uses Git" a parody of the Downfall movie. Rebase head and Cherry pick xD

  • np (unregistered) in reply to andyf
    andyf:
    np:
    Beware of people that repeatedly refuse help but are obviously stuck. They likely just don't want to be caught in the mess they put themselves in.

    Best to explain to them that they are causing the company money even if they work best alone. If they aren't getting work done, then they are just sucking money from the company.

    I love this idea.

    Captcha: tristique, for when a bistique just isn't enough.

    I had a manager at my work that preferred to keep meeting short. The more people in the meeting, the less time he would try to spend (no side-chat or other time wasting things). He said he doesn't want to waste tens of thousands of dollars just to be idle in a meeting.

    Unfortunately he didn't get along with the Super-Duper VP of Engineering, so he retired early (still was at the company ~10 years).

  • RichieAdler (unregistered) in reply to ¯\(°_o)/¯ I DUNNO LOL

    Sorry to disappoint, but I've developed for the platform for nearly 20 years and I do like it. I recognize its limitations and that's getting behind the times, but for the applications I've created is a very nice platform.

    The bloat is indeniable and the Eclipse-based versions are increasingly more fragile, though. But nonetheless I've grown to like it.

    So there. One counter-example suffices to destroy a generalization. Have a nice day :)

  • Dale (unregistered) in reply to EatenByAGrue
    EatenByAGrue:
    Also, never believe a developer who says something is 90% complete.

    I knew one dev who said a program was "done" when he hadn't even attempted to compile it yet.

    "The first 90% of the schedule is required to make the project 90% complete and the second 90% of the schedule is required to complete the other 90%."

    But this article is fun because it uses the horror-foreshadowing phrase "I work best alone." If this was TV, the background music would strike a minor chord right there!

  • Evan (unregistered) in reply to Subversion weenie
    Subversion weenie:
    Can someone explain how adding a file to a zip causes a delta "nearly as large as the archive itself"? Adding more files doesn't change how pre-existing files are compressed, and any half-decent binary diff algorithm should be able to take advantage of that.
    Git doesn't store revisions as diffs.

    Somehow the repos still seem to be smaller than Subversion's (at least in non-pathological cases), so while I only kind of understand why Git does this (the bottom layer of Git is basically nothing more than a dumb storage & retrieval system: it knows nothing of revisions or versions or whatever that is) and how they get away with it, it does work for them...

  • Donald Knuth (unregistered) in reply to np
    np:
    Beware of people that repeatedly refuse help but are obviously stuck. They likely just don't want to be caught in the mess they put themselves in.

    Best to explain to them that they are causing the company money even if they work best alone. If they aren't getting work done, then they are just sucking money from the company.

    I always liked to figure things out. That's what we do, right? Many years ago one of my early bosses and one of the best programmers I've ever met said: "I understand that you want to beat your head against the wall when figuring something out, but when you start tasting blood please come and see me."

  • the way these may mays are HALARIOUS!!111 (unregistered)

    SERIOUSLY OMG I LOVE MAY MAYS :DDDD

  • Worf (unregistered) in reply to Meep
    Meep:
    foo AKA fooo:
    So TRWTF wasn't that nobody bothered to check on the new guy's changes? The repo size exploding by some orders of magnitude didn't ring any bells before it was too late?

    It wouldn't have been exploding by orders of magnitude. It would never have doubled in a single commit.

    If he had been adding files using the zip -a, they might have been able to compress it just using git gc. The reason is that zip -a removes the trailing zip directory, adds a new entry, and rewrites the new directory. Thus a chunked binary comparison like git's delta compression would find both files largely the same.

    But git doesn't do this until it runs garbage collection to clean up loose objects.

    And many shitty zip utilities probably recompress the zip file, thus throwing of a simple binary delta.

    Except git doesn't store files as diff. It stores files as-is, by SHA-1 hash.

    So each time you re-committed the zip file, Git would just re-commit the entire binary into the repository. Each time you do this, a new copy is stored.

    Now, the objects in git are referenced by SHA1 hash value (the hash of the object). Each commit records the SHA1 of the previous head commit, as well as the new SHA1 of every file included in this commit.

    Yes, in git, you can also have lost objects where it's been hashed, but no HEAD refers to them, usually because you're in detached HEAD state, do a commit, and thus your last commit is dangling - the only reference to it is your current instance of the repo. If you switch HEADs, you can lose that reference and Git can't get it back unless you remember it so you can merge it into another tree.

    A garbage collection basically has git walking through all the HEADs and tracking what files are referenced. Once all the HEADs have been traversed, it goes and removes all the objects that aren't referenced.

    Incidentally, a git clone does the same thing - git only sends you the objects that are fully linked together and none of the objects that are dangling.

  • (cs) in reply to np
    np:
    Beware of people that repeatedly refuse help but are obviously stuck. They likely just don't want to be caught in the mess they put themselves in.

    Yes. Now I'm teaching university students programming, I'm finding interesting patterns. It's an interesting challenge, having a room of 30+ beginners trying to code in Javascript - all trying to do roughly the same thing, but all at different stages, with slightly different approaches. A handful just charge ahead doing it; a lot get stuck and ask for help - and some get stuck, but are too proud (or afraid, or whatever) to ask. Those can be the hardest to deal with in some ways.

    "Why are you putting a DOCTYPE at the top of your Javascript file?" sigh

  • Boris Lanirov (unregistered)

    Forever Alone? Derprecated? Has this turned into The Daily I Can Haz WTF?

  • Islamic State (unregistered) in reply to Worf
    Worf:
    Yes, in git, you can also have lost objects where it's been hashed, but no HEAD refers to them, usually because you're in detached HEAD state, do a commit, and thus your last commit is dangling - the only reference to it is your current instance of the repo. If you switch HEADs, you can lose that reference and Git can't get it back unless you remember it so you can merge it into another tree.
    When we commit you to a detached HEAD, there is nothing left dangling.
  • QJo (unregistered) in reply to Dale
    Dale:
    EatenByAGrue:
    Also, never believe a developer who says something is 90% complete.

    I knew one dev who said a program was "done" when he hadn't even attempted to compile it yet.

    "The first 90% of the schedule is required to make the project 90% complete and the second 90% of the schedule is required to complete the other 90%."

    But this article is fun because it uses the horror-foreshadowing phrase "I work best alone." If this was TV, the background music would strike a minor chord right there!

    "I work best alone."

    "I'm sorry, sir, we don't have an appropriate opening for you in this company -- we need team players here."

  • (cs) in reply to EatenByAGrue
    EatenByAGrue:
    Also, never believe a developer who says something is 90% complete. A typical project status goes from 0% to 50% to 90% to 95% to 96% to 97% to 98% to 98% to 98% ... only asymptotically approaching 100%. That's because the dev believes they only have to fix the last bug, only to discover there's another last bug.

    I had this happen to me in an extreme way. I was taking a course on my degree, and one assignment was to write a simple program in PHP to create a phone list from Web input.

    I wrote and tested and wrote and tested. Finally, I was on the final test. If this test passed, then the code was done. Instead of typing a full phone number, I entered "0" -- hey, it's a numeric string! -- and was boggled by the error message.

    That was when I found out about PHP liking to convert between types in odd ways. A couple hours later, I had found out about the === operator, had modified my code, and had completed my testing. What a waste of time!

    Sincerely,

    Gene Wirchenko

  • little bear (unregistered) in reply to EatenByAGrue

    so true (about the percentages)

  • Gunslinger (unregistered)

    The real WTF is git. And consultants.

  • (cs) in reply to Steve The Cynic
    Steve The Cynic:
    (Yes, I like Perforce. I store all manner of (Finnish words) in it at home, and my repository backs up in a big ol' tgz that currently tips the scales at around 8GB. And I've occasionally obliterated stuff that I didn't want in the repository anymore. It seems to just work.)

    Have they fixed all the lovely security holes in Perforce? (A quick Google search for "perforce bugtraq" suggests the past few years have been blissfully free of Perforce exploits. Back around '07 it was hilariously dangerous.) Then it might be worth another look, though I'd want some grounds to believe they'd actually committed to designing for security.

Leave a comment on “Forever Alone”

Log In or post as a guest

Replying to comment #:

« Return to Article