- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
It's like you don't even realize that he said exactly the same thing about your original post.
Admin
"chubertdev" realized it a long time ago, but he still like to troll for heck of it.
Admin
I love this idea.
Captcha: tristique, for when a bistique just isn't enough.
Admin
And if that's not enough, what if the story munged things and it was actually a tarball not a zip, allowing the new file to affect the compression of the old ones?
Admin
It wouldn't have been exploding by orders of magnitude. It would never have doubled in a single commit.
If he had been adding files using the zip -a, they might have been able to compress it just using git gc. The reason is that zip -a removes the trailing zip directory, adds a new entry, and rewrites the new directory. Thus a chunked binary comparison like git's delta compression would find both files largely the same.
But git doesn't do this until it runs garbage collection to clean up loose objects.
And many shitty zip utilities probably recompress the zip file, thus throwing of a simple binary delta.
Admin
Admin
Oops wait. Was mispelling the intentional misspelling of misspelling? Then what was defecated? Hey wait, I know!
git, but .zip instead of .tgz? Oxymoron alert.
Admin
Just think of your "future self", once you no longer have the code fresh in your head, as peer no. 2, no. 3, or as many as you need. It still takes the same amount of man hours as if you were working with an actual peer, just spread along more time. Of course that means you have to reschedule the same pieces of code, and sometimes QA efforts get lost if you find some nasties at review time, but at least you do get to find them.
Admin
The real problem with binaries is when they can't be delta'd - if the whole file changes, then the whole file has to be stored! If it's also incompressible then this is going to be huge on any VCS whatsoever. So a ZIP file or similar pre-compressed package is the absolute worst case - each time a file is added to the ZIP the delta can be much larger than the (zipped) file that was added, and it's already compressed the repo can't make it much smaller.
Admin
Admin
TRWTF is the use of "to make redundant" for "to dismiss someone from their position because it was decided that their position within the company is redundant".
"Redundant" just means "superfluous". In the context of engineering it means not necessary for functional requirements, but included to cover for failure of another component.
Unfortunately, the mealy-mouthed and dishonest started to use it as a euphemism for "fired because of lack of work to do", in other words, "surplus to requirements".
Admin
LoL, this reminds me with a youtube video I watched last week named "Hitler uses Git" a parody of the Downfall movie. Rebase head and Cherry pick xD
Admin
I had a manager at my work that preferred to keep meeting short. The more people in the meeting, the less time he would try to spend (no side-chat or other time wasting things). He said he doesn't want to waste tens of thousands of dollars just to be idle in a meeting.
Unfortunately he didn't get along with the Super-Duper VP of Engineering, so he retired early (still was at the company ~10 years).
Admin
Sorry to disappoint, but I've developed for the platform for nearly 20 years and I do like it. I recognize its limitations and that's getting behind the times, but for the applications I've created is a very nice platform.
The bloat is indeniable and the Eclipse-based versions are increasingly more fragile, though. But nonetheless I've grown to like it.
So there. One counter-example suffices to destroy a generalization. Have a nice day :)
Admin
I knew one dev who said a program was "done" when he hadn't even attempted to compile it yet.
"The first 90% of the schedule is required to make the project 90% complete and the second 90% of the schedule is required to complete the other 90%."
But this article is fun because it uses the horror-foreshadowing phrase "I work best alone." If this was TV, the background music would strike a minor chord right there!
Admin
Somehow the repos still seem to be smaller than Subversion's (at least in non-pathological cases), so while I only kind of understand why Git does this (the bottom layer of Git is basically nothing more than a dumb storage & retrieval system: it knows nothing of revisions or versions or whatever that is) and how they get away with it, it does work for them...
Admin
I always liked to figure things out. That's what we do, right? Many years ago one of my early bosses and one of the best programmers I've ever met said: "I understand that you want to beat your head against the wall when figuring something out, but when you start tasting blood please come and see me."
Admin
SERIOUSLY OMG I LOVE MAY MAYS :DDDD
Admin
Except git doesn't store files as diff. It stores files as-is, by SHA-1 hash.
So each time you re-committed the zip file, Git would just re-commit the entire binary into the repository. Each time you do this, a new copy is stored.
Now, the objects in git are referenced by SHA1 hash value (the hash of the object). Each commit records the SHA1 of the previous head commit, as well as the new SHA1 of every file included in this commit.
Yes, in git, you can also have lost objects where it's been hashed, but no HEAD refers to them, usually because you're in detached HEAD state, do a commit, and thus your last commit is dangling - the only reference to it is your current instance of the repo. If you switch HEADs, you can lose that reference and Git can't get it back unless you remember it so you can merge it into another tree.
A garbage collection basically has git walking through all the HEADs and tracking what files are referenced. Once all the HEADs have been traversed, it goes and removes all the objects that aren't referenced.
Incidentally, a git clone does the same thing - git only sends you the objects that are fully linked together and none of the objects that are dangling.
Admin
Yes. Now I'm teaching university students programming, I'm finding interesting patterns. It's an interesting challenge, having a room of 30+ beginners trying to code in Javascript - all trying to do roughly the same thing, but all at different stages, with slightly different approaches. A handful just charge ahead doing it; a lot get stuck and ask for help - and some get stuck, but are too proud (or afraid, or whatever) to ask. Those can be the hardest to deal with in some ways.
"Why are you putting a DOCTYPE at the top of your Javascript file?" sigh
Admin
Forever Alone? Derprecated? Has this turned into The Daily I Can Haz WTF?
Admin
Admin
"I'm sorry, sir, we don't have an appropriate opening for you in this company -- we need team players here."
Admin
I had this happen to me in an extreme way. I was taking a course on my degree, and one assignment was to write a simple program in PHP to create a phone list from Web input.
I wrote and tested and wrote and tested. Finally, I was on the final test. If this test passed, then the code was done. Instead of typing a full phone number, I entered "0" -- hey, it's a numeric string! -- and was boggled by the error message.
That was when I found out about PHP liking to convert between types in odd ways. A couple hours later, I had found out about the === operator, had modified my code, and had completed my testing. What a waste of time!
Sincerely,
Gene Wirchenko
Admin
so true (about the percentages)
Admin
The real WTF is git. And consultants.
Admin
Have they fixed all the lovely security holes in Perforce? (A quick Google search for "perforce bugtraq" suggests the past few years have been blissfully free of Perforce exploits. Back around '07 it was hilariously dangerous.) Then it might be worth another look, though I'd want some grounds to believe they'd actually committed to designing for security.