- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
This story sounds very familiar, except that last time it was some gun-for-hire that pulled this stunt. Seeing it as an actual institutional policy though... shudders
Admin
I really wish I could believe this wasn't true. Sadly I've been around too long and seen too much . . .
Admin
It's subversions all the way down.
Admin
The subversion of Subversion. So how well did the suggestion to move all the backups out of the SVN path go over with management?
Admin
Is it time to add software engineers as a type of Registered Professional Engineer? Such a person has to sign off on designs for bridges and stuff like that. Maybe "midsize companies" should have their biz insurance underwriters ask "did a Registered Professional Software Engineer look at your setup?"
Admin
Maybe someone at the company had shares in an SSD manufacturer.
ray10k (unregistered) 2018-05-09 Reply This story sounds very familiar, except that last time it was some gun-for-hire that pulled this stunt. Seeing it as an actual institutional policy though... shudders
dpm (unregistered) 2018-05-09 Reply I really wish I could believe this wasn't true. Sadly I've been around too long and seen too much . . .
my name is missing (unregistered) 2018-05-09 Reply It's subversions all the way down.
Nutster (nodebb) 2018-05-09 Reply The subversion of Subversion. So how well did the suggestion to move all the backups out of the SVN path go over with management?
Ollie Jones (unregistered) 2018-05-09 Reply Is it time to add software engineers as a type of Registered Professional Engineer? Such a person has to sign off on designs for bridges and stuff like that. Maybe "midsize companies" should have their biz insurance underwriters ask "did a Registered Professional Software Engineer look at your setup?"
Admin
My group has a 300gb repo at work.
It has everything in it. They're basically using it like you would a network drive. All the projects, documentation, and even software installers they found handy.
Needless to say if you try to check the whole thing out it takes two days even if you let it run overnight.
They did this long before I joined so unfortunately it was already a reality by that point. I'm trying to convince them to create a new repo for each new project now and to keep unnecessary files off of it.
Admin
At least their stuff was very well backed up. Every developer's machine had copies. Lots of copies of parts of it :)
Admin
Hey, what a coincidence, I'm trying to convince my teenage son to take his future more seriously. I figure we have equal chances of success.
Admin
Must say I was somewhat surprised that 300 GB downloaded during a coffee break.
Admin
This happened to me once, but just by accident, but I learned from it QUICKLY. In school, I was supposed to set up a backup of a server that I was running, but I was only allowed to save 2GB of files to the remote server, so I had to be super selective about what I wanted to back up, with the goal of restoring functionality as fast as possible on the day when the instructor went in and hosed our machines (usually by deleting /etc among others)
My original backup script tarred up all of the files it needed to my home directory, then transferred that to the server. No problem. Except that one of the things it tarred was the home directory itself (can't lose user data after all), and it forgot to DELETE the tarball after transferring it, or before running the next backup.
2 days later I had "out of space" errors coming off of the server. I learned from my mistake: backups always have to be careful to not include themselves. Looks like some other people could use running through that class.
Admin
You need to learn how to take longer coffee breaks.
Admin
300 GB would download in about 45 minutes on a standard gigabit Ethernet adapter from a good server. A bit long for a coffee break, but not unheard of ;)
Admin
300GB/20 minutes = 250MBps
It's hard to believe anyone who's implemented recursive backups would be able to wire a high-performance network, but it's within the realm of plausibility.
Admin
Hey dawg, I heard you like backups, so I made some backups of your backups of your backups of your backups of your backups.
Admin
Two different things, really.
You can get a third-party network team in to wire up a 1GB (or higher) network, so that higher management can watch, er, instructional videos on how to program in Haskell, complete with fake gasping and close-ups ... er ...
... or you could assume that nobody in their right mind would incorporate zip-file backups into a SVN database. Which is probably a faulty assumption. Companies hire idiots all the time.
The wonderful thing about this case is -- recursive backups. The fact that recursive backups are actually built into the design of the source control system is almost so incredibly unbelievable that I would assume it could never happen.
We all, now, know better. A recursive WTF -- what could be more enjoyable?
Admin
A possible explanation:
Due to the recursive backup monstrosity, someone starts to notice it takes an unacceptably long time to check out the repo. Rather than attempt to make the repo more efficient, some manager decides the problem must be the network, and orders IT to upgrade the network.
Band-aids are very popular fixes in this sort of environment....
Admin
I get that it's a pun on "exponential backoff", but this strategy is quadratic, not exponential.
Admin
No, it is exponential. You forgot that the TOTAL at each step includes all the previous. consider: dasums[1] = 1]; for (jj in 2:10) dasums[jj] = dasums[jj-1]*2 + 1
Admin
You start by backing up then you back up your backups then you back up your backups of your backups then you back up your backups of your backups of your backups .... then the universe runs out of space for your backups of your backups of ...
That might take a while because on the one hand the universe is big and your first backup is small, but then exponential growth is really fast. However, the bigger the backups the longer it takes to back them up, so you have an upper bound on how fast you can copy the bits.
Before you start scheduling the next backup to start after the previous one finishes, you might want to rethink your backup strategy.
Admin
That's a great idea! If the backup is hosed, you can restore it from backup.
Admin
I did that once.
Backup was on a USB drive.
Managed to back up /, including /media.
Admin
The universe is big, but we're limited by the holographic principle. Though, if it all collapses into a black hole, the information is nicely preserved at the event horizon.
Admin
It can also happen to git. $coworker made a tar-gz of a non-trivial portion of the work tree, then added that tar-gz in the next commit (probably using the almost always wrong "git add ."). The person caught their mistake in the next commit, but git has the memory of an elephant, so that tar-gz is still consuming space in every checkout.
Admin
Yo dawg, I heard you like backups, so we put a backup in your backup so you can backup while you backup.
Admin
Yo dawg, I heard you like backups, so I put a backup in your backup, so you can backup while you backup.
Admin
Backup like what a toilet does.
Admin
Question is, if something fails due to negligence or inability will that person be criminally liable for loss (even of life). Imagine how that will push up the cost of development. But I agree, it could be a good thing
Admin
That sounds... very likely.
I'm vaguely surprised that this detail wasn't part of the story, whether or not it was part of the original submission!
Admin
It's backups all the way down!
Admin
I remember a utility where the default location for saving the backup was the path where the data was stored. So if you didn't change it you'd be recursively saving the previous backups.
Admin
Recursion is divine!
I love it when repos have personal directories: johnny1 johnny2 or timestamps as directory names.
Admin
Recursion is divine!
I love it when repos have personal directories: johnny1 johnny2 or timestamps as directory names.
Admin
Registered Professional Engineers are a good way to ensure that we get more great WTFs, this time about stupid professional engineers
Admin
You can force it to squash the commit and so remove/garbage-collect the offending blob. Doing so (unless you do it immediately) requires major brain surgery though, and the more you've done afterwards, the harder it is to fix. You probably want to avoid doing that if you can.
Add-all is a nasty anti-pattern in VCSs, because it almost works very well and instead works too well and picks up everything it shouldn't as well.
Admin
TRWTF: Nobody checks out a whole Subversion repository. You just browse it, then decide what to check out.
The structure is similar to copy-on-write, so creating a branch costs basically nothing while checking it out will cost you twice the harddisk space.
Nobody checks out a whole Subversion repository.
Admin
I am such an idiot that I have this problem as a person (not a company). I'm a writer, not a computer programmer. I don't have time or wherewithal to figure out the computer but I have always done backups. Save Early Save Often! Always have a Backup! is what I learned when I learned pascal after college. Well I back up. and when I get full, I don't know how to sort, and when I move machines, or get a new hard drive, I just back up everything. now I can't find anything.
I landed here because I knew i had a problem with an exponential number of backups which is what I googled. any idiot person can get that. it is just as bad as having no backups at all, in fact if I knew what to do, I'd throw out he 20 hard disks I have time machine backups on and just do the new thing which I don't know what it is.
can anyone point me to a web page? I am not really an idiot (you can see I found my way here) but I don't know what to do with my personal exponential numbers of backups. is there a protocol, a method, something? even worse, the metadata gets screwed up every time you get a new computer. and I am a writer so you guessed it, I work on a poem and there are 30 versions because word says, "you can only save a copy of this" or because I wanted to save just the raw version and the latest one but somehow all these that Microsoft made me make and the ones I've mailed away are taking up space.
god tell me there is someone who has solved this problem already and where I can find their protocol!
thank you.