- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Manual tape switching on such a large scale operation? Tape library anyone?
Admin
Nothing says loving like a reminder of how insignificant you are.
Admin
I worked one summer (1985) in the computer room of the Boston(*) office of a minicomputer maker, and I had to deal with this kind of tape. And no, they didn't have libraries where you could just stick half a dozen reels. Do you realise how big that would have been?
(*) Well, actually Westborough, but it wasn't any of the usual suspects for that part of the world.
Kids these days, honestly! ;)
EDIT: Oops, my bad, "a decade ago". Bah, that'll teach me to read the article...
Admin
being unimportant sucks
Admin
Admin
Nice random video player over the "Township" text...LOL
Admin
Fictitious chat log...
Customer Care: Thank you for contacting customer support, Mr. Darby. How may I help you today?
Mr. Darby: Hello Ms. Care. I think one of our merchant APIs was hijacked and poisoned our data with bad transactions. We need to restore the original data and manually enter the last 24 hours.
Customer Care: I am sorry Mr. Darby, but you aren't important enough for us to keep your backups available. I'm sure your customers will understand that we only support our important clients in this way.
Admin
So TRWTF is that they have a system so stable that they didn't need to resort to the backup tapes for over a decade? Go tell that to Twitter.
Admin
I know what it is. An EC5017-02 9-track 5" tape drive. Gets between 16 and 64 kilobyte per second. It was a peripheral option for a KFKI TPA 1140 mini-computer. The National Institute of Oceanography used these to register the seismic reflection data
Kids these days use Google.
Admin
wait... the mid 80s isn't "a decade ago" any more? When did that happen?!
Admin
Addendum (2012-08-21 09:19): EDIT: of course, I meant, "nearly two decades ago"...
Admin
Yeah, and you think it doesn't happen now. Just see what happens when you tell your cloud storage provider they just lost 6 hours of data if you're not a top tier client.
Only difference is these days they'll confidently tell you that you must be mistaken. Because it's obviously impossible for their geo-replicated mega cluster with 8 way redundancy and automated failover to have possibly lost your data. The only explanation is that you're hallucinating, you never created the data sir.
Admin
Would be nice to get some of this stories in here.
Admin
Anybody who doesn't at least keep a local ad-hoc backup on one of those little terabyte drives that you can buy for a small price in the consumer outlets can probably be considered a little bit naive nowadays. And if you have more than a terabyte of ephemeral business-critical data, then you're probably big enough and rich enough to have invested in a seriously heavy-duty disaster recovery process.
Admin
The central site used a hardware tape library, and tapes were extracted from the system automatically for transmission to the offsite storage.
But the library management software was configured to send, well, tapes where "CLIENT='IMPORTANT'" to the offsite recovery store, rather than "CLIENT='ALL'".
Admin
Unfortunately I work for a company which is a sub^3 contractor to an organization with lots of lawyers, a senior management with only a tenuous grasp of the English language, and a sales-force which march in goose step. Welcome to the Asia Pacific. So i unfortunately, details won't be forthcoming. Sorry bud. Ex-Pat life is awesome.
Admin
Unfortunately I work for a company which is a sub^3 contractor to an organization with lots of lawyers, a senior management with only a tenuous grasp of the English language, and a sales-force which march in goose step. Welcome to the Asia Pacific. So i unfortunately, details won't be forthcoming. Sorry bud. Ex-Pat life is awesome.
Admin
That's why our bosses cloud-sourced it dude. Those drives are capital expenditure, and we don't have any budget for that. Cloud storage services are operational expenditure, and we've got plenty of budged for that. Your way of thinking is so Enterprize 1.0
Get with the times man.
Admin
"I am sorry Mr. Darby, but our computer centre just got flooded/burnt down/bombed by terrorists [please pick one only] and you aren't important enough for us to have sent your backups off-site."
As to tapes ... I remember when out computer centre got the new, expensive high density tape drives. They could store a whole 20 MBytes on a single tape!
Admin
Admin
Firstly, I fully agree with you. But the business thought they were outsourcing that task (which is stupid on so many levels), but they did have an expectation (and contractual agreement) that they would receive said service. To me, outsourcing business critical data storage for a cheaper price is akin to outsourcing the fu*king CIO to the cheapest responder on e-lance. Unfortunately, in our case doing so probably wouldn't be a bad thing.
Admin
Alex would have featured your comment...if you were more important.
Admin
Isn't it less about being unimportant and more about "We can fit more than 1 daily backup onto a tape for you, so we never pulled the tape out of the drive" kind of a thing?
In other words the smaller clients didn't fill up a tape in a day, or maybe a few days, so they never pulled the tapes from the drives, and they never got shipped off-site?
Or did they not pull the smaller clients tapes for some other reason?
Admin
So in the best-case scenario, the recovery plan required the offsite backup tapes to be transported across 6 states before they could be mounted and restored from?
How many hours of downtime would that translate to?
Admin
Admin
Days and it would be fast by tape days standards I suppose. People are spoiled these days. Twitter down for 20 mins and its like end of the world. Honestly, twitter can be down for days and in reality it doesn't actually matter.
Admin
Admin
Admin
Has anyone noticed that this drama happened during a test of the disaster recovery process? That is, a disaster has not actually happened, and the unimportant customers' data has not actually been compromised, and there is now ample motivation and opportunity to rectify the matter.
The fact that the DRP test has not actually been done properly for, er, ahem, some considerable time is a bit of a foul-up, I will grant you - but now it's being put right, and so there is no drama outside of a fun anecdote to be related in the pub net Friday lunchtime.
Admin
Admin
"wrinkles that needed ironed out" This phrase is not grammatically correct and makes no sense. Think about the meanings of the words. 'wrinkles that needed TO BE ironed out' or 'wrinkles that needed IRONING out' would both have been perfectly valid.
Admin
I'm not convinced.
'wrinkles that sucked, ironed out' or 'wrinkles that groaned ironed out' or 'wrinkles that kneaded ironed out' or 'wrinkles that needed ironed out'.
The article doesn't say what the "wrinkles that needed" needed (perhaps it's a euphemism for old men who needing some coffee); and it only says that a lot of these "wrinkles that needed" were ironed out; but that doesn't make it grammatically incorrect.
Admin
This, Friends, is WHY we test. This is actually an example of the system working as it is supposed to.
Admin
Now, when you go to calculate the value of a company, it works like this:
How much profit are they making? Does it look like they can keep bringing in profit year after year?
How much stuff do they own? Buildings, land, trucks... crap we could at least sell if #1 goes south?
So, salaries and other operational expenses reduce #1.
Capital purchases increase #2.
Thus, it seems, a company would rather buy equipment than pay operational expenses. Why is this not so?
Admin
Admin
... and this, ladies and gentlemen, is why we test.
OK, who am I kidding?
... and this, gentlemen, is why we test.
Admin
Something is missing from the story. How did the "backup system" (automated or manual) distinguish important clients from the rest, in order to use procedure A on one group and procedure B (do nothing) on the other group? And moreover, how did such a distinction get created by accident?
Maybe it is as a previous post speculated... the tapes never got full for the smaller clients, so they never went offsite. OK, somewhat credible... but why leave it to the readers to guess? The story lacks.
Admin
Admin
Admin
We cannot unplug twitter. We need it to keep younglings indoors so they don't disrupt the traffic. Playgrounds are dilapidated and cannot handle the load. Twitter needs to stay online and keep the young ones busy.
Admin
That's how it used to work.
Admin
Agreed. However with cars getting safer and laws protecting the dumbest its getting harder and harder to weed out.
Admin
Your least important client is the one who is about to leave your service but has to get through to the disconnections team first.
So what you do is put them on hold forever, or close to it, and hope they choose to give up and not leave your service.
When my wife wanted to leave 3 network about a month ago and needed a PAC code we were put on hold for 1 hour and 12 minutes. The consequence of this is that when I will be leaving Orange at the end of the year I will not choose 3 as my new network even though they offer the best package at present for what I want.
Admin
Admin
Your least important clients are the ones who constantly complain about not being featured :p
Admin
And given that having "unimportant clients" is supposed to be a WTF, the point of this article, shows exactly why this site is just that...
Admin
Our off-site storage was 100 miles away, and the recovery site about 3 states away. We had a commitment to restoring full service within 48 hours of the initial downtime event.
Getting a truck of backup tapes from off-site storage to the recovery center does take hours... but that entire paradigm assumes that 48 hours is expensive, but not disastrous.
There are still many sectors of the industry where this is acceptable.
Admin
What the fuck am I not following here?! The backups are made (irrelevant how and on what media) and then shipped 2000 miles away from where the related apps are - ok. So, in case of disaster, the data is restored how? By calling some operator and he does something, then data gets shipped back how? By being transferred over the wire, or what? What the fuck did they plan for? I understand keeping a copy that far in case of a fucking Katrina, but keep one at home, for fuck sake, so that I you can recover quicker.
Admin
As it happened, they were the only company automatically billing a certain credit card of mine. So I called the credit card company and had them change my card number. Figured the radioheads would go ahead and unsubscribe me once they realized there was no more money to grab.
Surprise! My credit card company told the assholes my new card number! Because we certainly want those robocharges to keep flowing no matter what, right?
I called my card company and told them in no uncertain terms that these charges were vehemently not authorized. Finally the bleeding stopped.
I still get emails from the satellite radio company, years later, begging me to come back. I don't care if they have the most entertaining jock in the world. I'll cut off my own foot before they get another penny of mine.
Admin
:p
Just poking fun at you. You never hesitate to poke fun at me after all.
And I might add... it doesn't count unless it is something that you PAY for.