• Frist (unregistered)

    Frist

  • Quite (unregistered)

    The good news is that the vitally important equipment is still functioning (it still gurgles) so that one was plugged into the red socket, and all is well.

  • Oliver Jones (google)

    was zum Teufel?

  • (nodebb)

    "In a good organization, each of these tasks is designed, reviewed, built, tested and verified, and approved. These things need to be right. Not sort-of right, but right!"

    That doesn't sound very agile to me

  • Hanzito (unregistered) in reply to bjolling

    Agile in construction? Sounds good. So we have a story: when a user wants to enter, they open the door with a key. After two weeks this feature is sort-of right. Too bad the door never closes.

  • Martin (unregistered)

    Ah yes, that sure reminds me of something that happened in the data center of an ISP I used to work for ate the end of the '90 in Austria:

    The data center had been slowly growing for years, the quality of the grid meant that there basically hadn't been real failures for years.:While testing did happen, it was done in sections. Along came a real outage. Cut to Batteries: OK. Generator Startup: OK, Cut to Generator: OK. A couple of seconds later: Generator failure, cut back to batteries. Repeat a couple of times till the batteries are dead and the whole data center goes dark.

    So, what happened? Turns out that capacity had been added to the data center, always making sure to stay under the load limit of the generator(s). The nasty fact that after cutting over to the generator, the batteries would try to recharge had been overlooked.

    The load of the full data center plus the load of the UPS unit(s) recharging turned out to be too much for the generators. It took days till everything was back up and running again.

  • Ex-lurker (unregistered) in reply to Quite

    No, it was plugged into a gray socket where it was supposed to be. You must have missed the part where their floor had been wired incorrectly.

    But I agree that that was the most important equipment in that place and thus should stay powered no matter what. This way they can still chug caffeine while they wait for the power to be restored.

  • (nodebb) in reply to Oliver Jones

    Just ask him: https://en.wikipedia.org/wiki/Erwin_Teufel

    (maybe fun facts: "Teufel" translates to "devil", and "CDU" stands for "Christlich Demokratische Union (Deutschlands" which translates to "Christian Democratic Union (of Germany)")

    Addendum 2016-09-12 08:28: AFAIK, people in Baden-Württemberg still say "zum Teufel" and not "zum Kretschmann", though, for whatever reason

  • LH (unregistered)

    Ah, nothing like a German WTF, where the aftermath is a nice cup of coffee.

  • your name (unregistered)

    from my experience with Germans there is one thing missing: the immediate start of the blame game: "That was your responsibility"- "No, it was the fault of x"

  • Russell Judge (google)

    Hey, at least they had coffee. And as everyone knows, that is what's important.

  • suomynona (unregistered)

    I was working at an independent telco as a sysadmin. They did power-fail tests every year or two. The first one I participated in revealed an oversight: While all of the magnetic door locks and their associated readers were on battery/generator power, the PC that had the authentication database for the door locks was not. Unfortunately, this was not discovered until after the last person left the switch room... which was where the PC in question was located. Since the doors were held shut by strong electromagnets, it didn't matter much that the staff had keys for the doorknobs. I forget how they ultimately got back in to fix the issue, but I recall it took a while.

  • Damien (unregistered) in reply to suomynona

    I'd have thought they'd do the obvious - since the power couldn't be cut locally, stage a terrorist incident and wait for the LAPD to cut the power remotely.

  • Quite (unregistered) in reply to Ex-lurker

    " ... the coffee maker was happily gurgling along in the silence that had suddenly fallen."

    Aha, gotcha. The grey sockets were in the UPS and the red sockets were on the boring old mains.

  • RichP (unregistered) in reply to Damien

    After all that posturing, and you're just a common crook!

  • Joe (unregistered) in reply to suomynona

    Don't most magnetic door locks systems have it setup that a key over rides it?

  • George Gonzalez (unregistered) in reply to Frist

    The real WTF is that they never did the real test, flipping off the main utility power. Using some soft switch down the line is less than a full and adequate test.

  • (Untitled) (unregistered)

    So here's a gem that happened to me. Setting: Indoor office building, no windows. I, the lowly intern.

    Middle of the afternoon, I'm typing away and suddenly everything's black. No sound, no lights, no nothing. Felt like I blinked and forgot to open my eyes again. Main power's gone, and on top of that, so are all our 'red socket' devices. After a minute the red socket items come back on, and my monitor illuminates the room. My PC, however? Not on a red socket. Ditto our staging server.

    Lost a day of work AND had to help debug the slightly-corrupted server. Dev lead walked around asking everyone who pulled the code the most recently :).

    But at least my dev machine got moved to a 'red socket'. (Due to limited space, this meant my monitor was kicked off.)

  • operagost (unregistered)

    It takes a really impressive painting job to completely clog a receptacle like that. I thought it was lazy when I noticed at a friend's newly purchased house that the previous resident hadn't bothered to remove or mask the switchplates before painting and just got a little around the edges.

  • Strif (unregistered) in reply to (Untitled)

    I don't know what the norm is in other places, but at my workplace the policy is that computers go in red sockets and monitors go in the normal white ones. I'd know, considering my team has moved offices a couple times, moving our own workstations over…

  • Benito van der Zander (google)

    We have a full test on Wednesday

    All power will be killed for two hours.

  • Joseph Osako (google) in reply to Russell Judge

    Sort of like chicken, then?

  • Joseph Osako (google)

    Hah, these guys are wimps. Try dealing with rolling blackouts in the Financial District every other day when the Wildly Famous Brobdingnagian finance company you are working for doesn't think they need to provide something as expensive as UPCs for mere developers (or web servers, I mean really, we've lasted 150 years without that silly 'Interweb' thing, why should it matter now?).

    Addendum 2016-09-12 11:44: Er, I meant UPSes. Admittedly, the servers would have to be of their connection for the half-hour to hour duration, and if PG&E had given the notice in a timely fashion it would have been fairly minor to deal with, but it still shows a real lack of interest in making their web presence work. Which it didn't, really, not when I worked for them at least.

  • DontLookWeKnowWhatWeAreDoing (unregistered)

    Back in the late 90's or early 90's a large tech manufacturer in the Chicago area had a similar setup - UPS, Diesel Generator, wired properly and tested properly at regular intervals. When the diesels on the roof kicked in the whole building shook. And would continue to shake for a half hour as the current supply of diesel was sucked down. (Fun Fact: If you let diesel fuel sit too long the long chain hydrocarbons cross link on their own any you get jelly in your tank.) The contract with the diesel supply firm specified a half hour response and they could supply enough fuel to keep things running for days. After a few midwest thunderstorms tested the system (along with a failing test of the local electric company) everyone felt that things were well and under control. Then one January a once-on-a-decade snow storm blew in dropping a couple feet of snow and causing travel restrictions to be imposed while the snowplows tried to make travel safe, or even possible. When the electric grid collapsed under the combined assault of ice, lightening, snow and wind the call went out to top off the diesel tanks. The fuel supply company replied that the act-of-God exclusion applied and anyway their trucks were legally barred from moving because of the travel ban. The skeleton admin crew that had been able to get in by cross-country ski or otherwise had 15 minutes to run around and shutdown as many servers as they could before the whole datacenter went dark.

  • (nodebb) in reply to DontLookWeKnowWhatWeAreDoing

    Well, your story really seems to be a force majeur situation. I don't think there's anyone to blame in such a situation.

  • jmm (unregistered) in reply to DontLookWeKnowWhatWeAreDoing

    And here I was, waiting to hear how the extra snow load on the roof caused the generators to shake themselves through the roof into the floors below...

  • (Untitled) (unregistered)

    So here's a gem that happened to me. Setting: Indoor office building, no windows. I, the lowly intern.

    Middle of the afternoon, I'm typing away and suddenly everything's black. No sound, no lights, no nothing. Felt like I blinked and forgot to open my eyes again. Main power's gone, and on top of that, so are all our 'red socket' devices. After a minute the red socket items come back on, and my monitor illuminates the room. My PC, however? Not on a red socket. Ditto our staging server.

    Lost a day of work AND had to help debug the slightly-corrupted server. Dev lead walked around asking everyone who pulled the code the most recently :).

    But at least my dev machine got moved to a 'red socket'. (Due to limited space, this meant my monitor was kicked off.)

  • Carl Witthoft (google) in reply to (Untitled)

    So the power outage led to you becoming duplicated?

  • (Untitled) (unregistered)

    So here's a gem that happened to me. Setting: Indoor office building, no windows. I, the lowly intern.

    Middle of the afternoon, I'm typing away and suddenly everything's black. No sound, no lights, no nothing. Felt like I blinked and forgot to open my eyes again. Main power's gone, and on top of that, so are all our 'red socket' devices. After a minute the red socket items come back on, and my monitor illuminates the room. My PC, however? Not on a red socket. Ditto our staging server.

    Lost a day of work AND had to help debug the slightly-corrupted server. Dev lead walked around asking everyone who pulled the code the most recently :).

    But at least my dev machine got moved to a 'red socket'. (Due to limited space, this meant my monitor was kicked off.)

  • Franz (unregistered) in reply to Carl Witthoft

    Triplicated ;)

  • (nodebb)

    The coffee maker was still working despite the blackout, I fail to see the issue here :D

  • DontLookWeKnowWhatWeAreDoing (unregistered) in reply to YellowOnline

    The current 'best practice' is to use natural gas for your generators. 25 years ago I think going with diesel was more cost effective and what got them into the position where the weather blind-sided them.

  • Ron Fox (google) in reply to Joseph Osako

    I liked better the idea that none of the devs had UPC's tattooed to their forheads...until you went and spoiled that image.

  • Ron Fox (google) in reply to DontLookWeKnowWhatWeAreDoing

    ...and I was waiting for the Hanukah miracle ending where 15 minutes of diesel lasted 8 days.

  • me (unregistered) in reply to operagost

    If you leave some latex paint opened for a day or two, most of the water should dry up. Easy enough to goop that on

  • Developer Dude (unregistered)

    And Curling.

    Close counts in Curling too

  • Herby (unregistered) in reply to AlexMedia

    "The coffee maker was still working despite the blackout, I fail to see the issue here :D"

    This reminds me of the picture of a guy using a Sun Workstation during the height of the "Morris Worm" fiasco. The guy has a bunch of Coke cans littering his desk. My dad saw that and immediately thought of me, as that is my drink of choice, and I consume lots of it now (although now it is Coke-Zero).

    Of course caffeine is the fuel of programmers, so if it is coffee, or drinks, anything will do.

    Note to self: Is the soda machine plugged into the red outlets??

  • Developer Dude (unregistered) in reply to DontLookWeKnowWhatWeAreDoing

    Speaking as a former diesel mech (marine, industrial, gensets), diesel doesn't degrade that way. Generally it can take quite a few years for plain diesel to degrade, and diesel destined to be stored in tanks for gensets usually has additives to prevent any degradation.

    Diesel can come out of a supply tank as a mess that seems congealed, but that is either an issue caused by water in the fuel or by ambient temps; the water/fuel interface causes bacteria to grow in the interface. Biocide is regularly added to large diesel tanks where the fuel is not rotated very often. Diesel can "gel" at temps somewhere below 0*F depending on its formulation and additives. These issues are possibly what you saw or heard about, not degradation of the fuel itself, but either contamination or severe cold weather.

  • Norman Diamond (unregistered)

    And Curling.

    And file handles. Close counts in file handles.

  • Joseph Osako (google) in reply to Ron Fox

    Wouldn't they be on the back of the neck, like they do with assassins?

  • fred (unregistered)

    Fortunately it was caught just before handover, but I know of a new hospital where they mixed up the grey and the red gas outlets...The contractors had to check every O2, Air, C02, anasthetic & whatever outlet in the whole hospital. Then building servcies had to check every outlet again (in case the contractors had got it wrong again). Then medical had to check every outlet again (to be sure no patients were killed). And how do you check gas outlets? You have to put a gas check and wait for a readout on each and every one....

  • Olivier (unregistered) in reply to suomynona

    When I designed our magnetic door lock system, I made sure the lock was NOT on a backed-up plug, so if the power goes off, the door will open. It may be unsafe for the equipment, but it's certainly safer for the people inside.

    Plus I added a key to cut off the power to the lock (and a panic button, but that's inside).

  • Olivier (unregistered) in reply to Martin

    I had something similar recently. During a week-end, my system went offline. I was hundred of kilometers away, so I did not went back, but on my way back on Sunday evening, I decided to give it a look.

    Main power was up, but one of the UPS was down, no more battery, won't start-up.

    It took me a while to trace back the problem to the main power panel: one circuit breaker had tripped.

    Next morning, when I examined the electric circuit in the server room, I noticed that all the plugs were connected to the same circuit breaker. All had been working fine for years, but I recently added a portable air-conditioner because the room was getting a tad bit on the hot side.

    That was not a problem either, it had worked for weeks.

    But then came a longer power outage, when the main came back, the UPS started to recharge its batteries and that was the extra load that tripped the breaker.

    I made sure to hook the UPS to a different circuit.

  • Norman Diamond (unregistered)

    And Curling.

    And fatal attractions. Close counts in fatal attractions.

  • Hans the Great (unregistered)

    Worked once on a hospital where we indeed had tree levels of power: Normal; Prioritized and Continuous. We were installing a new computer system for the chemistry laboratory. Management demanded continuous power as computers was very important. People and technicians from the laboratory stated: If we have a power fail in our hospital we will have other concerns to run important parts of the laboratory than a computer. Ended up we used normal power for our computers.

    Moral of the story. Views on what is important might differ.

  • Paul (unregistered)

    Where I work we have backup generators as well. They regularly test the system via a "total shutdown", some guy from the electrical company comes on site and flips the biggest switch we have: the one at our very own substation. Power goes out and backup generator starts up and takes over. This takes about 45 seconds. You can imagine this is quite disruptive, half the equipment is powerless for a hour or so, the other half is powerless for 45 seconds and only the most critical stuff is backed by a UPS. It's important, but it's disruptive. Hence, "regularly" is approximately once every 3 year.

    Non-disruptive tests however are more frequent, particularly, testing to see if the backup generator still works. They do this about every month: Press the ignition button, see if it starts, stabilizes etc, do some measurements, turn it back off. Fancy, right?

    When suddenly: A power outage. It had been ~2.5 years since the last "total shutdown", but the generators had been tested 2 weeks ago, so everything should be fine. Right? Except, every time they test the generator, its charger is still plugged in. Semi-dead battery + active charger == successful startup of the generator. Without the charger, not so much...

    Ever since the above procedure has been altered slightly: Disconnect charger, press the ignition button, see if it starts etc...

  • (nodebb) in reply to Hans the Great

    People and technicians from the laboratory stated

    I love how you word those to be exclusive categories! Seems right in my experience too…

  • (nodebb) in reply to YellowOnline

    Nobody except the cheap bastard who decided half an hour's fuel supply for half an hour supplier's response time was fine. Unless the supplier was literally a neighbor (seems it wasn't) it's easy to see how just a couple of red lights can screw up that plan. And that it's the supplier's insurance who has to pay for damages later doesn't help you all that much in that situation. A 2000 l tank, something many people have in their basement, is enough to run a 2.5 MW engine for hours.

  • Norman Diamond (unregistered)

    Unless the supplier was literally a neighbor

    Close counts in suppliers too.

  • Mark (unregistered) in reply to fred

    Bankstown-Lidcombe Hospital in NSW, Australia had a similar problem a couple of months ago, except it wasn't detected until 1 newborn baby died and another was left with brain damage.

    Nitrous oxide and oxygen lines mixed up.

Leave a comment on “Red Black Trees”

Log In or post as a guest

Replying to comment #:

« Return to Article