• (cs) in reply to snoofle
    snoofle:
    On Windows (at least in XP-sp2): start | Programs |Accessories | Accessibility | Magnifier, then set magnification level to 9 - really easy to count pixels w/o squinting!

    Windows-key U. The result isn't as convenient since it brings up another dialog, but the increase in speed makes up for it.

  • Top Cod3r (unregistered)

    Maybe if you are some low-level guru you would do timing using a profiling tool, but when you work on enterprise software meant for corporate users, its often better to use a stopwatch because it measures real-world performance, and not some theoretical performance benchmark that a profiling tool gives you.

    I don't really see the WTF here, unless they are trying to measure CPU cycles using a stopwatch, but they are not.

  • (cs) in reply to Sunstorm
    Sunstorm:
    This reminds me of the time when I used to put my face against the monitor and squint really hard to count the number of pixels of something drawn on the screen. Then I discovered printscreen...

    Didn't that kind of hurt your eyes?

  • Zygo (unregistered) in reply to sirhegel
    sirhegel:
    >This reminds me of the time when I used to put my face >against the monitor and squint really hard to count the >number of pixels of something drawn on the screen. Then I >discovered printscreen...

    ... And this reminds me of my fellow colleaque, who had a dead pixel in his laptops LCD screen. He wanted to complain, so he hit printscreen, verified that the dead pixel is indeed visible in the screenshot, and sent it via EMail to the laptop manufacturer.

    That's either a dead memory bit in the frame buffer RAM, or some kind of <insert ethnic group here> joke...

  • Zygo (unregistered) in reply to BS (those are my real initials!)
    BS (those are my real initials!):
    I once applied to work for MITRE. They invited me onsite, flew me in, set me up in an excellent hotel, and gave me a tour of the place.

    I never heard from them again. They spent over $1000 on my plane tickets (I printed a receipt at the gate), but they couldn't spend 39 cents on a rejection formletter.

    Obviously you were a total loss, and they were hoping you wouldn't cost them any more money if you just went away...

    BS (those are my real initials!):
    Never mind that Travelocity/Expedia/Orbitz/etc listed identical itineraries at a quarter the cost.

    ...or they were looking for someone to fix their internets.

  • Jon (unregistered) in reply to rbowes
    rbowes:
    Not to mention the plural "youse", as in "hey youse guys!"
    I figure that distinguishing singular and plural "you" will never work out—the last time English had such a distinction, it went out of fashion. It was a very long time ago; I doubt thou rememberest it.
  • Shakespeare (unregistered) in reply to Jon
    Jon:
    rbowes:
    Not to mention the plural "youse", as in "hey youse guys!"
    I figure that distinguishing singular and plural "you" will never work out—the last time English had such a distinction, it went out of fashion. It was a very long time ago; I doubt thou rememberest it.

    Thou was as much a formal you as a plural you. Romeo wherefore art thou? I'm pretty sure Romeo was only one person...

  • Carl T (unregistered) in reply to Shakespeare
    Shakespeare:
    Jon:
    rbowes:
    Not to mention the plural "youse", as in "hey youse guys!"
    I figure that distinguishing singular and plural "you" will never work out—the last time English had such a distinction, it went out of fashion. It was a very long time ago; I doubt thou rememberest it.

    Thou was as much a formal you as a plural you. Romeo wherefore art thou? I'm pretty sure Romeo was only one person...

    And that was the biggest load of hyena offal so far today. Not only do you seem to have no real clue as to what the difference between thou and you are, and how, when and why the use of thou/thee/thy/thine has receeded, but you also misquote poor Bill Shakespeare. While claiming to be him, no less.

  • Pol (unregistered)

    there are no words...

    CAPTCHA: darwin...evolutionary!

  • (cs)

    I've seen stopwatches used to time stuff in my previous workplace. This only worked effectively because it took almost 2 minutes to load and render a data input form.

    We eventually got it down to 50 seconds, and the management were pleased.

    buh.

  • (cs)

    I once had to time how long an SMS took to get from a remote device into our database to the nearest second. I said it'd take 7 seconds. This wasn't "proven" enough though. So I wrote software to automate the sending of requests, and time how long it took. It was GPS timestamped at the device end, so I also wrote some software to constantly sync my PC's clock with a GPS receiver.

    I ran the tests for a week and then gave them the result.

    It was 7 seconds.

  • (cs) in reply to Carl T

    I'm sure William Shakespeare would be delighted to have someone refer to him as "Bill". </sarcasm>

  • nobody (unregistered)

    Reminds me of when I worked at a different government site, and there was a particularly incompetent government EE. He didn't know, for example, that you cannot pass the same signal to the data and clock inputs on 7400 TTL chips and expect consistent results; the data needed time to settle before the clock could be signaled, for those of you who know EE of that era (somewhat after clay tablets and before the PC.) He was laid off in a RIF - reduction in force - and went to MITRE. My opinion of MITRE dropped drastically when I heard that.

  • Anonymous (unregistered) in reply to BS (those are my real initials!)
    BS (those are my real initials!):
    I once applied to work for MITRE. They invited me onsite, flew me in, set me up in an excellent hotel, and gave me a tour of the place.

    I never heard from them again. They spent over $1000 on my plane tickets (I printed a receipt at the gate), but they couldn't spend 39 cents on a rejection formletter.

    Never mind that Travelocity/Expedia/Orbitz/etc listed identical itineraries at a quarter the cost.

    Trust me, you lucked out. I'd say more, but, well, I'm not quite sure how much I could say without getting into trouble. The problem with government contracts is that they tend to classify seemingly random things...

    Let's put it this way. In all my dealings with MITRE people, I've never run into anyone competent. Ever. If I ever am in the position to effect hiring, if I ever run into anyone with MITRE on their resumé, it's heading to the circular file.

    I honestly wish I could give specific examples, but I'll have to stick with generalities since specific examples could get me sued or, worse, arrested. I'll just sum it up with: Java and XML and webservices for everything - up to, and including, things that require real-time responses and large amounts of binary data.

    It wouldn't be so bad, except they don't actually implement these systems, just help the government create contract requirements.

    So I fully believe that a MITRE employee could come up with a plan to use stopwatches to "accurately" measure low-level I/O hardware times. Not user times, but actual hardware performance times. In fact, I think I've seen something similar - but can't give out details, sorry.

    Posted anonymously for obvious reasons.

  • Joe (unregistered) in reply to The alimentary canal
    The alimentary canal:
    Were there seriously no corpses laying around?

    I hate to be the one telling you this, but... intestines does not do a terribly good job at moving electrons.

    Technically intestines, like other internal body parts, are filled with water. So that'd make them a conductor. Hmmm,

    Next time I'm out of cable I'm squeezing meat out of sausages and using the pig intestines that sausages are made in.

  • (cs) in reply to Joe
    Joe:
    The alimentary canal:
    Were there seriously no corpses laying around?

    I hate to be the one telling you this, but... intestines does not do a terribly good job at moving electrons.

    Technically intestines, like other internal body parts, are filled with water. So that'd make them a conductor.

    It's also worth noting that neurons are excellent conductors of electricity, since that's exactly what they're for.

  • (cs) in reply to Someone You Know

    Neurons normlally slow the signal down a lot though. Down to some hundreds of meters per second. Which is a feature, not a bug, but the reason escapes me at the moment.

  • (cs) in reply to valerion
    valerion:
    I once had to time how long an SMS took to get from a remote device into our database to the nearest second. I said it'd take 7 seconds. This wasn't "proven" enough though. So I wrote software to automate the sending of requests, and time how long it took. It was GPS timestamped at the device end, so I also wrote some software to constantly sync my PC's clock with a GPS receiver.

    I ran the tests for a week and then gave them the result.

    It was 7 seconds.

    I once worked on an application that scanned a fingerprint and printed it on an ID card, and I was asked (by the VP of Sales) to verify that the printed image was the same size as the actual fingerprint. The scanner produced images that were 400 x 600 @ 500 PPI, which is 0.8" x 1.2". I reasoned that if the printed area on the card was also 0.8" x 1.2", it should be fine.

    This wasn't "proven" enough. I suggested that I would create a 400 x 600 @ 500 PPI image in Photoshop, draw some tick marks in the right places, print it out on the card, and measure it with a ruler. This also wasn't enough, since it didn't prove that the scanner actually scanned at 500 PPI (never mind that fingerprint scanners have rigorous standards and test procedures to verify this kind of thing).

    Then we decided to scan a ruler with the fingerprint scanner, print the image on the card, then measure it with the same ruler. Now, most fingerprint scanners use optical technology, but this one happened to use ultrasound, so when we scanned the ruler, the image was blank. The VP said "Try it again, and I'll push down harder on the ruler". Tried again, still nothing. "Try again, and I'll push down even harder". He wasn't kidding, because the platen snapped, and the ultrasonic goo inside the scanner splattered all over us and my desk.

    We doubled over in laughter for a while, then decided to go with my earlier Photoshop idea.

  • anonymous (unregistered)

    Your tax dollars at work.

  • (cs)

    This isn't that much of a WTF, seeing as it was 1986, and in a SCIF, so external timing software (not part of the AIS) would be a no no.

    Stopwatches it is. I mean, if it's the kind of thing that would make you look at your watch in the first place while it took place in operation, then I imagine the accuracy and precision of that testing method are scale appropriate.

    A video camera is definitely more practical. BUT WAIT... recording devices like cameras make the DoD nervous so unless they had one lying around... no go there as well.

    Another anonymous poster mentioned MITRE, java, and XML. Let me just say you are probably working the intelligence or civilian interfacing parts of MITRE?

    Yeah, I know. They mean well but since they don't put the stuff into practice it's all lovey-dovey pie-in-the-sky crap when it comes to enterprise systems. Ugh. I have to work with these people. (disclaimer)

    The security people are where all the real talent is. Ever heard of SELinux? ...

  • Disbeliever (unregistered) in reply to kirchhoff
    kirchhoff:
    This isn't that much of a WTF, seeing as it was 1986, and in a SCIF, so external timing software (not part of the AIS) would be a no no.
    Where the bloody hell did you pull that information from? The article makes no mention of it. Let me guess, you're a MITRE senior engineer?
    kirchhoff:
    The security people are where all the real talent is. Ever heard of SELinux? ...
    Yes - it's an NSA-sponsored set of kernel patches (which are now part of the kernel) to improve the Linux kernel. According to the SELinux page, the extent of MITRE's involvement with the SELinux project involved Apache, Sendmail, and cron. Which, barring the type of stupidity that uses stopwatches to measure times in computers that have real time clocks, are not part of the kernel.
  • Arantius (unregistered)

    Hilarious! I once wrote, for much less important use, a little javascript based stopwatch: http://tools.arantius.com/stopwatch

    It got somewhat popular. Then, I got an email. From a government contractor. Who wanted permission to use that tool in part of their project.

    They weren't allowed to add other software to the system. This was the best he could do. Ouch.

  • ha ha ha (unregistered)

    thats what you get for letting a woman be in management

  • Joce (unregistered)

    It's called "eliminating points of failure".

    The only points of failure in this test are the stopwatch and the operator. If you start writing software, well, this is WTF so we know what could happen...

  • (cs) in reply to Zygo
    Zygo:
    sirhegel:
    >This reminds me of the time when I used to put my face >against the monitor and squint really hard to count the >number of pixels of something drawn on the screen. Then I >discovered printscreen...

    ... And this reminds me of my fellow colleaque, who had a dead pixel in his laptops LCD screen. He wanted to complain, so he hit printscreen, verified that the dead pixel is indeed visible in the screenshot, and sent it via EMail to the laptop manufacturer.

    That's either a dead memory bit in the frame buffer RAM, or some kind of <insert ethnic group here> joke...

    It's got nothing to do with anything other than the screen - the image of the faulty screen was viewed on the faulty screen.

  • JustSomeone (unregistered) in reply to nobody
    nobody:
    He didn't know, for example, that you cannot pass the same signal to the data and clock inputs on 7400 TTL chips and expect consistent results; the data needed time to settle before the clock could be signaled, for those of you who know EE of that era (somewhat after clay tablets and before the PC.)

    PCs had TTL components, and even regardless of technology involved, it should be common sense to anyone who understands digital circuit design.

  • ThingGuy McGuyThing (unregistered) in reply to The alimentary canal
    The alimentary canal:
    Were there seriously no corpses laying around?

    I hate to be the one telling you this, but... intestines does not do a terribly good job at moving electrons.

    Are you kidding? Billions move through mine every day!

  • Mr Steve (unregistered) in reply to sirhegel

    LOL thats truely cabbage

    That reminds me of the time we had a user email us saying that nobody could read her emails. She had the background color + text color both set to black. wtf?

  • (cs)

    Wouldn't it be better to hotwire the read/write LED into some custom breadboard and connect that to the serial or parallel ports to a separate benchmarking machine?

    Assuming, of course, that all models had read/write LEDs.

  • JTK (unregistered)

    I was working on a project for which we were getting a lot of complaints about the lag time between the scan of a barcode and the printing of a label. We insisted it was within the spec they gave us, but the users still insisted it was "too slow".

    We proved the point by using a laptop microphone to record from the beep of the scanner to the clatter of the printer. We then displayed the wave files in a sound editor to measure the elapsed time between the two sounds.

    They couldn't argue with that, so they changed the spec.

  • martinus (unregistered)

    I had a similar problem in school. We used a microcontroller and had to created a busy wait loop to wait exactly 2 seconds. Thats simple, but our teacher was too dumb to measure it correctly. She used a stop watch, counted to 3, and at exactly 3 she pressed both the button for the stopwatch and started the timer. Then she waited after 2 seconds the light went out, and pressed the stop button. Of course with this method she measured 2 seconds + her reaction time, which was half a second. She measured that failure everywhere and then claimed that our timer was not working.

  • Anthony (unregistered) in reply to Sunstorm
    Sunstorm:
    This reminds me of the time when I used to put my face against the monitor and squint really hard to count the number of pixels of something drawn on the screen. Then I discovered printscreen...

    That reminds me of watching a 40gb HD on Windows98SE defrag on that mode that shows you the little squares. Some quick counting and math let me determine each line showed 25 squares. I never did count how many lines there were...but it was definitely moving 25 squares at a time.

    Captcha: cognac

  • Paolo G (unregistered) in reply to Shakespeare
    Shakespeare:
    Romeo wherefore art thou?

    Not in any copy of Romeo and Juliet I've ever seen.

    Try "O Romeo, Romeo! wherefore art thou Romeo?". Didst thou, mayhap, fall into the trap of thinking that "wherefore" is just a fancy way of saying "where"?

  • Anonymoose (unregistered) in reply to rbowes

    not all ribbon cables are made the same :D if it was for some sort of instrumentation or factory automation equipment i'm sure the cables weren't readily available.

    I have stuff that runs on a standard DB15 connector, but the pinout inside is all custom depending on the application.

  • ELIZA (unregistered) in reply to Paolo G
    Paolo G:
    Shakespeare:
    Romeo wherefore art thou?

    Not in any copy of Romeo and Juliet I've ever seen.

    Try "O Romeo, Romeo! wherefore art thou Romeo?". Didst thou, mayhap, fall into the trap of thinking that "wherefore" is just a fancy way of saying "where"?

    It means, roughly, "why are you Romeo", and in my dictionary, it is essentially the reverse of the construction "therefore what?": EG, X therefore Y "X therefore what?" asks what X implies and "Wherefore Y?" asks what implies Y It could have been a contraction of the copulation of either what or where and therefore, but the dictionary states it was the direct product of copulation between where and fore.

    Speaking of old language, Man used to mean both male and female men, who were respectively Were (a root of werewolf) and Wyf (now wife; woman is a corruption of "wyf of a man"). Also, Nigh is not simply Tudor/Stuart English for Near; IIRC it only meant temporal nearness.

Leave a comment on “Sir, Seriously, Sir?”

Log In or post as a guest

Replying to comment #:

« Return to Article