- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Admin
This was a nice read. I've spent the past year explaining that our bottleneck was disk I/O only to have people suggest extra disk I/O to reduce the computational costs of the project. Of course, the computation must be causing the issue. It's not like 90% of the time is spent in I/O with the CPU at 15%...
Admin
Many moons ago I was a lowly junior operator at an ICL site. Despite being the only trained programmer in the building most of the night I wasn't allowed to touch the hallowed console as I might just know what to do with it (stupid auditors). One fateful night the shift manager came in and (IIRC) CHCP jobnumber 100 (I forget proper VME syntax) which boosted one particular low priority job to 100% cpu because the programmer wanted to go home. Quietly I said 'I really think you should put that back'. Before he could say anything the system coredumped and started avery single bloody printer in the room spitting out core. At least we got a good chunk of the 14 hours overtime it took to do a restore.
Admin
A millibit isn't used as a direct unit. It's used in "millibits per second", the most convenient unit for measuring disk I/O in PDP/11s. Lack of millibits-per-second caused the original problem.
Admin
You gather up 1000 microbits of course. Now to get a microbit you will need to start collecting nanobits, but that is another story.
Admin
Good god, I've had to put up with idiots that don't understand priority.
Recently it was a DBA, only app on the multi-processor UNIX machine. He wanted me to use the nice command on his process because he needed it done sooner. His thing was only using 80% of one CPU.
Or the FEA users that for some stupid reason restricted their NASTRAN jobs on a VAX to use very little memory. So the program was constantly doing disk IO to load and ship out parts of the big matrix it was working on. "Hey my job ran for 8 hours, you need to increase it's priority" Then I showed them charts of CPU usage while their job was running, it peaked at 25%. "But if you doubled my priority it would peak at 50%."
Admin
I've upped my priority, now up yours!
Admin
Admin
I'm glad there are some new authors, but it is kind of annoying when every sentence is put on a new line.
Admin
What if the author's dictating for a quadripelegic and wanted to accurately portray the pauses in speech? How insensitive of you...
Admin
Actually, this is a very good representation of the argument against kibi-, mebi-, gibi-, etc.
You can't have divisions of a bit. A millibit can't exist. A bit is a Binary digIT. A digit is an indivisible unit. The concept it represents may or may not be, but it itself is indivisible. Thus, half of the SI prefixes for bits don't apply. Following that, NONE of the SI prefixes can apply, since they're a set - all or nothing. So the prefixes used by bits are NOT SI prefixes. They simply mimic SI prefixes in context.
This proves that we don't need kibi-, mebi-, or gibi- or any of their braindead ilk. Kilo-, mega-, and giga- will do just fine in the context of bits, even with their non-base-10 multipliers.
And bytes follow this by extension, since they're defined in terms of bits.
In conclusion, those that continue to push for the use of kibi-, mebi-, gibi-, etc. can die in a fire.
Admin
I don't get it, why would it be different for memory and storage?
Admin
Admin
“a control panel straight out of straight out of Star Trek” should be “a control panel straight out of Star Trek”
http://www.emendapp.com/sites/thedailywtf.com/edits/0
Admin
Admin
No. Technically, bits are units of data that exist in one of exactly two possible states. The concept of low/high voltages is related to one method of representing bits. It's an implementation detail, unrelated to the general concept of a bit.
Admin
I can completely relate. In one company I worked at, I had a boss who was put in charge of my department (and I was a department of one) simply because he was friends with the CEO. He was one of these guys who thought that any problem was solvable by just working harder, and that if he spent long enough at it, he could do anything, which based on his track record was not even close to reality.
This demotivator:
"When you earnestly believe you can compensate for a lack of skill by doubling your efforts, there's no end to what you can't do"
described him to a T, and even worse, I had it on my wall in my office and he didn't understand what I was getting at by putting it up. He thought it had a typo.
Admin
Get off my lawn. Better yet, get off my planet, and take your bloatware with you.
Admin
Well, the bit started life as a binary digit, but fractions of bits are certainly useful these days. Try looking at http://en.wikipedia.org/wiki/Entropy_(information_theory) for more info.
Admin
That damn well better have been posted via Lynx. Me, I hand write all my HTTP requests and send them via cURL.
Admin
Stupidity and ignorance abounds everywhere, it's not restricted to just a few companies. What's the point of jumping ship to another pass-the-duct-tape-and-paper-clip outfit, especially if it means giving up decent benefits? In some cases it's just easier to shut up, play the same game and forget about work on the weekends instead of trying to edumacate the masses & cretins.
Admin
Did someone forget to adjust the speedup loop? http://thedailywtf.com/Articles/The-Speedup-Loop.aspx
Admin
Of course, that's also not really a problem...
Admin
You mean telnet, surely? Why else would the line terminator in the standard be CRLF instead of just LF?
Admin
Disk drive vendors can suck eggs and die. They're the reason for this push to define "MiB" etc. since they stole the original meaning of "MB".
Admin
Yeah, I worked with a guy who had a boss like this. Wanted "high uptime" on the UNIX boxes. Rebooting was never an option, and saw no reason to have uptimes of a year or more. Eventually, the guy I worked with replaced the "uptime" command to show "uptime + 365days" script (or something), and made an alias with the boss' login to the script. Of course, this didn't affect /proc/uptime or uptime reported by other tools, like top, but the boss was all about "uptime."
Admin
Admin
LOL, thats classical stuff dude. Well done.
Lou www.vpn-privacy.us.tc
Admin
I once had a boss that liked lights and motion. He thought those things meant the equipment was functioning properly. Well, after I had successfully pleaded for a RAID-5 NAS unit with the ability to store months of nightly backups, I had it installed and operational a few days later. Enter the boss.
"So what does this thing do again?"
"It stores our nightly backups. We still back up mission-critical data to tape for redundancy. But with this little box I can keep backups much longer and restore them quicker if there's a failure. It's a lot easier to log in to the NAS and restore the missing data than it is to retrieve the tapes from the fire safe and go through the whole backup looking for the missing data."
"Oh. So what's it doing right now?"
Seeing as how I had worked with this man for 4 years, I knew what was coming. Thinking quickly, I turned to a terminal and did the first thing that came to mind, created a batch file with a never-ending loop to copy a small JPG file from the web server to a share on the RAID, delete it, and recopy it once every minute. The network and drive activity light immediately responded.
"OK I see it now. Well, keep me posted. Good work."
I changed jobs soon after, but as far as I know, that batch file runs 24/7 to this day.
Admin
Whoever thought using kilo and mega for those values just because they were close to what their real values represent was an idiot. And probably american. Using commonly accepted and understood terminology to represent something completely different is a much better example of you-need-to-die-in-a-fire. Changing the names of the values that don't conform is the only way to fix this mistake.
And I'm sure many people will appreciate being able to double something like allocated memory without having to pull out a calculator and hitting "=" over and over until the number reaches something that is just over what common sense tells them it should be.
Also, just because the other end of the spectrum (the fractions) doesn't apply under normal circumstances, doesn't mean the entire system doesn't work. Suggesting it does means you're either a troll, or too stupid to figure out that you just don't use them. It's like someone gives you a thousand matches and asks you to light a candle. You don't use all of them, just one.
In conclusion, those that continue to push for the use of the values indicated by kibi-, mebi-, gibi-, etc. can do so at their own inconvenience.
Admin
This is all too familiar. I had an engineer on a PDP-11/70 running DEC RSX-11M who raised the priority of his big batch job to higher than the system console. It was already getting all the CPU, and now we lost access to the system until the job finished a couple hours later. Rebooting wasn't an option.
Admin
Can somebody just frickin' explain to me what a frickin' "ftfy" is?
Admin
Amen to that.
Megabyte = 2^20 Gigabyte = 2^30
See how simple that is? Not my fault some people are all butthurt over us CS guys coopting SI units.
Admin
http://www.internetslang.com/FTFY.asp
Admin
It isn't. Some HD marketing guys decided to start lying about it so they could claim a larger number on the box and everyone else went along so they could do so too.
Admin
It amazes me how many people don't just do that when they don't understand something...
Admin
Didn't you know a gigabyte was actually a bit powered by a gigavolt?
Admin
Maybe bits are like atoms, and we just have yet to split them.
Admin
ftfy
Admin
But once we got past the demo and the install and started using it, we discovered that when two people entered a command at about the same time, the first one would run to completion before the second one even said "Processing". So, naturally, with all 8 users doing their jobs as fast as they could, the system was "down" 95% of the time. Because, of course, if it didn't say "Processing" within 1/3 of a second, they'd enter the command again. Five times, just in case.
The supervisor insisted the system was "down", and nothing the vendor reps could explain would convince her otherwise. Luckily one of the refrigerators had 8 LEDs, you know, the ones you use when you are entering the bootstrap code one byte at a time, with toggle switches? You don't? Well it doesn't really matter to the story, infant.
Anyway, one of the vendor's tech guys came up with a tight assembler loop that would make those LEDs light up from left to right and then right to left, like a Cylon scanning. What? Cylon? Oh cheese!
Now, as long as Mrs. Overpaid Technophobic Supervisor could see the blinky-lights, she knew the system was not "down" no matter how slow it got.
The really nice part about this was the assembly routine was so low in the system that even when the application did crash, the lights kept blinking! I'm afraid we permanently bent Mrs. Supervisor's mind shortly after that...
Admin
Ease of conversion between submultiples of units does not outweigh the arbitrariness of scale of those units. Imperial units are based on day-to-day activities on a human scale. Why would you want to fix something that's not broken?
Admin
Well, since a binary digit is discreet and cannot actually be less than one, a fraction of a binary digit actually represents the uncertainty of the value. if you have 99/100ths of a bit, that means that you are only 99% certain of the value of that bit.
Admin
[quote user="DudeWaitWhat"][quote user="RandomUser423682"]
Actually, this is a very good representation of the argument against kibi-, mebi-, gibi-, etc.
You can't have divisions of a bit. A millibit can't exist. A bit is a Binary digIT. A digit is an indivisible unit. The concept it represents may or may not be, but it itself is indivisible.[/quote]
That comment is the real WTF. It is perfectly possible to have divisions of a bit. Take a look at binary decision tree construction (or information theory) for examples of the use of fractions of a bit, or even millibits.
Admin
Beyond that, it's actually standard practice in embedded systems, because there's no file system to store truthful information.
Admin
Process priority (n) pra'w-sess pry-ohr-it'tee : A tool for causing process starvation on an otherwise smoothly-running operating system.
Admin
Admin
The picture reminds me of the good old days when computer had cats rather than mice.
Admin
Admin
A bit is the amount of information gained by observing the outcome of a uniformly distributed binary random variable. If the variable is NOT uniformly distributed, then the information gained by observing its value is, on average, less than one bit per observation.
Information is defined as the negative logarithm of likelihood. When the base of this logarithm happens to be 2, the resulting unit is called a "bit." If it's base e, it's called a "nat."
Admin
Memory is manufactured on silicon, where it almost always makes sense to manufacture in power-of-two units. Further, CPUs logically address memory using bits to count, which makes a power-of-two range more sensible than an arbitrary one based on the number of fingers humans have. Hence 1024.
Storage (and bandwidth for that matter) has no such manufacturing nor addressing constraints, so 1000 is used because it's easier for us 10-fingered people to understand.