- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
How long ago was it again that someone claimed that 640 KB would have been more than enough?
And how ago was it that people considered a gigabyte of harddisk space enormous?
Let's see... Ten years ago, a megabyte was big. 5 years ago, a gigabyte was big. Nowadays, a terabyte is big. So, considering this pattern, a petabyte is big over 5 years. An exabyte in ten years. A zettabyte in 15 years. A yottabyte in 20 years. But in 25 years, no one will be suprised about harddisks of 1000 or more yottabytes... [:D]
So when in 25 years all other applications are crashing because they didn't keep these huge sizes in mind, this application will just be running perfectly well... [;)]
Admin
Actually, the 100th thing happened - someone made a binary joke, someone else made another binary joke in reference to the original joke, and then someone else missed that joke and tried to make another binary joke about missing a binary joke.
And finally, the second person made a binary joke about the whole thing. :)
Admin
A mole is defined as the number of atoms in 12 grams of carbon-12. So a mole of hydrogen molecules has a mass of ca. two grams. At 1 bit per molecule, 1000 yottabytes corresponds to ca. 16,060 moles. At 2 grams per mole, that's slightly over 32kg of hydrogen.
Admin
It's not neccessarily the case that the previous trend will continue. It reminds we of a Simpson's episode where Disco Stu has a chart of disco record sales up to 1979. Many of the biggest errors are made by preparing for the past. That is, looking at what happened before and assuming it will happen again.
The reality is that 5 Megabytes was never that much data. There were huge banks of tape drives long ago. In the past we were memory poor. We are reaching a point where graphic resolution is as good as the human eye can appreciate. I bought a 100 GB had drive a while back. I don't expect to fill it up any time soon. When I had a 500 MB harddrive, I had to remove applications in order to install a new one.
Admin
I agree.
However, the original code produces always the same output for
1024 yottabytes
1048576 yottabytes
1073741824 yottabytes
...
1024**n yottabytes
And that's a WTF, even if it actually is an easter egg.
cu
Yup, not denying, never have. However, you were the first person to actually address that issue; everyone else was talking about 999 instead of 1024, and why isn't len(symbol)-1 a constant. Neither of those things are bugs.
Admin
I'm not sure why everyone is assuming that we need at least one atom per bit. A qubit (quantum bit) can hold 2 values at the same time. A qubyte can hold 256 values at once. It doesn't take that many qubits to hold 1000 YB worth of data.
Admin
Programs expand to fill the available memory. Period.
Admin
Wow, that was so convincing. I'll have to change my view. 'Period' really was a great way to back-up your assertion.
Do you think that Pac-Man on the Atari home system had the low grade graphics it did because they had no better ideas? Or do you think that they limited the graphics because of the contraints of the system? The desire for more memory has increased with capacity. The desire was already there, the capacity wasn't. This is why we still have a lot of analog systems that outperform digital systems. Digital systems have a lot of advantages over analog but they've been unable to match what we want. This is starting to change. Digital cameras are now able to meet the needs of what we want, so 35mm cameras are starting to disappear. It's not that the capacity for high resolution came about and then we decided we wanted higher-resolution cameras. The desire was already there and the technology allowed digital cameras to fufill it.
The point I'm making is not that there's some knowable limit to the need for memory. It's that form follows function. People want more memory and that drives the technology. It's not the other way around. The extra capacity rarely (there are exceptions such as in software development) drives us to want to use it. If storage capacity outpaces our need for more storage, I don't believe that software developers will purposely bloat their programs to fill it (excepting Microsoft, perhaps.)
Admin
Lest there be any confusion, the first sentence above should be:
"The desire for more memory has not increased with capacity"
Admin
I don't think microsoft intentionally bloats their apps, it's just that the office team throws in everything and the kitchen sink for marketing, differentiating, or just "well, someone asked" reasons. :p The other teams are better about cutting features to make do on hardware.
Does anyone have a count of how many times every AOL CD and floppy ever made stacked up would go to the moon and back? =D
Admin
For comparison, what do you think is the size in bytes of all existing and past recorded digital data (that means ANYTHING that has EVER been saved in binary on an electronic storage medium)? That includes all possible computers, webservers, drives, floppies, CD's, DVD's, all sent e-mail (including all spam), all private network files, all TV signalls that have ever been digitally broadcast, EVERYTHING.
Admin
It's choice number 10, you missed a binary joke about a binary joke
Anonymous wrote the power values as decimal and the powered values as binary, hence writing 2^3 as [0b]10^3, which is why it's 8 and not 1000
Admin
Shouldn't he than have written it as 10^100?
Admin
Of course, I mean 10^11
Admin
Nice Seinfeld reference.. :)
Admin
You need to look no further than this forum software to see the typical quality of free open source software. I get quite a lot of submissions from FOSS; heck I'm sure the source code to this software would provide a wealth of goodies.
No less, I generally don't post submissions from FOSS. I have once or twice, but it's a monumental effort to obfuscate it to be un-googleable. I don't like code that can be publically traced to the original author.
Admin
In case, someone is interested in the code, here's my little function for that matter:
def fquant(val, prec = 2): """ format data quantities in human readable form """ v = ('', 'k', 'M', 'G', 'T', 'P', 'E', 'Z', 'Y') d, i = 1024.0, 0 while val > d and i < len(v) - 1: val /= d i += 1 return '%.*f %sB' % (prec, val, v[i])
Stopping his at 'T' is just lame, isn't it.
Admin
These lines need to be intended...
Admin
That's why we make the stack *at night*.
Geez.
Admin
I get the Windows Update "Updates are available" notice on my laptop when it hasn't had the wireless card inserted for a week - there's a timer in there!
"Whoops, it's been a week, the *must* be updates by now!"
Admin
Dual layer DVDs "only" holds 8.5GB of data. You need double sided, single layer discs to get 2*4.7GB.
Admin
But is it more than a googol?
Admin
this tread is 2years old, im posting for the random person who, like me, decided to read to the end of this, just for the hell of it. my 2 cents, 2 years late.
I agree: memory expands to fit demand, not vice versa. the issue is that demand keeps rising as we relies what can almost be done with the current hardware. 2d games were great when they first came out, they were a novelty. then the novelty wore off, and we wanted more. better graphics came next, more detail. then came 3d. then came realistic physics. then came near photo realism. then came destructible environments (some of these are a little out of order, since some of the ideas were developed in parallel with the others) we keep pushing for more "realism," that is, the desire for our dreams and fantasy to more accurately reflect what we are able to experience: the real world. as photos and pictures of the real world approach resolutions that surpass our ability to perceive them (even at magnified zoom levels) desire to have storage space for them will fall. same is true for holligrafic renderings. ditto for 3d polygon recreations. in a fully simulated environment. that is 100% destructible. and rebuild-able. so as we reach the capacity to handle that kind of info, the need for storage for that kind of info will taper off. we don't need atom by atom coordinates for everything, we just need a resolution that makes us believe its real at the scale were looking/feeling it. ergo storage demand is not infinite. its probably just really big.
but what about fantasy worlds created with that kind of resolution? what if i want to recreate the star wars universe for a game (yes, the whole damned universe, with the plaster on the wall modeled in polygons (not textured) so that when i use the inch high cheat the world seems every bit as believable on that scale as it did on the larger one, and its all free roaming with no load times, and it has to be volumetrically modeled, so that when i reach through the wall to strangle an evil guard i can feel the coarseness of the brick all the way though and is 100% destructible, mine-able/exploitable/reclaimable/rebuild-able with a physics engine that can be modified to fit my whim)? now you need that kind of data set for two universes (the real one and this one). and what about those other game developers who want to develop their own games? how many universes will we need to have storage space for? i hereby, and for the sole benefit of that poor lost sole who actually read to the end of this, propose lucus' law: human imagination is infinite, therefore storage needs are (dun dun dunnnnnn) infinite. period.
Admin
And now we jump ahead to middle 2010.
The FIFA world cup is being hosted in here South Africa (gasp...!!), and we all have lotsa yottas on the cellphones in our pockets ...
No, wait... it's only 32GB on a good day ... the yottas were all burn up during the trip to the sun ...
Admin
I know this is almost a decade old...but kind of relevant today....
Consider (with an excellent sale) you can get 1TB for under $50 [USD]....My first harddrive [that I used, no owned] was $20K [USD] for a pair of 2.5MB removables...and that is back in 1971 when $20K could easily buy a (pretty darn nice) house..
Petabyte systems are not uncommon, and datacenters regularly hit multi-exa byte levels of data....
Is it actually possible that yotta-byte will become relevant in the real world before I pass on?????