• (cs)

    How long ago was it again that someone claimed that 640 KB would have been more than enough?

    And how ago was it that people considered a gigabyte of harddisk space enormous?

    Let's see... Ten years ago, a megabyte was big. 5 years ago, a gigabyte was big. Nowadays, a terabyte is big. So, considering this pattern, a petabyte is big over 5 years. An exabyte in ten years. A zettabyte in 15 years. A yottabyte in 20 years. But in 25 years, no one will be suprised about harddisks of 1000 or more yottabytes... [:D]

    So when in 25 years all other applications are crashing because they didn't keep these huge sizes in mind, this application will just be running perfectly well... [;)]

  • Chris Brien (unregistered) in reply to Magic Duck
    Magic Duck:

    Anonymous:
    Anonymous:
    I believe that a bit does have 10 possible values, namely 0 and 1.

    Fun,

    Sten
    Of course, by that logic, one kilobyte, should be 10^3 bytes - ie 8 bytes.

    kilobyte: 10^3 = 8 bytes
    megabyte: 10^6 = 64 bytes
    .
    .
    .
    vendekabyte: 10^33 = 8,589,934,592 As you can see, current (32-bit) computer systems can address 0.5 vendekabytes. In only a few Moore-times, when we have 128-bit systems, we will be able to address substantially more than a googolbyte.

    Either one of 10 cases just happened:

    [*] Someone missed a binary joke or
    [*] I missed missed a joke about a binary joke

    Actually, the 100th thing happened - someone made a binary joke, someone else made another binary joke in reference to the original joke, and then someone else missed that joke and tried to make another binary joke about missing a binary joke.

    And finally, the second person made a binary joke about the whole thing. :)

  • Suomynona (unregistered) in reply to EvanED
    Anonymous:
    It's something very small. One molecule of hydrogen per bit would yield about 16 grams.


    A mole is defined as the number of atoms in 12 grams of carbon-12. So a mole of hydrogen molecules has a mass of ca. two grams. At 1 bit per molecule, 1000 yottabytes corresponds to ca. 16,060 moles. At 2 grams per mole, that's slightly over 32kg of hydrogen.

  • (cs) in reply to Katja

    Katja:
    How long ago was it again that someone claimed that 640 KB would have been more than enough?

    And how ago was it that people considered a gigabyte of harddisk space enormous?

    Let's see... Ten years ago, a megabyte was big. 5 years ago, a gigabyte was big. Nowadays, a terabyte is big. So, considering this pattern, a petabyte is big over 5 years. An exabyte in ten years. A zettabyte in 15 years. A yottabyte in 20 years. But in 25 years, no one will be suprised about harddisks of 1000 or more yottabytes... [:D]

    So when in 25 years all other applications are crashing because they didn't keep these huge sizes in mind, this application will just be running perfectly well... [;)]

    It's not neccessarily the case that the previous trend will continue.  It reminds we of a Simpson's episode where Disco Stu has a chart of disco record sales up to 1979.  Many of the biggest errors are made by preparing for the past.  That is, looking at what happened before and assuming it will happen again.

    The reality is that 5 Megabytes was never that much data.  There were huge banks of tape drives long ago.  In the past we were memory poor.  We are reaching a point where graphic resolution is as good as the human eye can appreciate.  I bought a 100 GB had drive a while back.  I don't expect to fill it up any time soon.  When I had a 500 MB harddrive, I had to remove applications in order to install a new one.

  • (cs) in reply to eagle
    eagle:
    JThelen:


    And if you'd read the whole post that I was referring to, I wasn't addressing that issue.



    I did, and here is what you wrote



    I think you, as well as a few others missed the reasoning behind using the 'thresh' variable as opposed to hardcoding 1024.  YUM isn't the only place you'll find that;  I'm sure that if you looked at apt-get, you'd find it there too, as well as the Windows updater.  The reasoning is to force it into the next higher measure earlier, so that you'll get .90GB as opposed to 1000MB, to use the earlier example.  You have noticed that, right?  Well, doing it your way, as well as was mentioned in an earlier reply(which you and about 3 others appear to be overlooking) will keep that from happening, and won't roll the measure over until it's exactly 1.  So, in short, it's not a bug, and is working as intended



    So you talk about rolling over at .9# and that's the cause for using thresh=999. And yes you are right.

    Yes, it is ok to compare with thresh in the loop's condition. But in


    This snippet looks buggy to me. It is true that it is hard to hit the bugs during our lifetime even taking into account Moores law. I do not know the language it was written in but:
    WTF: depth=depth-diff ->why obfuscate the code in such a manner instead of simply saying: depth=len(symbols)-1
    where, of course, len(symbols)-1 could be a constant
    Bug: number=number*thresh*depth is wrong. Correct would be: number=number*1024**diff,
    this assuming in the language this is written ** means integer exponentiation and is implemented fast and accurately unlike FP exponentiation and has a higher precedence than multiplication.
    It is amazing though that the language this was written in supports integers so big. I think that even 64 bit systems stop at 16exa.
     


    there is no reference to the loop condition, but to the expression used to convert from wtfbytes back to yottabytes.

    And in this expression it is

    a) wrong to use thresh=999 instead of 1024

    and

    b) wrong to use depth instead of diff.

    That's the bug pointed out by the other poster, but in your mission to educate everybody about why thresh=999 is correct (which nobody doubted) you missed that point.



    Now, in the regards that this individual, among others is presenting the 'apparent WTF', it's not. 

    W/r to what you're talking about, yes, it's likely there's a bug in what's there.  The real question, IMO, is how much does it matter?  The developer obviously put this in as an easter egg of sorts, and one accessible only to someone looking at the source.

    Cheers.



    I agree.

    However, the original code produces always the same output for

    1024 yottabytes
    1048576 yottabytes
    1073741824 yottabytes
    ...
    1024**n yottabytes

    And that's a WTF, even if it actually is an easter egg.

    cu


    Yup, not denying, never have.  However, you were the first person to actually address that issue;  everyone else was talking about 999 instead of 1024, and why isn't len(symbol)-1 a constant.  Neither of those things are bugs.
  • (cs) in reply to Suomynona

    Anonymous:
    Anonymous:
    It's something very small. One molecule of hydrogen per bit would yield about 16 grams.


    A mole is defined as the number of atoms in 12 grams of carbon-12. So a mole of hydrogen molecules has a mass of ca. two grams. At 1 bit per molecule, 1000 yottabytes corresponds to ca. 16,060 moles. At 2 grams per mole, that's slightly over 32kg of hydrogen.

    I'm not sure why everyone is assuming that we need at least one atom per bit.  A qubit (quantum bit) can hold 2 values at the same time.  A qubyte can hold 256 values at once.  It doesn't take that many qubits to hold 1000 YB worth of data.

  • w-ber (unregistered)

    Programs expand to fill the available memory. Period.

  • (cs) in reply to w-ber

    Anonymous:
    Programs expand to fill the available memory. Period.

    Wow, that was so convincing. I'll have to change my view.  'Period' really was a great way to back-up your assertion.

    Do you think that Pac-Man on the Atari home system had the low grade graphics it did because they had no better ideas?  Or do you think that they limited the graphics because of the contraints of the system?  The desire for more memory has increased with capacity.   The desire was already there, the capacity wasn't.  This is why we still have a lot of analog systems that outperform digital systems.  Digital systems have a lot of advantages over analog but they've been unable to match what we want.  This is starting to change.  Digital cameras are now able to meet the needs of what we want, so 35mm cameras are starting to disappear.  It's not that the capacity for high resolution came about and then we decided we wanted higher-resolution cameras.  The desire was already there and the technology allowed digital cameras to fufill it.

    The point I'm making is not that there's some knowable limit to the need for memory.  It's that form follows function.  People want more memory and that drives the technology.  It's not the other way around.  The extra capacity rarely (there are exceptions such as in software development) drives us to want to use it.  If storage capacity outpaces our need for more storage, I don't believe that software developers will purposely bloat their programs to fill it (excepting Microsoft, perhaps.)

  • (cs) in reply to dubwai

    dubwai:
    The desire for more memory has increased with capacity.   The desire was already there, the capacity wasn't.

    Lest there be any confusion, the first sentence above should be:

    "The desire for more memory has not increased with capacity"

  • (cs) in reply to dubwai
    dubwai:

    Anonymous:
    Programs expand to fill the available memory. Period.

    The point I'm making is not that there's some knowable limit to the need for memory.  It's that form follows function.  People want more memory and that drives the technology.  It's not the other way around.  The extra capacity rarely (there are exceptions such as in software development) drives us to want to use it.  If storage capacity outpaces our need for more storage, I don't believe that software developers will purposely bloat their programs to fill it (excepting Microsoft, perhaps.)


    I don't think microsoft intentionally bloats their apps, it's just that the office team throws in everything and the kitchen sink for marketing, differentiating, or just "well, someone asked" reasons. :p The other teams are better about cutting features to make do on hardware.

    Does anyone have a count of how many times every AOL CD and floppy ever made stacked up would go to the moon and back? =D
  • (cs) in reply to Anonymous

    For comparison, what do you think is the size in bytes of all existing and past recorded digital data (that means ANYTHING that has EVER been saved in binary on an electronic storage medium)? That includes all possible computers, webservers, drives, floppies, CD's, DVD's, all sent e-mail (including all spam), all private network files, all TV signalls that have ever been digitally broadcast, EVERYTHING.

  • Masklinn (unregistered) in reply to Magic Duck
    Magic Duck:

    Anonymous:
    Anonymous:
    I believe that a bit does have 10 possible values, namely 0 and 1.

    Fun,

    Sten
    Of course, by that logic, one kilobyte, should be 10^3 bytes - ie 8 bytes.

    kilobyte: 10^3 = 8 bytes
    megabyte: 10^6 = 64 bytes
    .
    .
    .
    vendekabyte: 10^33 = 8,589,934,592 As you can see, current (32-bit) computer systems can address 0.5 vendekabytes. In only a few Moore-times, when we have 128-bit systems, we will be able to address substantially more than a googolbyte.

    Either one of 10 cases just happened:

    [*] Someone missed a binary joke or
    [*] I missed missed a joke about a binary joke

    It's choice number 10, you missed a binary joke about a binary joke

    Anonymous wrote the power values as decimal and the powered values as binary, hence writing 2^3 as [0b]10^3, which is why it's 8 and not 1000

  • (cs) in reply to Masklinn
    Anonymous:
    Magic Duck:

    Anonymous:
    Anonymous:
    I believe that a bit does have 10 possible values, namely 0 and 1.

    Fun,

    Sten
    Of course, by that logic, one kilobyte, should be 10^3 bytes - ie 8 bytes.

    kilobyte: 10^3 = 8 bytes
    megabyte: 10^6 = 64 bytes
    .
    .
    .
    vendekabyte: 10^33 = 8,589,934,592 As you can see, current (32-bit) computer systems can address 0.5 vendekabytes. In only a few Moore-times, when we have 128-bit systems, we will be able to address substantially more than a googolbyte.

    Either one of 10 cases just happened:

    [*] Someone missed a binary joke or
    [*] I missed missed a joke about a binary joke

    It's choice number 10, you missed a binary joke about a binary joke

    Anonymous wrote the power values as decimal and the powered values as binary, hence writing 2^3 as [0b]10^3, which is why it's 8 and not 1000

    Shouldn't he than have written it as 10^100?

  • (cs) in reply to Elvarg
    Elvarg:
    Anonymous:
    Magic Duck:

    Anonymous:
    Anonymous:
    I believe that a bit does have 10 possible values, namely 0 and 1.

    Fun,

    Sten
    Of course, by that logic, one kilobyte, should be 10^3 bytes - ie 8 bytes.

    kilobyte: 10^3 = 8 bytes
    megabyte: 10^6 = 64 bytes
    .
    .
    .
    vendekabyte: 10^33 = 8,589,934,592 As you can see, current (32-bit) computer systems can address 0.5 vendekabytes. In only a few Moore-times, when we have 128-bit systems, we will be able to address substantially more than a googolbyte.

    Either one of 10 cases just happened:

    [*] Someone missed a binary joke or
    [*] I missed missed a joke about a binary joke

    It's choice number 10, you missed a binary joke about a binary joke

    Anonymous wrote the power values as decimal and the powered values as binary, hence writing 2^3 as [0b]10^3, which is why it's 8 and not 1000

    Shouldn't he than have written it as 10^100?

    Of course, I mean 10^11

  • tim (unregistered) in reply to richleick
    richleick:
    I saw this wtf, had a great stack of CD's and yotta yotta yotta, I landed on the moon.




    Nice Seinfeld reference.. :)
  • (cs) in reply to Boaz

    Anonymous:
    I just started reading TheDailyWTF, and have been pretty impressed by the quality (or lack thereof) of the submissions. When I saw the first line of this, though, I went "WTF? yum has a WTF?" The rest was reassuring -- it is nice to know that the open-source software that I run does not come close to the scariness of the databases, web applications, and proprietary in-house apps out there. Maybe it's the OPEN SOURCE part of Open Source?

    You need to look no further than this forum software to see the typical quality of free open source software. I get quite a lot of submissions from FOSS; heck I'm sure the source code to this software would provide a wealth of goodies.

    No less, I generally don't post submissions from FOSS. I have once or twice, but it's a monumental effort to obfuscate it to be un-googleable. I don't like code that can be publically traced to the original author.

  • frispete (unregistered)

    In case, someone is interested in the code, here's my little function for that matter:

    def fquant(val, prec = 2): """ format data quantities in human readable form """ v = ('', 'k', 'M', 'G', 'T', 'P', 'E', 'Z', 'Y') d, i = 1024.0, 0 while val > d and i < len(v) - 1: val /= d i += 1 return '%.*f %sB' % (prec, val, v[i])

    Stopping his at 'T' is just lame, isn't it.

  • frispete (unregistered) in reply to frispete
    Anonymous:
    val /= d i += 1

    These lines need to be intended...

  • (cs) in reply to DZ-Jay
    DZ-Jay:
    Alex Papadimoulis:

    A stack of 3.5" floppy discs with 1,000 yottabytes would be tall enough to make it to the sun. 14 Million times. But still ... just in case ...



    Dude, wouldn't the floppies melt when they get close to the Sun?

        dZ.

    That's why we make the stack *at night*.

    Geez.

  • (cs) in reply to TheMuuj

    TheMuuj:
    Furthermore, I wish I could compare the YUM source code to the Windows Udpate code.

    What about the "feature" on Windows Update that causes it to bug me to install the same Outlook Express patch every 30 minutes, even though I've installed it 10 times already, and I don't even use Outlook Express for email anymore!!!

    Or how about the code that bugs you until you reboot, interrupting your workflow until you're forced to reboot and take an early break?

    Windows Update is a big WTF.  My minimal experience with YUM has been quite pleasant, and if I saw an update that was 1000 Yottabytes, I would think twice about downloading it, and then laugh because there's probably an error on the server side.  At least the program won't crash.

    I get the Windows Update "Updates are available" notice on my laptop when it hasn't had the wireless card inserted for a week - there's a timer in there!

    "Whoops, it's been a week, the *must* be updates by now!"

     

  • Trond M. (unregistered) in reply to Mic
    Anonymous:

    let's assume a double layer DVD, capable of holding 2* 4.7 GB (2*4.38GiB).


    Dual layer DVDs "only" holds 8.5GB of data. You need double sided, single layer discs to get 2*4.7GB.


  • (cs)

    But is it more than a googol?

  • LucusLoC (unregistered) in reply to dubwai
    dubwai:
    Anonymous:
    Programs expand to fill the available memory. Period.
    Wow, that was so convincing. I'll have to change my view.  'Period' really was a great way to back-up your assertion. Do you think that Pac-Man on the Atari home system had the low grade graphics it did because they had no better ideas?  Or do you think that they limited the graphics because of the contraints of the system?  The desire for more memory has increased with capacity.   The desire was already there, the capacity wasn't.  This is why we still have a lot of analog systems that outperform digital systems.  Digital systems have a lot of advantages over analog but they've been unable to match what we want.  This is starting to change.  Digital cameras are now able to meet the needs of what we want, so 35mm cameras are starting to disappear.  It's not that the capacity for high resolution came about and then we decided we wanted higher-resolution cameras.  The desire was already there and the technology allowed digital cameras to fufill it. The point I'm making is not that there's some knowable limit to the need for memory.  It's that form follows function.  People want more memory and that drives the technology.  It's not the other way around.  The extra capacity rarely (there are exceptions such as in software development) drives us to want to use it.  If storage capacity outpaces our need for more storage, I don't believe that software developers will purposely bloat their programs to fill it (excepting Microsoft, perhaps.)

    this tread is 2years old, im posting for the random person who, like me, decided to read to the end of this, just for the hell of it. my 2 cents, 2 years late.

    I agree: memory expands to fit demand, not vice versa. the issue is that demand keeps rising as we relies what can almost be done with the current hardware. 2d games were great when they first came out, they were a novelty. then the novelty wore off, and we wanted more. better graphics came next, more detail. then came 3d. then came realistic physics. then came near photo realism. then came destructible environments (some of these are a little out of order, since some of the ideas were developed in parallel with the others) we keep pushing for more "realism," that is, the desire for our dreams and fantasy to more accurately reflect what we are able to experience: the real world. as photos and pictures of the real world approach resolutions that surpass our ability to perceive them (even at magnified zoom levels) desire to have storage space for them will fall. same is true for holligrafic renderings. ditto for 3d polygon recreations. in a fully simulated environment. that is 100% destructible. and rebuild-able. so as we reach the capacity to handle that kind of info, the need for storage for that kind of info will taper off. we don't need atom by atom coordinates for everything, we just need a resolution that makes us believe its real at the scale were looking/feeling it. ergo storage demand is not infinite. its probably just really big.

    but what about fantasy worlds created with that kind of resolution? what if i want to recreate the star wars universe for a game (yes, the whole damned universe, with the plaster on the wall modeled in polygons (not textured) so that when i use the inch high cheat the world seems every bit as believable on that scale as it did on the larger one, and its all free roaming with no load times, and it has to be volumetrically modeled, so that when i reach through the wall to strangle an evil guard i can feel the coarseness of the brick all the way though and is 100% destructible, mine-able/exploitable/reclaimable/rebuild-able with a physics engine that can be modified to fit my whim)? now you need that kind of data set for two universes (the real one and this one). and what about those other game developers who want to develop their own games? how many universes will we need to have storage space for? i hereby, and for the sole benefit of that poor lost sole who actually read to the end of this, propose lucus' law: human imagination is infinite, therefore storage needs are (dun dun dunnnnnn) infinite. period.

  • DonaldK (unregistered)

    And now we jump ahead to middle 2010.

    The FIFA world cup is being hosted in here South Africa (gasp...!!), and we all have lotsa yottas on the cellphones in our pockets ...

    No, wait... it's only 32GB on a good day ... the yottas were all burn up during the trip to the sun ...

  • (cs)

    I know this is almost a decade old...but kind of relevant today....

    Consider (with an excellent sale) you can get 1TB for under $50 [USD]....My first harddrive [that I used, no owned] was $20K [USD] for a pair of 2.5MB removables...and that is back in 1971 when $20K could easily buy a (pretty darn nice) house..

    Petabyte systems are not uncommon, and datacenters regularly hit multi-exa byte levels of data....

    Is it actually possible that yotta-byte will become relevant in the real world before I pass on?????

Leave a comment on “Just In Case It's Needed”

Log In or post as a guest

Replying to comment #:

« Return to Article