• Joe (unregistered) in reply to snoofle
    snoofle:
    Philippe:
    It wasn't a recursion problem. As others have noted, it was more of a scaling problem - the amount of disk space required to store the log data was quickly exhausted.

    This is a WTF along the lines of "no one will ever need more than 640K of RAM" and "14MB of disk space is such a huge amount of disk that no conceivable process could ever fill it up."

    You youngsters and your terabyte filesystems and gigahertz processors...

    I have twin 114GB drives on a 3 year old system (not too shabby for it's day). Even with a few thousand photos of the kids, hundreds of songs and a handful of movies, I've only used about 30GB, and most of that is Windows updates.

    Just curious: what do people put on these huge disks to fill them up (I mean besides porn)? And how much of it is actually useful?

    I can answer that. I've had my 250GB hard drive for one month and it's got... 20GB of space left. What's taking up the space? Who knows!? Why's it there? Because South Korean internet is fast and I have too much free time. As to the question of how much is useful... hmmm... about 2GB.

  • (cs) in reply to Joe

    It's amazing how much space various virtual machines can get for cross-platform testing of software before releasing it.

    Then take into account that the software being tested is video processing software - for broadcast and digital cinema files, not home videos. Multiple operating systems, each with swap-files, sources, applications and huge data files.

  • Watson (unregistered)

    "I poured over my code,"

    Ewwwwww.....

  • Jim Steichen (StychoKiller) (unregistered) in reply to Carnildo

    Does paper tape count as writable (certainly NOT readable!) permanent storage? Let's not forget the venerable Holerith card!

  • Tom (unregistered) in reply to PSWorx
    PSWorx:
    Reminds me of a depth search based "maze solver" I wrote as a hobby project when I was still in school... or at least attempted to write...

    The program, like every good maze solver accepted a two dimensional array of "cells" as an input (where each cell was either accessible or an obstacle) and was to return the shortest possible route from the start to the destination cell.

    The program with its really cool recursive depth search algorithm I had proudly coded in VB at that time worked fine as long as it just had to find a path to the target. However, when I tried to find the shortest possible one, it kept mysteriously freezing on some mazes, especially on those with few obstacles...

    I tried and tried but no matter what, I could not find the error... until i realized that the program actually ran as expected, just that iterating through every single possible path on an almost blank 100x100 cells grid might just maybe not be the best idea...

    I recently encountered a similar "problem"... I decided to write a script to run a basic breadth first search on Wikipedia to find a path between any two Wikipedia articles (why not!?). Well, it worked fine for the first few test cases I made up, and it managed to find paths of about 3-4 links, which required downloading a few hundred articles.

    So i decided to try two truly random articles using WP's random article feature. It was taking awhile (at a depth of 5, and had downloaded a few thousand articles) and it was getting let so I decided to let it do it's thing while I went to sleep, figuring it would be done in the morning.

    I checked on it the next morning, and it was still searching at a depth of 6 and had checked over 100,000 articles. It turns out the average number of links for Wikipedia articles is around 100. You don't always have to search every link, but that's still pretty big.

    I'm working on a new version that caches the links in a database... that should cut down on eating up WP's bandwidth (average article is about 40KB. 100,000 * 40KB = 4GB ... sorry Wikipedia).

  • A jaded user (unregistered) in reply to joe.edwards

    This story sucked. What business does this have on WTF?

    All the quality left when the site changed its name.

    I've been a daily visitor for almost a year, now I'm down to once a week. I think it's time do drop to once a month. The content has no quality.

  • Anthony DeRobertis (unregistered) in reply to Tom
    Tom:
    I'm working on a new version that caches the links in a database... that should cut down on eating up WP's bandwidth (average article is about 40KB. 100,000 * 40KB = 4GB ... sorry Wikipedia).

    You know, for a mere 570MB, you could have had a complete, local copy of every page-to-page link in the English Wikipedia. Your 4GB could have gotten you a complete local mirror of the current version of every article, too.

    See http://meta.wikimedia.org/wiki/Data_dumps for links to data dumps.

  • thequux (unregistered)

    Ahhh... historic computing.

    It's a good question why he bothered, given that simh can boot cp/m just fine <sarcasm>

    Anyways... I tried this using Bochs on my Blade 150 (15Gb of disk space free), trying to boot my hobby OS. It loaded grub after a couple of minutes, then loaded my OS, and it actually managed to switch the video mode. Then, as it was printing "Welcome to Xenon", I ran out of disk space and ended up with "Welcome to Xen". Took a while to do, though

  • w00t (unregistered) in reply to Francis
    Francis:
    So, what was wrong? Seriously, a recursive log function is funny, but not that interesting. Come on, add a little bit more to this wtf, make it interesting!

    Do people submit these?

    I'm quite sure the programmer said something along the lines of "d'oh! that's a WTF!" when the realization dawned on him that he could have guesstimated beforehand that logging every cycle of a 4 million-per-second cycle will quickly eat up the "almost infinite" number of megabytes (14) on the hard drive in only a few seconds of simulated time.

    OK, maybe the writing could've been better, but remember, this is the guy himself admitting to it, in public, on dailywtf.. Takes balls.

  • Evo (unregistered) in reply to A jaded user
    A jaded user:
    This story sucked. What business does this have on WTF?

    All the quality left when the site changed its name.

    I've been a daily visitor for almost a year, now I'm down to once a week. I think it's time do drop to once a month. The content has no quality.

    That's bullshit. No offence, but the stories are pretty much the same and they added the CodeSOD and Error'd. So there's actually more stuff. Obviously, this story sucked. Halfway reading it I wondered when the actually story would start. It didn't.

    But actually it's pretty smart:

    • The real wtf is that this store is on this site
  • Raw (unregistered)
    So, what was wrong?

    Do the math. 4 MHz. If I remember correctly, the 8086 ran most instructions using 4 or 8 cycles. That means somewhere between 0.5 - 1 million instructions per second. Let's say 0.75 million. Each instructions is logged with something like 50 bytes. That's 0.75 * 50 = 37.5 MB per second.

    The curious reader can amuse themselves by doing the same math for a modern system.

    Granted, most of the instructions will be NOP's, but still...

  • deadtime (unregistered) in reply to Carnildo
    Carnildo:
    Alex:
    Boy, talk about bringing back some memories of my old Kaypro II... ;)

    Terabyte filesystems?!? How about - "You youngsters with any hard drives at all!!" If 2 5-1/4 floppies were good enough for me, they're good enough for anyone. One floppy for WordStar, and one for your data - what more could you want?

    Writable permanent storage? What luxury! Back when I started programming, the only "permanent storage" I had was a spiral-bound notebook sitting next to the computer.

    A spiral-bound notebook!? When I started programming I had to inscribe every instruction on a stone plate, get electrocuted if there was an error in the program somewhere, and run everything on a broken abacus!

  • Baston (unregistered) in reply to snoofle
    snoofle:
    Just curious: what do people put on these huge disks to fill them up (I mean besides porn)? And how much of it is actually useful?

    I didn't know there was something beside porn !!??

    And, yes, it's usefull 8-)

  • coz (unregistered)

    So...how OLD is this guy anyway? It's a miracle he has the power to use the computer to submit emails...forget the fact that he is actually making sense :)

  • Hans (unregistered) in reply to clintp
    clintp:
    As someone who's written a few emulators, lemme tell you it's no fun. Congratulations on your tenacity.

    However, your sense of purpose was a bit misguided. Glad you realize that now. :)

    I worked on an emulator for 6 or 7 years (not just CPU, but an entire system) and let me tell you, it was the best hobby I ever had... The reason I stopped was that the machine I was working on (an Amiga) had become hopelessly obsolete. And since this thing was written in carefully crafted 680x0 assembly I have never yet felt the urge to port to PC or any other platform.

  • stickman (unregistered) in reply to Carnildo
    Carnildo:
    Alex:
    Boy, talk about bringing back some memories of my old Kaypro II... ;)

    Terabyte filesystems?!? How about - "You youngsters with any hard drives at all!!" If 2 5-1/4 floppies were good enough for me, they're good enough for anyone. One floppy for WordStar, and one for your data - what more could you want?

    Writable permanent storage? What luxury! Back when I started programming, the only "permanent storage" I had was a spiral-bound notebook sitting next to the computer.

    a notebook?

    we used sticks to write in the sand and compiled the code in our heads!

    (to think i do top quite a few guys and gals here wit hsimons basic and a datasette is kind of sad o_O)

    oh, and i fill my disk mostly with images of games i bought* and ripped music

    (i tend to loose the discs somewhere in my room, so it is a bit better to use the images)

  • Bitter Like Quinine (unregistered)

    Not exactly emulation, but I once had to write a program that would take an existing codebase - written in a proprietary language, for an obsolete platform - and convert it to clean and shiny FORTRAN code to run on our blisteringly fast new DEC kit.

    I offered to write an actual interpreter, so they could just run their old programs until such time as they could be rewritten properly but, no, automatic translation was the way they wanted to go.

    It took weeks but at last I was ready to test it. In with the old code, out with the new. It looked the part, the old lines were present as comments, the equivalent FORTRAN was nicely formatted underneath, the old direct accesses to remote hardware were mapped to routines that interrogated hi-tech remote controller cards, it even compiled first time. So I ran it.

    The fatal bugcheck was a little surprising, but not quite as surprising as the fact that all the other workstations talking to the same remote controller bugchecked at the same time. I never used the translator again.

  • Ornedan (unregistered) in reply to PSWorx
    PSWorx:
    Reminds me of a depth search based "maze solver" I wrote as a hobby project when I was still in school... or at least attempted to write...

    The program, like every good maze solver accepted a two dimensional array of "cells" as an input (where each cell was either accessible or an obstacle) and was to return the shortest possible route from the start to the destination cell.

    The program with its really cool recursive depth search algorithm I had proudly coded in VB at that time worked fine as long as it just had to find a path to the target. However, when I tried to find the shortest possible one, it kept mysteriously freezing on some mazes, especially on those with few obstacles...

    I tried and tried but no matter what, I could not find the error... until i realized that the program actually ran as expected, just that iterating through every single possible path on an almost blank 100x100 cells grid might just maybe not be the best idea...

    A* works for this, as has been suggested. Though I'd prefer Dijkstra's algorithm in this case - consistent good performance, whereas with A* your remaining distance estimation heuristic is likely to be wrong for some inputs.
  • James (unregistered) in reply to snoofle

    I have 150GB just of compressed music. If I dumped all the films I have i'd have to have over 3tb of storage!

  • Hans (unregistered) in reply to James
    James:
    I have 150GB just of compressed music. If I dumped all the films I have i'd have to have over 3tb of storage!

    Do you actually ever watch any of those films? I mean, I download the odd film now and then but I always delete them after watching...

  • DrYak (unregistered) in reply to James

    The real WTF is using /emulation/ to try to run 8080 code on a 8086.

    Even if they aren't binary compatible, the 8080 instruction set can be mapped easily to 8086. The x86 family is basically "assembler backward compatible" with 8080.

    In fact this characteristic was openly abused by MS-DOS who feature a lot of similarities in its API with CPM, so source code of 8080 software could be ported simply by using a translation software that converts the 8080 / CPM code to 8086 DOS code (see wikipedia article of the 8086).

    Instead of writing an emulator, a much more efficient solution would have been to write a "binary recompilation" emulator that automatically translates 8080 code to corresponding 8086 code. With this you could already port trivial application in a ompletly unattended manner. If your dessambler / converter keeps track of jump points, you can convert and run most applications. If the converter either keeps tracks of every instructions of the original 8080 code (which isn't that difficult given the fact that most 8bits programs are a couple of dozen KB in worst case) or is able to dynamically re-convert code from random jump points, the converter will be even able to cover special cases like "jump to pointer", etc.

  • speaking of wtf's (unregistered)

    man this forum software blows

  • Rick (unregistered) in reply to XIU

    Got 40GB for music, 1TB on anime and other tv series, 200GB for xvid movies and then too much backups I'll never use :)

    1TB of movies? Just how many hours/days/months is that? And do you think you'll ever watch all of that? Besides more is supposedly trickling in...

    Hunter/gatherer of today?

  • Lachy Junior (unregistered) in reply to snoofle
    snoofle:
    Just curious: what do people put on these huge disks to fill them up (I mean besides porn)? And how much of it is actually useful?
    You obviously don't have a DV camcorder, then. A 60 minute tape takes 12GB, so editing together a holiday eats in to your disk space very quickly
  • pistole (unregistered)

    ahh CP/M ..

    I learned programming basic (and assembly) on a Z80 running CP/m (mind you: an Exidy Sorcerer, for those old and grey enough to remember) back in the 80's. Brings back good memories :)

  • Dark Shikari (unregistered) in reply to snoofle
    snoofle:
    Philippe:
    It wasn't a recursion problem. As others have noted, it was more of a scaling problem - the amount of disk space required to store the log data was quickly exhausted.

    This is a WTF along the lines of "no one will ever need more than 640K of RAM" and "14MB of disk space is such a huge amount of disk that no conceivable process could ever fill it up."

    You youngsters and your terabyte filesystems and gigahertz processors...

    I have twin 114GB drives on a 3 year old system (not too shabby for it's day). Even with a few thousand photos of the kids, hundreds of songs and a handful of movies, I've only used about 30GB, and most of that is Windows updates.

    Just curious: what do people put on these huge disks to fill them up (I mean besides porn)? And how much of it is actually useful?

    Well, ISOs take up a whole ton of space, plus the space to install the programs on them. Videos use a ton also.

    I personally have about 100GB of anime, 50GB of other TV shows/movies, 100GB of ISOs and the like, plus a few gigs of music and maybe 50-100GB of video projects with Adobe Premiere.

    I know someone who has 2.6GB of anime, and I also know someone who has over 1TB of trance music (professional DJ). Now that's a lot.

  • (cs) in reply to A jaded user
    A jaded user:
    This story sucked. What business does this have on WTF?

    All the quality left when the site changed its name.

    I've been a daily visitor for almost a year, now I'm down to once a week. I think it's time do drop to once a month. The content has no quality.

    Why don't you drop your visits to once a decade then? Be sure and say hi when you stop by.

  • (cs) in reply to Ornedan
    Ornedan:
    PSWorx:
    Reminds me of a depth search based "maze solver" I wrote as a hobby project when I was still in school... or at least attempted to write...

    The program, like every good maze solver accepted a two dimensional array of "cells" as an input (where each cell was either accessible or an obstacle) and was to return the shortest possible route from the start to the destination cell.

    The program with its really cool recursive depth search algorithm I had proudly coded in VB at that time worked fine as long as it just had to find a path to the target. However, when I tried to find the shortest possible one, it kept mysteriously freezing on some mazes, especially on those with few obstacles...

    I tried and tried but no matter what, I could not find the error... until i realized that the program actually ran as expected, just that iterating through every single possible path on an almost blank 100x100 cells grid might just maybe not be the best idea...

    A* works for this, as has been suggested. Though I'd prefer Dijkstra's algorithm in this case - consistent good performance, whereas with A* your remaining distance estimation heuristic is likely to be wrong for some inputs.

    Nah, you want write a genetic algorithm which searches for the best program (written in Visual Basic) to solve the problem. This works as follows:

    1. Start off with some random sequences of characters.
    2. Print each one out, take a digital photo on a wooden table, print that out, fax it to the processing office.
    3. Processing office scans in the fax, then runs OCR to recover the characters (this is the "mutation" step in the algorithm. They are then printed out and entered by hand into a text file.
    4. An intern tries to run the contents of the file on a few mazes, and gives it an integer score out of 3.
    5. The genetic algorithm is emailed the results, and the mutated programs are dictated back to the mainframe by the secretary at the processing office.
    6. Genetic algorithm combines the results of the programs that got the highest scores, by randomly picking some characters from the first program and some characters from the second. This produces a new set of programs; return to step 2.

    To get marginally faster convergence, start off with solutions posted on thedailywtf in step 1.

  • (cs) in reply to speaking of wtf's
    speaking of wtf's:
    man this forum software blows

    So don't use it. Start your own site with the forum software of your choice, and visit there instead.

  • puzzled (unregistered) in reply to Carnildo

    Why would you want to download wikipedia that justs sounds crazy...

  • eMGeee (unregistered)

    "....was only wonderful in comparison to writing on the sidewalk with the end of a burnt stick."

    Alex, you both cracked me up and gave me the courage to finally pitch out my Osborne.

  • Dark Shikari (unregistered) in reply to puzzled
    puzzled:
    Why would you want to download wikipedia that justs sounds crazy...

    Well its only a few gigabytes 7zipped. Heck you can get the entire full-pages history, probably over a terabyte uncompressed, in just an 8.5GB 7zip file. They're available for download on the database dump page.

    They're in XML format, so you have to parse it first though. And yes, its one giant XML file. 0.o

  • (cs)

    A real WTFesque ending to the story would be if the emulated instruction logger was running on emulated instructions.

  • CraigL (unregistered)

    Given that this website is presumably read by actual geeks that enjoy laughing at boners made by your peers, not one of you geniuses actually understood the problem that G.R.G was describing. To refresh your memory:

    "There was no way such a logger could ever possibly work. Can you figure out why? As it turns out, I didn't have nearly enough CPU power...Formatting a hexdump of a single 8080 instruction requires at least a few hundred instructions...Had I let it run..., I would not have lived long enough to watch CP/M boot."

    It was not a recursion problem nor was it a disk space problem. The problem was that he (without thinking it through) increased the time required to execute each emulated 8080 instruction to the point that he would be waiting a long time before he saw CP/M boot. If we assume that the 8086 ran at an effective rate of a million instructions a second on a 4Mhz clock (generous), and each 8080 instruction required 10 native instructions to emulate, the effective 8080 clock would be 100 Khz. Now if you suddenly add 400 additional 8086 instructions for each 8080 instruction, you now have 1M / 410 or rate of about 2,439 emulated instructions per second, which is really slooooooow (and results in a disk fill rate of about 146K/sec by the way).

    I suspect he could have easily solved the disk space problem by using a pseudo-breakpoint type scheme that relied on unused 8080 instructions to start and stop the logging, and could have been easily set up in the emulation. That way he could have quickly isolated the trouble area of the boot sequence (which I bet was in the CP/M BIOS disk I/O code).

    You kiddies have absolutely no concept of what was involved in writing code down on the bare metal. Multi-core, multi-Ghz processor speeds, terabyte hard drives, and high-level languages have spoiled you all.

  • totolamoto (unregistered) in reply to DrYak

    The NEC V20/V30 clones could even be switched in an emulation mode where they could run 8080 binaries directly. c't had even an article where they set up a CP/M-80 emulator for CP/M-86 machines.

  • Jungleman (unregistered) in reply to Dark Shikari
    Dark Shikari:
    snoofle:
    Philippe:
    It wasn't a recursion problem. As others have noted, it was more of a scaling problem - the amount of disk space required to store the log data was quickly exhausted.

    This is a WTF along the lines of "no one will ever need more than 640K of RAM" and "14MB of disk space is such a huge amount of disk that no conceivable process could ever fill it up."

    You youngsters and your terabyte filesystems and gigahertz processors...

    I have twin 114GB drives on a 3 year old system (not too shabby for it's day). Even with a few thousand photos of the kids, hundreds of songs and a handful of movies, I've only used about 30GB, and most of that is Windows updates.

    Just curious: what do people put on these huge disks to fill them up (I mean besides porn)? And how much of it is actually useful?

    Well, ISOs take up a whole ton of space, plus the space to install the programs on them. Videos use a ton also.

    I personally have about 100GB of anime, 50GB of other TV shows/movies, 100GB of ISOs and the like, plus a few gigs of music and maybe 50-100GB of video projects with Adobe Premiere.

    I know someone who has 2.6GB of anime, and I also know someone who has over 1TB of trance music (professional DJ). Now that's a lot.

    .. 2.6TB? anime? TWO POINT SIX TB? I find your 100GB anime alot. I mean, anime. ANIME. You remind me of this fellow: http://www.youtube.com/watch?v=s3Fzt7EPrI0

  • bramster (unregistered)

    CP/M AND a Hard Drive?

    As I recall, CP/M used 8" floppy drives. . .

    And there was Basic, dBase II. .

  • Anonymous (unregistered) in reply to Dark Shikari
    Dark Shikari:
    puzzled:
    Why would you want to download wikipedia that justs sounds crazy...

    Well its only a few gigabytes 7zipped. Heck you can get the entire full-pages history, probably over a terabyte uncompressed, in just an 8.5GB 7zip file. They're available for download on the database dump page.

    They're in XML format, so you have to parse it first though. And yes, its one giant XML file. 0.o

    Can anyone explain why they would only offer their database dump in XML? It seems like that's one of the least efficient ways of doing it.

  • bramster (unregistered) in reply to Anonymous
    Anonymous:
    Dark Shikari:
    puzzled:
    Why would you want to download wikipedia that justs sounds crazy...

    Well its only a few gigabytes 7zipped. Heck you can get the entire full-pages history, probably over a terabyte uncompressed, in just an 8.5GB 7zip file. They're available for download on the database dump page.

    They're in XML format, so you have to parse it first though. And yes, its one giant XML file. 0.o

    Can anyone explain why they would only offer their database dump in XML? It seems like that's one of the least efficient ways of doing it.

    That's the WTF. Disk Drives are getting hugh, in order to handle very well compressed images, and XML is pushing the size requiremts with its verbose methodology.

    Seagate will be taken over by Parkinson.

  • Loren Pechtel (unregistered) in reply to PSWorx
    PSWorx:
    Reminds me of a depth search based "maze solver" I wrote as a hobby project when I was still in school... or at least attempted to write...

    The program, like every good maze solver accepted a two dimensional array of "cells" as an input (where each cell was either accessible or an obstacle) and was to return the shortest possible route from the start to the destination cell.

    The program with its really cool recursive depth search algorithm I had proudly coded in VB at that time worked fine as long as it just had to find a path to the target. However, when I tried to find the shortest possible one, it kept mysteriously freezing on some mazes, especially on those with few obstacles...

    I tried and tried but no matter what, I could not find the error... until i realized that the program actually ran as expected, just that iterating through every single possible path on an almost blank 100x100 cells grid might just maybe not be the best idea...

    Yeah--maze solvers need to work breadth-first and log what they've done. The time is linear with regard to the number of cells no matter what.

  • whicker (unregistered) in reply to CraigL
    CraigL:
    You kiddies have absolutely no concept of what was involved in writing code down on the bare metal. Multi-core, multi-Ghz processor speeds, terabyte hard drives, and high-level languages have spoiled you all.

    Uhm.

    Writing code for simple microprocessors in assembly is so easy it's almost fun. There's like 30 actual instructions to remember... no memory controller, no cache to worry about, full control over peripheral chips like serial ports, simple jsr instructions (or interrupts) to the bios, etc.

    And yes, I have used an EPROM eraser and programmer. Quite quaint, yet effective if one keeps a rotating supply of them.

    But then again, if somebody told me to set up a database server without assistance, I'd be mostly clueless.

    I guess what I'm getting at is don't group us "kiddies" together.

  • MoTH (unregistered) in reply to snoofle
    snoofle:
    I have twin 114GB drives on a 3 year old system (not too shabby for it's day). Even with a few thousand photos of the kids, hundreds of songs and a handful of movies, I've only used about 30GB, and most of that is Windows updates.

    Just curious: what do people put on these huge disks to fill them up (I mean besides porn)? And how much of it is actually useful?

    Music and movies. I can still fit my photo collection on a single layer dvd no problem.

    When I rent dvd's I rent 7 at a time (better deal) rip them to HD and return them imediately (no late charges). Then I watch them when I damn well please. At any time I could easily have 80Gigs of ripped dvd's.

    Music is about 30GBs right now or so.

    Music editing files (I do some home recording).. at multiple tracks per song in 24bit 48khz.. well it takes up a LOT of space.

    Software, various OSs (usually run at least dual boot between some linux distro and one version of Windows, although I have run multiple versions of windows).

    Currently of my 330GB of space I have less than 100GB free (more likely around 50GB). And even though I am young I do remember thinking 40MB was huge and impossibly to fill. Now if I had 40MB left on a partition I'd freak out because it is full.

  • Loren Pechtel (unregistered) in reply to RON
    RON:
    Should have used a heuristic breadth-first search to weight cells based on if they are moving closer to your destination or not. It's not a perfect solution, but it returns the shortest route for 99% of all test cases.

    That's still not the right answer.

    distance : array [xsize, ysize] of integer;

    for all cells if empty[cell] then distance[cell] := 0 else distance[cell] := maxint

    distance[start] := 1 put in quee start done := false

    repeat get from queue test current := distance[test] inc(current) check(test.x - 1, test.y) check(test.x, test.y - 1) check(test.x + 1, test.y) check(test.x, test.y + 1) until queue empty or done

    if done then repeat current := finish test := distance[current] push finish if distance[current.x - 1, current.y] < test then current.x := current.x - 1 else // I'm not going to bother with the others until current = start push current // The shortest path is now on the stack

    procedure check(xy)

    if distance[xy] = 0 then begin distance[xy] = current put in queue xy if xy = finish then done := true end

    Note the performance situation:

    The runtime is inherently limited to 4 calls to check per cell on the board no matter how empty the maze. The shortest path is always found.

  • (cs)

    I got a Commodore 128 around 1986 and it featured triple booting: C64, C128, and CP/M. C64 mode was essential for running video games and C128 mode had a much nicer dialect of BASIC. I thought that CP/M mode must be useful for something, but all it seemed to do was "store files, load code into memory, and run code in memory". Considering that there were no programs included and that I never found any software online, I really wonder what the designers thought was so great about including CP/M.

  • Speng (unregistered)

    You 'pore' over code, not 'pour' over it. Sheesh.

  • speaking of wtf's (unregistered) in reply to KenW
    KenW:
    A jaded user:
    This story sucked. What business does this have on WTF?

    All the quality left when the site changed its name.

    I've been a daily visitor for almost a year, now I'm down to once a week. I think it's time do drop to once a month. The content has no quality.

    Why don't you drop your visits to once a decade then? Be sure and say hi when you stop by.

    I have to agree, the stories suck, the forum software blows, and even this idiot user I'm replying to has a brick for a brain. I say shut the site down before it SUCKS THE FUN OUT OF THE ENTIRE INTERNET.

  • PB (unregistered) in reply to Alex

    I guess the "digital cassette" storage I had beat you both. I could actually get my BASIC program restored with only a little corruption.

  • Nils (unregistered)

    The real WTF is, that it took more than two days to write a full functional 8080 emulator.

    That's such an easy cpu.

  • TheRealBill (unregistered) in reply to snoofle
    snoofle:
    I have twin 114GB drives on a 3 year old system (not too shabby for it's day). Even with a few thousand photos of the kids, hundreds of songs and a handful of movies, I've only used about 30GB, and most of that is Windows updates.

    Just curious: what do people put on these huge disks to fill them up (I mean besides porn)? And how much of it is actually useful?

    Video and games. Videos of the family (birthday parties, holidays, etc) , making them into DVDs, etc. takes up quite a bit of storage space. Ever see how much an hour of video capture takes up? Let's just say it isn't small.

    Then for those with MS Office there is several GB of data. Many games these days install several gig as well. Heh I was installing a bunch of my old games from the Win95 era of PC games.

    It was quite amusing to see games have a "normal install" and a "full install"; with the full install confirming I wanted to install 30MB of game data. Oh the horror! Did I have room? Where or where could I squeeze it in?

  • TheRealBill (unregistered) in reply to pistole
    pistole:
    ahh CP/M ..

    I learned programming basic (and assembly) on a Z80 running CP/m (mind you: an Exidy Sorcerer, for those old and grey enough to remember) back in the 80's. Brings back good memories :)

    "grey enough" implies we still have hair.

Leave a comment on “Emulating the 8080”

Log In or post as a guest

Replying to comment #:

« Return to Article