We've all heard the stories of spectacular failure from the dot-com bubble a few years back. Spectacular failure, however, wasn't limited to just the big names like Kozmo.com, WebVan, eToys, Pets.com, etc. Rob "Scruffy" Rescorla has the pleasure of working at a smaller dot-com that managed to only get a few million dollars in start up capital ...

Many years ago, I landed a job as a programmer with a search engine/directory company. It seemed like a great idea at the time. Soon, everyone would have free access to the Internet (thanks to the people at FreeInternet.com, NetZero, etc), and they'd want a reliable and trusted way to find things on it. That's where my employer came in.

Even though this search engine only covered the UK, the web servers were hosted somewhere in Canada. But that was OK, so said management, because soon, all ISPs would be able to get great bandwidth across the Atlantic.

Unlike some of the other dot-coms at the time, my employer didn't pamper its employees with pool tables or really trust them with distractions like, oh, say, the Internet. But that was OK; we could fairly easily develop and update the code offline, then go to the Windows 95 PC and upload the new code to the servers.

We also didn't have any sort of web spider, crawler, or any other fancy gizmo to get sites in our index. If you wanted your site indexed, you had to fill out a web form that would, in turn, save a file on the web server. It was the job of the one of the data entry clerks to go to the Internet PC and download the files to disk to be printed. From there, the data entry team would put the data into our local system through a Microsot Access database.

As part of being the "reliable and trusted" search engine, we wanted to make sure that the sites submitted actually existed. So before uploading the sites, another data entry clerk would visit each and every one of the newly entered sites on the Internet PC. If the site was legit, the user would mark the site as valid and fire off an email to the email address on the submission form. Sites that wouldn't load were rejected and the checker would send an appropriate email.

Once the new sites index was created, it was time to upload and merge the database. The developers didn't believe that databases could possibly be queried in real time, so they had perl scripts that would generate category and summary pages whenever a category record was updated. But, there were some problems with this -- and that's where I was brought in to help.

With all the recent records being added, we had about 200,000 generated files on the remote server. Whenever a record was added in the middle of the list, all of the files would need to be moved and renumbered. Since they only updated one record at a time, a move script to update only fifteen items would result in over a million move commands to be invoked. As you might immagine, there was little or nothing that I could do to help.

I didn't last too long there, but I do remember the last conversation I had before walking out ...

Boss: We're taking an inventory -- how much RAM does your new workstation have?
(referring to my workstation, a new Windows 3.1 machine)
Me: It has 13 megabytes.
Boss: No, that's not possible.
Me: Well, that's what it has ...
(goes and talks to another developer)
Boss: Jerry says that you can only have 8 or 16. Are you lieing to me?
Me: Umm ... look ...
(flick off computer, turn it back on, show the 13MEG from BIOS)
Boss: Oh.
Me: ...
Boss: Don't shut your computer down like that.

[Advertisement] BuildMaster allows you to create a self-service release management platform that allows different teams to manage their applications. Explore how!