• (cs) in reply to Anon
    Anon:
    It's a Feature:
    How simple! On the web server, just put a job in the task scheduler that runs once a day with this in the commandline:

    http://localhost/dispatchqueued.asp

    And the WTF will live on!

    Don't laugh. That's how Drupal, a popular open source CMS, works.

    (It's even the CMS that Tim Berners-Lee uses for his blog. Yes, that Tim Berners-Lee.)

    And if the part that sounds WTF-y is the bit where the cron job opens an http connection to start the script, keep in mind that

    php public_html/drupal/cron.php

    should work just fine.

  • Rewt (unregistered)

    I worked in a company that had a process much like this one. Theirs was end-of-month though - 3 days before the last day of the month, they would go to a webpage (unsecured, I might add) and click submit buttons to run processes. After several attempts by myself and other to convince management that this wasn't a safe way to do things, somehow the page got found by spambots, and EOM reports were being run repeatedly on any given day.

    They finally wised up.

  • Hah (unregistered)

    To get rails to work on a shared host I set up a cron job on my local machine that polled the site every 5 minutes. That took care of the 25 second load time on the first hit. That's another WTF, I suppose.

  • Diego (unregistered) in reply to phaedrus
    phaedrus:
    Andrew:
    Use CGI to spawn your own cron-daemon, that reads a flat-file list of scripts. This is easy to do using Perl fork/exec. I don't know to make PHP spawn a child process.

    pcntl_fork()/pcntl_exec() are how PHP exposes those system calls. Took a quick search for 'fork' on the online PHP manual.

    I'd use php5 as my main language but it is too easy to make a real mess because even the simplest errors are detected only at runtime. And a silly typo is sufficient to spawn nasal demons since we don't declare variables :(

    It is a great language with a great variety of tools, but it needs to be more robust to be really useful.

  • aaa (unregistered) in reply to pitchingchris

    "Not to mention it sent everything down to one point of failure, so if Frank goes into a coma for awhile, we just can't get cards then"

    Then I hope Frank goes into a coma. I hate receiving e-cards.

  • (cs) in reply to aaa
    aaa:
    "Not to mention it sent everything down to one point of failure, so if Frank goes into a coma for awhile, we just can't get cards then"

    Then I hope Frank goes into a coma. I hate receiving e-cards.

    That comment wins. XD

    -- Seejay

  • Brad (unregistered)

    TI in IT? Oh god, that sounds awful. Like "case of the mondays" or "michael bolton" awful.

  • Chance the Gardener (unregistered) in reply to zip
    so, just check to see if it's time for your "job" on every page load...

    If you mean something like 'There is a .0001% chance that any given page load will kick off garbage collection' or something, that seems reasonable to me- I have used that trick to clean up session variables. Long tasks I would avoid doing this way, as that user who 'gets lucky' doesn't want a huge wait.

  • ForcedSterilizationsForAll (unregistered) in reply to pitchingchris
    pitchingchris:
    Think of all the time he'll never get back because he has to go send all the jobs manually.... Not to mention it sent everything down to one point of failure, so if Frank goes into a coma for awhile, we just can't get cards then. You'll get him once he comes out of it.

    He'll get his "Get Well Soon" e-card then as well.

  • nobody (unregistered) in reply to zip
    zip:
    Does anyone know of a better way to do this? I guess a cron job on my local machine invoking a page load once a day would be better, in case I wasn't getting any traffic :)

    Get a free site uptime monitor and have it ping a particular script a few times a day. :-P

  • (cs)

    Now he can add to his schedule tasks buying several pizzas on Christmas day

  • LKM (unregistered)

    I'm also guilty of having had locally-run cron jobs starting wget downloading pages starting scheduled tasks on a server because the provider wouldn't allow any better solution.

  • (cs)

    So Frank didn't know (or didn't care to know) how to schedule jobs. Big deal. But he didn't care at all to make the process safe and foolproof. That's a big deal - and the typical "developer/devil-may-care"-attitude I have observed often enough with fellow developers when it comes to system management issues. And it is the very reason that developer are not to be trusted with direct production application/server access unless supervised by a responsible adult .... I am not kidding.

  • (cs) in reply to Someone
    Someone:
    I've seen this sort of thing on websites that are hosted cheaply in a multihosting enivironment, where you have PHP and MySQL, but no cron (or atleast no access to it). What they did was write a short snippet of code vaguely emulating cron's behavior and including it at the bottom of every PHP page of the site.

    This is sort of how the session cleanup code in the Common Lisp webserver Hunchentoot works (or at least used to work). Every 50 times a request with sessions is performed, the session garbage collector is run, deleting older than permitted sessions. Not actually a horrible solution in this particular case.

  • D (unregistered)

    I have actually done this myself. At a previous job, there was a directory of temporary files that needed to be cleaned out periodically. I created a simple web page that deleted everything in that directory and forwarded me to our intranet site. I knew it was kludgy, but it took 2 minutes to create and worked until I was let go from that company.

    The bonus is that I knew that after I was fired, there was at least one process that would screw up and cause someone to have to research and fix it. Bwah-ha-ha!

Leave a comment on “The Batch Process Manager”

Log In or post as a guest

Replying to comment #:

« Return to Article