• justanewbie (unregistered)

    No.

  • A cat with belt and suspenders (unregistered)

    Title

  • And of course... (unregistered)

    Unless someone's monkeyed with the permissions, you can still do the normal one, which is what anyone "not in the know" would do anyway.

  • Cornify (unregistered)

    ${GREP} all the things

  • AB (unregistered)

    This script makes more sense than you think.

    cat, when called with no arguments, sits there waiting for input from stdin. Not a behavior you want if all you wanted to do was show a file.

    Called with non-existent file, cat prints an error on stderr - something that the alternative does not.

    And finally, there are lots of reasons to distrust PATH variable and aliases in a system that script whiter does not control.

    This is good defensive programming for scripts that try to be production quality rather than ad hoc hacks.

  • Bert (unregistered)

    Personally, I'd find /bin/grep much easier to type than the finger-gymnastics of ${GREP}, probably even $GREP.

    Also, PSA: use $() instead of backticks if your shell supports it (it usually will)!

  • useless use of cat (unregistered)

    It's not all that bad. The difference clearly shows in 'cat | grep x' and 'cat_file |grep x' cat_file is actually a wrapper that handles the fact that cat will use stdin by default. In a script that would result in waiting for input (indefinitely)

  • gleemonk (unregistered)

    This wins the useless use of cat_file award!

  • gleemonk (unregistered)

    I'd do

    dev=$(grep eth < /proc/dev/net)

    The entire thing just demonstrates what a shit-show shell programming really is. Don't do Bourne if your idea of "production quality" includes things like robustness against a borkened PATH.

  • Foo AKA Fooo (unregistered)

    If you don't trust the PATH, just reset it to a good value at the start of your script. As a bonus, this is also inherited to all programs your script calls.

  • p. dexter (unregistered)

    is grep's -f switch cursed or something?

  • Bert (unregistered) in reply to p. dexter

    -f FILE retrieves patterns from FILE rather than using FILE as input. You can just give it a list of files as arguments though!

  • (nodebb)

    TRWTF is loading the whole file into memory before sending it to stdout. Must be very efficient with huge files…

    And as a bonus, this wrapper does not even behave exactly as cat would:

    • When reading the file content into the "temp" variable, all trailing newlines are stripped. But "fortunately" they add another newline in the output by calling echo so in most cases they won't notice the difference
    • They check if the file exists (and return 1) but other than that they will joyfully ignore any cat failures and return a zero exit status in all cases
  • Eric (unregistered)

    The checks in cat_file make sure cat is never used with multiple files or with any switches that change the behavior, in addition to preventing it from just waiting on stdin. If the arguments are generated by variable/command/etc. expansion or might have spaces, those could be issues. As for $GREP and similar, one use I can see for those is debugging during development. The commands could be replaced with a function or script (say, one that logs the arguments to a file, then uses tee to copy the output into that same file) everywhere, just by setting the variable once.

  • my name is missing (unregistered)

    On the internet no one knows you are a cat.

  • snoofle (unregistered)

    Another reason to put utility paths into variables (e.g.: GREP=/bin/grep) is to prevent path searches in deeply nested loops - sometimes it can make a difference; especially if the path is huge.

  • JimTonic (unregistered) in reply to Cornify

    You're a ${GREP}ist!!!

  • Stephen (unregistered)

    Patterns such as "GREP=/bin/grep" used to be common when people wrote scripts that may run over many different environments, especially when the OS may have had a mixture of BSD+SYSV utilities; I might want /usr/ucb/ps but /bin/ls and so hard code the path to the utilities. Porting then becomes just modifying the constants at the top.

    There's not much justification in an embedded system running a dedicated OS, but maybe the scripts have a historical legacy and just haven't been refactored.

  • Herby (unregistered)

    This is the dreaded "Inner Platform Effect" in spades. Wrappers just to show what you REALLY mean, even if you don't know it yourself.

    Yes, one can write shell scripts, and when doing so, you find out that the shell actually has lots of "helps" that take away the need for just about any helpers at all. With he current project I'm working on, there are a multitude of shell scripts that once I look at them, I say "what were they thinking", and then remind myself that they have accreted over a span of over 20 years, piece by piece. These accretions, very well intentioned, sometimes look like a hack that someone wrote just to make it to the next step, and after many years the code is FULL of them. There is no sorting it out as the whole thing is MANY 1000's of lines, and I'm sure that no ONE person actually knows all the ins and outs that lie therein.

    Rewrite for clarity? I have doubts. This is a test suite that is both working, AND comprehensive, and a rewrite would take years, and accomplish little. If it works, don't fix it. Life goes on.

    Oh yes, you can usually tell a noob when you see:

    $ cat file | grep pattern

    In any form.

  • Fernando (unregistered)

    Damn straight, no one trusts the PATH variable. Your script will be run by a user whose $HOME/bin/rm is a script that does "/bin/rm -i".

  • thosrtanner (unregistered)

    Re "those heavy duty paper cutters that are basically guillotines" - when I worked in a (book) printers, they were called guillotines.

  • Guest (unregistered) in reply to Stephen

    cron jobs are a good example of the PATH variable being unreliable.

    Whoever wrote this script obviously knows more about shell script than the person who wrote the article.

  • Jérôme Grimbert (google)

    cat_file is useful :

    • allow only one file at a time
    • check that file exists

    But it forget a few things:

    • check that the file is readable
    • check that the file is not empty
  • Head-Banger (unregistered) in reply to Guest

    Absolutely correct. The only real bug in the presented code is that the file data is being stored in a buffer and echoed out again rather than being piped to the next command as plain-old cat would do. This becomes an issue for large files.

  • AWK (unregistered)

    I maintain a large codebase, that originally ran on Minix, later OpenServer, and now Linux. If I had a dollar for each hour we've spent writing defensive wrappers, that mimics the (mis)behaviour of shell commands, stdlib and compiler on OpenServer, I'd be able to take half a year of unpaid leave. I would not be surprised the least, if the siftware controlling this printing press is ported from something really obscure, with its own set of quirks.

  • Dan Mercer (unregistered) in reply to gleemonk

    I haven't see a UUOC award in 15 years. You're obviously an old usenetter like me.

  • (nodebb)

    This is indeed a correct implementation in pure shell, but you really really do not want to use that for production if there are any big files to process, because the performance of the bash I/O with read is abysmal (the shell will read the input byte per byte...).

Leave a comment on “A Lazy Cat”

Log In or post as a guest

Replying to comment #:

« Return to Article