- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
No.
Admin
Title
Admin
Unless someone's monkeyed with the permissions, you can still do the normal one, which is what anyone "not in the know" would do anyway.
Admin
${GREP} all the things
Admin
This script makes more sense than you think.
cat, when called with no arguments, sits there waiting for input from stdin. Not a behavior you want if all you wanted to do was show a file.
Called with non-existent file, cat prints an error on stderr - something that the alternative does not.
And finally, there are lots of reasons to distrust PATH variable and aliases in a system that script whiter does not control.
This is good defensive programming for scripts that try to be production quality rather than ad hoc hacks.
Admin
Personally, I'd find /bin/grep much easier to type than the finger-gymnastics of ${GREP}, probably even $GREP.
Also, PSA: use $() instead of backticks if your shell supports it (it usually will)!
Admin
It's not all that bad. The difference clearly shows in 'cat | grep x' and 'cat_file |grep x' cat_file is actually a wrapper that handles the fact that cat will use stdin by default. In a script that would result in waiting for input (indefinitely)
Admin
This wins the useless use of cat_file award!
Admin
I'd do
dev=$(grep eth < /proc/dev/net)
The entire thing just demonstrates what a shit-show shell programming really is. Don't do Bourne if your idea of "production quality" includes things like robustness against a borkened PATH.
Admin
If you don't trust the PATH, just reset it to a good value at the start of your script. As a bonus, this is also inherited to all programs your script calls.
Admin
is grep's -f switch cursed or something?
Admin
-f FILE retrieves patterns from FILE rather than using FILE as input. You can just give it a list of files as arguments though!
Admin
TRWTF is loading the whole file into memory before sending it to stdout. Must be very efficient with huge files…
And as a bonus, this wrapper does not even behave exactly as cat would:
Admin
The checks in cat_file make sure cat is never used with multiple files or with any switches that change the behavior, in addition to preventing it from just waiting on stdin. If the arguments are generated by variable/command/etc. expansion or might have spaces, those could be issues. As for $GREP and similar, one use I can see for those is debugging during development. The commands could be replaced with a function or script (say, one that logs the arguments to a file, then uses tee to copy the output into that same file) everywhere, just by setting the variable once.
Admin
On the internet no one knows you are a cat.
Admin
Another reason to put utility paths into variables (e.g.: GREP=/bin/grep) is to prevent path searches in deeply nested loops - sometimes it can make a difference; especially if the path is huge.
Admin
You're a ${GREP}ist!!!
Admin
Patterns such as "GREP=/bin/grep" used to be common when people wrote scripts that may run over many different environments, especially when the OS may have had a mixture of BSD+SYSV utilities; I might want /usr/ucb/ps but /bin/ls and so hard code the path to the utilities. Porting then becomes just modifying the constants at the top.
There's not much justification in an embedded system running a dedicated OS, but maybe the scripts have a historical legacy and just haven't been refactored.
Admin
This is the dreaded "Inner Platform Effect" in spades. Wrappers just to show what you REALLY mean, even if you don't know it yourself.
Yes, one can write shell scripts, and when doing so, you find out that the shell actually has lots of "helps" that take away the need for just about any helpers at all. With he current project I'm working on, there are a multitude of shell scripts that once I look at them, I say "what were they thinking", and then remind myself that they have accreted over a span of over 20 years, piece by piece. These accretions, very well intentioned, sometimes look like a hack that someone wrote just to make it to the next step, and after many years the code is FULL of them. There is no sorting it out as the whole thing is MANY 1000's of lines, and I'm sure that no ONE person actually knows all the ins and outs that lie therein.
Rewrite for clarity? I have doubts. This is a test suite that is both working, AND comprehensive, and a rewrite would take years, and accomplish little. If it works, don't fix it. Life goes on.
Oh yes, you can usually tell a noob when you see:
$ cat file | grep pattern
In any form.
Admin
Damn straight, no one trusts the PATH variable. Your script will be run by a user whose $HOME/bin/rm is a script that does "/bin/rm -i".
Admin
Re "those heavy duty paper cutters that are basically guillotines" - when I worked in a (book) printers, they were called guillotines.
Admin
cron jobs are a good example of the PATH variable being unreliable.
Whoever wrote this script obviously knows more about shell script than the person who wrote the article.
Admin
cat_file is useful :
But it forget a few things:
Admin
Absolutely correct. The only real bug in the presented code is that the file data is being stored in a buffer and echoed out again rather than being piped to the next command as plain-old cat would do. This becomes an issue for large files.
Admin
I maintain a large codebase, that originally ran on Minix, later OpenServer, and now Linux. If I had a dollar for each hour we've spent writing defensive wrappers, that mimics the (mis)behaviour of shell commands, stdlib and compiler on OpenServer, I'd be able to take half a year of unpaid leave. I would not be surprised the least, if the siftware controlling this printing press is ported from something really obscure, with its own set of quirks.
Admin
I haven't see a UUOC award in 15 years. You're obviously an old usenetter like me.
Admin
This is indeed a correct implementation in pure shell, but you really really do not want to use that for production if there are any big files to process, because the performance of the bash I/O with read is abysmal (the shell will read the input byte per byte...).