- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Well... that's what I was saying. The spider will not see the link at all. I guess you need to be a registered user, probably with high privileges, in order to delete stuff. Definitely a spider does not fall in this category, whether it declares itself or not... so just don't show it the link (as you don't show it to guest users or to normal users that can't delete stuff)
Admin
It's still a good idea to use POST requests for this. I have used links to delete content once and got promptly bitten by an authorized user who used a web mirror tool on that site. Done right the data will at least survive well behaved tools in the hands of your average user.
Admin
FWIW, I always use Location, followed by exit; and/or die();
Admin
I think somebody put something in my morning coffee, I'm starting to things twice, thrice, nay... I could swear I saw the die()/POST/header()/if($auth) comments at least a hundred million times in the last five minutes.
I'm going back to bed...
Admin
Yeah, something's definitely wrong with this coffee...
Admin
Admin
input type="button"
No, no, no, no. no!!! It is:
input type="submit"
Whoever came up with the "button" that doesn't do anything unless I give the entire internet permission to totally fsck my browser (also called enabling scripts) should be shot. In the leg, so as to live long enough for the torture.
The form submit capability is there for a reason. Use it!
And oh yeah, GETs should be idempotent. If you didn't know that already, kindly remove your stinking excrement from my web.
Admin
Using a POST (input button) instead of a GET (link) is not a proper solution for this threat. It would have fix the case for the crawler but a hacker could have done de same using POST method.
Use Get or POST when ever you want to. Just remember de difference and the guidance from the w3c: get is for navigation, post is for data.
my 2 cents
Admin
But deleting the whole website is idempotent, unless it fails when there is nothing left to delete...
Admin
Why oh why can't grammar people just stay at their own linguistics forums and keep away from any technical debates on the web? The signal-to-noise ratio isn't exactly increased by their continuous super-relevant insights.
[sorry, had to blow off some steam - am happy again now]
Admin
Admin
Until, that is, a perfectly legitimate and fully authorized user installs a browser plugin to preview links or speed up browsing by prefetching them.
Admin
Admin
Admin
I got hacked by Google once. It turned out to be code roughly equivalent to this one:
Please do not ask why I thought this would be a good idea, I did not know either when I finally figured it out.
Needless to say, this page doesn't have confirmation forms for deletion, because I didn't want to go through that trouble.
Admin
In my much younger (and much more naive) days, I was involved in developing a custom CMS for a small online shop. One day we ran Xenu link checker on it, which obligingly followed all the unguarded "delete" links and wiped the whole stock database.
Doh!
Admin
Admin
Admin
This is what I will be telling everyone today..
Admin
Does everybody here know what is an header, and what the client can do with it, if the true meaning is an attack?
Admin
Well, that's what the validation code was for, wasn't it? If you're not authorised to view the page you get redirected back to the index....
...Unless you're one of those clients that ignore Location: headers.
Spider hits this code, it's not authorised, so it gets a Location header (which it ignores) followed by a big long list of delete links, which it blindly starts following. It's a good bet that the validation mechanism on subsequent pages is the same.
Even if the Location: header is respected by the client, there is nothing in the call to header() that terminates the script's processing. The server still goes to the effort of generating that big long list of delete links: I guess it's only blind luck (to put it charitably) that prevented anyone from seeing what would happen if an unauthorised user tried to access the page that actually performs the deletion. Prior, that is, to it happening in the wild. (In other words: no, it's not "all well and good if the client respects the Location header").
The only difference caused by the client ignoring "Location:" is that it didn't ignore the content that followed (which according to spec is supposed to offer alternative URIs for the requested resource anyway).
Admin
"..differentiate the difference..."????
wtf?
Admin
If you consider producing a parse error to be a quick fix....
DoH!
Admin
This is not just a PHP WTF - it's an HTTP WTF.
There are too many web developers who don't understand the basic tenets of HTTP. Including simple things such as, Location headers don't force the client to do anything.
I've tried to explain this before and been met with blank stares (or at least I imagine the people on the other end of the interwebs are staring blankly) when pointing out that following headers is technically entirely optional for browsers.
Of course, the PHP should have included an exit() or die() after the header() line - but you don't understand this until you understand HTTP.
Admin
A similar thing happened to a friend of mine. He had to (for some reason) remove the security on his phpMyAdmin for a few minutes. Of course, just then, google wanted to index his page. poof, no data.
Extreme timing, i must say.
Admin
Is this a WTF-classic, or just a re-run of "Google ate my website" http://thedailywtf.com/Articles/The_Spider_of_Doom.aspx ?
... and to rfsmit : Folk in this decade who feel the need to capitalize "The Web" or "The Internet" are the same folk who still use phrases like "The Cancer", "The Aids" or "The Iraq". Once someone understands how it works, they stop calling devices the "Tele-Phone" or "Auto-Mobile". You'll get there eventually.
Admin
Admin
"Tele-ma-phone". "Auto-ma-mobile". Aww, that was easy!
Admin
Admin
Nothing, because you also have the check in the page that actually issues the DELETE.
I was just saying that if you start by putting the check in the frontend page, where you have the link to the DELETE page you're even less prone to this kind of disaster, and you don't expose the page address to everyone.
Admin
Additionally, there is also robots.txt which one can use...
Admin
I never use the location header for just this reason
Captcha explains it all: validus
Admin
Wow - 1998 called. It wants its WTF back.
Admin
Reminds me of the webmail application I worked on in the late 90s.
I discovered if I "cleverly" crafted an HTML email with a couple of img tags in it, like: [image] [image]
I could delete most of another users Inbox (not necessarily all since the emptyTrash might have kicked off before the delete was finished).
After a bit more investigation I found I could do pretty much everything via img tags. For instance, something like this: [image]
would add a forwarding address to the account. Since desc and name were empty, it wouldn't show up in the forwarding address section of the user's profile.
Or I could fill an email with a couple thousand: [image] [image]
Admin
QFT
Admin
This is exactly why 'web accelerators' never took off (for example, the one from [a large search company with a funny name]), some online 'forums/webhosting/etc.' had/have "Delete Account" as a hyperlink (good bye account!)...also could be one of the lesser reasons why there's the presence of Captchas on web pages today...
Admin
"Location" is only meaningful if the response code is 3XX or 201.
Admin
never mind... IA is probably doing the right thing; the Location isn't being ignored; it's just too late.
Admin
Also, thanks for showing me the BBCode "code" tag. (Yeah, all I had to do was click BBCode Okay, but I didn't.)
-- Furry cows moo and decompress.
Admin
So you are saying brazzy shouldn't tell himself?
Admin
Maybe use better laid out code?
Or, even better, try using functions
As far as I can tell things like
are nothing more than rebranded gotos, with all of the problems that they bring. For a simple one-line test at the top of the page, it may not be too much of an issue but if you start scattering them throughout your code as soon as you've reached a certain point, then trying to check whether a particular bit of code is going to be executed can be a nightmare.Admin
Yup that should have worked. I also sometimes use a javascript redirect (document.location) and for the noscript-ers beyond us I would've made a clickable hyperlick to the index.php.
Admin
header("location: someurl"); exit; // fixed.
Admin
Exactly! If you put exit; or exit() you will always STOP execution of the script immediately. PHP will not continue to execute and even robots will only see a blank unfunctional page.
Admin
Admin
The main problem with goto is that it isn't easy to tell exactly where it is going. Also, it's flexible enough that one can get truly horrendous flow control.
for and while are rebranded goto. They limit the flexibility enough that one has a great deal of difficulty making the sort of gordian knot that was common back in the days of goto.
exit is a rebranded longjmp. It has the saving grace that there's only one place that it will longjmp to.
While having multiple return or exit points in your subroutines or web page may make things difficult, it's nowhere near what goto (and gosub) once did.
Admin
Actually, those signs are much more about allowing the store or mall security types to treat anyone with detectable weapons as a gun-toting maniac - thus giving them a chance to deal with said maniac before he starts hurting people.
Now, thinking such a sign will keep out the maniacs, allowing one to not have in-store security is a serious WTF. However, to my knowledge, I've only encountered one store manager that delusional.
Just because you don't see the store security types doesn't mean they aren't there. Some savvy stores have one unit on duty watching the store cameras, a uniformed unit or three on call, and a few plain-clothes guards roaming the store, looking like customers. Really savvy stores will also have a uniformed guard near each exit, in addition to the hidden security.
Admin
I only saw one before this post. Most of the prior posts indicated using die or exit in the beginning if block. That only equates to wrapping the entire file in an if block if you're one of those purists who feels that each section of code should only have one return or exit point.
For what it's worth, while I do agree that one should limit the number of exit points for any subroutine, it usually makes code less legible to carry this to the extreme. I find it tends to work better to try to keep early returns at the very beginning. If one really needs to have a large block of code, a potential exit point, and then another large block of code, that probably should be refactored into two subroutines, which are invoked by a simple subroutine which calls the first, and then conditionally calls the second.
Admin
This is a fundamental misunderstanding of header redirects. As other posts have mentioned, specifying that there should be a redirect header does not stop processing. If there is a die immediately after a header redirect, the code will always hit it, and your error logs will fill up with useless "errors" which are, in fact, statements of the web page actually working as designed.
Debugging your web pages will go much easier if you keep stuff like this out of your logs. That way, when you have problems, you can go to your logs, and actually find real problems.
(I'm submitting this response, because nobody else has indicated why one should use exit instead of die - people have only stated that one or the other should be used. It's not just a meaningless preference thing. Using one is poor practice, using the other good practice. Either is far, far better than having completely broken code like, well, the "Well-Intentioned Destruction" above.)
Admin
I work for a national internet archiving consortium. We are legally obliged to ignore robots.txt.