• anonym (unregistered)

    this comment will auto destruct

  • (cs)

    AOL in 1996: "Data Retrieval Failure."

    Sounds about right. :^)

  • Captain Oblivious (unregistered)

    If you rely on a form button to delete content, also remember to use the POST method, not GET!

  • Jim (unregistered)

    if(!$Auth) { header('Location: index.php'); exit; }

  • (cs)

    Until the Wayback Machine fixes itself, check out aol.com from 1997 here and read all about the beta release of some crazy piece of software called "AOL Instant Messenger"

  • (cs)

    Yeah, that will teach them to not make their site RESTful!

  • Edgeman (unregistered)

    And that's why header redirects like this need a die(); afterward... even if you never expect the code to hit it, better to be safe than sorry.

  • PerfectlyNormal (unregistered) in reply to Havstein
    Havstein:
    Yeah, that will teach them to not make their site RESTful!

    Doesn't sound all that RESTful to me. If it was, they'd have used DELETE to delete pages. And I really doubt the Wayback Machine's bot sends off DELETE-requests to pages. It should only do GET.

    So if they had made it RESTful, this wouldn't have happened.

    A good idea is to never let a GET-request do anything that changes things.

    "Some methods (for example, HEAD, GET, OPTIONS and TRACE) are defined as safe, which means they are intended only for information retrieval and should not change the state of the server. In other words, they should not have side effects, beyond relatively harmless effects such as logging, caching, the serving of banner advertisements or incrementing a web counter. Making arbitrary GET requests without regard to the context of the application's state should therefore be considered safe." http://en.wikipedia.org/wiki/HTTP#Safe_methods

  • (cs)

    I don't believe you can call yourself a web developer until you've built an app that uses hyperlinks for deletion and have all your data deleted by a search bot.

    I managed to do that on a system not even connected to the internet - it was an experimental app for my own use. And then I decided to experiment with an indexing engine as well...

  • (cs)

    If this site doesn't even validate credentials to DELETE a page, what about adding or modifying content?

    This sounds like every website defacer's wet dream.

  • Madball (unregistered)

    This brings back memories. Had almost the same exact thing happen several years ago. We had a a backdoor admin page for a supplier "portal" for creating announcements. It had convenient little edit/delete links next to each pre-existing posting. We even had some nice internal documentation which spelled out where and how to create announcements. All was well with the world for a couple years (security was not really a concern).

    Until, the infrastructure guys bought a google search appliance. It dutifully found the documentation -> backdoor -> followed each delete link. It took some effort to track it down. After some chuckling, we started blaming google for all future mysterious occurences.

  • (cs)

    Actually it has nothing to do with the header. Some headers are not supposed to be followed by a page body and Apache will actually go to great lengths to remove the body if it encounters such a reply before sending it to client. Therefore the bot can ignore all the headers it wants.

    The actual WTF is sending a header but not aborting the control flow so the header gets cached for output while PHP is still happily executing the rest of the file (presumably deleting stuff).

    So I say there are two WTFs: the actual one pointed above and the explanation at the end of the original story.

  • Foo (unregistered) in reply to Jim

    Exactly.

  • ckoppelman (unregistered)

    or just add rel="nofollow"

  • (cs) in reply to brazzy
    brazzy:
    ...I managed to do that on a system not even connected to the internet - it was an experimental app for my own use. And then I decided to experiment with an indexing engine as well...

    You, sir, win the internet today.

  • redwall_hp (unregistered)

    Sigh

    This is why you put an exit; statement after the header call...

    If you add that one line, you wouldn't have the problem. The script would just stop running after the header is sent. (Some servers won't execute redirects properly unless you do this anyway...)

  • tulcod (unregistered) in reply to Patryk Zawadzki
    Patryk Zawadzki:
    Actually it has nothing to do with the header. Some headers are not supposed to be followed by a page body and Apache will actually go to great lengths to remove the body if it encounters such a reply before sending it to client. Therefore the bot can ignore all the headers it wants.

    The actual WTF is sending a header but not aborting the control flow so the header gets cached for output while PHP is still happily executing the rest of the file (presumably deleting stuff).

    So I say there are two WTFs: the actual one pointed above and the explanation at the end of the original story.

    Exactly, even if browsers would honor the Location header, the full page gets executed, no matter if the browser already started requesting another, so this is not the real leak.

    So indeed, TRWTF is that even after so much trouble, Dan still doesn't understand the issue.

  • (cs)

    The article is a bit misleading about headers and PHP, but not off by much.

    $authorized = authorize(); // let's assume this is false
    if(!$authorized) {
      // this, by default, sets a 302 status header
      header("Location: index.php");
      // because I'm stupid, this was omitted
      // die(); 
    }
    
    // this works. PHP happily continues after the header() call
    deleteSomething(); 
    
    // this fails.  Since the 302 status header is set, PHP is going to die() before sending any actual output.
    echo "Hello World!";
    
    // this fails, too.  PHP done died.
    deleteSomethingElse(); 
    

    So, setting a Location header on a failed auth is OK. Just make sure you die() before doing anything else.

    Most robots/spiders handle 30x redirects just fine. It's just that PHP doesn't die() immediately until you try to output something after that header was sent.

    Yes, using a <form> would keep the spider away, but that won't keep wrongdoers at bay. Yes, deletes should be done with a POST and not a GET, but that's less important than just making sure the damned script dies when auth fails.

  • wrong! (unregistered) in reply to ckoppelman

    nofollow is a hint to search engines that you do not endorse the contents of the link because it may have been added by users. It's very badly named: it does not mean "do not follow this link", it means "I do not vouch for the value of this page".

  • Kuroki Kaze (unregistered)

    Man, that was an Epic Fail :) Not that i prone to that ones myself, but it was hell of a funny :)

  • Neil (unregistered)

    Wow, 20 comments about a PHP blunder and we've yet to see someone say the usual mantra, "TRWTF is PHP".

  • Ryan (unregistered)

    Or.. stop using if (!auth) and instead put if(auth) at each page.

    A crawler won't auth, so they get nothing.

  • chris (unregistered)

    The WTF is that the check NEVER works unless you add exit or die after the location header. It doesn't matter whether the Location header is interperted by the client - after sending the Header, the script will still happily continue to actually delete the page - so EVEN IF the client interprets the Location header, the page will still be gone. So the problem is not exactly that the programmier chose do use an idemponent method (GET) for a non-idemponent action (delete a page) - that is in itself not a security issue. It's bad design, agreed, but unproblematic in a security sense. The problem is that the programmer either simply forgot the "exit" (which is a stupid but very human mistake, could happen to anyone) or the programmer had no clue what in client/server communications actually happens. To repeat myself: The Location header is not the problem (well, actually it's wrong since the HTTP spec requires absolute URIs but let's ignore that for the moment) - the problem is what happens after sending the header. The script should terminate instead of continue.

    The other WTF here is that the article author also draws the wrong conclusion... Also, using a form and POST by itself will only solve the problems of crawlers - NOT the security issue when trying to protect against motivated attackers. FAIL I dare say. ;-)

    By the way: Another WTF is that - apparently - there's no CSRF protection in that script...

  • (cs) in reply to ckoppelman
    ckoppelman:
    or just add rel="nofollow"
    Yeah, that'll teach those perky script kiddies to stay away from your website!
  • Palad1 (unregistered)

    If only there was a way to execute a different piece of code when the if condition failed!

  • Eevee (unregistered)

    I'm a little disturbed that several people now have condoned wrapping almost an entire file's worth of code in an if() block. Flow control is your friend.

  • (cs) in reply to Eevee
    Eevee:
    I'm a little disturbed that several people now have condoned wrapping almost an entire file's worth of code in an if() block. Flow control is your friend.

    If you're doing authentication on a per-page basis, what's the alternative? HTML isn't stateful, you can't control when people are going to hit which pages, so you can't really maintain flow. If you could, all you'd have to do is authenticate them on the first page, and then you could be sure that every page thereafter was fine.

    Mind you, I hate crap like the example. If you do "If($auth){...} you're going to get screwed by someone who understands cookies. You need to run an actual sanity check on the cookie data.

  • (cs) in reply to Neil
    Neil:
    Wow, 20 comments about a PHP blunder and we've yet to see someone say the usual mantra, "TRWTF is PHP".

    I think everyone is off today...I always like to visit these articles about an hour after I see them in my RSS feed, then I can try to bet myself on what will be getting nitpicked and torn apart in the comments...I find that I'm right 9 out of 10 times.

    Today though...I gave myself 10 to 1 that "differentiate the difference between..." would be the major topic of discussion, with camps from one side defending its usage and the other, saying it is the most epic failure on the interwebz today.

    And I lost...you cut me deep, TDWTF...you cut me real deep.

  • (cs)

    Look on the bright side! You can at least recover your data through archive.org...

  • (cs)

    This happen at [a large search company with a funny name] a few years ago when I was there. Some guys was furious, screaming about lawsuits, cold-calling every corporate-looking phone number he could come up with, etc.

    He claimed that we were hacking his site to the benefit of his competitors because he wasn't buying enough keywords. Of course, that's completely ridiculous. I think at one point he restored his site from the pages in the cache even.

    It happened a few times. Finally someone took a look and explained to him that [funny-name]bot was there to index his site, not delete things. So having a link right there in the footer of every page that would let anyone delete the page was probably a bad idea.

    I don't recall the outcome, but I think he finally went the robots.txt route to prevent his site from being crawled --by us. Odd that he'd trust his nephew the neophyte coder over people who are largely regarded as experts in their field and more than willing to help him out, but for some people, paranoia is a motivating force.

  • (cs)

    Let this be a lesson to all: Validating credentials before actually making any changes is better than only validating credentials at the entry point. Better to validate multiple times, if necessary.

    Guy I knew once called me out for doing "unnecessary" authentication checks in every function that, well, required authentication. His reasoning was that the function couldn't be reached unless you were authenticated already. While that was true, at the time, it was an MVC application, and somebody could easily create a view that didn't check auth credentials. Authentication and verification goes at the lowest possible level, not the highest.

    Anyway, it came back to bite him later, when somebody did just that, and since he was the one who pulled my authentication out of the source repository, he was the one who lost his job over it.

    Authentication: It's good job security.

    Also, while POST is your friend for state-changing events, don't rely on it as a safety mechanism. Search engines and other bots can do POSTs too, you know.

  • diaphanein (unregistered) in reply to brazzy
    brazzy:
    I don't believe you can call yourself a web developer when you've built an app that uses hyperlinks for deletion and have all your data deleted by a search bot.
    FTFY
  • Dominic Pettifer (unregistered)

    WTF No. 1: Using GETs to do deletes instead of POST.

    WTF No. 2: Not properly authenticating the user in the server-side code that performs the delete/update.

    WTF No. 3: Hiring numpties (who couldn't web-develop their way out of a paper bag) to build you a website.

  • Nathaniel (unregistered)

    While using POST is a lot better, and yes, you do want to make sure that the script exits after sending a location header, neither of these alone can prevent an XSRF attack. Don't think your work is done if that's all you've done to secure your web apps.

    http://en.wikipedia.org/wiki/Cross-site_request_forgery

  • Vlad Patryshev (unregistered)

    I can't believe it; it is 2009, and it is still happening. Probably the whole world knows about this.

    Would be nice to track the developer that added delete command to GET request. Track those who hire such developers. Track those who hire those who hire such developers. (To make sure we never never never get into such an environment.)

  • Harold (unregistered)

    if(!isset($_SESSION['usr_id']) || !isset($_SESSION['usr_name'])) { header('Location: index.php'); exit() }

    would have been the quick fix.

  • (cs) in reply to Vlad Patryshev
    Vlad Patryshev:
    Would be nice to track the developer that added delete command to GET request. Track those who hire such developers. Track those who hire those who hire such developers. (To make sure we never never never get into such an environment.)
    Pfff, the security of POST is grossly overestimated. I know several pieces of web software that carefully check and validate everything sent with GET, but that blindly accept POSTed values as if nothing ever can be wrong with those. Hacking those sites is the easiest thing there is.
  • (cs) in reply to Vlad Patryshev
    Vlad Patryshev:
    I can't believe it; it is 2009, and it is still happening. Probably the whole world knows about this.

    Would be nice to track the developer that added delete command to GET request. Track those who hire such developers. Track those who hire those who hire such developers. (To make sure we never never never get into such an environment.)

    It'll be interesting to see how many more people fire the silver bullet of "POST," or even "DELETE," at this thread. (More interesting than the umpteenth numpty who fails to read the thread first and adds "exit()" to the code, anyway.)

    Otto:
    Let this be a lesson to all: Validating credentials before actually making any changes is better than only validating credentials at the entry point <snip/>... Authentication and verification goes at the lowest possible level, not the highest.

    Also, while POST is your friend for state-changing events, don't rely on it as a safety mechanism. Search engines and other bots can do POSTs too, you know.

    Indeed -- a comment that should be blued, at least for educational purposes.

    While PHP certainly isn't TRWTF here, and I can already hear people muttering "you can write ... in any language," there are certain language communities that encourage programmers to ignore the fact that they are doing two entirely separate things (three, if you count the C in MVC): working on the client side and working on the server side. Maybe it's just the teaching of PHP that's at fault.

    On the server side you can do far, far more horrible things than just allow systematic bot trawling to delete individual records or files. Anything to do with financial transactions comes to mind. Anything like "rm -rf /" would be a bag of fun.

    Web applications should never do anything significant on the server side without proper authentication. (And yes, I've bitten myself on the bum with this one, too.)

  • Jay (unregistered)

    There's something fundamentally wrong with any security scheme that relies on the CLIENT for enforcement. Like, umm, what if the user is deliberately trying to vandalize your site? Even if off-the-shelf browsers will respect your security, surely there are plenty of hackers out there smart enough to write their own socket programs.

    This is like those stores that put up "No weapons allowed" signs with the big red slash through a picture of a handgun. Like, yeah, some maniac is going to load up with guns and bombs and head out to the mall with the intention of killing as many people as he can before the police finally arrive to stop him, he's prepared to sacrifice his own life for the pleasure or glory or whatever he gets out of killing innocent people ... and then he's going to see a "No weapsons allowed" sign on the door and say, "Oh, darn, I guess I can't do that here."

    I'm not naive or arrogant enough to suppose that any security measures I build into a system are absolutely 100% secure. A determined, resourceful hacker could probably figure out a way to beat any security I've ever built. But I at least try to give him a run for his money. I want to make him go to more work than, say, having to "view source" on the page to see the password, or having to put a quote mark in his input string to break my SQL.

  • Top Cod3r (unregistered) in reply to Captain Oblivious
    Captain Oblivious:
    If you rely on a form button to delete content, also remember to use the POST method, not GET!

    Exactly. TRWTF is that the article writer didn't say this.

    Also, this isn't a problem of obeying headers.

    If I send form data to "controlPage.php?action=delete&page=112", and the page responds with a redirect header but STILL PERFORMS THE ACTION, that is just a matter of the security being ALL FUCKED UP.

  • Dan (unregistered)

    Hmmm, so Alexa joins the tiger team...

  • Franz Kafka (unregistered) in reply to Jay
    Jay:
    There's something fundamentally wrong with any security scheme that relies on the CLIENT for enforcement. Like, umm, what if the user is deliberately trying to vandalize your site? Even if off-the-shelf browsers will respect your security, surely there are plenty of hackers out there smart enough to write their own socket programs.

    You could probably just write a simple FF plugin - ignoring Location: headers and spidering all the delete links on a site sounds like a simple task.

  • Concerned...and shocked (unregistered)

    I wonder how on earth the system deleted pages without enclosing the delete-performing block with a check if the user (depending on the user ID, store it any way you like it) is allowed to do so. Overtrustingly damaging.... sheesh

  • Jim (unregistered)

    Screw whether it's a Get or Post request.

    Just make sure any page that is normally accessed by authorization actually checks that the user is logged in and authorized to perform that function.

    Otherwise, any user could just type in the link.

    Yes, POSTS could be hacked too. Plenty of code around to submit post data to a page as well. This is why the post target needs to check.

  • GrandmasterB (unregistered) in reply to Jim
    Jim:
    Screw whether it's a Get or Post request.

    Just make sure any page that is normally accessed by authorization actually checks that the user is logged in and authorized to perform that function.

    You just beat me to this. If the fact that its a GET vs a POST matters to your app, you're doing something very wrong. Properly check the user credentials before doing the deletion, and you will not have this problem, period.

  • nico (unregistered)

    Hmmmmm... so why not, first of all, showing the delete link ONLY if you are actually allowed to delete stuff? Why would you show a delete link to a spider? It does not really need to index delete pages anyway, so you should prevent it to go there completely with something as simple as:

    if ($canDelete) echo 'Click me!';

  • (cs) in reply to brazzy
    brazzy:
    I don't believe you can call yourself a web developer until you've built an app that uses hyperlinks for deletion and have all your data deleted by a search bot.
    That explains a lot. Just don't tell the Web developers who know how to capitalize properly.
  • susciphere (unregistered) in reply to Jay
    Jay:
    This is like those stores that put up "No weapons allowed" signs with the big red slash through a picture of a handgun. Like, yeah, some maniac is going to load up with guns and bombs and head out to the mall with the intention of killing as many people as he can before the police finally arrive to stop him, he's prepared to sacrifice his own life for the pleasure or glory or whatever he gets out of killing innocent people ... and then he's going to see a "No weapsons allowed" sign on the door and say, "Oh, darn, I guess I can't do that here."
    That would make a nice xkcd strip.
  • 8D (unregistered) in reply to RHuckster

    you have single-handedly saved these horrendous comments. thank you

  • Franz Kafka (unregistered) in reply to nico
    nico:
    Hmmmmm... so why not, first of all, showing the delete link ONLY if you are actually allowed to delete stuff? Why would you show a delete link to a spider? It does not really need to index delete pages anyway, so you should prevent it to go there completely with something as simple as:

    if ($canDelete) echo 'Click me!';

    Not all spiders declare themselves, and anyway, how would you tell them apart? Best thing to do is actually protect delete links.

Leave a comment on “Well-Intentioned Destruction”

Log In or post as a guest

Replying to comment #:

« Return to Article