- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
OMFG! See! always use proper authentification! FIRST!
Admin
Do tell, which website is this? I'd like to give them a signature of security authenticity from myself in the form of a link to the goatse man.
Admin
Once again, ladies and gentlemen, I give you: Security through incompetence!
Please, a round of applause for our wonderful content management system!!!!
(golfclap)
Admin
fist!
Security though client-side reliance doesn't pay off. And they were just unfortunate really, and uhm, had a pretty bad design too ... nevertheless.
Sincerely (one of) Gene Wirchenko('s fans)
Admin
Good thing they didn't have a 'delete company' link...
Admin
I'm surprised the workaround wasn't just to add robots.txt to the site.
If you're going to have a 2 WTFs with javascript and cookies, why not put a 3rd one in there and get a hat trick.
Admin
That is awesome. I dont know what else to say.
Admin
Heh, we have a similar cookie issue, though it's just a (we feel to be minor) security violation, nothing is deleted. Our "security" involves using javascript in some of our apache/Linux/phpwiki pages that looks for a login cookie, if not, redirects to an NT IIS server for auto NTLM, which then creates a URL back to the wiki for the page login. The PHP page sees if you're in the group that should be getting the page. For browsers, works fine since all our browsers use cookies.
Our google appliance of course doesn't, and indexes all our "secure" pages. It wouln't dbe too hard to fix this in apache, but nobody has complained so far, and we tell all our wiki owners about the security hole.
Admin
That's a superb story, and an excellent lesson which I wish management had learned...
Admin
Well, at least he can now use google cached as a backup.
Admin
The true muppetry of this is that the idiots who implemented it used GET requests (and get is meant to be side-effect free and idempotent) to implement a destructive actions.
Whatever stupidity might be involved in the authentication method, it's this and this alone that make it suck ass. The great pity is that there are so many sites out there that do just the same thing. :@
Admin
This one qualifies as a "WTF" from a different point of view. I'd be grumbling "wtf?!?!?" over and over if my production system suddenly dissapeared like this. Kudos to Josh knowing how/where to investigate the cause and restore the site.
Admin
I seem to recall a similar issue with another Google product - it was one of those things that's supposed to speed up internet browsing by simulating the user clicking on everything. If this is the case, even with a good authentication scheme, you're in danger of deleting everything, because this is a direct user simulation (allowing access to all user cookies and such).
I could be wrong about this, but I thought I read something on Slashdot about it.
Admin
Argh! that should be 'destructive actions' rather than 'a destructive actions'.
Admin
So the logic something like:
if (getCookie(isLoggedOn) != "false")
CongratulationsYouAreLoggedIn()
Why would anyone do that?? (and yes I realize the point of this site is to make you wonder things like that).
Someone needs to replace it with this clearly superior code:
if (getCookie(isLoggedOn) != false && getCookie(isLoggedOn) != FILE_NOT_FOUND)
CongratulationsYouAreLoggedIn()
:)
Admin
This sounds like even more of a WTF than the subject of the thread.
Admin
Google Web Accelerator. As I said, that's because people were using GET requests to implement destructive actions, which is a big no-no. All GWA did was expect that the sites you used conformed to the HTTP 1.0 and 1.1 specs. To be frank, the people who produce sites and apps that would let something any spider, proxy, or prefetcher do that deserve a Darwin award for stupidity.
Or a Daily WTF.
Admin
http://webaccelerator.google.com
Admin
This is actually a fairly common occurance on mis-configured Wikis.
Admin
If this were digg, or slasdot, I'd totally mod that comment up.
Admin
Some years ago, a spider got caught in on of our websites which had a sophisticated tree view CGI script that used long strings of 0 and 1 to indicate which branch of the tree is opened (displays its subbranches) and which is closed. To make it worse, this parameter was implemented as a virtual path, so e.g. http://www.somedomain.at/cgi-bin/treeview/000000000 showed the tree with all branches closed and http://www.somedomain.at/cgi-bin/treeview/111111111 showed the tree with all branches open. Well, it all amounted to the equivalent of a DOS attack. That's when we learned what robots.txt is all about.
Admin
Yes, it was the Google web accelerator.
It does not simulate the user clicking "everything." Only on the links, but not the buttons that submit forms. This should not be a problem on a sane web site because GET requests should not have any effect other than to return a page. POST requests should be the only way to do things like delete files or purchase something on a credit card, and the web accelerator would never submit a POST request before the user clicked on a button.
Unfortunately, not all web sites are sane, and those sites treated these GET requests from the Google web accelerator as if the user had actually clicked on something. Google really didn't do anything wrong here, but they got blamed for the bad habits of dumb programmers anyway.
Admin
OMG... Google ate my home page!
Admin
Total agreement. And it's not just "destructive" actions, but *any* action which causes a stateful change to the data is supposed to be performed by POSTs. That's sorta the whole point of having two separate types of requests in the first place.
GWA caused problems, but like you said, any caching and/or "look ahead" type of proxy would have caused those same problems. The fault is with the bad web developers out there. If more people would, oh, read the friggin' relevant RFC's before they go and write their code, then they would avoid these sorts of issues.
Admin
Meanwhile, there are also plenty of robots (used by spammers to look for email addresses, mostly) that don't follow robots.txt at all, so you'd still get the same problem.
Many times they'll also send a fake UserAgent to look like a browser, so the only way you can identify them is behavior (for example, list honeypot pages as disallowed in robots.txt and either not link to them, or use "invisible" links.... if anyone hits those pages, block their IP).
Admin
Wow, I just learned something. I realize that I am opening myself up to much criticism by by posting this but I myself have created a few sites which contain pages such as DeleteClient.asp?id=123 where 123 is the client to delete. I always check a session variable to make sure they are logged in before doing the action, but sounds like that was a Very Bad Idea. The problem is how do I get the action to spawn via hyperlink where I can only pass values in the QueryString?
Admin
Not quite anything. For instance, every time your webserver writes to its log file, you're causing a state change. However, such changes are ok because they don't interfere with GET's idempotency: you can do it time after time, and it's the same as doing it once: the extra data doesn't interfere with the running of the app.
What these idiots were doing, now that is another matter entirely.
Personally, I'm surprised nobody's brought up the "but form buttons are so ugly" chestnut. I've a way to smack that one down too.
Admin
First, don't use query strings, use forms. That's what they're there for.
And it's possible to style buttons so that they look decent or the same as a link. I've done it many times myself.
Admin
Thanks Keith. You just made me less dangerous. -Bill
Admin
<font size="2">hack-o-matic: Use a button instead. Not only will it do the right thing, it's a clue to the user that clicking it will perform some action rather than just fetching a page.
Admin
<a href="#" onclick="myform.submit();return false;"> ...
Admin
That's a fundamentally bad design.
The quick fix is to have DeleteClient.asp?id=123 be a confirmation box with a button that does an HTTP POST.
Admin
what about links such as this? <a href="javascriptmyform.Submit()">
Admin
the forum software ate my colon, but you know what i mean. wow. that didn't come out right. wow, that didn't either. i'm gonna stop now.
Admin
sweet!, can I insert a joke about the Spider mastermind? ^o)
Admin
You can't. HTML "a" tags always generate GET requests. You need to use a "form" tag with type="post" in order to generate a POST request.
Admin
Not to mention that ANYONE who does anything but READ data in response to an HTTP GET deserves to have all thier data deleted. Any mutation should be only in reponse to PUT/POST/DELETE verbs. You'll get nailed but pre-caching tools tool, and that's not going to be prevented by login checks of any ilk!
Admin
Speaking of which, is anyone else annoyed that when a post in this forum gets 50 replies, the site uses POST rather than GET to go to replies 50-99? So whenever I'm on page 2 and hit refresh I get prompted as to whether I want to resubmit form data.
Admin
That's is not really a valid URL so when the GWA sees that in the HREF attribute, it will just do nothing. It does not run Javascript at all.
Admin
Whoops
Admin
If I could slap you over the internet, I would. If you're going to play stupid javascript games instead of doing it right (input type="submit" with appropriate stylesheets to make it look acceptable) at least use <a href="http://some.meaningful.url/" onclick="myform.Submit()">.
Admin
LOL! Everyone within earshot of my cube is now wondering just WTF I'm reading.... Colon jokes, gotta love'em....
More pertinent to the thread, while the GET/POST thing is a problem, I tend to see the security issue as the more serious, and WTF-ish of the two....
Repeat after Me:
1. Security cannot be implemented client-side.
2. Assume denial, prove authentication. In other words, if the security is cookie based, you need to be able to retrieve the cookie, and identify information within the cookie that proves the user is authenticated. Any failure at any step along the way results in denial of authentication.
Solve the security problem first, then convert your simple links to POST requests...
But that's just Me
-Me
Admin
Uh ... anybody ever hear of 'GET' actions always being benign and 'POST' actions changing things?
Even if you're gonna use a login cookie, wouldn't you make the user not be logged in unless the cookie was present?
Do web developers not realize that you can telnet to port 80 and type anything you like?
Admin
The only logical course of action is to sue Google for hacking the site.
Sincerely,
Bob Lablaw
Admin
WTF?!
And here I thought this forum software was so well designed!
_sigh_ - I wasn't annoyed because I thought it was smarter than me...
Admin
Cool, I want in this thedailyWTF.com class action verse Google!!!
Maybe then the owner could get some better forum software!!
Admin
I always thought, "These really seem familiar" as I (re)read the first 50 posts <grin>
Yes. That has annoyed me too!
Admin
<a href="http://www.centos.org/modules/news/article.php?storyid=127">Jerry Taylor</a> on it
Admin
A lawsuit againts Google. There's a unique and novel idea!!
Admin
You know... I'll bet Alex would appreciate a fundraiser to upgrade this forum to, say, VBulliten......