- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
Fair and reasonable. I'd like to have tested it on Konq and Safari, but there's nobody I know who runs either, and the best I'd be able to do is an old version on a Knoppix CD I've lying around here somewhere.
The former is regrettable, but unavoidable. However, if the browser supports inlining of block elements and styling, this shouldn't be a problem, and it could be argued that the copy & paste problem is more a browser issue than an issue with buttons. That, and it degrades more gracefully than a JS/hidden form based solution: at least it always works.
The latter isn't a problem. Clicking it triggers a HTTP POST, which no search engine should do anyway.
Yup, but I personally like to make that kind of thing an enhancement rather than a requirement. With trees, JavaScript is unavoidable, but can be made unobtrusive. With menus, on the other hand, you only need JS if you need to hack around IE's poor support for :hover.
K.
Admin
Usually it's neither designer nor programmer who has the last say in this, but a manager, salesperson or, ultimately, the customer. Unfortunately, all of these tend to have little understanding of security and the benefits of standards, and instead get hung up on GUI details.
Admin
HTTP Authentication would have helped, but no more than a cookie based method. The real problem here is the abuse of HTTP GET.
Validate on the client first, then revalidate on the server. Client-based validation makes your forms more usable, but server-based validation is essential for security. How you do it isn't important, just so long as you don't abuse HTTP.
K.
Admin
Throw up a small demo somewhere so I can be absolutely clear, and I'll see what I can do.
K.
Admin
lol
Admin
This WTF has made my day.
Admin
Well, if you thought about the problem a bit, the issue is that you're throwing away the correct solution as "not acceptable". And no, I'm not suggesting that you get wizzy with the photoshoppery, what I'm suggesting is that you use the stuff that you've already done.
I assume as you have to deal with multiple languages already, you are using rgettext, and you thus have messages.po and friends set up. So, what you do is this:
1: Install RMagick
2: Generate and cache your various messages server side using RMagick's handy compositing tools
Something like this:
Of course, if you're not happy with a single hit to the rendering engine on the first time you hit an action / language combination, you could use the same code (near enough) to do this for a selection of keywords as part of your localisation script.
FFS. Ajax. what will these kids think of next?
Simon
Admin
Fucking forum software. obviously all the looked good in my browser. Grrrr.
Simon
<snip ajaxified="" hackaround=""></snip>
Admin
yes, shure. but tell me how to put a mouseover effect on that button in Internet-Explorer (any version up to and including 6.x) without using javascript. oh, wait, IE doesn't get the :hover right on anything but link elements. Hmm, problem - if the customer really want it, I'll explain the problems to him, but place an javascriptwhatever link in there, so they get the mouseover for their toy - in the end, it's them who pay my wages.
However, I'd rather place a javascript for the mouseover effect ;)
Admin
You gotta love the "Just don't do that" solution from managment.
They probably even blamed the client, making them think it was their fault.
Admin
The customer hires a web designer to design for him. If you can't manage to convince him that you're right, then you end up with your pride in your ass. It's you who's supposed to know what's right. Imagine driving lessons steering to the wrong side of the road just because the customer wants to. Show your customers w3c or some other stuff that says in large, friendly letters "Don't do javascript links". If you still can't convince them to do the Right Thing, quit the customer or the job. It's not worth damaging your nerves.
Admin
Well, I don't see a javascriptonclick() link as 'damaging my nerves'. I try to be standards compliant wherever possible and make my pages as accessible as possible, but if the customer decides that they prefer 'the wrong way' of using a javascript link and is aware that in turn, they'll loose some visitors, I'll place the javascript link because apart from the above mentioned side-effects, no harm is done. They pay me to design and develop an application for the they way _they_ want it to be. I make proposals and tell them what's right, but the last call is theirs.
If the change request could jeopardize the safety of the application, that would be a different story.
captcha: doom and enterprise. is this forum trying to tell me something?
Admin
Same thing happened to me! I created a simple PHP program with a MySQL backend to keep notes for various things. I didn't really care if it wasn't secure so I didn't implement any authentication. I used an edit and delete link next to each entry for quick administration. That's when Google's spider came through and visited all my links. I'll never make that mistake again. Also, I've never relied on JavaScript for security.
Admin
You, NancyBoy, are rather bigoted to say such a thing, as well as ignorant. I'd suggest you shutup while you're ahead.
Admin
I do not have Asperger's Syndrome.
We do care. We just do not usually make a fuss. It just will not show up in your log that we can not be bothered to fight your Website. Of course we are not missed. It is very difficult to spot an absence.
I am not under any obligation to use any particular Website. I do have a sense of what I like, what I will put up with, and what I will not tolerate. If a Website is difficult/awkward/intrusive/bothersome in any way, it is much easier to simply go to another. If the other Website has what I want, I have no reason to ever come back to yours.
Sincerely,
Gene Wirchenko
Admin
You, sir, are my hero.
Not that I thought that he deserved such a put-down. I just liked the turn of phrase.
Admin
(form action="delete.foo?id=bar")(button type="submit")(img src="bigredx.png" alt="" /)Delete(/button)(/form)
Admin
Sweet. I'll shut up. Er... does this work in Safari as well?
Admin
by doing exactly the same as you are right now. i assume that the default behaviour of your application is to do nothing/error if there is no session data available, or no cookies set? if so, no problem, it's just not the "Right Way" but it's still secure.
Admin
Yeah, I thought about that, but my skills with rmagick are not all that good.
What this debate is coming down to is a puristic vs pragmatic approach. According to the old standards, a submit input or button is the only way to go for POST operations. Well, technology advances, and I think the ajax way, despite the silly name, is the better way to solve this problem. For those few of you who turn off javascript... deal with it.
</snip>
Admin
Next time I hope that guy includes a "Delete Retarded Programmer" link aswell.
With that kind of skill, sounds like its not out of the scope of reality.
Admin
Good, because there aren't enough of you to matter. Good luck with your noscripting!
Admin
I am so happy to hear that you are so rich you can afford to ignore potential customers. Other people are unfortunate enough to have to care about all customers, not just the easy ones.
Sincerely,Gene Wirchenko
Admin
Not a clue. I've no way to test it.
K.
Admin
You can't because IE's :hover support is broken before IE7. If you really need to compensate for this brokenness, attaching handlers to the button's mouseover and mouseout events is ok. But you see, that's a bit of a strawman that has nothing whatsoever to do with my point. There's still no need to use links.
If you want to argue with me, come up with a cogent one.
K.
Admin
This is really a case of "anal-retentive do-it-by-the-book-goddamit" vs. "let's give the user the best experience while keeping the code clean". It's obvious that when people use tools such as web accelerator one should prefer POST, however it is equally aparent that if you wish to have a certain look on your web page you might very well put yourself in a maintenance nightmare trying to make your buttons look like the ordinary links in all common browsers. I'd rather spend that time on testing the application or implementing some other functionality giving my customers a good application on time.
It is also slightly amusing that it seems like the original idea behind having buttons for POSTs was to alert the user visually that something might acctually happen when you click that button, but this thread is mostly about how to make the user not knowing it.
Admin
No, this isn't as you put it 'a case of "anal-retentive do-it-by-the-book-goddamit" vs. "let's give the user the best experience while keeping the code clean"'. It's a case of "give the user the best experience without breaking the cornerstone of the web's scalability" vs. "who gives a damn, this kinda works and it looks right".
Do you think HTTP would have GET and POST if it wasn't necessary? No, that'd be bad protocol design. Each satisfies differing constraints and implies different things about the response. HTTP GET is for getting things. That's why it's called GET and not CHANGE_STUFF or, um..., POST. This implies that GET is idempotent, so it's possible for its responses to be momentarily cached to spread load, which is important because it turns out that getting stuff is by far the most common thing people do on the web, so caching, even for a very short amount of time is a Good Thing and why it doesn't cause awful problems if somebody accidently double-clicks a link, and some pretty small servers that have been tuned properly can appear remotely to take loads that more powerful but untuned servers can't.
POST implies non-idempotence; they always go to your server. For what POST does, this is also a Good Thing as it's far less common to modify stuff, but you want the network to do its best to move that request from you to the target host.
It's no more possible to replace GET with POST than it is to replace POST with GET. That's like saying that you can do without protein and fat in your diet completely and just live off of carbohydrates. Of course, any dietician will tell you that while you can do that for a while, eventually such a diet will kill you. Using GET when you should be using POST, and vise versa, might not lead to the same extremes, but the former will lead to silent data loss of all kinds, and the latter will kill your server's scalability.
If you do it properly, there's absolutely no reason why it should be a maintenance nightmare. That is, after all, the whole point of CSS: to keep a site's styling in one canonical place to ease maintenance. Apart from that, I agree with you: I prefer to focus on providing a good application on time, but part of that is about ensuring that it doesn't abuse fundamental properties of HTTP and, therefore, the web.
For me, it's been about two things: disabusing people of the idea that non-idempotent state change on HTTP GET is dumb idea--which is directly related to the post's topic--and disabusing people of the notion that using a javascript: link or the onclick event on a link to trigger a form post is a good idea, and that there are alternatives that work and work cleanly without sacrificing aesthetics.
K.
Admin
That's funny, it's almost as bad as the original. So if I want to hack your website all I have to do is add a cookie called "isLoggedOn" to my cookies? Wow, easy!
Admin
Please see Most Popular Downloads on Firefox extension page. Then think of all the corporate Explorers with Javascript turned off for external websites because of security reasons. Then calculate, how many visitors you might have lost.
I strongly suggest that your site needs Javascript only for additional functionality. If your website gets unusable, people just go away, as Gene Wirchenko says, leaving you under the illusion that there are so few of them. Really, are you so crappy a designer to be incapable of adding simple href to your a tags?
Admin
a element, or beginning tag of an a element.
Admin
Another reference:
http://www.37signals.com/svn/archives2/the_google_web_accelerator_is_back_with_a_vengeance.php
Admin
Keith:
Is the tag line on top of your site cached ?
Or does my browser reload the whole freaky page just for the tag line ?
Admin
We're talking about replacing POST with GET, not the other way around. Get-requests that works like posts should of course have the apropriate cache-control set and thus, pages that can be cached will be cached, making your servers run with good performance and requests that can't be cached won't, just as before.
Since you mention double-clicking I'm inclined to agree that a button might be a slightly better choice, when looking like a button, but how many sites have you seen that says "please only click once"? I've seen more than I can remeber. To a user there is no difference between a link or a button that looks like a link, except that the button refuses to open in a new tab, can have it's text copied and some other small annoyances that might ruin the user experience.
No, of course it isn't... You haven't had to do lots of modifications to make it acctually work for everyone after you thought you were finished? Will your "hacks" work in the next version of browser foo? You don't know, and will constantly have to check your page against browsers to ensure it works, possibly tweek your css some more, since doing this kind of stuff with css isn't on the top of things that should work in a browser, unlike anchors.
Admin
Ohhhhh. you nearly got me there :)
Admin
They have the money.
Admin
Happened to a colleague of mine. He had written a CVS web interface for a project, and one day he woke up and all files had been removed... by the GoogleBot :-)
Admin
That reminds me of CUPS's "print test page" button.
Which is a GET link.
I found once a CUPS system somewhere where the only things ever printed were test pages. Most of them, printed by googlebot.
Admin
Why did this reply kill the discussion?
Admin
Too late, it's already a legal issue now. Cases have been brought (in the US) against many web companies for their lack of support for disabled and handicapped people. The ADA is a powerful piece of legislation. The short of it is that so-called "web designers" who fail to design to the proper standards are being criminally negligant in doing so, and leaving themselves open to lawsuits. It won't be too long before companies start to realize that they can be liable for not designing to proper standards... At least then they would have a legitimate defense to this sort of thing, in that designing to standards is the only real way to allow all people to have equal access to the content they're making available. That's the whole point of having standards.
Remember kiddies, implementing to standards isn't just for fun, it's the law.
Admin
Assuming that your cache parameters are honored by the intervening servers, of course. POSTs simply cannot be cached by anything anywhere, whileGETs can be, they can just ask not to be. But asking and having it actually done are two different things.
Yes, I've seen these a lot. Many now use javascript to disable the button after the first click, and I consider that usage perfectly acceptable since it is an addition, not actual necessary functionality. The server component still has to detect stupid users doing stupid things, like double clicking buttons.
But that's sorta the whole point. Double-clicking links had damn well better be okay, *by definition*. That's what "idempotent" actually means. Double clicking buttons doesn't have to be okay, although it would be a good idea to detect it anyway.
Interesting point, since having "javascriptform.submit()" links in an anchor tag doesn't open in a new tab particularly well either. It usually gives you a javascript error on pages that are broken in this specific way.
If you have to use any "hacks" that are anything but trivial little bug fix issues (most of those being for IE), then you horked up your initial design in the first place. Do back to the drawing board and do it properly next time.
Admin
Ah, what joy this thread has been. In fact, I think a number of people who've posted to this thread deserve a huge WTF themselves.
I've never seen so much misunderstanding associated with idempotency since I started programming twenty years ago. Anyone who's actually developed a proper distributed system will understand that idempotency has absolutely nothing to do with caching.
An action is said to be idempotent if, no matter how many times you perform it, the action only occurs once. Thus, a form that is posted to, say, transfer money from one account to another can be idempotent if implemented properly, and non-idempotent if implemented incorrectly. How many poorly developed systems do you come across which say things like "please don't click this button more than once, no matter if it seems to be taking a long time, otherwise your credit card might be billed multiple times?"
Similarly, killing a cat is idempotent (the cat can only die once). The on-going torture and punishment of arbitrary cat killers, however, should definitely be non-idempotent (IMHO the pain they receive should be repeated time after time after time).
POST and GET operations can therefore both be idempotent, or both be non-idempotent. I would surely agree that GET operations should be idempotent, but please don't confuse that with their ability to be cached.
But, of course, in the days of dynamic web applications, very few GET operations are idempotent (in terms of the results generated), because so many organisations feel the need to keep their site "alive" by changing the content of their pages continuously.
Admin
The phrase "user experience" is something that I find causes tremendous frustration. I believe strongly in good, intuitive GUI design. The user experience is important to me. However, I've come to realize that very few people who use the phrase "user experience" care about anything else.
Your user experience goes far beyond the GUI. Your links, your buttons, your general site design are the most visible parts of the GUI (literally so) but they are not all of it. Googlebot accidentally deleting your entire website will lead to an extremely bad user experience, so if you really and truly care about "user experience" and not just pretty GUIs, you will care about proper implementation of standards. Remember, a pretty GUI will not save you if your site is pathetically slow, dangerously insecure, or produces behavior which an end user could not reasonably expect.
The standards are not arbitrary lunacy dreamt up by sadistic software engineers or (god forbid) requirements engineers. They exist for a reason, were created by experts, agreed upon by industry representatives, and have endured considerable real-world testing. Furthermore, they define an essential contract between you and your clients. You implement a standard so that your clients can have confidence in what the end product will do because that's what the standard says it will do. Violating that standard should only be done when absolutely neccesary and with the deviation explicitly documented in your product's requirements spec, approved by your client. (And odds are, if your client knows you're consciously violating a standard, they will not easily be made comfortable with that. But it's better than them finding out later on.) Even then, it should be avoided. If you find an accidental violation of standards, you should document it as a bug and determine whether or not it is appropriate to fix it given the strictures of the contract (time, budget, other things to change/fix, etc) and with the understanding of your customer. That is, if you want to have more customers later on.
As far as maintainability goes, it'll be easier if you follow standards and avoid hacks. Anytime you deviate from what you'd normally expect to do, make sure you document exactly where and why you did so lest you forget about it before you have to make changes to the product months or even years down the road. You can also take pity on any third party who might be stuck maintaining your code later on and explain why you felt it neccesary to do it this non-standard way, because you can't expect somebody else to understand your design decisions if they aren't standard.
Admin
Um, yeah. The story is frightening enough, but what's doubly frightening is that the writer didn't zero in on the true problem. This ain't google's fault, and it's got nothing to do with cookies.
Admin
Thank you so much for actually using the word idempotent. I don't know how many people use this word (correctly or incorrectly) but here it is exactly what I've been trying to convince people of for years. For example: "make foo" and then "make foo" should _not_ build foo twice -- the second make should do nothing :-)
Admin
Unfortunately, the script is banned, it's made illigal!
Admin
Write the page properly in the first instance and if the dumb client wants javascripty links, then use JavaScript on DOM load to rewrite all your nice stuff into their stupid crap ONLY if you can't persuade them otherwise (probably because they've seen too many .aspx pages that do exactly that, this forum being a case in point). Page remains accessible, so you don't get your arse handed to you by compliancy lawyers, which is exactly why I never use the native SystemWeb.UI.Page but something that actually works properly.
Token extra WTF: Currently at over 1300 errors in Firebug whilst making up this post, and 2 captchas.
Admin
It looks exactly unlike the links on each side. Opera has grey background box, you can't select the text, you can't drag the link, the text moves as if it was a button. Safari shows a button, period. Camino shows a button, period.
Only firefox shows it visually like the links near it. And even there it behaves nothing like the links. You don't get link context menu if you right/control click it to open it in new window or tab, and you can't middle click to open it in new tab. And you can't drag the link in FF either or select the text.
So, 1 out of 4 browsers on Mac OSX shows it with any resemblence to links, and that's the least Mac-like of the bunch even.
Here's a screenshot
Admin
This reminds me of when the Texas DMV had their database (their WHOLE database) online, searchable, and protected by ... wait for it ... javascript authentication. Turn off js in your browser, and suddenly you had access to home addresses and other personal info for anyone who had the audacity to drive in Texas. I found a friend from cub scouts who had moved to Abilene. A friend of mine still has contact info for many a Bush family member lying around somewhere... Sigh.
Admin
I myself is SEO but I was not knowing till date that spider can be such a notorious thing. Though frequent crawling has some bad effects but this is something unimaginable damage to the CMS site.
Admin
Please read RFC 2119 (SHOULD and MUST).
It might violate the spirit of the RFC, but it doesn't violate it. It's just an instance of WTF and failure to RTFM/RT(RFC).
Also note that even this quotation has a caveat about headers under which it could be OK to have side effects. No one has mentioned whether the instances mentioned had such things.
That said, having worked on a spider that got trapped in a calendar maze, I hate calendars. Oh, and I hate wikis too, since in development it would get stuck there too. In the end, good spider(m developer)s develop an algorithm for deciding that they aren't getting anything from following links (down whatever maze you provide) and move on with life.
Note that unfortunately the spider in question was required to operate buttons too, not just links -- It had to be able to submit posts and run JavaScript. It would also carry out authentication, but it was generally configured to be given tokens by a real person and was designed not to wander too far afield. I don't remember what warnings were included in the documentation/user guide, but the product was marketed for testing web applications.
All in all, I hate web applications, and I hope people have sympathy for me (not that I care). Spiders and anything from Google, OTOH, I like and sympathize with.