- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
It didn't [:(]
The sarcasm tag got cut (Nice isn't (sarcasm)). Also a misplaced word, it should read "that the first time you are entering the page "
----------
CAPTCHA: ORANGE
orange
CAPTCHA: wrong
This thing is case-sensitive?
Admin
Still broken. Looks like a big ugly button. Running IE7 beta 2.
(looks like a link in FF)
Admin
I am not very familiar with dreamweaver speak, care to enlighten me as to which terms are in its exclusive lexicon?
I understand HTML quite well actually, I've been writing web applications for about 12 years. The "the old hyperlined document model" that I refer to is the inane system whereby we take a system designed to link together related articles and build entire network clients out of it, which end up so convoluted that while 90% of the data transmitted with each request is just rebuilding the identical UI that was in the last request, with a tiny change of the data in the middle of the application. Could you imagine if when typing in MS Word, every time you wanted to change the selected font, it had to rebuild the entire application layout, just update which font name appears as selected? The most important elements of any commercial system is the UI fluidity and asthetic appeal. Any programmer worth their salt will be up to that challenge, and given that designers will already have to make some compromises, its on the programmers to limit those infractions as much as possible. Whining about a link that calls a POST request seems like a really trivial reason to make a designer's job that much harder.
Admin
re: Prefetching,
I Google'd for "attrs" using Firefox not long ago, and while still looking the the Google site, I was asked to accept or decline a security certificate, from a .mil site.
Turns out Firefox was prefetching the link to atrrs.mil which offered the certificate.
Admin
Kinda. There's two reasons why it might not work. Firstly, the stencil font, even though it appears to be uppercase, is in fact lowercase. Secondly, caching sometimes interferes with the captcha's contents.
The best you can do is either sign up or repeatedly try to satisfy the beast.
Admin
Hmmm... looks fine in IE6. Any chance of linking to a screenshot? Looks like a job for IE's preprocessor comment yokes.
Admin
Paddy, you're forgetting the the web is primarily distributed hypertext. That we attempt to build applications on top of it that HTML was never designed for is not HTML's fault. If you want to build thick client apps, go do that.
And you still haven't given me an example where you need tremendous hacks to satisfy 'UI fluidity and aesthetic appeal'. Put you money where your mouth is rather than giving out.
Now, all this whining about HTTP POST is happening for good reason. The web was designed to scale and scale extremely well. A big part of that scaling is caching. HTTP GET is the most common request type, and for the web to scale it must be idempotent (very important distributed computing word there: look it up). Without idempotency, caching proxies couldn't exist and your servers would be continually pounded by requests.
Protocols are contracts of expected behaviour, and part of the HTTP contract is that you allow GET is idempotent so the load can be distributed over the whole system through caching. Once you break this and have HTTP GETs causing significan side-effects, you're buggering with the contract.
And javascript links are bad because they break the user's expectations of what a link does.
K.
Admin
IE bug or not (actually it happens in Firefox too) its something that designers have to deal with, since so many consumers use IE and they are the ones that need to be made happy. Any design plan that relies on MS fixing a bug in IE is doomed to failure.
Admin
There are a lot of bots on the web that will POST spam to anything they see that looks like a from as well, so if you've wrapped your delete operation as a form submit button, you're not much safer.
Still makes sense to use GET and POST properly though.
And authenticate!
Admin
I have no problem with hypertext documents being used to display hypertext content. I never said it was HTML's fault that we use it to build application UIs, I am just saying its used in a convoluted way in order to get the results we want. It is still better than writing thick clients of course, but that doesn't make HTML any more ideal just because the alternatives are less.
I already mentioned that form components (any native widget) always draw over dynamic HTML in at least a good portion of modern web browsers, isn't that a good reason to avoid using form components in those situations?
Caching proxies are great for the static content the web was built for, but how many GET requests actually show static content these days in business applications? Every request shows a different banner ad, may or may not say "hello [your name here]" and "you have [x] messages" depending on if you logged out via another link or not, or if someone sent you a message since the last time you pulled the homepage.
I absolutely agree that grevious misuse of GET requests is a Very Bad Thing (such as in the OP), and you should not modify the server state via a GET request regarding its content as a general rule. But consider Yahoo Mail: you have to click a hyperlink, which calls a GET request to view your mail message, and changes the server's data regarding that message from "unread" to "read" right then.
If you used a POST for that, you could not open the message in a new tab, you would bloat the HTML by a large amount to load it up with tons of form tag sets, and any cache of the message list page would be bad anyway, as it would show old data regarding which messages were read or not, meaning you'd need to get that page by POST request only too. So much for clicking a link to your inbox - that would have to be a form button then too.
In the Real World, people tend to use GET requests a whole lot, and the pages expire immediately instead of caching, because the content does or can constantly change even for identical GET requests.
And secondarily: Using a hyperlink to submit a form does not break any client/server protocal contract as the server does not care how the request is generated as long as its valid.
Admin
Just a followup about this: how would you recommend interacting with javascript content?
Personally, I would contend that the design should give the user a good expectation of what the link does. If a "next" link goes to another page, or shows the next image in an image tag on the existing page via javascript, what matters is that the system is intuitive to the target audience, even if some engineers find it baffling that you'd use a hyperlink for anything but hyperlinking to another URL despite a design that clearly indictates the intended functionality.
Admin
Wow, such a bias to what the web page looks like, visually, to a human... which would be great for a real user interface on a real client/server application. But the web is designed for shuffling text, with pointers to other text, and I actually want to use it that way. By all means avail yourself of all the flashy features you want when you know they are supported - but for God's sake test that the browser you're sending stuff to supports it first, and - much more importantly - degrade gracefully when it doesn't! Not doing that work is just negligent, however you dress it up.
Remember, it's not just me and my twisted preference for browsers that don't support JavaScript. What about users dependent on screen readers? Should blind people not be allowed to submit forms? (In a day and age where anti-discrimination legislation is springing up all over the place, this could turn into a legal issue too, before too long.)
Oh, and whilst you're focusing on client/server contracts, don't forget the service provider/customer contract - or, cynically put, "if you tell me I'm too primitive to use your services, how many people can I dissuade from being your customers?"
Admin
If you want an example of how this can be done well, look at GMail, whose "basic HTML" view is good enough that I sometimes even prefer it to the standard view. That should be the benchmark.
Admin
lol
Admin
I don't know which rock I've been hiding under, but I've simply never heard that.+o(
Perhaps it's so obvious that nobody thought of mentioning it or writing it in a book?
Admin
That's the problem with technology: it alway falls short of how we'd like it to be.
And I've already said that you're wrong. There's one, and only one marginally popular browser that does that, and that's IE, and even then it only occurs with the SELECT element. And that's a problem that MS set out to fix with IE7. Browsers rarely use native widgets for that very same reason.
Caching proxies are great full stop. That's why HTTP provides mechanism for where you really, really don't want content cached. In HTTP/1.0, there was Pragma, and then in HTTP/1.1, we got Cache-control, which gave us fine grained control over caching. There's also the Expires header for when we want to limit the amount of time information is cached, conditional GETs using the ETag and Last-Modified headers so that we only end up using bandwidth to transmit fresh information, and a whole host of other facilities.
Me, I've very careful with caching. Where at all possible I'm able to serve information that's slightly out of date, I do. It's only in places where the information must be up-to-date that I prevent caching. Also, when I'm serving a user that's logged into an app, if I'm sending information to them that I want cached but don't want any intermediate servers to hold onto, I use Cache-Control: private to ensure it's only cached in their local cache and not on any intermediate server.
I care about ensuring that the apps I write can cope with scaling up and down smoothly. Do you?
Something like that is an edge case, I'll admit that. Similarly, logging requests is another edge case that gets through. But you seem to be misunderstanding what idempotence is.
The vast majority cache. Really, they do. It might not seem like it sometimes, but they do. Caching doesn't have to last all that long. If a cached copy lasts for a second, and that's a cache of a page on a server that's being slashdotted, that's an awful lot of scalability.
And I work in the Real World. The systems I deal with might not have to scale like the likes of Amazon or Google, but they still have to cope with a lot of pounding, and the more processing it needs to do, the more it costs and the slower things get. I take advantage of caching whereever it's necessary.
No it doesn't, nor did I ever say it did, but its a nasty and unnecessary hack and has been for quite some time.
K.
Admin
I am not biased towards making a site useful to humans, but most of my clients are, and I have to make them happy. They want attractive sites that get their potential customers interested. If they want to add some functionality for the majority that makes it hard for blind people, its their call. They can always offer a thin version of the site if they feel it is economical. Usually, it is cheaper to offer the thin version as well instead of slimming down the features of the main site. You have to offer the majority of your customers the best site you can right off the bat, or they won't remain your customers.
I agree you should gracefully tell someone they need to turn on cookies to use a site, or enable javascript, as all error conditions should. Most web commerce isn't about accessiblity though, its about commerce, so if it works for a company's business plan to alienate 5% of web users then for good or ill it really is their call. I use Windows, but if I can't run some company's catalog CD on a Mac, its because they targeted Windows users, not Mac users, and again its their call to target consumers that possess a specific type or level of technology.
Admin
Exactly, and we have to work with what it is.
I tested this out in Firefox, it appears that a submit button paints under layers as you said, though flash still paints over. I agree that when MS fixes it, and it becomes a rare issue, then it'll make this easier. Still, you can't tell users to wait until a new browser is released to resolve an issue, you have to work with the technology you have at the moment, and a lot of people use IE. I am happy to know they are working on it, thanks for pointing it out.
I agree with caching where possible, and of course scalability is very important. Ironically it would be really nice if web applications were designed more like client-side applications, with adding elements as children of other elements, where the higher level design elements could be left intact while you flipped next/prev through data in whatever field area it is displayed in. That would make things a lot easier on the server, and it would make it easier to partition out what data is cachable and what is not, and what is actually already on the client. That is, without resorting to frames of course.
I think most web applications are edge case. Whether you check your email, bid on auctions, process reports etc, it comes up. Many news and shopping sites can and do utilize effective caching of course, since they do have high traffic of rather static pages. I absolutely admit it is important to use it where it is possible.
If the site is a web application that requires the user to register and/or log in, all that traffic will still require each one gets the page generated with a "hello [x]" at the top for that user, and caching will not help.
And of course I agree. The caching issue came up in context of complex user interfaces, which generally are in complex web applications such as ones that help you check your email, bid on products, or otherwise provide lots of unique data. In those cases, the user often gets different data off the same URL even a second later. They are also generally personalized to the user that is logged in, and cannot be reused for others logged in on other accounts.
As far as I am concerned, if a designer finds it useful, and does it gracefully, the experience is transparent to the user who only knows they like how the site works. We can of course disagree.
Admin
mod_rewrite. It makes URLs so much nicer!
http://foo.gov/PageDelete/2456
It would be interesting to see some poor employee getting blamed for deleting the whole site.
Admin
Or perhaps Arachnotron :)
Got a WTF myself, can't actually play Doom3 'cause I get too scared... WTF?
Admin
Doesn't work in Konqueror, though I wasn't surprised. (I get the same thing on LycosMail, which uses a similar technique to change the appearance of the login button).
Admin
No. Your default view should be text-only friendly, and the button at the (bottom|top) should say "view rich content site". Or you could just use CSS and not bother about the rich content at all. There are very, very few sites where rich content of any sort makes sense.
Admin
(Re: http://talideon.com/wiki/?doc=WikiEtiquette)
It's a form button, huh? Very clever, Lord Smartington Cleverboots of Brainingworth Hall.
That must be why it looks different from the links (it has a dark rectangle background for me) and acts differently from the links (caption moves when clicked, dotted border) and generally creates a jarring 'is this a button? why is it not like the other links?' feeling.
Smooth. Real smooth.
Seriously, if you're going to do something, do it right and test in more than one browser. If you must test in one browser, make it a standards-compliant one.
Admin
No, that's a bug in IE. Every other web browser does not do this. IE should be fixed to respect the HTML standard.
There is an evil hack for this, just put an IFRAME with the same width and height as the div/layer under the div/layer and it will work... (it will overlay SELECTs which is the only widget that gets painted above objects with a higher z-index).
Admin
I'm not defending using GET in this manner, but for the record, there's no violation of the spec here. "SHOULD NOT" is different from "MUST NOT", and is defined in RFC 2119 thusly:
I'll be the first to admit that this particular behaviour is not acceptable or even useful, but you get my point, I think.
before implementing any behavior described with this label.
Admin
That's what looked the best ending to me :)
Sue google for "non-respect of the conventionnal browsing behaviour".
People should not be allowed to go on the internetl without JS and cookies ! Theses guys are terrorist ! Hope IE7 won't allow to disable such things.
"... and then... we .. will... have... peace." (Palpatine)
Admin
LOL!
Admin
I bow to you sir, I have utmost respect for people writing CGI in C (and there still is such kind of people, I know of a forum written in C).
That's where CSS come into play.
ul means "make an unordered list", using roman numerals on an unordered list is proof that you didn't grasp how HTML and CSS work.
He's not.
Thanks, you rock.
Admin
You're not missed.
Seriously, if I were designing a site so that only malcontents with Asperger's were happy, I'd keep your comment in mind. Since no one else cares...
Admin
The sadness is that IE does not allow you full control over buttons. It adds about 10px of padding left and right even when padding:0; and you can't remove it.
This highly annoying annoyance has not been fixed in IE7 Beta2.
So when I got to this issue, instead of enforcing my stylist ways, I opted to adjust the entire design to allow for a wider button, because reinventing already robust functionality ("submit") using javascript (can be turned off) is a Bad Practice. It falls under Square Wheel Reinvention.
Is too!
*hugs my DOM and fuzzy JS library*
Ok.
Is not.
Admin
http://www.fhwa.dot.gov/infrastructure/hawaii.htm
Admin
I am breaking a "personal rule" to never respond to anyone who uses profanity on this forum.
However, referring to layers does NOT make one a "dumbshit". That happens to be a W3C standard. The effect to which gwenhwyfaer is referencing is quite common. Anyone who has spent more than (say) 3 hours writing HTML code understands how the browser(s) render Form Elements with a precedence that overrides any attempts to use properly applied (standards-based) CSS techniques.
This comes from "the voice of experience". One who has been writing complex HTML/JavaScript/CSS systems in notepad since HTML was at version 0.1
Try it sometime. It is good for the sole.
Admin
<Scratches head>
But we can do that, and have been able to for a long time. People haven't been making all that noise about XMLHttpRequest for no reason you know. And even before that became popular, there were still plenty of ways of doing client-server communication, from hidden IFRAMEs to a JavaScript mechanism for fetching data that I've seen independently invented several time by others too: http://talideon.com/projects/javascript/JSRPC.js
Where have you been for the past few years? :-)
No, by definition they can't be.
That's why it's possible for the server to prevent caching of GET where the information sent is one-shot. However, even with progress reports, and checking your email, they still benefit from idempotency by using Cache-Control: private and other mechanisms.
Now, only an idiot would use an idempotent request to submit a bid on something. Why? Caching. You could click a link to place a bid and get back a cached page from somebody else. That's not what you want happening, and that's one of the reasons HTTP POST exists. It's for submittin non-idempotent, mutating requests. Its responses cannot be cached, so you always get back a fresh page.
You haven't read that article I pointed to on idempotency, have you?
Now, let me think... when you log in, what kind of request is that... oh, a HTTP POST! That, and must I point out again that there are differing levels of caching. You can specify public caching where the response will be cached on a public caching proxy, you can specify private caching, where it will be held in your browser cache alone, and you can specify no-cache where it's absolutely positively impossible for the response to be cached anywhere. You can also specify expiration dates on responses, which might range from anything from a few milliseconds to a month and upwards, depending on how fresh the response needs to be.
How are you not getting this?
You do know the difference between HTTP GET and HTTP POST, right?
The problem here is that it doesn't do it gracefully, that's why it's a nasty hack. If you've two methods of allowing the user to trigger some non-idempotent action, one of which (styling a button in a form performing a HTTP post to look the way the designer intended) isn't a hack, and another (using GET for a non-idempotent request or using a JS link to trigger a form submit) that is a hack, and they both have the same outward appearence, then there's no disagreement here: the former is the correct way to do it and the latter is for somebody who's too lazy to do their job right.
K.
Admin
you really didn't need to say 'whoops' so many times. It made your post worthless.
Admin
You said 'whoops' way to many time. it was damn annoying
Admin
You'll think otherwise when your proxy changes your input language to Mandarin.
BTW: HTTP POST can be submitted through a link.
Admin
Not sure if it was mentioned previously, but this story and TheDailyWTF were "digged" recently:
http://www.digg.com/technology/Googlebot_destroys_incompetent_company_s_website
Admin
Wrong, there are not "layers" in the W3C specification.
Wrong again, only Internet Explorer overrides any attempt to set z-index to put other HTML elements on top of form elements, and then again it only does it for a select number of form elements (mainly selects, maybe also textareas). No other browser follows that kind of stupid behavior.
And it's supposedly been fixed in IE7, too.
Admin
Don't be a smart ass. If you're going to complain, be at lease a bit constructive. Tell me what browser you're using, and what platform it's on, give me a screenshot, and I'll do my best to make it work right. You also seem to be missing the bigger point.
Now, I have three browsers running on my machine: Firefox 1.5.0.1--my primary browser--Opera 8.53--which I find useful for debugging JavaScript--and IE 6.0. Last time I looked the former two were considered standards-compliant, no? And guess what: the page looks just fine in all three. So, smooth, real smooth. You've just made an allegation against me that was unfounded.
Now, the only browser I know of that acts exactly the way you described is IE. All three put a dotted line around both links and buttons when styled not to have a border. Check. IE is the only one that makes the text move when the button's clicked and there's nothing I can do about it.
The whole point I was attempting to make was that there's no need to use a HTTP GET to modify application state. That page is made to use a HTTP POST to accomplish the job properly.
K.
Admin
I actually think you've got this the wrong way round. I don't think that a programmer should jeopardise security (or standards for that matter) for the sake of the designer's 'freedom'. The case in point is not merely a question of programming style, but a programming error resulting in not only a breach of security, but the loss of large amounts of data. Are you willing to accept these risks for the designer's 'freedom'?
Should a car be built with aluminium panels in place of the windscreen and windows because the designer thinks it looks better? All elements of work need to be undertaken within a certain level of constraints. Automotive engineers work within constraints, car designers work within constraints, programmers work within constraints and web designers should also be confined by constraints.
Any designer worth paying should be able to design an attractive, engaging and intuitive design for a web application within the constraints laid out by web standards and the requirements of both the customer and the programming team they are working with.
Admin
And even then, it's only the SELECT element that's affected.
K.
Admin
Keep on trying, maybe you'll get it right on the third attempt even :P
Admin
Of course it does accomplish the job of posting istead of getting, but that isn't the question is it? If it was you wouldn't need to style it like a link.
The question we should ask ourselves is why use post? I can find three reasons in this thread to not use GET: "Make cache work correctly", but since we can set cache-control and turn off cache anyways it is no longer an issue, "make computer programs suchs as googlebot not visit some pages" which can be done through proper authentication and "because the rfc says so" which is the only valid reson, but as it is only an recommendation we can ignore it when we absolutely have to.
The user experience should always come in first place and sometimes buttons or javascript just don't fit.
As a side note: How does buttons and :hover work in IE 5.5/6?
Admin
One question: How does "not using cookies" allow the spider to bypass the cookie check? Wouldn't this cause the cookie check to always return false, and thus keeping the spider logged out?
Or is this security implemented in Javascript? Please no...
---
As a side note, it is advisable to turn all forms that make significant changes/execute major actions into POST forms, thus putting them out of reach of any spider, bookmarks, or accidentally pressed links. If this causes "delete page" to require two clicks instead of one - once on the link, once on the submit button of the post form - that is a small price to pay.
This is because GET requests are considered to be
In other words, if a rogue GET request can wipe your site, it's your own fault.
Admin
Okay. I can agree with you up to a point. However:
Your button doesn't get styled as intended in the Konqueror browser (which is my default).
The button text can't be copied+pasted, and breaks the flow of your text, so if you need to include something like this within your page's body text, it won't look right, and will probably choke search engines.
If you're writing an Ajax app, you're going to be using Javascript anyway, so I don't see anything wrong with an onclick event in that case. (not all onclicks result in a call to the server; eg treeviews and menus just make other elements (dis)appear).
Admin
That's easy. I need a picture and text together in one solid link. The text needs to be able to be different languages, so making a solid image with every language is not acceptable. For example, a big red X next to the word delete.
How do I solve it? Using Ruby on Rails, it's pretty easy. I use link_to_remote to do a javascript POST action . If the user has ajax capability, it updates the page dynamically, otherwise it redirects to a new page load.
If the user doesn't have javascript, it triggers the same URL via GET, but instead gets a confirmation page with the data in a form and a regular submit button. (Which submits back to the same action as before)
Admin
When I read things like these I seriously consider abandoning my job and start swine herding instead. It's hard work but you get much more intelligent clients.
Admin
I'm new here, came via Digg or some other RSS feed.
Anyhow, I have a few questions:
1) Would this all have been avoided if they used HTTP Authentication? I know it's not entirely secure, but correct me if I'm wrong: robots can't view anything in a pw protected dir since they're not authenticated?
2) Does anyone have any suggestions on validating form input. I am currently using a onClick="checkform();" type of javascript, and I realize this isn't ideal. Should I send all information via POST to a script that checks it and re-writes header data based on if it's cool or not?
Thanks.
Admin
No, it is the question. If it makes a non-idempotent state change on the server, you ought to use a HTTP POST. Really: http://www.dehora.net/journal/2006/03/oh_well.html, and follow the links.
If it were just getting and the request was idempotent, a link is just fine. That's what it's for. This is Distributed Systems 101 here.
Pragma's been around since HTTP/1.0, and Cache-Control was introduced in HTTP/1.1, so there hasn't been a problem with this since '96. That's not the issue though.
That's a valid reason to use proper authorisation and authentication, not a reason not to use HTTP GET.
Um, no. It's not a recommendation, it's an actual standard. HTML 4.0 is a recommendation because the W3C, but you'll find the HTTP and HML 3.2 were ratified by an actual standards body, the IETF.
That, and if you ignore the spec, you end up incompatible with everything else out there. A protocol is a shared agreement between you and your peers implementing that protocol. If your app breaks because you decide to ignore the protocol, it's your fault, not theirs.
The user experience should always come in first place and sometimes buttons or javascript just don't fit.
Heck, TCP and IP don't work quite the way I'd like sometimes. Maybe I can ignore their specs and do what I like. No, wait, then I wouldn't be able to communicate effectively with the rest of the world.
It's broken. If you really, really, really need to emulate :hover on buttons in IE, JS is fine for that. But do it cleanly with something like Dean Edward's IE7 patch. But that's seperate matter, its lack doesn't break anything, and is completely tangental to the thread.
K.
Admin
"He brought up the root cause -- that security could be beaten by disabiling cookies and javascript -- but management didn't quite see what was wrong with that. Instead, they told the client to NEVER copy paste content from other pages."
Granted, the authentication bug was a big deal, but the real problem was management's non-solution to the problem.