- Feature Articles
- CodeSOD
- Error'd
- Forums
-
Other Articles
- Random Article
- Other Series
- Alex's Soapbox
- Announcements
- Best of…
- Best of Email
- Best of the Sidebar
- Bring Your Own Code
- Coded Smorgasbord
- Mandatory Fun Day
- Off Topic
- Representative Line
- News Roundup
- Editor's Soapbox
- Software on the Rocks
- Souvenir Potpourri
- Sponsor Post
- Tales from the Interview
- The Daily WTF: Live
- Virtudyne
Admin
My bank's site required Javascript for their BS useless Click your Pin Number interface. I sent them screenshots of how you could change the code in Firebug to change verify_form() to just return true, and a month or so later they changed the interface.
Admin
The first dot com I worked at ran the Netscape web server. Which was server side javascript. It's been around for forever.
Admin
How does ASP.NET save you from JS? You mean, save you from writing it yourself?
Admin
JSON is great to debug with since you can just go to your site, hit F12 and look at your posts and responses. Not using notepad to debug the browser, using Firebug to debug the browser. Since that's what it's for. Also... JSON <> XML text, not sure why that isn't clear. JSON gets rid of the 90% of the xml syntax which is useless bloat and leaves just a tiny bit beyond the actual data. Getting rid of the 90% of cruft is the win, not the 5% of cruft.
Admin
Admin
From Acronym Finder:
Point Information Jeunesse (French: Youth Information Point) Palestine-Israel Journal of Politics, Economics and Culture (quarterly publication) Points d'Information Jeunesse (French: Youth Information Office) Proximal Interphalangeal Joint (veterinary science) Project Investment and Justification (Arizona) Melanesian/Pidgin (language) Pride in the Job (home builders' award) Palestinian Islamic Jihad
Take your pick.
Admin
JSON is actually not that bad. For efficient processing it's not in the same league as X12/EDIFACT or ASN.1 (which were specifically designed for efficient data transfer, and at this point I'd also like to point out X12 et al as examples of human-readable encodings that aren't at all human-readable, to the extent that one bank once claimed they were encryption until someone read their 8583 transactions out loud to them :-), but it's still light-years ahead of XML. Mind you that's because pretty much everything is light-years ahead of XML, it sort of forms the end-of-the-scale mark that everything else is more efficient then.
(Geek quiz: Does anyone know of any at least moderately-used on-the-wire format that's even less efficient or more complex than XML? I've dealt with an awful lot of them, and I can't think of anything that comes close).
Admin
"Are you the Judaean People's Front?" "Fuck off! We're the People's Front of Judaea!"
Admin
Interesting question. I was about to cheat and say XHTML, but actually I think XHTML wins on both (and is a cheat in this context anyway).
The whole argument ignores cultural history, anyway. XML didn't just spring unannounced from nowhere: it's a travesty of SGML, because SGML is what the authors were used to. In other words, it wasn't carefully designed from the ground up to solve the problem at hand; it was a useful hack with little thought given to parsing and none whatsoever to scalability.
And I'm sorry, but any representational format that uses in-band control mechanisms with all those stupid ampersands and, God help us, CDATA, is clearly not well designed. "We use a textual format ... except when we don't."
The argument that you can more easily debug a textual messaging format is specious. This may work for an XML message of a thousand characters or so (I personally get angle-bracket blindness, but that's just me), but once again it doesn't scale. And it ignores the fact that you have a broken message. The thing to do with a broken message, whether binary or textual, is to reject it and ask the sender to correct it. You're not doing the sender any favours by second-guessing them.
Well, call me an old ASN.1 or Edifact fuddy-duddy. I do rather like Json, though. It's a representation that understands its own limitations, and within those limitations it works brilliantly.
JSON reminds me somewhat of LISP. XML, on the other hand ... Well, XML reminds me of that style of literary German popular about a hundred years ago where you actually had to turn the page to find the verb you were looking for.
Admin
Hopefully it's not moderately used, but in some dealings I had long ago with .NET webservices I came across base64-encoded binary serialization of XML DOM wrapped in XML. As in: binary serialize a homegrown equivalent to the .NET XMLDocument class and everything below it, then return the resulting byte array as the webservice's response and let the XML serializer base64-encode it.
(You could almost make a case for binary serialized XML strings to handle bundling of multiple documents with varying character encoding in one response, but straight up serializing the DOM object graph is nuts.)
Admin
It's more similar than that. You can easily convert a java function to javascript by removing type qualifiers, test it (works as expected) and get bitten by int/int not being a float in Java.
Admin
Admin
JSON is better than XML for most web applications, and there are many other popular ones.
Applause! You've justified the use of a binary format in certain situations.The key thing is that writing, maintaining, bugfixing and supporting applications related to binary formats is more expensive than textual ones.
Sometimes that cost is justified by savings in CPU, bandwidth, storage or whatever requirements. Sometimes it is not, and just consumes developer and support resources for no good reason.
Of course, HTTP is a textual protocol anyway, so you may be just base64 or otherwise re-encoding binary data that would be smaller 'on the wire' if it were in a raw textual format in the first place...
Admin
Let's say that you are responsible for the program that sends the message.
It's broken somehow - your data customer has called up to say that something is ****ed up with the data file that program sends them.
How are you going to fix it? First of all you need to find the bit that's broke. Regardless of format, first thing you'll probably do is try your verbose parser on the thing. If that finds the issue, great. Nobody wants to manually trawl through a big file of any kind.
But what if it doesn't?
Figuring out where the data is screwed is going to be easier in a message you can open in your favourite text editor when compared to same information in a hex editor.
Admin
Admin
By the way, the guy who said that xml is a SGML application is right too. There are so many things possible with SGML, so why did they pick the most bloated one? Even a single change to the xml specs, that would allow <tagname/content> instead of <tagname>content</tagname>, would have reduced the bloat by half. Oh, and that's not nearly as big a change as some of you might think.
Admin
Same for JSON as well, you usually just get a single endless string of wire-format data that looks like a COBOL programmer's version of EDIFACT. So you feed it through a pretty-printer to get it laid out in a (human-)readable manner... and now you've just done the same thing that you'd need to do for a full-on binary protocol.
(This could lead to an interesting argument about how you want to define "human-readable", which I suspect is in practice about as easy to nail down as "Web 2.0" and "the cloud". What makes it particularly pernicious is that unless you're incredibly careful about how you slice it, you'll end up with either JSON being found to be non-defined-to-be-human-readable or EDIFACT being found to be defined-to-be-human-readable).
Admin
Damn, right after I hit "Submit" I remembered this link, which argues that XML was never meant to be human-readable (and that's by the co-founder of jGuru, hardly an XML-hating radical).
Admin
Posten's rule is nothing of the sort, and in fact there are several authorities (far more knowledgeable than I) who consider it something of a mistake.
At the basic level of hardware (somewhere down at the physical layer, for example) I would be extremely surprised to find a driver that is "liberal in what it will accept." Sure, there are protocols that do error correction on the fly, but that depends on checksums and so on. It isn't remotely the same as pulling up a byte-stream (presentational layer format: your choice) into Notepad or Emacs and editing it, is it?
However, to your scenario. (It's worth pointing out that we are now at the proper point for fixing the message. Not the destination; the originator.)
Frankly, I could be sending the message in Klingon for all it matters. Presumably I have a Klingon parser. Presumably I have a set of software tools that understand Klingon. Presumably, out of the hundreds of thousands of bytes, whether binary or textual (and the latter includes things like Unicode and UTF-8 and UTF-16 and even Base-64, so even this is not quite trivial), I have a visualisation tool that gets to the broken part of the message and turns it bright red or something. If not, my tools are broken and my message is garbled beyond hope of redemption.
I've worked with enough XML to recognise that this unhappy state of affairs is pretty much an everyday occurrence.
Notepad (you may choose not to call it a tool in this context, and I would agree, except for the fact that all the testers I know use it) is hopeless. IE whatever is hopeless. I've even downloaded one of the popular XML viewers, which is very very good when the "formatting" bit (ie the tags) is correct and allows you to edit the data bit at will. Unfortunately, it's crap in the common case where the "formatting" bit is wrong.
That's what happens with in-band communications, basically. Once you're screwed, you're screwed.
I can conceive of a purely textual format that handles field length, field type, repeaters, etc in a printable format that doesn't need binary information at any stage. Incidentally, that's pretty much what Edifact is.
What I cannot conceive of is a general-purpose textual messaging regime that doesn't require a specialised toolset (ie a parser) to analyse it. If I am correct in my assumption, that rather blows the text-vs-binary argument out of the water.
Admin
One other interesting consequence (inspired by your comment) is that people just won't stop there.
In a weird way, which I am afraid I will not defend or explain, you get fixed into a binary protocol. Changes take aeons.
With (say, and I'm not picking on it, except that it's a good example) XML?
You have "namespaces." I've never been quite sure how those work. Or more precisely, I've never been quite sure how to download them and analyse them. Or more precisely, I'm not quite sure how any given DOM will use them. They're a massive and as far as I can see random abstraction.
Or take XAML. Microsoft (my employer) is very fond of XAML.
XAML is a disgrace and should be put to death right now.
But it isn't just XAML. SOAP is an excellent concept that IMHO is wrecked by its dependence on XML. It's just plain butt-ugly, and I wouldn't care, but I've had occasion to recommend a "best of breed" SOAP parser, and you know what? I don't think there is one.
Or then there's things like design/architecture frameworks, like the one I was forced to use about four years ago. Pure XML. Pure text. Apart from the silly notational qualifiers, and the immense verbosity, and the fact that unless you actually bought The Official Parser from the Only Company That Used This Stuff, you were basically screwed.
It doesn't matter whether it's binary or textual in the end. Either way, you are going to need tools to parse and understand it.
The problem with "textual" formats like XML is that an architecture astronaut is going to be able to persuade your PHB that "it's easy to extend! Just load it up in Notepad!"
See, I've been in the data-comms field for 25 years or so, and believe me: this is not how it works.
Admin
Admin
That's why I'm such a big fan of TLV encoding, you've got your control data (the 'TL') and your payload (the 'V') and there's no way you can confuse one for the other. Well, barring pathological cases where the 'TL' portion gets corrupted, but you can't, for example, have something in the 'V' that screws up the processing of the surrounding 'TL's (or, in the case of security-related stuff, attack your control plane using your data plane as you can with XML). In addition since the 'V' is separated from the 'TL', the en/decoder doesn't have to worry about what's in there, unlike all of the in-band formats. There's one fairly definitive text on a particular aspect of XML that spends a couple of hundred pages discussing nothing more than how you deal with this problem, and never really resolves it beyond "well fscked if we know what to do here". Replace the encoding with TLV and all that gets reduced to a single line of "make sure you encode your T and L correctly".
Admin
I think you mean "Postel's rule". Posten's rule is "if it's big and fragile, drop it; if it's small and fragile, put heavy things on top of it".
Admin
ANNOUNCEMENT: This site has been rebranded TOWTF (The Occasional Worse Than Failure). Christmas (sorry, "Holidays") is no excuse.
That is all.
Admin
ANNOUNCEMENT: This site has been rebranded TOWTF (The Occasional Worse Than Failure). Christmas (sorry, "Holidays") is no excuse.
That is all.
Admin
Admin
Admin
And I don't know what this "resist change" schtick is all about. For about half of that time I've been surrounded by XML, with or (shudder) without schemas, so at this point I would regard JSON -- which is nicely designed -- as a change for the better.
Sometimes us old people are just in the unhappy position of seeing the same old crap being built up once again with a shiny new silver bullet factory attached.
(For a poster-child example of where binary encoding goes wrong, btw, see IBM's SNA architecture. It's truly vile. I'm convinced that not even anybody at IBM actually understood it.)
Admin
JavaScript was invented for extending HTML. It's the most widely supported way to extend HTML. If I want to extend HTML, JavaScript is the obvious choice. And since I don't need to extend HTML all that often, I don't want to have to become a JavaScript expert just to do simple things.
I'm finding it hard to care if JavaScript is good at other things, because I have other choices for other things. I'd be willing to try JavaScript for server-side code, given how wonderful you've apparently found it, but if it didn't work well for server-side code and my choices were then limited to Python, Ruby, Perl, PHP, and every other programming language in existence, I wouldn't be upset.
That said, the next time I need to mess with JavaScript, I will definitely take a longer look at the resources you mentioned. Thank you for the pointers.
Addendum (2011-12-29 13:44): I screwed up. The quote here is actually from this comment by Marnen Laibow-Koser. Sorry, L.!
Admin
That doesn't mean he is wrong. If you think he's wrong, tell us why. Don't just be an asshole about it tho.
Admin
Piss off. If he's not releasing content on his website to your schedule, then go make your own. (Why am I in such a shitty mood today?)
Admin
Why am I an evangelist just because I like something? The commenter said JSON was bad because who needs to look at the XML, and complained about data transmission. JSON has does not use XML, so that comment didn't make sense. JSON also has a low wrapper to data ratio, so the data transmission comment didn't make sense either. He actually said he wanted to transmit data in a binary format which I realllllly don't get - isn't his website in HTML?
I mostly do back end programming, mainly when I need to deal with JSON it's tracing webservice calls in Firebug.
Admin
Newsflash. There are better xml editors than Notepad. I usually open xml in visual studio since I already have it installed and it does the xml-beautication automatically. You can also use Internet Explorer for this purpose if you don't have any development tools other than notepad.
Admin
No, you don't open it in some other development tool and pretty print it. You just look at it in Firebug as you're viewing the site that generated it. Or in Chrome's equivalent. Maybe some of you aren't comfortable working with Firebug or browser developer tools. You can view the Post to the web service, view the Response as text, or view the JSON evaluated object where you can drill into the object exposing named members.
Admin
But by requiring a specialised parsing tool you've now made XML just as "human-readable" as a binary format like ASN.1. I can open any binary ASN.1 file in an ASN.1 editor and get a nice pretty-printed human-readable, freely editable text presentation of the data, and I don't need to use XML (with its bloat and overhead) to do it. This gets back to my previous point that unless you're very, very careful with your definition, either everything is "human-readable" or everything isn't "human-readable". In this case requiring a custom XML parser to make it human-readable is no different from requiring a custom ASN.1 parser to make it human-readable.
Admin
Whether it's text or not, though, it's still a binary format. Text just spares us needing to make a customized viewer in order to inspect the file contents.
Admin
Yep. IE is a "specialised parsing tool" indeed. It just happens to be one that is not only installed on literally every single Windows machine, but is embedded into the Windows operating system itself. And has been for 10 years+. Tell me, do you use Notepad to browse HTML sites? No? You accept that a browser can usefully convert SGML dialects found on the web into human readable content for HTML 4.01, but for some reason reject it for XML? Why?
Admin
I have never used ASN.1. Does it have minification and gzip compression tools available if the issue is data bandwidth? Those two are easy to implement with text based web communications, and I have to wonder what extra efficiency there would be in a binary protocol vs minified gzipped text. I also wonder how you would ever get a browser to communicate over ASN.1 in the first place, since browsers only go over HTTP. We were talking about using it as opposed to JSON or XML in a browser talking to a site right?
Admin
Admin
Admin
Eh . so Microsoft fed the trolls .
Never thought I'd hear anyone saying they used Silverlight . Congrats Hoodaticus, best abstract defense move ever --
Admin
Pure epic win . You, sir get my letatio (captcha)
Admin
You squishy human not smart like us orks .. We use Json, send it g-zipped and tadaam . as much network impact as g-zipped binary . and eat my shorts fool. - and it's still readable once unzipped (all in all an automated native conversion from text to binary 'less you didn't realize -- )
Admin
Well, the archetype of the behavior you are discussing has been around since the dawn of time ... we call it stupidity.
Glad for you you got to work during the bubble with some of the less tech-savvy of fools who tried to surf the wave of doom --
Admin
OBviously you fail . dutch it was , which is why he said (in dutch) " yea right, you can laugh " - my dutch -> english sucks dKb but it might still beat googll
pij in het nederlands betekent http://translate.google.com/#nl|en|pij
CAPTCHA: conventio
"Welcome to Conventia, the Convention Hall Planet !"
Admin
As I pointed out .. text-vs-binary is a retarded argument when speaking web . As the only thing that really matters for data transfers would be the amount transferred .. just using the default gzip will bring any solution to roughly the same size; although JSON will probably be best as it's quite simple to start with.
Not requiring a parser is impossible : either you or the application requires one to understand the message anyway . and even if the app doesn't use a parser it'll still be using a deserializer or something of that kind -
Admin
Nice fake quote .. A post by me saying javascript is not actually a piece of crap ? impossibru !
Admin
LoseThos anyone ?
Admin
Admin
ping zunesis