• (cs) in reply to PKman
    Anonymous:
    Yeah, a giant step off the edge of the f*ckin world

    You're lacking a few XML-related references, I was quoting Erik Naggum who once stated that:

    XML is a giant step in no direction at all.

    ;-)

  • Adam (unregistered)

    This reminds me of that Seinfeld episode where Kramer gets a vasectomy.  Whose brainchild was this?

  • John Hensley (unregistered)

    XML is an interchange format, people!   IN-TER-CHANGE!!!!!

    The real WTF will come when this company decides to make XML dumps of its database that contains XML text, at which point the developer responsible may or may not remember to escape the angle brackets. It'll look rather WTF either way.

  • (cs) in reply to johnl
    johnl:

    Someone mentioned JSON.  I've never heard of JSON before now, and I doubt many other people have either.  So why should I spend a lot of time implementing JSON into my application as, say an export format, when no-one can import it?


    Because you can just visit http://www.json.org/ and grab the library for one of the 22 officially languages supported? By the way, you mentioned AJAX. JSON is a strict subset of Javascript. Need I say more?

  • (cs) in reply to johnl
    johnl:
    Name some superior alternatives to XML that were around before XML came on the scene, and we can tell you why we're using XML rather than those alternatives.  Even if that's just because no-one knew about the alternative (see my comment on JSON)


    Lua. Yes, it's a full-fledged programming language, but data description is one of its main uses. You didn't know about it? Your loss.

  • suresk (unregistered) in reply to sammybaby
    sammybaby:

    I swear to god, I made a joke about doing just this very thing on Slashdot today.

    My joke was arguably worse, though, as the "data" took the form of serialized objects instead of xml fragments.



    Ummm... Did we work for the same company? The very first "real" job I had, I ran into that exact same issue. It saved the original programmer about an hour of programming, but any mass edit to the database took forever. Grab row -> Unserialize -> Check Conditions -> Update Object -> Reserialize -> Update. WTF???

    I've also run into string parsing in mySQL, and then joining on the results. PITA to use or debug. Operating a DB should require a license.
  • (cs) in reply to PKman

    CSV was a good idea before PKZip came along.

    Er... I'm not sure if we're talking at cross-purposes here, but isn't PKZip a compression utility, and one that you have to pay for, judging by www.pkware.com.  If I want to transmit data in plain text, why do I need a binary compression format?

    XML has served to create a need for humungous disk drives. . .   lots of memory for parsers to "work".    For providing a standardizable way of presenting data for printing i.e., Markup  repeat Markup. . .    I haven't seen the use for it.   I've been trying to climb the mountain to get the message. . .    but sorry, I'll just keep going to a different church.

    Fair enough.  I'm not a pro-XML fanatic, by the way, but the fact that I'm not an anti-XML fanatic seems to make you think I am.  I've already described some very good uses for it, and there are some more that I didn't mention.  For example, Installshield now allows you to store your installation project as an XML file.  This is pig-slow, but it has benefits - you can do text comparisons to see what's changed (great for source control), source control systems store the file as deltas, and you are able to use XML-compatible languages to edit the project programmatically (for example, I use this to set the installation version number and file paths in my NAnt builds).

  • (cs) in reply to johnl

    Why use PKZip? Well, you wouldn't, per se. You'd use a compression library. Personally, zlib works well for us - we have both C++ and Java implementations of that.

    Why use compression? Because that XML data is so bloated, and running it through compression will reduce its size by a factor of at least 10, quite possibly 100 on large chunks.

    Why reduce it? I don't know about you, but we have customers who have some relatively slow network links (e.g. out to Guam), and they really don't appreciate having those links saturated with XML bloat. It's also actively quicker to take the XML, compress it, UUEncode it to be safe from non-8-bit-clean links, transmit it and reconstitute it at the far end than to send it uncompressed.

    It also takes less space inside a database (and the XML is not intended to be indexed).

    You also seem to be confused about compression in general. A good binary compressor does so by spotting patterns in its input - the more patterns, and the longer, the better. Being text is a major pattern, which is why PKZip is so good at compressing textual data.

  • (cs) in reply to Bellinghman

    @felix:  Clearly you do need to say more, because I don't get what you're driving at.  I'm not going to visit the site if I don't know about it, am I?  And AJAX uses XML, so the fact that JSON is a subset of Javascript doesn't make any difference because AJAX already has a data representation system.  MAybe you're saying that JSON would have been a better alternative to AJAX, in which case you might be right.  But as I pointed out earlier, acceptance is an important part of why XML is popular - everyone uses it because the people who they'll exchange data with are using it.

    @Bellingham:

    Why use compression? Because that XML data is so bloated, and running it through compression will reduce its size by a factor of at least 10, quite possibly 100 on large chunks.


    Ok, you've just killed the data interchange thing for a lot of people because you're now using a binary format, which is not guaranteed to be compatible across all systems.  ASCII and Unicode are compatible across almost all computer systems, though Unicode may have a lower rate of success in this regard.

    It also hurts most source control systems because they don't store binaries as delta, so compressing your data isn't a good idea there either (it's usually better to leave it uncompressed so the system can store it as a delta, which, believe it or not, is smaller than a full revision). 

    And you've also just decided that, once you've exported your data, no-one should be able to do anything with it other than export it into the destination system.

    So, for example, you want to transfer data between two systems with a different schema.  You can export your data as XML, run a transform on it to convert it to the schema the destination system uses, then import it into the destination system.

    You also seem to be confused about compression in general. A good binary compressor does so by spotting patterns in its input - the more patterns, and the longer, the better. Being text is a major pattern, which is why PKZip is so good at compressing textual data.

    I never said anything that disagrees with this.  The fact is that the compression results in a binary stream.  Key word:  Binary.  Not text.  Binary.  Sometimes people need to work on their files as text.

    The only thing I can think of that would make your argument make sense is if you were saying that you should compress the data during the transmission, then uncompress on the other side.  That's fine, as long as the destination has access to the same compression algorithm, and yes it probably would help if you're transferring over a slow network.

  • (cs) in reply to johnl

    Oh, and you've also said that no-one should be able to compare the data with standard comparison utilities like WinMerge.

  • (cs) in reply to johnl
    johnl:
    @felix:  Clearly you do need to say more, because I don't get what you're driving at.  I'm not going to visit the site if I don't know about it, am I?  And AJAX uses XML, so the fact that JSON is a subset of Javascript doesn't make any difference because AJAX already has a data representation system.  MAybe you're saying that JSON would have been a better alternative to AJAX, in which case you might be right.  But as I pointed out earlier, acceptance is an important part of why XML is popular - everyone uses it because the people who they'll exchange data with are using it.

    Ajax doesn't "use" shit, ajax is already an abused and borderline retarded buzzword, please don't start spawning even more retarded buzzword just cause you don't want people to use JSON in AJAX.

    Fact is you're too late anyway, people are using ajaxy methodologies using JSON, raw HTML or even raw text as a data transport layer already.

    Oh, and JSON is not "an alternative to AJAX" btw, it's an alternative to XML as a data transport layer in scripted asynchroneous communications.

    Oh, BTW, most Yahoo ajaxy services use JSON already, not xml...

    Now if you can be bothered with at least somewhat thought of opinions on the subject PPK from QuirksMode posted to interresting pieces with even more interresting comments as The AJAX response: XML, HTML or JSON and The AJAX Response - Part 2

  • Sooper Doody (unregistered) in reply to Frank Cassata

    Unfortunately jamming XML everywhere in the database is a trend where I work.  Its a total nightmare to deal with this data.  Please someone send me the toe attachment for my shot gun.  People who think XML should be "jammed" everywhere should be shot.

    - Sooper Doody!

  • (cs) in reply to masklinn

    Ajax doesn't "use" shit, ajax is already an abused and borderline retarded buzzword, please don't start spawning even more retarded buzzword just cause you don't want people to use JSON in AJAX.

    Do you have trouble reading or something.  I don't know anything about JSON.  That's the friggin' point.  If people don't know about an interchange format, then it's friggin' useless as an interchange format.  I haven't decided that I don't like JSON, just that it's only an alternative to XML (at least, what XML was intended to be) if people know about it.  So far, not many people know about it.

    Oh, and JSON is not "an alternative to AJAX" btw, it's an alternative to XML as a data transport layer in scripted asynchroneous communications.

    Explain that to felix, since he seems to think JSON is an alternative to AJAX.  Anyways, see above.

    Now if you can be bothered with at least somewhat thought of opinions on the subject PPK from QuirksMode posted to interresting pieces with even more interresting comments as The AJAX response: XML, HTML or JSON and The AJAX Response - Part 2

    It's off-topic anyway, I'll look at it if and when I decide that I'd need to do something like this.

  • (cs)

    all I can say is... wow

  • (cs)

    I had a client once who demanded a database be scalable, capable of handling any number of user field additions, and run smoothly for their nationwide user base on a machine with a tiny processor and less than a gig of ram.   I left, maybe this guy came in :)

  • (cs) in reply to masklinn
    masklinn:
    johnl:
    @felix:  Clearly you do need to say more, because I don't get what you're driving at.  I'm not going to visit the site if I don't know about it, am I?  And AJAX uses XML, so the fact that JSON is a subset of Javascript doesn't make any difference because AJAX already has a data representation system.  MAybe you're saying that JSON would have been a better alternative to AJAX, in which case you might be right.  But as I pointed out earlier, acceptance is an important part of why XML is popular - everyone uses it because the people who they'll exchange data with are using it.

    Ajax doesn't "use" shit, ajax is already an abused and borderline retarded buzzword, please don't start spawning even more retarded buzzword just cause you don't want people to use JSON in AJAX.

    Fact is you're too late anyway, people are using ajaxy methodologies using JSON, raw HTML or even raw text as a data transport layer already.

    Oh, and JSON is not "an alternative to AJAX" btw, it's an alternative to XML as a data transport layer in scripted asynchroneous communications.

    Oh, BTW, most Yahoo ajaxy services use JSON already, not xml...

    Now if you can be bothered with at least somewhat thought of opinions on the subject PPK from QuirksMode posted to interresting pieces with even more interresting comments as The AJAX response: XML, HTML or JSON and The AJAX Response - Part 2



    Bad day at the office pal?

    sincerely,
    Richard Nixon
  • (cs) in reply to Richard Nixon

    Sorry, missed this bit:

    Oh, BTW, most Yahoo ajaxy services use JSON already, not xml...

    Then it's not really AJAX, is it?  Asynchronous Javascript And XML.  Note XML.  Essentially, they've used Javascript (including JSON) to reproduce what AJAX does.  Actually, thinking about it, GMail also uses Javascript to transfer data, so it might be JSON too.  Still doesn't in any way invalidate what I said, though

  • (cs) in reply to johnl

    "The only thing I can think of that would make your argument make sense is if you were saying that you should compress the data during the transmission, then uncompress on the other side. That's fine, as long as the destination has access to the same compression algorithm, and yes it probably would help if you're transferring over a slow network."

    By George, I think you're got it!

    Yes, of course you need the same algorithm at compression and decompression points. Why do you think I mentioned that we went for zlib? As I said, we've got it in both C and Java implementations.

    For cases where you're moving XML data between endpoints under your own control, then compression /decompression can be an enormous optimisation. That includes going over non-local network links and storage for later use.

    This is why XML bloat isn't as much of a problem as it could be.

    Note - some network links, databases and file systems may do automatic compression. In such cases, you will not want to bother doing a second level of compression. But I'll assume that you're experienced enough to know the usual caveats about optimisation.

  • (cs) in reply to Bellinghman

    Yeah, I can see that's why you went for ZDlib.  Though is that algorithm available to other languages?  I seem to remember it being available for C#/.NET, but what about, say, Ruby, Python, and so on?  (Just asking since I'm not sure)  Even if it is available for other languages, you can still get into situations where they can't/won't use it because they'd have to uncompress the data.  You can do that on the command line, but if this is meant to be an automated system, then they'd have to find a way of getting that system to call the uncompression command.  If they recieve data from other sources, some compressed, some not, some compressed with different algorithms...  That could be painful.  And if the system is off-the-shelf, then they could well be screwed.

    Besides, you still have to create that textual data.  It could be a JSON packet or an XML document or whatever - you want to zip it.  But that's not what's under discussion here, hence my original confusion.

  • (cs) in reply to johnl

    For the particular case of AJAX, where you're sending XML to a browser, you can generally reduce bandwidth dramatically by simply using mod_gzip (or gzipping the XML in your program if that makes more sense... just make sure to check the Accept-Encoding: header to see if the browser can take gzipped data).

  • (cs) in reply to johnl
    johnl:
    Yeah, I can see that's why you went for ZDlib.  Though is that algorithm available to other languages?  I seem to remember it being available for C#/.NET, but what about, say, Ruby, Python, and so on?  (Just asking since I'm not sure)  Even if it is available for other languages, you can still get into situations where they can't/won't use it because they'd have to uncompress the data.  You can do that on the command line, but if this is meant to be an automated system, then they'd have to find a way of getting that system to call the uncompression command.  If they recieve data from other sources, some compressed, some not, some compressed with different algorithms...  That could be painful.  And if the system is off-the-shelf, then they could well be screwed.

    What cave have you been living in for the last 15 years? There are no major (large) computers that do not have some form of zip available. Even when not installed by default, odds are the user installed it himself because it is so common.

    Okay, the computer in your microwave is unlikely to have zip (though I wouldn't be surprised if it was there for some development purpose and never removed), but they don't do network communication. Likewise you might find a PDP-8 still in use somewhere without zlib. Nothing reasonably current

    Python has Zlib. Ruby has it. Included in the default install (of current versions) because it is so common and useful. If you used google on your favorite language and zlib, I'm sure you would find it.

    We are not talking about something new or controversial, we are talking about zlib. Easy to use by any competent programmer - from their program. If your protocol is well designed at all, it will be easy to tell the other end to uncompress the data first.

  • (cs) in reply to felix
    felix:

    Because you can just visit http://www.json.org/ and grab the library for one of the 22 officially languages supported? By the way, you mentioned AJAX. JSON is a strict subset of Javascript. Need I say more?

    I think the real problem here is that thus far, nobody has really been able to satisfactorily justify a dislike of XML in favor of JSON. So let's run the issues, eh?

    Size: XML could be said to be larger due to a larger set of descriptors and such. However I think that in practice this would rarely be the case. The point of XML is to provide a description of data, and JSON contains, essentially, those same descriptive elements. The only real difference in size terms is that JSON uses slightly smaller sets of symbols (like brackets and parenths instead of <xxx> and </xxx> and such). However, the difference in any reasonable set of data is going to be small, and with compression such as gzip encoding to browsers, the difference in size is going to be virtually non-existent.

    Ease of use: JSON is significantly easier to use in javascript, as all you have to do is to eval the thing and you get a javascript object back, which you can then reference directly. This is a massively huge security hole, however, so you need to use a bit more javascript with some regular expressions to *correctly* parse the thing. For non JS languages, you can get libraries and such to do that parsing for you. However, XML is much the same in this respect, since all modern browsers have fairly easy-to-use XML parsers built right in, so really it's not all that complicated, using getNodeValue's and such. All modern languages have XML libraries available too, so there's also not a lot of benefit there either.

    Readability: Both JSON and XML are more or less human readable, although it can be argued that JSON "looks cleaner" because of lack of all those greater and less than brackets around everything. It also more directly represents the underlying data structure. This is really a matter of taste more than anything else, and considering that modern browsers render XML into a hierarchical display with expand/collapse links by default, XML really wins on this score. Not to mention that you can use an XSL transformation to render the XML in a pretty format for those cases where you're actually loading it into a browser, while leaving it unchanged for those cases where you're not loading it into a browser.

    Speed: It could be argued that JSON is faster to parse in javascript, which is to an extent true. The format itself fits the language guidelines directly, so it's a simple matter of an eval. However, what with the XML parsers built into browsers being usually compiled into native code, and what with the regular expressions needed to make eval'ing a JSON file safe, I'd say that the difference is not as wide as you'd think. I'd want to see actual profiling before making a determination there, and would not consider this to be universally true.

    In the end, it's a matter of preference. While it's true that Yahoo now offers most of their webservices using JSON, this is relatively new ( >3 months old, I think), and it's offered as an alternative to their more traditional XML webservices. Basically you can get the same data either way. They have switched to using JSON on some (not all) of their own apps for speed reasons, and that makes sense for those apps. But that's not going to hold true for all applications.

  • Integration Nation (unregistered) in reply to Sooper Doody

    People who put XML databases are usually the same people who have NEVER, EVER had to work with OLAP tools or reporting engines.

    When you can click-click-click inside Crystal and extract, aggregate and summarize XML-embedded data within a database AND it is also rationable and easy to use by any novice-level analyst will be the day my penis turns blue and falls off.

  • (cs) in reply to hank miller

    But not just any old thing will do.  Ok, so zip is pretty damn common, but that doesn't mean your destination system can use it.

    Python has Zlib. Ruby has it. Included in the default install (of current versions) because it is so common and useful. If you used google on your favorite language and zlib, I'm sure you would find it.

    I thought it might.  I just wasn't sure.

    We are not talking about something new or controversial, we are talking about zlib. Easy to use by any competent programmer - from their program. If your protocol is well designed at all, it will be easy to tell the other end to uncompress the data first.

    Ok, smart-arse, I'm running your destination system.  You send me data compressed with zdlib.  Can you think of the possible scenarios?  Tell you what, I'll list a few for you, save you the trouble:

    1. I say "it's a closed source 3rd party system, and it doesn't accept compressed data".
    2. I don't have time to adapt my system, even if it is a trivial change.  Likewise, I don't have time to take a critical system offline to update it just because your network is slow.
    3. Some of the people sending us data don't compress it (they have super-fast magical networks that don't die horribly at the first sight of a few lines of text, the lucky devils).  We now need to keep track of who compresses their data and who doesn't.  Which we don't have the time for.
    4. Someone else gets in on the act, only they don't like ZIP format, they want to use LHA (or some other compression algorithm) instead.  Maybe we could support multiple formats, but that just compounds points 2 and 3.
    5. My system isn't transferring over a slow network - maybe it's purely local (say, from one database to another, or one application dumping some XML for another to pick up).  So why do I need this extra step to compress/decompress data?
    6. Maybe I'm not transferring the data anywhere - maybe I'm converting it to XML to work with in the same system simply because that's easier to work with in my situation than leaving it a binary format.  So, again, why is the compression/decompression step needed?

    I can't understand what the problem with this concept is.  I'm not even arguing against the idea that zlib is useful in this context - I'm sure it is.  I'm telling you (something that you should already know) that it isn't always necessary or feasible to implement that.  I'm also trying to point out that, whether you zip it or not, XML is still a viable choice for working with data as text.  It may not be the fastest, or the most readable (though I don't have any issues with it), or the smallest, but it's there, it works and it's well-supported by many applications.

  • (cs) in reply to Integration Nation

    People who put XML databases are usually the same people who have NEVER, EVER had to work with OLAP tools or reporting engines.

    Or maybe they just don't have to on that project, or even that part of the project.  Truth be told, the system we're working on (close to release) uses both Crystal reports and SQL Notification Services.  The server-side application will dump a piece of XML in the notifications database to be picked up by the client application.  No OLAP tool or reporting engine ever goes near it.  The message only exists in the database until the client picks it up.  It's not something you'd want to report on.

    The advantage of using XML in this case is that we can use whatever schema we want.  Some of these messages have images embedded in them (specifically, a fragment of a map around a specific point), some don't.  Since Notification Services is an off-the-shelf system, it would be impossible for the developers of it to know what schema should be used (should it have a binary field for an image or not?), so a dynamic schema is the way to go, and XML is a viable way to do that in this case.

    However, think carefully before embedding XML in databases - it's usually not a good thing to do.

  • (cs) in reply to johnl
    johnl:
    Then it's not really AJAX, is it?  Asynchronous Javascript And XML.  Note XML.  Essentially, they've used Javascript (including JSON) to reproduce what AJAX does.

    God... I don't even want to bother with that so I'll just quote PPK and be done with it:

    On the XML side of the debate I noticed one fallacy: the fact that the name AJAX has "XML" in it. Although some say this means that XML is the "best" output format, I fully side with their opponents. The name AJAX has been badly chosen, and although it's far too late to turn back the clock and pick a better name, please remember that the phrase was coined by a non-technical person who wanted to point out a useful trend in JavaScript, and not by someone who wanted to lay solid technical foundations for this trend.

    Therefore, the fact that "AJAX" has an X for XML doesn't mean anything. In fact, this whole discussion is meant to see if the X is useful or not.

    (emphasis mine)

  • (cs) in reply to masklinn

    Well, that explains that one then.  Care to look at any of the otehr points, or is it all just too much trouble for you?

  • (cs) in reply to johnl
    johnl:
    Well, that explains that one then.  Care to look at any of the otehr points, or is it all just too much trouble for you?

    Since I don't know what "the other points" are i'll just go through every post of yours since my previous one.

    johnl:
    Ajax doesn't "use" shit, ajax is already an abused and borderline retarded buzzword, please don't start spawning even more retarded buzzword just cause you don't want people to use JSON in AJAX.

    Do you have trouble reading or something.  I don't know anything about JSON.  That's the friggin' point.  If people don't know about an interchange format, then it's friggin' useless as an interchange format.  I haven't decided that I don't like JSON, just that it's only an alternative to XML (at least, what XML was intended to be) if people know about it.  So far, not many people know about it.

    JSON is a lightweight data serialisation and interchange format. It's name is the acronym of JavaScript Object Notation, because it's merely the literal notation of object definition in Javascript. It should also be noted that JSON is at the same time a notation for dictionaries/maps in some other languages (e.g. Python). It can also use Javascript's Array literal notation for series of data. Here is an example of JSON data:

    {"user": {
            "surname":"Mask",
            "name":"Leen",
            "location":"TDWTF",
            "characteristics": [
                "Information Technology Pervert",
                "Troll"
            ]
        }
    }

    One of the great advantages of JSON (on top of being extremely easy to generate) is the fact that Javascript's eval is theorically all that's required to parse it, making it extremely efficient in parsing time and in using the data itself (since it's used in a regular JS-object access way, and not through a "third-party" interface). In practice, using eval on raw JSON is unsafe, and using the parsing function provided by JSON.org is much safer (it merely uses REs to scan the syntax of the JSON string and returns false if an error is detected. The package also provides functions to serialize Javascript objects in order to send them to the server).

    It should also be noted that JSON is a subset of the more complex YAML (and since YAML is the de facto data serialization format of Ruby, Ruby has no trouble parsing JSON data even though the JS object literal doesn't map any Ruby structure).

    johnl:
    Oh, and JSON is not "an alternative to AJAX" btw, it's an alternative to XML as a data transport layer in scripted asynchroneous communications.

    Explain that to felix, since he seems to think JSON is an alternative to AJAX.

    I couldn't conclude that from what I saw, what I read in Felix' posts was that AJAX heavily uses Javascript, and that JSON is native Javascript, therefore a logical data interchange format.

    That's the only post of yours I found things I could/should answer to, anything else needed?

    Otto:
    Ease of use: JSON is significantly easier to use in javascript, as all you have to do is to eval the thing and you get a javascript object back, which you can then reference directly. This is a massively huge security hole, however, so you need to use a bit more javascript with some regular expressions to *correctly* parse the thing.

    Yes, that regular expression to validate the string is available at JSON.org (the final parsing function takes something like 3 lines, they provide a much more complex Javascript > JSON serializer though)

    Otto:
    However, XML is much the same in this respect, since all modern browsers have fairly easy-to-use XML parsers built right in, so really it's not all that complicated, using getNodeValue's and such.

    Using raw JS object notation still is much easier (and usually more readable) than using the DOM interface to extract data from the XML document. Or so I think.

    Nothing to add on your other points.

  • (cs) in reply to johnl
    johnl:

    Do you have trouble reading or something.  I don't know anything about JSON.  That's the friggin' point.  If people don't know about an interchange format, then it's friggin' useless as an interchange format.


    Erm... if you don't know about it, shouldn't you go take a look?

    johnl:

    Explain that to felix, since he seems to think JSON is an alternative to AJAX.  Anyways, see above.


    Read my post again, will you? And rememeber that  XmlHttpRequest can handle data in any format. Even binary. AJAX itself is just a buzzword. It doesn't have to use XML. And if you have Javascript on the receiving end, JSON makes a better format, 'cause it is JS. You can deserialize a chunk with a simple call to eval(). There. Did I really need to explain that?

  • (cs) in reply to masklinn

    Since I don't know what "the other points" are i'll just go through every post of yours since my previous one.

    I meant "read my post" so I guess that works

    JSON is a lightweight data serialisation and interchange format. It's... <snip>

    My point was that JSON isn't as well-known as XML.  Telling me about JSON doesn't change that - though I do appreciate the information, I could find out for myself now that it's been mentioned, if I was really interested in the specifics of the implementation.

    But since I'm looking into Ruby at the moment, the Ruby link is interesting.


    I couldn't conclude that from what I saw, what I read in Felix's posts was that AJAX heavily uses Javascript, and that JSON is native Javascript, therefore a logical data interchange format.

    I read it as "don't use AJAX - use JSON instead!"  Maybe I misread it though.

    That's the only post of yours I found things I could/should answer to, anything else needed?

    Well, I think that's all the ones in direct response to you...

  • (cs) in reply to felix

    Erm... if you don't know about it, shouldn't you go take a look?

    You really don't get it, do you?  Who cares whether I knew?  It's whether lots of other people know.  Lots of other people don't know.  Saying they can go and look is useless because why would they if they don't know about it?

    If you still don't get it after that, then I give up.

    Read my post again, will you? And rememeber that  XmlHttpRequest can handle data in any format. Even binary. AJAX itself is just a buzzword. It doesn't have to use XML. And if you have Javascript on the receiving end, JSON makes a better format, 'cause it is JS. You can deserialize a chunk with a simple call to eval(). There. Did I really need to explain that?

    A later post cleared that up and it's just poorly named.  Though I'm surprised that XmlHttpRequest can handle other formats.  I can imagine JsonHttpRequest and so on, but I would expect XmlHttpRequest to handle, well, XML.  Sorry about that.

    I move that we submit AJAX as a WTF, not because it's bad in terms of functionality, but because of it's everything-must-be-prefixed-with-XML-even-if-it-doesn't-use-it naming standard.

  • mcguire (unregistered) in reply to SurfMan
    SurfMan:
    "If you know how to handle a hammer, don't think of everything as a nail...".



    You know what they say, "if all you have is a hammer and a screwdriver, everything looks like a threaded nail."

  • (cs) in reply to johnl
    johnl:
    Erm... if you don't know about it, shouldn't you go take a look?

    You really don't get it, do you?  Who cares whether I knew?  It's whether lots of other people know.  Lots of other people don't know.  Saying they can go and look is useless because why would they if they don't know about it?


    Why would the go and look if they don't know about it? So they learn something new and potentially better, that's why. How can you keep up in this bussiness if you don't learn new things all the time?

    And if you mean that you're passing data along to people you don't know, and it has to be in a widely-known format... well... I understand, but then it can't be application-specific, can it? Because you'd have to give them documentation as well. Then you might as well choose any format.

    No, I don't get it... just give me an example.

    johnl:

    A later post cleared that up and it's just poorly named.

    I noticed that after I posted, and I owed you a reply anyway. You're right about the name, but see above.

    johnl:

    I move that we submit AJAX as a WTF, not because it's bad in terms of functionality, but because of it's everything-must-be-prefixed-with-XML-even-if-it-doesn't-use-it naming standard.


    Good idea :))

  • (cs) in reply to felix

    Looks at today's WTF, cries

    As for the XML thing, I still don't see why reinventing the s-expression in an awkward and verbose manner is such a big deal...

  • (cs) in reply to felix

    Ok, not quite ready to give up on that point just yet felix :P

    Take me for example.  Before it was mentioned here I didn't go to the JSON site because I didn't know it existed.  I didn't read any books about it because I haven't seen any.  I didn't read any blogs about it because I didn't see any (I don't subscribe to all the RSS feeds out there or visit all the blogs, and I don't get time to read everything on the ones I subscribe to or visit regularly).  I didn't search google for it because, again, I didn't know it existed and I don't usually search for things that I don't know exist.

    Most likely, you just don't like XML (wait! I'm not trying to flame you - read to the end) and wanted an alternative to it.  Unsurprisingly, a google search for 'xml alternative' returns JSON as the 4th match.  However, before today I didn't do this search because I didn't see the need to - I find XML quite workable, and I didn't even think "isn't there anything else I can use.

    The problem with JSON isn't so much a problem with the system itself - it's a lack of publicity.  If you want, think of it as VHS vs BetaMax.  Not enough people bought BetaMax, even though it's widely accepted as the superior system.  But if nobody buys it, then they don't make tapes for it, so anybody buying one becomes even more unlikely.

    I'm not going to debate whether XML is better than JSON or vice versa, but XML has the advantage that everybody knows about it.  If I want to exchange data with someone, they are more likely to be able to do something with XML than JSON because their tools are more likely to support XML already.

    For this to change, JSON needs to encourage people to support it as an Import/Export format, it needs to get itself known.  XML is practically a household name which is vital for an interchange format.

    Hope that helps

  • (cs) in reply to johnl
    johnl:

    Take me for example.  Before it was mentioned here I didn't go to the JSON site because I didn't know it existed. [...]  I didn't search google for it because, again, I didn't know it existed and I don't usually search for things that I don't know exist.

    I don't even see how you could search for something you don't know exists. I happened upon it myself while looking for something unrelated. I didn't need it back then. But I learned about it because I could see it being potentially useful someday. Don't you do that kind of thing?

    johnl:

    Most likely, you just don't like XML (wait! I'm not trying to flame you - read to the end) and wanted an alternative to it.

    I did read to the end. It's not that I dislike XML. What I dislike is the one-size-fits-all approach.

    johnl:

    I'm not going to debate whether XML is better than JSON or vice versa, but XML has the advantage that everybody knows about it.  If I want to exchange data with someone, they are more likely to be able to do something with XML than JSON because their tools are more likely to support XML already.

    I'm not sure one is better than the other. I wouldn't use JSON for documents with much text and little structure, just as I wouldn't use XML for object serialization. But to choose one over the other based on popularity? It's like buying a Ford (or whatever brand is popular in your country) because all your neighbours did, not because it's the best car for your needs. As for the Betamax, it was popular. Among the professionals ;-)

    johnl:

    Hope that helps

    Yes it does. Let's move on to the next WTF.

  • (cs) in reply to felix

    I don't even see how you could search for something you don't know exists. I happened upon it myself while looking for something unrelated. I didn't need it back then. But I learned about it because I could see it being potentially useful someday. Don't you do that kind of thing?

    Of course I do, I just haven't happened on JSON as of yet.  The chances that links to stuff that's unrelated (and whether I pay attention to those links - ads just kind of get ignored, for example) will turn up JSON seem quite small to me.  I suppose if I was talking about representing data as text, as we are here, or I cast myself as someone particularly interested in the field, someone might mention it.  But conversations such as "How's the weather, oh, BTW, try JSON" aren't all that common.

    I'd be surprised if I was the only one who hadn't randomly come across JSON.

    I did read to the end. It's not that I dislike XML. What I dislike is the one-size-fits-all approach.

    Sorry, I didn't want to to think I was just calling you an XML-hater.  I was trying to say "you hate XML, so you looked for something better", rather than "you hate XML so anything else will do". I was worried that you might just read the first part of the sentence, get pissed off because you thought I was saying the latter, and flame me.

    I'm not sure one is better than the other. I wouldn't use JSON for documents with much text and little structure, just as I wouldn't use XML for object serialization. But to choose one over the other based on popularity? It's like buying a Ford (or whatever brand is popular in your country) because all your neighbours did, not because it's the best car for your needs.

    No it's not.  Buying a car is different because your choice of car doesn't affect anyone else.  If you really want to take the car analogy, it's like driving on one particular side of the road, just because that's what everyone else does.  ;)  (I invite you to drive on the wrong side - whichever is the wrong side in your country - and see what happens.)

    As for the Betamax, it was popular. Among the professionals ;-)

    Professionals aren't the majority.  If the pros use Betamax and everyone else uses VHS, then VHS is more popular.

  • (cs) in reply to masklinn
    masklinn:

    Using raw JS object notation still is much easier (and usually more readable) than using the DOM interface to extract data from the XML document. Or so I think.


    "More readable" I would agree with, but "easier" I would argue against, since the code is usually pretty identical both ways, you're just referencing elements directly with JSON and using functions to reference them with the XML DOM. It's really not all that different in the actual code, although the direct approach is more intuitive and readable.

    But for the case where you're pulling down data and then stuffing bits from it onto a webpage (not an uncommon case), XML is significant easier and more readable if you are using XSLT to do it, because in that case it's reduced to a few functions. Read in the XML and XSL, run them through transformXML, shove the output to a div somewhere. It takes a bit more thought in the design, but it's also significantly faster in IE (thanks to the transformation running as native code), though it admittedly can be slightly slower in Firefox. For a lot of what AJAX type stuff tends to do, this fits the model well. Not always, I grant you, and XSL itself is a WTF and a half, but it works once you get used to it.

    They both work well in different cases, and you really can't say that one of them is better in every case.
  • (cs) in reply to Otto
    Otto:
    But for the case where you're pulling down data and then stuffing bits from it onto a webpage (not an uncommon case), XML is significant easier and more readable if you are using XSLT to do it

    But you can't. Because only two browsers support client-side XSLT. So you're basically left with 3 choices, not giving a flying fuck about other browsers (duh), implementing everything twice, once with and once without XSLT (with two code paths to maintain), or implementing everything without XSLT.

    I don't quite see the point of the second choice, and the first one would only be acceptable in a controlled environment (LAN/WAN).

  • (cs) in reply to masklinn

    Precision: client-side XSLT meaning, in this case, JS API to the browser's XSLT processor. Safari can apply XSLT files to XML files but doesn't provide any JS API, and Opera 8.5 doesn't provide said API yet, even though I think the soon-to-be-released Opera 9 does.

  • (cs) in reply to masklinn
    masklinn:
    But you can't. Because only two browsers support client-side XSLT. So you're basically left with 3 choices, not giving a flying fuck about other browsers (duh), implementing everything twice, once with and once without XSLT (with two code paths to maintain), or implementing everything without XSLT.

    I don't quite see the point of the second choice, and the first one would only be acceptable in a controlled environment (LAN/WAN).

    True as far as it goes, although there do exist javascript implementations of XSLT (Google's AJAXSLT, for example). I wouldn't recommend them for prime-time use, because they are slow, but they work.

    However, client-side XSLT is in use on a number of different live web-sites. They simply don't support Safari (or Opera until Opera gets around to 9). So I disagree with the controlled environment notion. For all intents and purposes, the web that can do AJAX and such other functionality consists of two browsers: IE and Firefox. Safari is not supported by lots of websites and it likely won't be until they add that sort of thing. Lots of Mac users just install Firefox anyway because of the lack of Safari support on all the newer websites. Or rather, if Safari works it's because of luck as opposed to intent.

    In any case, Safari is not on my own list of test cases, nor will it probably ever be. Looking through various logs, Safari users comprise less than 1% even on the pure HTML sites I can see logs for. It's simply not a big enough market to worry about, IMO.

  • (cs) in reply to masklinn
    masklinn:
    Joost_:
    On a related note, when is Oracle going to implement XML DOM Level 2? REAL ULTIMATE POWER! select XMLDocument.loadXML(table.xml).selectSingleNode('/root/elem['+$id+'][@attr1]').value from mywtftable as table

    Actually, they'll probably use that XQuery bullshit, the wonderful Tamino database already does it (native XML database for the win, XQuery or XQL requests (with heaps of XPath goodness... the parts that've been implemented in the DB engine i mean), runs slower than molasses and overall defeats the very purpose of a database).

    But at least you can run XSLs on the raw data you retrieve from the DB without having to do any transformation!

    Anonymous:
    Someone needs Tamino - http://www2.softwareag.com/Corporate/products/tamino/default.asp

    It's an XML database. Yes I've had the misfortune to use it...

    CAPTCHA : bozo - Hmmm....

    Oooh, another guy who's had the (dubious) priviledge of "using" Tamino, I'm not alone!

    A european level project with european news agencies involved...

    ... using a XML database (Firebird?), data processing, server-client communication....

    ... when the client and server code are mostly finished and functioning, they attempt to place the distributed application in production....

    .. when the XML database is holding more than 300 news, it starts going sllllooooooooooooooooowwwww....

    ... just when they have discovered the problem, they get their finantion cut because of project director's problems with justice...

    ... so they never get to rewrite the whole thing to use a relational database and relegate XML for inter-application communication :(

  • (cs) in reply to masklinn

    Masklinn, you realise that you can apply an XSL on the server, right?

  • (cs) in reply to johnl
    johnl:
    Masklinn, you realise that you can apply an XSL on the server, right?

    Yes, and I realise that this is absolutely not what Otto was talking about, too.

  • (cs) in reply to Dreddlox
    Anonymous:
    Thats...not really that bad...
    Seriously, I've done far worse things in SQL and I've known the language less than 3 months
    Try UNIONing 3 statements each 6 lines long, reading from tables with different column names for the same data and each statement requiring 2 SELECTs in their FROM list because the unindexed lookup table otherwise gets overly linked and sends the execution time to hell... That statement only reached 3 SELECTs deep(the third was so that I could order the whole lot) and I found it relatively easy to maintain weeks after I had written it.

    If you think SQL is bad, try dealing with VB... The only reason my statement was so long was because it'd be agonizing to bring the data back and do all the transformation on the local computer in Excel for Applications... -.-'

    Wow. Are you masochist or something? :)

    I now refuse to work with anything barely resembling VB, windows or office because they require that sort of kludges because of their limitations. Very specially Access. Gimme mysql or postgres, even if I can only use the command-line client.

  • Norman H (unregistered) in reply to Satanicpuppy

    Amen!

  • KennyBoy (unregistered) in reply to mpswaim

    Sitara?

  • (cs)

    I worked with an e-store package that did this for addresses. It was all fine until we needed to modify the code to search addresses by zip, state, name, etc.

    When we got that requirement and realized the implication, much "WTF" was uttered.

    I work someplace else now :-)

  • theshowmecanuck (unregistered) in reply to GalacticCowboy

    Maybe they were concerned that their database might become over-normalized.  Everyone knows the danger of an over-normalized database, I'm sure.

     I know you were being sarcastic, but in truth, sometimes it can be over normalized (depending on the application).  That is why people sometimes need to de-normalize a schema... usually when you end up having to do a large number of joins to retrieve your data for a high demand application (even with the chance of data redundency).  Data warehousing can be thought of from one perspective (sometimes) as denormalized data.  Yeah I know, it is really 'better optimized for retrieval'... but that is why you would denormalize.
     

  • Zetetic (unregistered) in reply to osp70

    I am certain that the guy who came up with this plan was immediately promoted to management.

     

    Good. That way he will do less damage. 

Leave a comment on “JOIN ON WTF”

Log In or post as a guest

Replying to comment #:

« Return to Article