• David B. Goode (unregistered) in reply to anon

    It's getting harsh in here.

    One thing I've noticed during my programming career is that people have a hard time admitting that they're wrong. Case in point: SliceX

    It makes it easier on everybody involved if you just say, "Yes I used wikipedia to gather my information" and "No I don't know what I'm talking about, I just want to appear enterprisey to the others in the forum".

    Just admit that you were wrong and accept that you're not as knowledgeable as others. I know it's hard, but try.

    Admission and acceptance are both steps on the road to recovery.

  • SliceX (unregistered) in reply to David B. Goode

    Anonymous:
    It's getting harsh in here.

    One thing I've noticed during my programming career is that people have a hard time admitting that they're wrong. Case in point: SliceX

    It makes it easier on everybody involved if you just say, "Yes I used wikipedia to gather my information" and "No I don't know what I'm talking about, I just want to appear enterprisey to the others in the forum".

    Just admit that you were wrong and accept that you're not as knowledgeable as others. I know it's hard, but try.

    Admission and acceptance are both steps on the road to recovery.

    I will accept that I've only been familiar with the phrase for about 6 months. I am currently working on my first project using it and thus have not used it extensively although I can honestly tell you that I did not get my information from Wikipedia.

    I read TDWTF--trust me when I say that I believe there are cases of misuse where AJAX makes things worse instead of better. My point has been that according to the commonly accepted definitions of AJAX (a term still in it's infancy), it is not wrong to use the phrase to discuss the original post and that those who attempted to correct previous comments where doing so based more on symantics than an understanding of the underlying meaning of the phrase and the discussion at hand.

    For instance the following quote was in reply to a post about compressing XML:

    Just fyi, Javascript is not synonymous with AJAX. So many people think they're the same thing, but AJAX is a little more involved. Shows how much you know dude.

    While the statement is true, the context seems to indicate that the poster is merely reciting something they read on an elitist forum somewhere where everyone just bickers about whether or not it's acceptable to call a directory a 'folder'. Of course AJAX is more involved than just Javascript. Usually it involves XML and the Asynchronous transfer thereof, hence the current discussion about compressing XML.

  • (cs) in reply to too_many_usernames

    too_many_usernames:
    I just had an interesting thought, mostly brought on by the discussions about compressing lots of repeated text due to the use of communicating with structure that are essentially just meta-information (XML tags in this case):

    I wonder if anyone has performed any work in the field of looking not only at bandwidth of various communications mechanisms or data formats, but also the efficiency of various communications and data formats. By efficiency I mean it in the sense of answering the question "how much information do I need to send to obtain the desired result?" If my desired result is to show the string "Hello World" then I would want a communications mechanism that does that with the least communications possible. If I don't care how that message is displayed, my requirements are low. However, if I want to make sure that message is displayed in a certain typeface, with a certain letter size, in a certain position on the imaging device, more information is required - but the question is am I using the minimum communications possible to convey that extra information?

    It is my opinion that, from that standpoint, things like XML are extremely inefficient because the amount of meta-information that is desired could be conveyed with far less communication than is typical. Hence the ability to use compression algorithms effectively on those types of communication. I would posit that a necessary (but not likely sufficient) condition on "most efficient communication" is that it will not be able to be compressed.

    Welcome to 1985.  This is a great example of monolithic thinking.  You don't have to solve every problem on earth simultaneously.  If you are building a data representation system, make it the best data representation system you can.  Don't worry about transmission issues, it's easy enough to deal with that at a lower layer.  The processing power of today's computers makes it unnecessary to worry about 90% of those issues and the need for integration of today's systems makes it undesireable to have one solution try to solve every problem.

    XML isn't very space efficient.  However, it solves a billion data issues that other technologies don't even address.  Things like namespace based extensibility, repetition, and containment are dealt with aithout tripping over issues like multi-byte character sets, byte order, and compression.  And XML compresses to nearly the same size as a well-optimized alternative.  So all a hand-optimized solution gets you is limitations and a little less processor time.  The only thing XML is bad at is large amounts of binary data.

    By the same reasoning, assembly language is the best programming language, Object Oriented Programming is a bad idea, and HTTP should be replaced by a more efficient binary protocol.

  • Kent Boogaart (unregistered) in reply to Joe Blow Ahole

    Moral of story: users don't care how enterprisey the application is, only how usable it is.

  • Anonymous (unregistered) in reply to jsmith

    too_many_usernames:
    By the same reasoning, assembly language is the best programming language

    Actually, I find assembly to be a bit too abstracted. I prefer to design my own single purpose microprocessors or at least write raw binary ;) .

  • (cs) in reply to Roger

    Anonymous:
    merreborn:
    I worked on several projects in an AJAX framework in 2001, even though the term wasn't coined til '03. We used hidden iframes instead of xmlhttprequest, and JSON instead of XML, but it was the same basic idea.

    Point being, "5 years development experience with AJAX" is entirely possible, more or less.
    Completely incorrect. The term was coined in 2005! Check your sources. (And I don't mean wikipedia.) And I wouldn't go claiming on your resume that you have 5 years of AJAX experience, even if you worked on something similar. That just sounds retarded.

     

    Wrong...  AJAX as first coined as the name of a few cities

    Ajax Louisiana
    Ajax Virginia
    Ajax Ontario
    Ajax Pennsylvania
    Ajax South Dakota
    Ajax West Virginia

     

    all those poor people living in those cities must be very upset!

  • (cs) in reply to HitScan
    HitScan:

    Except that little thing we like to call "absolutely lacking namespaces in every way."


    I'm sick and tired of hearing that. C doesn't have namespaces either, and guess what: it's the single most successful programming language in the world.

    HitScan:

    Most people that succeed with PHP succeed in spite of it, not because of it.


    I beg to disagree. On what basis? You see, there's a little thing called personal experience. Ever heard of it?
  • (cs) in reply to felix

    felix:
    HitScan:

    Except that little thing we like to call "absolutely lacking namespaces in every way."


    I'm sick and tired of hearing that. C doesn't have namespaces either, and guess what: it's the single most successful programming language in the world.

    HitScan:

    Most people that succeed with PHP succeed in spite of it, not because of it.


    I beg to disagree. On what basis? You see, there's a little thing called personal experience. Ever heard of it?

    I agree with you.   These .NET nuts are just out to lunch.    I like classic asp more then asp.net.  And  that is because I know HTML, Javascript, and how to f-ucking program

    Php is just as good as ASP and in some cases you can do more with it.   Both still require you to know you html and javascript.   

    A good classic asp or php developer can produce a cleaner and faster website then a asp.net developer of the same skill.   

    "You can take that Viewstate and stick it up your ass!"

     

     

     

     

  • Hikaru (unregistered)

    GREAT JOB!!

    like an old chinese saying..

    If you can't move the mountain.. Move the road.

  • anon (unregistered) in reply to Kev777
    Kev777:
    I like classic asp more then asp.net.  And  that is because I know HTML, Javascript, and how to f-ucking program

    Php is just as good as ASP and in some cases you can do more with it.   Both still require you to know you html and javascript.   

    A good classic asp or php developer can produce a cleaner and faster website then a asp.net developer of the same skill.   

    "You can take that Viewstate and stick it up your ass!"

    Please Please Please tell me you're joking. I almost peed my pants laughing at this.

  • Gavin (unregistered) in reply to SliceX
    SliceX:
    I will accept that I've only been familiar with the phrase for about 6 months. I am currently working on my first project using it and thus have not used it extensively although I can honestly tell you that I did not get my information from Wikipedia.
    Where did you get your info from then? A random Google search? Did you just "Google it"?
    SliceX:
    I read TDWTF--trust me when I say that I believe there are cases of misuse where AJAX makes things worse instead of better. My point has been that according to the commonly accepted definitions of AJAX (a term still in it's infancy), it is not wrong to use the phrase to discuss the original post and that those who attempted to correct previous comments where doing so based more on symantics than an understanding of the underlying meaning of the phrase and the discussion at hand.
    This sounds like you're trying to claim some sort of moral high ground...however it doesn't quite work. Sorry bud...try again.
  • Nathan Strong (unregistered) in reply to Kev777
    Kev777:

    Anonymous:
    merreborn:
    I worked on several projects in an AJAX framework in 2001, even though the term wasn't coined til '03. We used hidden iframes instead of xmlhttprequest, and JSON instead of XML, but it was the same basic idea.

    Point being, "5 years development experience with AJAX" is entirely possible, more or less.
    Completely incorrect. The term was coined in 2005! Check your sources. (And I don't mean wikipedia.) And I wouldn't go claiming on your resume that you have 5 years of AJAX experience, even if you worked on something similar. That just sounds retarded.

     

    Wrong...  AJAX as first coined as the name of a few cities

    Ajax Louisiana
    Ajax Virginia
    Ajax Ontario
    Ajax Pennsylvania
    Ajax South Dakota
    Ajax West Virginia

     

    all those poor people living in those cities must be very upset!



    Not to mention the millions of people who use AJAX to clean their kitchen sinks, bathroom sinks, toilets, and bathtubs.

    Nathan
  • i am alive! (unregistered) in reply to Kev777
    Kev777:

    I agree with you.   These .NET nuts are just out to lunch.    I like classic asp more then asp.net.  And  that is because I know HTML, Javascript, and how to f-ucking program

    Php is just as good as ASP and in some cases you can do more with it.   Both still require you to know you html and javascript.   

    A good classic asp or php developer can produce a cleaner and faster website then a asp.net developer of the same skill.   

    "You can take that Viewstate and stick it up your ass!"


    ASP vs ASP.NET

    Your ASP reuse code with OOP?
    ASP.NET seems that can do that very well. This is a nice feature for good devs, that can reuse lots of code, and do elegant designs.

    Assembler dont support OOP, assembler is good because is faster and simpler (but not easier).


  • (cs) in reply to HitScan
    HitScan:
    Anonymous:
    absolutely nothing wrong with PHP - other than the occasional bug in the interpreter (all interpreted languages have bugs like that)


    Except that little thing we like to call "absolutely lacking namespaces in every way." That's a little annoying. Most people that succeed with PHP succeed in spite of it, not because of it.

    I'm right now doing some updates for a PHP+MySQL program which I originally wrote some 3 years ago (back then I had studied in university only for one year and PHP was the language that I was most familiar with). I hadn't looked at the code for 6 months and now that I needed to make some changes to the central database structures, my first thought was that it would be nice if this was written in Java or some other language with good tools for refactoring, development and testing. Luckily I'm getting paid by the hour.
  • C (unregistered) in reply to i am alive!

    <font size="2">> Assembler dont support OOP, assembler is good because is faster and simpler (but not easier).</font>

    Actually you're wrong, there are several assemblers which support OOP, and a number of others where OOP functions may be added with a suitable macro library.

  • (cs)

    I guess the real problem with this Web2.0 CMS and Elbonia is not bandwidth, it's latency. If every action requires "javascript loads javascript loads XML loads javascript", just count the number of roundtrips between the server and the browser till it's done. Now multiply that with the ping between these two.
    No wonder it's slow as hell, and even a real broadband connection would not make it much faster.

  • (cs) in reply to Jimmie Jay
    Anonymous:
    Furthermore, what is the development community's obsession with AJAX?

    They listened once too many to a marketroid who'd read in a pointy-haired-boss-magazine that Ajax was the best thing since hot water.

    Anonymous:
    richleick:
    I feel the same about perl, php, and linux.

    • Perl - is an antique language. Incredibly complex to learn and manage. There's really no point working with it anymore when you can use .NET...and of course AJAX
    .

    Perl is still one of the best things in existance for one off scripts (data scraping and all) or system admin scripts.

    Anonymous:
    absolutely nothing wrong with PHP - other than the occasional bug in the interpreter (all interpreted languages have bugs like that)

    More like everything is wrong with that pile of dung.

    • The $ prefix is frigging wrong, it comes from perl, but what the fucktards who hacked PHP together failed to realize is that it actually has a meaning in Perl
    • Functions being global everywhere is hella wrong
    • No notion of namespace (don't tell me about PHP6, it's about time, and PHP5 still isn't that common), Common Lisp has namespaces in it's hyperspec god damn it!
    • The "POO" blows badly
    • No damn prepared statements before PHP5, and a half dozen different functions just to escape strings, all of them being unsafe but one (more or less)... But of course they couldn't actually... like... deprecate or remove them, or aliase the old unsafe functions to new safe ones
    • And dozens of other security holes built into the language itself.
    • magic_quotes and register globals
    • Zend. Zend is fucking wrong
    • 3000 functions (no i'm not kidding) in the frigging global namespace
    • Dozens of functions to do the same thing slightly differently (want to search text? use function A. Want to search text case-insensitively? Well function B then)
    • Oh, and hellishly inconsistent naming, too

    I'm going to stop here, but oh well you probably get the point.

    Bus Raker:
    They used to actually type some their code, BY HAND!

    4GLs and stuff have tried to remove the coding phase.

    Needless to say they miserably failed.

    We'll still be coding by hand in 10 years.

    Anonymous:

    Anonymous:
    The real WTF here is that someone worked AJAX into the discussion. If you re-read the original post, there's no mention of it. Stay on topic!

    P.S. If working with AJAX makes you feel hardcore, then I feel sorry for you. That's just sad.

    Anonymous:
    Just fyi, Javascript is not synonymous with AJAX. So many people think they're the same thing, but AJAX is a little more involved. Shows how much you know dude.

    The real WTF here is the number of TDWTF readers who are so bold as to offer corrections on the meaning of AJAX when they themselves don't know the definition.

    AJAX stands for Asynchronous Javascript And XML.

    Wrong, it stood for, its current meaning only has a very lose relation with its original meaning

    Among various semantic shifts is the fact that you can use whatever format you want to communicate with the server (plain text, custom textual format, JSON, HTML) and are not restricted to XML, and the fact that you don't even have to use remote calls to create "ajaxy" applications while this was the very base of Jesse James Garrett's original article.

    Oh, and you can use frames/iframes to replicate xmlHttpRequest (I think that Dojo uses that as a fallback), and some turds even created a Flash (programmatic) interface called by JS that interacts with the server

    jsmith:

    too_many_usernames:
    I just had an interesting thought, mostly brought on by the discussions about compressing lots of repeated text due to the use of communicating with structure that are essentially just meta-information (XML tags in this case):

    I wonder if anyone has performed any work in the field of looking not only at bandwidth of various communications mechanisms or data formats, but also the efficiency of various communications and data formats. By efficiency I mean it in the sense of answering the question "how much information do I need to send to obtain the desired result?" If my desired result is to show the string "Hello World" then I would want a communications mechanism that does that with the least communications possible. If I don't care how that message is displayed, my requirements are low. However, if I want to make sure that message is displayed in a certain typeface, with a certain letter size, in a certain position on the imaging device, more information is required - but the question is am I using the minimum communications possible to convey that extra information?

    It is my opinion that, from that standpoint, things like XML are extremely inefficient because the amount of meta-information that is desired could be conveyed with far less communication than is typical. Hence the ability to use compression algorithms effectively on those types of communication. I would posit that a necessary (but not likely sufficient) condition on "most efficient communication" is that it will not be able to be compressed.

    Welcome to 1985.  This is a great example of monolithic thinking.  You don't have to solve every problem on earth simultaneously.  If you are building a data representation system, make it the best data representation system you can.

    And the best data representation system sure isn't XML, if anything S-Expressions have all the power of XML with twice the readability and half the verboseness.

    The only thing XML is good at is interoperability, which is why users of non overly-static-and-bloated (Python or Ruby for instance) see XML as a liability and an annoyance that should only ever be used when required by the client or if you have to interoperate with XML-based tools using XML-based protocols. While Java users see XML as the best thing since sliced bread, something they seemingly have to cram in every hole of their applications.

    felix:
    HitScan:

    Except that little thing we like to call "absolutely lacking namespaces in every way."


    I'm sick and tired of hearing that. C doesn't have namespaces either, and guess what: it's the single most successful programming language in the world.
    C was built as a beefed up and cross-platform assembly, C is as close to the metal as you can be without actually being assembly (or fortran), ASM doesn't have namespaces, C therefore doesn't have any logical reason to have namespace. You know, C has this little thing called consistence. PHP utterly lacks it.

  • (cs) in reply to YoAdrian
    Anonymous:
    Not exactly "WTF" material, is it?  There are plenty of new applications that are "web-enabled" that assume since you're using them internally you must have all the bandwidth in the world.  Just because it works in a browser, don't assume that your parents with the ol' dial-up connection would be able to use it!


    You don't think this
    "Bryan found pages would downloaded several megabytes each time they rendered even the most basic text."
    is a WTF? WTF!

    All that because it has to be so nifty and up to date, that you get a solution that doesn't work.

  • i am alive! (unregistered) in reply to ammoQ
    ammoQ:
    I guess the real problem with this Web2.0 CMS and Elbonia is not bandwidth, it's latency. If every action requires "javascript loads javascript loads XML loads javascript", just count the number of roundtrips between the server and the browser till it's done. Now multiply that with the ping between these two.
    No wonder it's slow as hell, and even a real broadband connection would not make it much faster.


    Well.. No always.

    If you download tons of data, you can avoid conexions to the server for a while. And you can even cache conexion to the server, so the local javascript retreive the data from local memory, and dont need to open a conexion.

    Fatclient mean some interactions can be local, cached, etc. Think about reordering a grid, ...you can do that locally with Web2.0 stuff. But Web 1.0 need to ask the server.

    Also, you can use a lightweight server to serve web2.0 content, or even a http server builin into your languaje. So you dont need a big server like Apache. I guest Apache is really optimized, but is.. anyway, very big and server multipurposes, so I guest a simple server can beat him.

    Anyway dialing conexions are deprecated because feel like hell.
  • (cs)

    Hmm proven again, there are a serious bunch of idiots here that think they all know best. Haven't replied on this site for a while but damn, the comment here are better than the original post :D

    For starters, what's all the stuff about dragging AJAX to it? Let me ask you a few questions. First - why the hell not? Second - what the hell do you think is referred to when ppl use the term "AJAX" in a non-marketting/hype context?

    AJAX bashers - I'll explain: what is described in the original post is EXACTLY what the abbriviation "AJAX" stands for. Like it or not, but that's what it is. When people talk about "web 2.0" applications, most of them use technologies which are in general referred to as "AJAX", which is in fact not such a big deal anyway, if requesting some data and processing it in background is such a "bad" thing, I defintly should quit this job (and no I'm not using AJAX :P)

    Let me make another thing clear: I am by no means an AJAX fan, hell, I avoid developing web applications as much as I can, but AJAX can be nice for some sollutions. For public content stuff - avoid at all costs, but for applications like gmail or a CRM that in general don't need perma-links? I don't see the problem. I also think it's a very good option for a lot of applications, it's cross-platform (if written well), no software distribution/updates, single point of maintenance (and failure - but that's a detail ;) :D) Owkay - the software maintenance becomes harder, if not a nightmare if you don't watch out, but if designed well, this should not be a problem, and just be a client/server design. Not really rocketscience - or is it?

    That said, this sounds like they have a very sub-optimal application there, the client-side Javascript should do some smart-caching itself to avoid these kinda things. A decent proxy server might help a bit too. I doubt compression will do much good since it are mostly short requests with short answers (GET request and a 304 response), which will NOT compress that well... Sure the XML will compress nicely, and it will help a bit, but if the majority of the data are stupid requests - don't expect any miracles...

  • (cs) in reply to jsmith
    jsmith:

    too_many_usernames:
    I just had an interesting thought...

    Welcome to 1985.  This is a great example of monolithic thinking.  You don't have to solve every problem on earth simultaneously.  If you are building a data representation system, make it the best data representation system you can.  Don't worry about transmission issues, it's easy enough to deal with that at a lower layer.  The processing power of today's computers makes it unnecessary to worry about 90% of those issues and the need for integration of today's systems makes it undesireable to have one solution try to solve every problem.


    XML isn't very space efficient.  However, it solves a billion data issues that other technologies don't even address.  Things like namespace based extensibility, repetition, and containment are dealt with aithout tripping over issues like multi-byte character sets, byte order, and compression.  And XML compresses to nearly the same size as a well-optimized alternative.  So all a hand-optimized solution gets you is limitations and a little less processor time.  The only thing XML is bad at is large amounts of binary data.


    I agree with your statements here - there are lots of things for which XML is very useful. However, most people do not use the tool in an appropriate manner. For instance, just about everyone sends uncompressed XML. This is not a problem with XML but a problem with people who use the technology (just like every other field of technology). The philosophy that bothers me most, though is the "I don't have to think about optimization (read "waste") because some chip designer or network guy will figure out how to make bigger, faster pipes." I'd rather have the philosophy of "How can I use the existing pipes to get more throughput instead of requiring more infrastructure?" - I think people need to use all three "R's" in software as well as in their daily lives: Reduce, Reuse, Recycle. I think software folks use the Reuse, but rarely the Recycle, and just about never the Reduce.

    jsmith:

    By the same reasoning, assembly language is the best programming language, Object Oriented Programming is a bad idea, and HTTP should be replaced by a more efficient binary protocol.



    I do not agree with your reasoning - note that I didn't say "best" I said "efficient" and I gave a very specific objective definition of what I meant by "efficient". I would actually argue that there are several ways to measure computer language efficiency: one is "get the computer to do what you want with the least amount of instructions" and another is "Get the computer to do what I want with as little effort to give it instructions as possible" (execution efficiency versus development efficiency). That's not an exhaustive list either - there is "keep it running with the least required resources", "require the least amount of post-release support", and the like.
  • anonymous (unregistered) in reply to too_many_usernames
    too_many_usernames:

    The philosophy that bothers me most, though is the "I don't have to think about optimization (read "waste") because some chip designer or network guy will figure out how to make bigger, faster pipes." I'd rather have the philosophy of "How can I use the existing pipes to get more throughput instead of requiring more infrastructure?"



    Yes. Theres 3 options to optimize code:
     a) better code
     b) better design
     c) wait fast&fat pipes

    C is very cheap, you do nothing and work. B often need a rewrite, so is expensive. And A is hacky and dont speedup much.  I think A is a bad idea. A mean the algorithm dont change, but you use clever tricks to make code faster.

    Example:
    you change "c = c * 2" to "c = c >> 1";

    Ignore A. So theres:

     a) better design: expensive and hard.
     c) wait: cheapo and easy
     
    If theres no budget, I guest everyone do C.

  • (cs) in reply to anonymous

    Well, I fully admit I have NFI what AJAX is all about ... If its web development related, I definately know nothing about it (I'm one of those old school, weird people that find a lot of things about the web to be very annoying) ... But I have seen it mentioned on the Resume's of people I interview to work at my job.  I honestly just skimmed over it since I had no idea what it was anyways, but what makes this a really funny WTF is, as mentioned here, plenty of the applicants noted many years of experience in working with the "AJAX Framework", and seeing here that it was coined in 2005, that just makes it all the more amusing.

  • anonymous guy (unregistered) in reply to Raider
    Raider:
    but what makes this a really funny WTF is, as mentioned here, plenty of the applicants noted many years of experience in working with the "AJAX Framework", and seeing here that it was coined in 2005, that just makes it all the more amusing.
    That is too funny. By the way, someone in the forum (koffie) mentioned that AJAX was synonymous with "web 2.0 applications". That is so far off the mark. Client-side is not the same as server-side. You know how you can tell? Try running your "enterprisey" AJAX application with Javascript disabled in your browser. Kinda sucks, doesn't it. I just read an online article where 72% of web users surf the net with Javascript disabled...how are you going to handle that koffie?
  • (cs)

    Ah, yes...

    Further proof that human stupidity is becoming more common in the Universe than hydrogen...

  • (cs) in reply to jo42

    this arguing about "AJAX" is silly.

    AJAX is just shorthand for an idea that one person coined and others are using.  I might call my implementation of this idea JEFF.   Or WD40.  Or R2-D2.  Does it really matter???

    The idea, of course, is fetching data from the server from the browser to update parts of the document dynamically instead of reloading the whole thing.  That's all.  I suspect that we all know this and I am not generating a newsflash here.  This idea itself has been around for a long time as has been noted, people have been doing it for years with IFRAMEs and the the like. 

    Two things have happened recently to facilitate the whole "AJAX is the newest, hottest thing out there": 

    1) lots of browsers are now implementing the XMLHTTP object.  This makes implementing this AJAX idea easier accross platforms.  

    2) Once people become aware of a concept or idea they hadn't considered before (i.e., that the whole page doesn't have to reload), and once it starts becoming easier to implement on different browser platforms, they start doing it!

    Actually, there's a third:

    3) it has a cool name -- AJAX -- and many people don't realize that much of the concept and the tools commonly used to implement AJAX were developed by Microsoft, therefore making it cool and hip.  

    It is more complicated than that?  Can we all settle down and stop arguing about specific implementations or definitions?

    What I find funny is -- everyone complains that IE invented new kinds of syntax and methods and DOM techniques that are not part of standards, but they fail to realize that a) the standards barely existed when most of these were invented and b) many of their ideas are good ones and ended up becoming part of the standards (or should be).  XMLHTTP and many DHTML techniques (i.e., the innerHTML property) are some of those.

  • (cs) in reply to too_many_usernames
    too_many_usernames:
    jsmith:

    too_many_usernames:
    I just had an interesting thought...

    Welcome to 1985.  This is a great example of monolithic thinking.  You don't have to solve every problem on earth simultaneously.  If you are building a data representation system, make it the best data representation system you can.  Don't worry about transmission issues, it's easy enough to deal with that at a lower layer.  The processing power of today's computers makes it unnecessary to worry about 90% of those issues and the need for integration of today's systems makes it undesireable to have one solution try to solve every problem.


    XML isn't very space efficient.  However, it solves a billion data issues that other technologies don't even address.  Things like namespace based extensibility, repetition, and containment are dealt with aithout tripping over issues like multi-byte character sets, byte order, and compression.  And XML compresses to nearly the same size as a well-optimized alternative.  So all a hand-optimized solution gets you is limitations and a little less processor time.  The only thing XML is bad at is large amounts of binary data.


    I agree with your statements here - there are lots of things for which XML is very useful. However, most people do not use the tool in an appropriate manner. For instance, just about everyone sends uncompressed XML. This is not a problem with XML but a problem with people who use the technology (just like every other field of technology). The philosophy that bothers me most, though is the "I don't have to think about optimization (read "waste") because some chip designer or network guy will figure out how to make bigger, faster pipes." I'd rather have the philosophy of "How can I use the existing pipes to get more throughput instead of requiring more infrastructure?" - I think people need to use all three "R's" in software as well as in their daily lives: Reduce, Reuse, Recycle. I think software folks use the Reuse, but rarely the Recycle, and just about never the Reduce.

    jsmith:

    By the same reasoning, assembly language is the best programming language, Object Oriented Programming is a bad idea, and HTTP should be replaced by a more efficient binary protocol.



    I do not agree with your reasoning - note that I didn't say "best" I said "efficient" and I gave a very specific objective definition of what I meant by "efficient". I would actually argue that there are several ways to measure computer language efficiency: one is "get the computer to do what you want with the least amount of instructions" and another is "Get the computer to do what I want with as little effort to give it instructions as possible" (execution efficiency versus development efficiency). That's not an exhaustive list either - there is "keep it running with the least required resources", "require the least amount of post-release support", and the like.

    You didn't give a bunch of alternative for "efficient" earlier.  Your question was why people didn't spend time optimizing for smaller data structures.  They don't spend time doing that because it is nearly at the bottom of the priority list.  In 1985, it was at the top of the priority list because programs had to fit on floppy disks and everyone used a 300 baud modem to connect to other computers.

    For 99% of us, optimizing for "require the least amount of post-release support" is our primary goal.  That means that we only spend our time making ultra-efficient data structures if there is a really good reason for it.  If we can add a compression layer at the web server, that is a far better option than optimizing the data structures and risking introducing bugs in the serialization process. 

    My three examples at the end were illustrations of mis-optimizing.  Most people don't write assembler because it's too hard to maintain.  If you need to write assembler for some reason, then do it.  But otherwise it's a bad idea.  Object Orinted Programming is an inefficient way of programming from a lot of perspectives.  It's harder to learn, requires more code to get the same job done, and often compiles to a less efficient program.  But it is the best way to make a maintainable application.  HTTP is a fairly fat protocol.  It has the exact problems you were complaining about earlier.  For example, to submit data to a web server, the HTTP packet starts with "POST".  It would be far more efficient to make use "P", or "2".  Yet HTTP is one of the most popular protocols ever and the maintainers have no plans to reduce the overhead of the packets.

    As for people misusing XML, it's not normally the developers job to compress XML.  Just do it at the transport layer.  A common use of XML is to get data from web servers.  Just have the administrator turn on HTTP compression and be done with it.  If you store XML in a file, turn on compression in the file system and you are done.  If the developer is writing a transport system, then they should compress the data as it is being transported.  Use a freely available zip or gzip library and it takes 5 minutes.

    BTW, I'm not saying XML is the best way to represent data, but it does make a good example for the above points.  If I used ASN.1 as an example, I'd lose half the audience.

  • (cs) in reply to anon
    Anonymous:
    Kev777:
    I like classic asp more then asp.net.  And  that is because I know HTML, Javascript, and how to f-ucking program

    Php is just as good as ASP and in some cases you can do more with it.   Both still require you to know you html and javascript.   

    A good classic asp or php developer can produce a cleaner and faster website then a asp.net developer of the same skill.   

    "You can take that Viewstate and stick it up your ass!"

    Please Please Please tell me you're joking. I almost peed my pants laughing at this.

    No I am not joking.     ASP.NET is great for creating web applications but it is trash for public access websites.      In fact, when I make public websites using ASP.net I avoid using the viewstate and the event model.     I wrote a template engine that turns out pages faster then the event model in .NET and allows my web designers (who don't know .NET) to work on the design without looking at code.    Web designers don't want to learn .NET. They work with photoshop, html, css, and javascript.   You will also have a hard time creating a web site with SEO in ASP.NET.  In fact all the top websites on the internet do NOT use ASP.NET.

    Classic ASP and PHP is simple and if you actually know how to write clean html, css, and javascript it works just fine.    It is also very easy to modify on the server (using notepad) if you have to fix a critical issue. 

    Anyway, I honestly think that .NET has greatly contributed to the development of poor web applications (as is the case with this CMS system) . People use .NET to create web forms just like they would an application and they have little concern for what is going on under the hood.  It is most likely that these designers stuffed a bunch of nice looking controls on their forms just for the sale value.   They had little concern for how much bandwidth would be used.

     

  • (cs) in reply to Jeff S
    Jeff S:
    this arguing about "AJAX" is silly.

    AJAX is just shorthand for an idea that one person coined and others are using.  I might call my implementation of this idea JEFF.   Or WD40.  Or R2-D2.  Does it really matter???

    The idea, of course, is fetching data from the server from the browser to update parts of the document dynamically instead of reloading the whole thing.  That's all.  I suspect that we all know this and I am not generating a newsflash here.  This idea itself has been around for a long time as has been noted, people have been doing it for years with IFRAMEs and the the like. 

    Two things have happened recently to facilitate the whole "AJAX is the newest, hottest thing out there": 

    1) lots of browsers are now implementing the XMLHTTP object.  This makes implementing this AJAX *idea* easier accross platforms.  

    2) Once people become aware of a concept or idea they hadn't considered before (i.e., that the whole page doesn't have to reload), and once it starts becoming easier to implement on different browser platforms, they start doing it!

    Actually, there's a third:

    3) it has a cool name -- AJAX -- and many people don't realize that much of the concept and the tools commonly used to implement AJAX were developed by Microsoft, therefore making it cool and hip.  

    It is more complicated than that?  Can we all settle down and stop arguing about specific implementations or definitions?

    What I find funny is -- everyone complains that IE invented new kinds of syntax and methods and DOM techniques that are not part of standards, but they fail to realize that a) the standards barely existed when most of these were invented and b) many of their ideas are good ones and ended up becoming part of the standards (or should be).  XMLHTTP and many DHTML techniques (i.e., the innerHTML property) are some of those.


    It's about time someone laid it all out. I don't think it could have been explained better.

    A note on the IE thing though:
    <rant>
    People complain because the ideas were already being tossed around, and implementations were being discussed, when Microsoft decided that "Hey, screw the standards, we'll just use our own implementation, and when they finally finalize anything, maybe we'll implement it." Doing stuff like that pisses off the rest of the industry. Especially when lots of web-developers start using the proprietary tags and methods, and don't bother to implement the true standards. Even more so escpecially when Microsoft takes their sweet time implementing the actual standards. It's the self-centric attitude that Microsoft takes that pisses off the rest of the community.
    </rant>
  • (cs) in reply to GoatCheez

    This seems to be a problem that plagues every language.  Take a look at C++.  For those of us that are really anal about writing standards compliant code, we get kicked in the nuts in situations like Win32 development where if we choose to either use the MFC, or Borland's libs, we are stuck with the fact that neither have implicit support for anything in the C++ Standard Library.  Yeah, once in awhile a revision comes out that may have an operator or two to support std::string or something, but I hardly call that supporting the standards.

    My father owns a web development business and one of the things that has always plagued him was trying to be standards compliant, while at the same time having a lot of his customers using IE, so trying to make them happy, but having to make certain things that he wanted to be reusable, proprietary, and having to have a second version for those that use Mozilla.  I'm far from a web guru in every aspect of the word, but I do know that anything that doesnt support ATLEAST the standards, is going to be a huge hinderance to all developers of that field.

  • (cs) in reply to GoatCheez

    GoatCheez:
    Jeff S:
    this arguing about "AJAX" is silly.

    AJAX is just shorthand for an idea that one person coined and others are using.  I might call my implementation of this idea JEFF.   Or WD40.  Or R2-D2.  Does it really matter???

    The idea, of course, is fetching data from the server from the browser to update parts of the document dynamically instead of reloading the whole thing.  That's all.  I suspect that we all know this and I am not generating a newsflash here.  This idea itself has been around for a long time as has been noted, people have been doing it for years with IFRAMEs and the the like. 

    Two things have happened recently to facilitate the whole "AJAX is the newest, hottest thing out there": 

    1) lots of browsers are now implementing the XMLHTTP object.  This makes implementing this AJAX *idea* easier accross platforms.  

    2) Once people become aware of a concept or idea they hadn't considered before (i.e., that the whole page doesn't have to reload), and once it starts becoming easier to implement on different browser platforms, they start doing it!

    Actually, there's a third:

    3) it has a cool name -- AJAX -- and many people don't realize that much of the concept and the tools commonly used to implement AJAX were developed by Microsoft, therefore making it cool and hip.  

    It is more complicated than that?  Can we all settle down and stop arguing about specific implementations or definitions?

    What I find funny is -- everyone complains that IE invented new kinds of syntax and methods and DOM techniques that are not part of standards, but they fail to realize that a) the standards barely existed when most of these were invented and b) many of their ideas are good ones and ended up becoming part of the standards (or should be).  XMLHTTP and many DHTML techniques (i.e., the innerHTML property) are some of those.


    It's about time someone laid it all out. I don't think it could have been explained better.

    A note on the IE thing though:
    <rant>
    People complain because the ideas were already being tossed around, and implementations were being discussed, when Microsoft decided that "Hey, screw the standards, we'll just use our own implementation, and when they finally finalize anything, maybe we'll implement it." Doing stuff like that pisses off the rest of the industry. Especially when lots of web-developers start using the proprietary tags and methods, and don't bother to implement the true standards. Even more so escpecially when Microsoft takes their sweet time implementing the actual standards. It's the self-centric attitude that Microsoft takes that pisses off the rest of the community.
    </rant>

    This is true,  since when has microsoft ever worried about creating code that is fast and efficient.   We are just lucky that the speed of the internet isn't increasing like the power of the CPU is.  If it did, we would be downloading their OS evertime we requested a page.    Microsoft doesn't give a damn about how fast their code is or how clean it is.  All they care about is making money on their OS and making you upgrade your PC every year for it.     The CMS system described here is just doing exactly what microsoft does best by forcing you to upgrade.   Security, bandwidth, speed, standards be damned when microsoft is around.

     

  • Andrey (unregistered) in reply to apparition
    apparition:
    Low tech outshines high tech -gotta love it!


    Props to the submitter for using the least complicated solution to solve the problem.  The CMS system's designers need a good slap upside the head, though.
  • Steve (unregistered) in reply to Kev777

    It's not that we're bashing AJAX, it's that there are too many coders out there yelling and screaming for attention: "Hey look at me everybody! I'm coding with AJAX! I'm cutting edge! I can put buzzwords on my resume..."

    It's just so ridiculous. They're trying to elevate AJAX and equate it to the .NET framework. What sense does that make.

    But I guess if it makes you feel warm and fuzzy by discussing AJAX implementations all day, by all means go right ahead. I'm going to find myself a more mature forum, where they discuss real programming techniques.

  • (cs) in reply to Steve
    Anonymous:
    I'm going to find myself a more mature forum, where they discuss real programming techniques.


    I believe this is known as "Take Your Ball and Go Home 2.0"
  • (cs) in reply to anonymous guy

    I just read an online article where 72% of web users surf the net with Javascript disabled...how are you going to handle that koffie?
    You shouldn't believe everything you read.

  • anonymous guy (unregistered) in reply to nullbyte
    nullbyte:
    You shouldn't believe everything you read.
    You're absolutely right. I totally made that up.

    Suckers.

  • A nony mous(e) (unregistered) in reply to anonymous

    Do you, by any chance, develop CMS systems which deliver upwards of 600kb per page over the internet?

  • (cs) in reply to Kev777
    Kev777:

    This is true,  since when has microsoft ever worried about creating code that is fast and efficient.   We are just lucky that the speed of the internet isn't increasing like the power of the CPU is.  If it did, we would be downloading their OS evertime we requested a page.    Microsoft doesn't give a damn about how fast their code is or how clean it is.  All they care about is making money on their OS and making you upgrade your PC every year for it.     The CMS system described here is just doing exactly what microsoft does best by forcing you to upgrade.   Security, bandwidth, speed, standards be damned when microsoft is around.

     

    That's a pretty dumb statement to make.  MS's main goal is to make things fast and efficient!  Sometimes, unfortunately, at the expense of security!  that's why they added all of these DHTML features to IE -- so that developers could create faster, more robust applications.  That's why they invented XMLTTP.  that's why they made manipulating the DOM much easier.    To make things faster, to decrease bandwith, to make it easier to developers to create apps. 

    Again, you can argue about how they went about things or how they have since implemented or worked with standards or well they were able to implement security in their ideas.   Adding features that allow programmer X to do good things also adds those same features for programmer Y to do bad things.  But statements like the ones you are making are completely ignorant.   Especially in light of their recents efforts to improve standards support and security.

    Try to look at things objectively and form your own intelligent conclusions and don't just repeat what you've heard people write at Slashdot.
  • (cs) in reply to Jeff S
    Jeff S:
    MS's main goal is to make things fast and efficient! 


    C'mon now Jeff, we all know that MS's MAIN goal is to have everyone in the world relying in their software. Sure, efficiency and speed are probably on their list somewhere, but I assure you that it is not at the top of that list. If it were at the top of the list, then Windows would have a GUI-less mode. Also, do you think that Aero will make the GUI more fast and efficient? Despite the fact that it's supposed to be accelerated by some video cards, I find it hard to believe that it will be faster than the good 'ol (non-shiny button, gray) interface.
  • Evan (unregistered) in reply to Nathan Strong

    There's also Ajax, the legendary ancient Greek hero...

  • (cs) in reply to Jeff S
    Jeff S:
    MS's main goal is to make things fast and efficient!  Sometimes, unfortunately, at the expense of security!


    First of all, things that are not secure cannot be used at all anymore, so speed and efficiency are nothing at all without security. Then, MS' goal is to make lots of money. Providing fast and efficient things is one possible way to reach this goal, but on the other hand, there is MS Office.
    My take is that MS makes fast and efficient programs if that is the way to displace OS/2 Warp and Netscape Navigator. Otherwise, MS makes slow and inefficient programs to help his long-time comrade Intel sell new PCs, which in turn come with Windows preinstalled.
  • CubicleDrone (unregistered) in reply to jsmith

    >This is a great example of monolithic thinking. 

    Hey, monolithic designs are great!  When they break, all you have to do is replace the lith.

  • anonny (unregistered) in reply to ammoQ
    ammoQ:
    My take is that MS makes fast and efficient programs if that is the way to displace OS/2 Warp and Netscape Navigator.
    Which explains the blazing speed and rock-solid stability Windows NT and Internet Explorer are world-famous for.

    Oh wait...
  • (cs) in reply to anonny

    Anonymous:
    ammoQ:
    My take is that MS makes fast and efficient programs if that is the way to displace OS/2 Warp and Netscape Navigator.
    Which explains the blazing speed and rock-solid stability Windows NT and Internet Explorer are world-famous for.

    Oh wait...

     

    Ha, for a second there I thought you were serious .. Though I must admit, I run Win2k Pro on one of my machines at home and it never crashes, at one point I had over a year of uptime on it, and thats with playing Neverwinter Nights a lot, doing a lot of programming, compiling huge projects, etc.  As far as Windows goes I'd say 2k Pro is as stable as they get.

  • John (unregistered) in reply to Jeff S
    Jeff S:
    Try to look at things objectively and form your own intelligent conclusions and don't just repeat what you've heard people write at Slashdot.
    Ouch! That's gotta hurt!
  • Caddilac (unregistered) in reply to John
    Anonymous:
    Jeff S:
    Try to look at things objectively and form your own intelligent conclusions and don't just repeat what you've heard people write at Slashdot.
    Ouch! That's gotta hurt!
    I'm sorry everybody. But I just have to say it. I have to be honest, right?
    I am so tired of people using sites like Wikipedia, Slashdot, etc. and quoting them as ultimate truth. Not only do they come off sounding ignorant, but it's obvious their source was the wikipedia! That is a 10 on the lameness scale.
    If you are a junior programmer, please listen to me. Wikipedia is a collection of people's opinions. It's not truth. Don't fall into this trap.
    Use your brain for something else other than summarizing the latest wiki entry for the group.
  • (cs) in reply to anonny
    Anonymous:
    ammoQ:
    My take is that MS makes fast and efficient programs if that is the way to displace OS/2 Warp and Netscape Navigator.
    Which explains the blazing speed and rock-solid stability Windows NT and Internet Explorer are world-famous for.

    Oh wait...


    IE4 was, at that time, much faster than NN; especially start-up. (We know it's because it was preloaded, but nethertheless it started _fast_).
    WinNT was slow, but it was Win95 - which was definitely fast - which pushed OS/2 Warp out of the home users market, and I saw it on a lot of corporate desktops too.
  • (cs)

    I have said it before, and I will say it again - just like how not everything should be converted into a COM object, not everything should be converted to XML. :)

    This sounds like a prime example where someone takes a decent technology (XML, like Java, has its place) and runs with it without bounds.  I mean, come on, a web page that downloads anywhere near 100KB or more (not counting images)?!? 

    The real WTF here is that someone had to have noticed that and said: "Sure, that kind of data transfer will be OK...".  That person should be rounded-up as quickly as possible!

  • (cs) in reply to Jeff S
    Jeff S:
    Kev777:

    This is true,  since when has microsoft ever worried about creating code that is fast and efficient.   We are just lucky that the speed of the internet isn't increasing like the power of the CPU is.  If it did, we would be downloading their OS evertime we requested a page.    Microsoft doesn't give a damn about how fast their code is or how clean it is.  All they care about is making money on their OS and making you upgrade your PC every year for it.     The CMS system described here is just doing exactly what microsoft does best by forcing you to upgrade.   Security, bandwidth, speed, standards be damned when microsoft is around.

     

    That's a pretty dumb statement to make.  MS's main goal is to make things fast and efficient!  Sometimes, unfortunately, at the expense of security!  that's why they added all of these DHTML features to IE -- so that developers could create faster, more robust applications.  That's why they invented XMLTTP.  that's why they made manipulating the DOM much easier.    To make things faster, to decrease bandwith, to make it easier to developers to create apps. 

    Again, you can argue about how they went about things or how they have since implemented or worked with standards or well they were able to implement security in their ideas.   Adding features that allow programmer X to do good things also adds those same features for programmer Y to do bad things.  But statements like the ones you are making are completely ignorant.   Especially in light of their recents efforts to improve standards support and security.

    Try to look at things objectively and form your own intelligent conclusions and don't just repeat what you've heard people write at Slashdot.

    Well..  these are examples of technologies that solve problem X only to create problem Y. 

    Since microsoft is in the business of selling its OS it doesn't want you to create web applications that will work on other OS systems and do all the processing on the server.   They also don't want you to write script without the need of their bulky developer tools.   That is why they create all these totally useless client side features and chunky  asp.net controls.    Active X and MS J++ are also good examples of MS trying to force their OS up everyones ass.

    In this regard classic asp was wonderfull because you could code it in notepad (never require a licence) and then host in on an external server without ever giving microsoft a dime or your own money).  In fact that server didn't even have to run IIS.  :)   I'll bet ms sat back before they developed .NET and thought about ways that they could make web applications require their OS. And force people to pay their licence fees.   With scripting you can't force that on people but with client side code and dlls you can.

    The statements I'm making are from hard lessons learned and experience in developing microsoft applications using their technologies.    The best web applications require the client to do as little processing as possible (if any) and return the least amount of code to the browser as possible.        I also find that it takes longer to develop (more lines of code and more layers) using MS technologies .  Things are not getting easier they are just more complicated.     Most of the new features that microsoft creates don't help programmers they help people who don't know how to program. They are great for MS geek conventions and make great sales, but they don't help me.  I have to write more lines of Code now then I ever did.   

     

     

  • (cs) in reply to Kev777

    oh .. and don't tell me that inline declarations count for less lines of code.   They just make ugly programs and encourage bad code. 

Leave a comment on “Incompatible with Web 2.0”

Log In or post as a guest

Replying to comment #:

« Return to Article