• (cs)

    <FONT face=Arial>But dynamic = good?!</FONT>

  • YoAdrian (unregistered)

    Not exactly "WTF" material, is it?  There are plenty of new applications that are "web-enabled" that assume since you're using them internally you must have all the bandwidth in the world.  Just because it works in a browser, don't assume that your parents with the ol' dial-up connection would be able to use it!

  • (cs)

    Low tech outshines high tech -gotta love it!

  • dimitry z. (unregistered)

    Now this is how you make Web2.0 "Enterprisee".

    Best part:
    "Being a Web 2.0 system, the CMS used JavaScript that dynamically loaded JavaScript that dynamically loaded XML that was dynamically transformed into proprietary commands that were parsed to dynamically execute JavaScript to dynamically load content."

    Dimitry

  • (cs) in reply to dimitry z.

    Anonymous:
    Now this is how you make Web2.0 "Enterprisee".

    Best part:
    "Being a Web 2.0 system, the CMS used JavaScript that dynamically loaded JavaScript that dynamically loaded XML that was dynamically transformed into proprietary commands that were parsed to dynamically execute JavaScript to dynamically load content."

    Dimitry

    And you just know that some yahoo is going to print out all the dynamically generated stuff onto a very static piece of paper, put it on a very static wooden table, take a very static picture, print out a very static photo, scan in a very static bitmap, ...

  • Dude (unregistered) in reply to apparition

    The real WTF is that they're using the internet....

  • (cs)

    And he found a good solution! The users probably thought they got upgraded computers.

    On an ironic note, I hit 'Quick Reply' - the main page faded, this dialog box came up (not a pop-up). Very "web-twennie." Of course, I am used to the forum software by now, but this entry made me think about it.

  • Magic Mike (unregistered) in reply to R.Flowers

    Javascript will solve everything/anything.

  • (cs)

    The terminal server sounds like a good idea, but the fact that a remote desktop (even using compressed and optimized protocols) outperforms the native browser doing all the work on the local machine boggles the mind.

    Alternative solution: use the site without JavaScript enabled in the browser.  If the CMS is worth anything, it will degrade gracefully for those users and do mostly static rendering.  I have a feeling, based on the 'the CMS used JavaScript that dynamically loaded JavaScript that dynamically loaded XML that was dynamically transformed into proprietary commands that were parsed to dynamically execute JavaScript to dynamically load content.' that thoughts of accessability and older browser environments were not thought of.

  • (cs) in reply to R.Flowers

    Alex must be scraping the bottom of the barrel. Time to get submitting!

  • Anonymous (unregistered) in reply to sinistral

    Has anyone considered using Javascript?

  • (cs)

    Gawd, what a horrible solution. Everyone know what they SHOULD have done:

    Print out the entire site, and mail the printouts to the remote location.
    Staff at that location is to fill out the appropriate forms from the print-outs when needed, and to send them back via mail for entry.
    When the forms reach the main office, they are to be placed on a wooden table, where a digital camera will take a picture of the form. The digital image is to be printed out, then scanned into the system and entered into a database. Finally, depending on the workload, one or more people will read the information from the digital images stored in the database, and enter the proper data into the local website.

    Simplicity at it's finest!

  • Anonymous (unregistered) in reply to GoatCheez

    Someone should contact Paula Bean.

    Maybe she has a solution.

  • (cs) in reply to sinistral

    sinistral:
    The terminal server sounds like a good idea, but the fact that a remote desktop (even using compressed and optimized protocols) outperforms the native browser doing all the work on the local machine boggles the mind.

    Alternative solution: use the site without JavaScript enabled in the browser.  If the CMS is worth anything, it will degrade gracefully for those users and do mostly static rendering.  I have a feeling, based on the 'the CMS used JavaScript that dynamically loaded JavaScript that dynamically loaded XML that was dynamically transformed into proprietary commands that were parsed to dynamically execute JavaScript to dynamically load content.' that thoughts of accessability and older browser environments were not thought of.

    Why would any self respecting developer support backward compatibility? Simply mandate that everyone everywhere simultaneously upgrade their environment to support dedicated T3 speeds globally.

    Problem solved!

  • (cs)

    OK, am I the only one who wonders what software is the basis for today's WTF?  I know Alex anonymizes, but it's not like Bryan's company wrote it, they're just trying to use it.

  • richdiggins (unregistered)

    WFT!

    why not fix the problem on the client side...

    "with the cache disabled, Bryan found pages would downloaded several megabytes each time they rendered even the most basic text."

    sounds like they needed to update their browser or enable page caching...

    just my 2 bits...



  • (cs) in reply to richdiggins
    Anonymous:
    WFT!

    why not fix the problem on the client side...

    "with the cache disabled, Bryan found pages would downloaded several megabytes each time they rendered even the most basic text."

    sounds like they needed to update their browser or enable page caching...

    just my 2 bits...





    No, you misunderstood the original poster.  Bryan discovered that it was 300k - 600k, mostly of 304s (use cache) when the cache was enabled.  When the cache was disabled, then a single page would cause a download of several megabytes.  He was pointing out that with a cache on the client side, it was bad.  Without cache enabled on client side, it was a massive WTF.
  • l1fel1ne (unregistered)

    Sounds like they are devout believers in Rube Goldberg design.

  • (cs) in reply to richdiggins

    Anonymous:
    WFT!

    why not fix the problem on the client side...

    "with the cache disabled, Bryan found pages would downloaded several megabytes each time they rendered even the most basic text."

    sounds like they needed to update their browser or enable page caching...

    just my 2 bits...

    Good idea in most cases. However, some clients have (possibly) weird configurations because of other applications that they use. Change some config for your app and you break the other one. Gotta play nice in the sandbox :(

  • Anonymous (unregistered) in reply to l1fel1ne
    Anonymous:

    Sounds like they are devout believers in Rube Goldberg design.

    Whoopi Goldberg design? Never heard of it. Does that involve a Ted Danson interface?
  • richdiggins (unregistered) in reply to sinistral
    sinistral:
    Anonymous:
    WFT!

    why not fix the problem on the client side...

    "with the cache disabled, Bryan found pages would downloaded several megabytes each time they rendered even the most basic text."

    sounds like they needed to update their browser or enable page caching...

    just my 2 bits...





    No, you misunderstood the original poster.  Bryan discovered that it was 300k - 600k, mostly of 304s (use cache) when the cache was enabled.  When the cache was disabled, then a single page would cause a download of several megabytes.  He was pointing out that with a cache on the client side, it was bad.  Without cache enabled on client side, it was a massive WTF.


    humm... then, HTF did terminal services get through on such a small pipe, with multiple clients?... i dont belive it...

  • Anonymouse (unregistered)

    Why is this a WTF? It just seems like a webapp that's more complicated than it needs to be, where the designers never considered the possibility of dialup access. Shitty design to be sure, but WTF worthy?

  • ohng (unregistered) in reply to richdiggins
    Anonymous:


    humm... then, HTF did terminal services get through on such a small pipe, with multiple clients?... i dont belive it...



    Terminal Services is obiously smarter than Web 2.0
  • Benzaholic (unregistered) in reply to richdiggins

    >humm... then, HTF did terminal services get through on such a small pipe, with multiple clients?...
    >i dont belive it...

    Could it be because MS didn't create terminal services, they licensed the technology from Citrix, who long ago came up with decent optimizations for using bandwidth?



     

  • Evan M. (unregistered) in reply to richdiggins
    richdiggins:


    humm... then, HTF did terminal services get through on such a small pipe, with multiple clients?... i dont belive it...



    That one is quite easy. With all the dynamically generated content, you're moving several 100's of KB of content, per page (asusming cache off here), a lot of which doesn't get rendered, but instead sits behind the scenes in Javascript etc, rather than directly getting put on screen. If by usign the program under terminal services, only the final rendered content gets sent directly to the user over their slow connection, saving much of their bandwidth (since it's all nicely compressed over the TS connection, rather than plain-text / compressed images).
  • joe (unregistered)

    <FONT face=Verdana size=2>Who evaluated the tool before the purchase?  Why didn’t they test it from the remote site before shelling out the money to buy the system.  I bet some VP made the decision based on a sales demo.</FONT>

  • Anonymutt (unregistered) in reply to Benzaholic

    There's a way to get around this by using AJAX and multiplexed asynchronous socket programming.

  • eloj (unregistered)

    Did he remember to apt-get install libapache-mod-gzip?

  • (cs) in reply to eloj

    the real wtf is why the F is there an office in the Republic of Elbonia?

  • (cs) in reply to Kev777

    Now that's what I call server side inclusion!

  • Ben (unregistered)

    I want to know what the name of the CMS is.  I've gotta see that!

  • (cs) in reply to joe
    Anonymous:

    <FONT face=Verdana size=2>Who evaluated the tool before the purchase?  Why didn’t they test it from the remote site before shelling out the money to buy the system.  I bet some VP made the decision based on a sales demo.</FONT>

    Its a known trick NOT to include technical people in sales meetings especially when you are trying to sell technology.

    I recall working for a very large company that purchased a CMS web portal for 2.5 million!  It had all kinds of funky 'Gadgets'.  They sold it rather quickly and the big Mr M was happy that he could publish his news articles over the intranet.  The web developers in house didn't even know about the purchase.  They just had to support it.   Basically the CMS system was a pile of junk that one of the developers could of made in 2 months.   Everyone on the dev team was like 'Pay me 2 million and see what kind of CMS program you get!"  

    This is just another example of how companies rip themselves off because they know nothing about technology or how to use their own people. 

    I'll bet Brian got a big FAT raise for wasting the companies money on something they could of developed in house. 

     

     

     

     

  • Pyromancer (unregistered) in reply to Kev777

    Kev777:
    the real wtf is why the F is there an office in the Republic of Elbonia?

    Otherwise Elbonians would have to use carrier pigeons to access this firm's site and it takes quite a while because you typically need 3-4 pigeons for every KB :)

  • SnapShot (unregistered) in reply to apparition
    apparition:

    sinistral:
    The terminal server sounds like a good idea, but the fact that a remote desktop (even using compressed and optimized protocols) outperforms the native browser doing all the work on the local machine boggles the mind.

    Alternative solution: use the site without JavaScript enabled in the browser.  If the CMS is worth anything, it will degrade gracefully for those users and do mostly static rendering.  I have a feeling, based on the 'the CMS used JavaScript that dynamically loaded JavaScript that dynamically loaded XML that was dynamically transformed into proprietary commands that were parsed to dynamically execute JavaScript to dynamically load content.' that thoughts of accessability and older browser environments were not thought of.

    Why would any self respecting developer support backward compatibility? Simply mandate that everyone everywhere simultaneously upgrade their environment to support dedicated T3 speeds globally.

    Problem solved!



    Exactly!  I'm not sure why this is here.  This WTF sounds like a hardware problem not a software problem.  Somebody should give those sysadmins a good talking to...
  • Anonymous (unregistered) in reply to SnapShot

    The real WTF is...The real WTF is...The real WTF is...(sounds of record skipping)

  • (cs) in reply to Kev777
    Kev777:
    Anonymous:

    <FONT face=Verdana size=2>Who evaluated the tool before the purchase?  Why didn’t they test it from the remote site before shelling out the money to buy the system.  I bet some VP made the decision based on a sales demo.</FONT>

    This is just another example of how companies rip themselves off because they know nothing about technology or how to use their own people. 

    It's pretty obvious the company screwed up by not including in the requirements that it must be accessibile over a dial-up.

    No big deal though, you simply create another smaller system with only the core necessary features that these little offices need that integrates with the database on a cost/benifit basis.  I mean, why sacrifice the whole system just for a couple of weak links?  Just because someone's car on the highway can only do 45 mph is no reason to lower the speed limit ... they should either get a new car or take the 'back' streets.

  • mc (unregistered) in reply to SnapShot

    The only WTF here is that the system administrator couldn't figure out how to add gzip compression to the web server.

    Yes, AJAX is pretty damn verbose (the same problem you'll encounter with SOAP) but there are things you can do to help remedy the situation. It doesn't take a genius to realize the amount of repeating data in XML lends itself very well to compression. Every major browser supports it with zero configuration.

    We achieve about 95-98% compression ratio with XML from our AJAX and SOAP calls. That means that 600k is only about 30k transferred between the client and the server.

  • (cs) in reply to SnapShot
    Anonymous:
    apparition:

    sinistral:
    The terminal server sounds like a good idea, but the fact that a remote desktop (even using compressed and optimized protocols) outperforms the native browser doing all the work on the local machine boggles the mind.

    Alternative solution: use the site without JavaScript enabled in the browser.  If the CMS is worth anything, it will degrade gracefully for those users and do mostly static rendering.  I have a feeling, based on the 'the CMS used JavaScript that dynamically loaded JavaScript that dynamically loaded XML that was dynamically transformed into proprietary commands that were parsed to dynamically execute JavaScript to dynamically load content.' that thoughts of accessability and older browser environments were not thought of.

    Why would any self respecting developer support backward compatibility? Simply mandate that everyone everywhere simultaneously upgrade their environment to support dedicated T3 speeds globally.

    Problem solved!



    Exactly!  I'm not sure why this is here.  This WTF sounds like a hardware problem not a software problem.  Somebody should give those sysadmins a good talking to...

     

    I think this is a good WTF in that it outlines how stupid companies are when they purchase technology.  This is a WTF at the executive level for sure.

    The second WTF sits on the shoulders of the external company that made the CMS.  They are acting just like microsoft by creating layer over layer over layer... etc.. until all CPU power, memory, and bandwidth is used up.    Not only does .NET 2.0 already have a s-hit load of overhead they added even more! 

     

  • anon (unregistered) in reply to Kev777
    Kev777:
    Anonymous:

    <FONT face=Verdana size=2>Who evaluated the tool before the purchase?  Why didn’t they test it from the remote site before shelling out the money to buy the system.  I bet some VP made the decision based on a sales demo.</FONT>

    Its a known trick NOT to include technical people in sales meetings especially when you are trying to sell technology.

    I recall working for a very large company that purchased a CMS web portal for 2.5 million!  It had all kinds of funky 'Gadgets'.  They sold it rather quickly and the big Mr M was happy that he could publish his news articles over the intranet.  The web developers in house didn't even know about the purchase.  They just had to support it.   Basically the CMS system was a pile of junk that one of the developers could of made in 2 months.   Everyone on the dev team was like 'Pay me 2 million and see what kind of CMS program you get!"  

    This is just another example of how companies rip themselves off because they know nothing about technology or how to use their own people. 

    I'll bet Brian got a big FAT raise for wasting the companies money on something they could of developed in house. 

     

    Just because you could have developed something in-house does not mean you should have.  There are 1000 CMS systems out there that come with support, patches, upgrades etc etc...  It is the responsiblity of the comapny to decide if the price beats out doing all of the above in house or not.

    When would the inhouse employees have time to whip off a full-featured CMS system?  Do they not have enough 'core business' tasks to do?

  • (cs) in reply to mc

    Anonymous:
    The only WTF here is that the system administrator couldn't figure out how to add gzip compression to the web server.

    Yes, AJAX is pretty damn verbose (the same problem you'll encounter with SOAP) but there are things you can do to help remedy the situation. It doesn't take a genius to realize the amount of repeating data in XML lends itself very well to compression. Every major browser supports it with zero configuration.

    We achieve about 95-98% compression ratio with XML from our AJAX and SOAP calls. That means that 600k is only about 30k transferred between the client and the server.

    I love how technology solves one problem and then creates another so that another solution can be implimented.    It is an endless cycle of idiocy.

  • Anonymous (unregistered) in reply to mc
    Anonymous:
    The only WTF here is that the system administrator couldn't figure out how to add gzip compression to the web server.

    Yes, AJAX is pretty damn verbose (the same problem you'll encounter with SOAP) but there are things you can do to help remedy the situation. It doesn't take a genius to realize the amount of repeating data in XML lends itself very well to compression. Every major browser supports it with zero configuration.

    We achieve about 95-98% compression ratio with XML from our AJAX and SOAP calls. That means that 600k is only about 30k transferred between the client and the server.
    Just fyi, Javascript is not synonymous with AJAX. So many people think they're the same thing, but AJAX is a little more involved. Shows how much you know dude.
  • (cs) in reply to Anonymous
    Anonymous:
    Anonymous:

    Sounds like they are devout believers in Rube Goldberg design.

    Whoopi Goldberg design? Never heard of it. Does that involve a Ted Danson interface?

    Ted Danson!  I've been trying to remember his name for days now.  Thank you.

  • (cs) in reply to Anonymouse
    Anonymous:
    Why is this a WTF? It just seems like a webapp that's more complicated than it needs to be, where the designers never considered the possibility of dialup access. Shitty design to be sure, but WTF worthy?


    Any CMS that has to send out a megabyte to display the 11 byte string "Hello World" is a wtf.

    Sure, it might work acceptably for a single user on a LAN, but such high volumes of data transfer are a waste of bandwidth, (meaning a handful of users could saturate the LAN), and dramatically reduce the number of pages per second the CMS can serve.  Apache alone could serve a basic "Hello World" page thousands of times a second.  Layer this CMS on top, and all the sudden you're lucky to serve 10 pages a second.

    Yeah, that's a WTF.
  • Anonymous (unregistered) in reply to Oscar L
    Oscar L:
    Anonymous:
    Anonymous:

    Sounds like they are devout believers in Rube Goldberg design.

    Whoopi Goldberg design? Never heard of it. Does that involve a Ted Danson interface?

    Ted Danson!  I've been trying to remember his name for days now.  Thank you.

    Yeah, that show Becker is actually pretty funny.
  • anon (unregistered) in reply to richdiggins

    I don't believe it either.  300KB-600KB is a lot, but it was almost certainly text (thus easily compressible).  Enable compression and you should have been able to yield much better performance than using terminal server, even over dialup.

    Perhaps they were really just using crappy machines, and therefore it was browser sluggishness that was the problem?  (completely plausible with that much Javascript)

  • (cs) in reply to anon
    Anonymous:
    Kev777:
    Anonymous:

    <FONT face=Verdana size=2>Who evaluated the tool before the purchase?  Why didn’t they test it from the remote site before shelling out the money to buy the system.  I bet some VP made the decision based on a sales demo.</FONT>

    Its a known trick NOT to include technical people in sales meetings especially when you are trying to sell technology.

    I recall working for a very large company that purchased a CMS web portal for 2.5 million!  It had all kinds of funky 'Gadgets'.  They sold it rather quickly and the big Mr M was happy that he could publish his news articles over the intranet.  The web developers in house didn't even know about the purchase.  They just had to support it.   Basically the CMS system was a pile of junk that one of the developers could of made in 2 months.   Everyone on the dev team was like 'Pay me 2 million and see what kind of CMS program you get!"  

    This is just another example of how companies rip themselves off because they know nothing about technology or how to use their own people. 

    I'll bet Brian got a big FAT raise for wasting the companies money on something they could of developed in house. 

     

    Just because you could have developed something in-house does not mean you should have.  There are 1000 CMS systems out there that come with support, patches, upgrades etc etc...  It is the responsiblity of the comapny to decide if the price beats out doing all of the above in house or not.

    When would the inhouse employees have time to whip off a full-featured CMS system?  Do they not have enough 'core business' tasks to do?

    Pay your in house employees 2.5 million and see what kind of CMS they give you!      I mean CMS is common web app that has been done over and over and over again.  

    I mean, 2.5 million is enough to feed all the starving childern in Elbonia .   :)  You don't go and waste that much money just so your CEO can post his emails on the corporate intranet.  

    I'll bet that new CMS that bryan put in place won't even be used that often .

     

     

     

  • mc (unregistered) in reply to Anonymous
    Anonymous:
    Anonymous:
    The only WTF here is that the system administrator couldn't figure out how to add gzip compression to the web server.

    Yes, AJAX is pretty damn verbose (the same problem you'll encounter with SOAP) but there are things you can do to help remedy the situation. It doesn't take a genius to realize the amount of repeating data in XML lends itself very well to compression. Every major browser supports it with zero configuration.

    We achieve about 95-98% compression ratio with XML from our AJAX and SOAP calls. That means that 600k is only about 30k transferred between the client and the server.
    Just fyi, Javascript is not synonymous with AJAX. So many people think they're the same thing, but AJAX is a little more involved. Shows how much you know dude.



    It's true that nobody came right out and said AJAX but with the words web 2.0 and the following quote:

    JavaScript that dynamically loaded XML that was dynamically transformed into proprietary commands that were parsed to dynamically execute JavaScript to dynamically load content.

    I think the read between the lines rule applies. I don't know the details of the application. It may have been unnecessarily complicated. I have no idea because I don't have the code. All I know is that the core problem seems to be the verbosity of XML. That's a pretty damn easy problem to solve.

  • Jimmie Jay (unregistered) in reply to Kev777

    The real WTF here is that someone worked AJAX into the discussion. If you re-read the original post, there's no mention of it. Stay on topic!

    P.S. If working with AJAX makes you feel hardcore, then I feel sorry for you. That's just sad.

  • (cs) in reply to Jimmie Jay
    Anonymous:


    P.S. If working with AJAX makes you feel hardcore, then I feel sorry for you. That's just sad.


    I don't feel hardcore because I just completed my first hand-rolled AJAX implementation.

    I feel hardcore because I have appeared in numerous porno movies.
  • richdiggins (unregistered) in reply to mc
    Anonymous:
    The only WTF here is that the system administrator couldn't figure out how to add gzip compression to the web server.

    Yes, AJAX is pretty damn verbose (the same problem you'll encounter with SOAP) but there are things you can do to help remedy the situation. It doesn't take a genius to realize the amount of repeating data in XML lends itself very well to compression. Every major browser supports it with zero configuration.

    We achieve about 95-98% compression ratio with XML from our AJAX and SOAP calls. That means that 600k is only about 30k transferred between the client and the server.


    Exactly, this is what I was getting at in my previous post...

    compression

    cause we all know that text compresses very small... especially text with patterns...

Leave a comment on “Incompatible with Web 2.0”

Log In or post as a guest

Replying to comment #:

« Return to Article