Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Ajax Performance Analysis 36

IBM Developerworks' latest was submitted to us by an anonymous reader who writes "Using Firebug and YSlow, you can thoroughly analyze your Web applications to make educated changes to improve performance. This article reviews the latest tools and techniques for managing the performance of Ajax applications along the life cycle of your application, from inception through production."
This discussion has been archived. No new comments can be posted.

Ajax Performance Analysis

Comments Filter:
  • by davecb ( 6526 ) * <davecb@spamcop.net> on Saturday May 03, 2008 @11:54AM (#23284550) Homepage Journal

    A good review and counter-argument is available at the "codinghorror" blog where Jeff Atwood points out the codinghorror blog [codinghorror.com]Yahoo's Problems Are Not Your Problems

    --dave

    • by davecb ( 6526 ) *

      Bother, I meant Yahoo's Problems Are Not Your Problems [codinghorror.com]

    • by Bogtha ( 906264 ) on Saturday May 03, 2008 @12:18PM (#23284708)

      Hmm, I'm not so sure about some of that. For instance:

      Yahoo recommends turning ETags off because they cause problems on server farms due to the way they are generated with machine-specific markers. So unless you run a server farm, you should ignore this guidance. It'll only make your site perform worse because the client will have a more difficult time determining if its cache is stale or fresh. It is possible for the client to use the existing last-modified date fields to determine whether the cache is stale, but last-modified is a weak validator, whereas Entity Tag (ETag) is a strong validator. Why trade strength for weakness?

      Because "strength" isn't anything particularly important here. The difference between strong and weak validators is that a strong validator is supposed to change even if only minor alterations take place (e.g. spelling mistakes), while a weak validator can remain the same if minor changes take place.

      In practice, I've never seen anybody make a distinction like this for websites/web applications. If anybody did bother, then weak validators would be more efficient, as they would have a better cache hit ratio. For all intents and purposes, there is no difference between a strong and a weak validator. But what you are doing is computing and transmitting useless ETag headers with every single request you serve, so it is beneficial to turn them off, even if you don't have a server farm. Last-Modified is good enough for practically everybody. If Last-Modified isn't good enough for your purposes, then you don't need to be told to switch ETags back on, you know what you are doing.

      All you're really saving here is the cost of the client pinging the server for a new version and getting a 304 not modified header back in the common case that the resource hasn't changed. That's not much overhead.. unless you're Yahoo.

      Well how much overhead it is really depends on what it is you are doing. If it's expensive to figure out if the resource has changed, you don't want to incur that expense a lot, for example naïve dynamic stylesheet implementations can noticeably slow down a site. And remember, even if you have a fast server, it doesn't mean your users have a fast connection, and going back and asking the server if things have changed for a dozen resources on the page when you know for a fact you don't change them for months at a time is ridiculous.

      His basic point, that you shouldn't take Yahoo's advice as gospel is good, but you shouldn't automatically assume that efficiency is only something that benefits giants.

      • by DavidTC ( 10147 )

        You're right about ETags. ETags are useful when you have cached, dynamically-generated content, and can't use Last-Modified. Well, you could use it, but handling faking that is more complicated than just running the end result through crc32().

        Anything else and ETags are just just dumb. OTOH, they're not actually slowing anything down for the end user.

        OTOH, YSlow's still stupid here, because it makes some crazy comment about server farms. Look, Yahoo, I don't know who you think is using this software, but

        • by Rob Kaper ( 5960 )

          Look, Yahoo, I don't know who you think is using this software, but 99.999% of the people using this outside of you guys do not have 'server farms'.

          Actually, 99.999% of websites probably do not run on a dedicated server but on a shared host - most of them using some sort of cluster solution. Which is precisely why ETags won't work unless you turn off the server defaults so you can generate your own using server-independent data.

          They think normal websites should be using CDNs? Are they on drugs?

          Stop dissing

  • by davecb ( 6526 ) * <davecb@spamcop.net> on Saturday May 03, 2008 @12:02PM (#23284604) Homepage Journal

    It's actually useful to break the response time out into three parts:

    1. 1) Round-trip time, the latency from the network

      2) Transfer time, the time from receiving the first byte of the page to the time the last byte arrives. This varies greatly with page size, and is the time you use to do KB/S calculations as well.

    2. 3) Latency proper, the time between sending the request and receiving the first byte of the page. This is the time that grows during an overload, and the one that capacity planners use to do queuing models to see how much the server will slow down by under an overload.

    --dave (a capcity planner) c-b

  • The author appears to endorse the use of innerHTML as opposed to DOM manipulation.

    I suppose it has its place in a performance analysis, but mention might be made that it's just not worth the trading off standards compliance and futureproofing. When I see innerHTML being manipulated I assume the designer didn't know what he was doing.


    • <html>

      <body>
      </body>

      <script language="javascript">
      theDiv = document.createElement("DIV");

      theFont = document.createElement("FONT");
      theFont.style.fontFamily = "verdana";
      theFont.style.fontSize = "24";
      theFont.style.fontWeight = "700";

      theFont.innerHTML = "Hello World!"

      theDiv.appendChild(theFont);
      document.body.appendChild(theDiv);
      </script>

      </html>

      • Re: (Score:3, Informative)

        theFont.appendChild(document.createTextNode('Hello World'));

        If it is supposed to be text, createTextNode will properly handle &, <, etc, whereas innerHTML won't.

    • The purpose of the article was improving performance and response times. This means trading in practices which one should be doing to optimize for what works best. While DOM manipulation may be the standards compliance way of going it is dramatically slower than innerHTML [quirksmode.org]. It is only recently that Safari and Opera have been able to increase the speed of DOM manipulation [hedges.name], but sadly IE and Firefox (the two more used browsers) show better results for innerHTML. In my own testing Firefox 3 was closer to the Ope

  • by truthsearch ( 249536 ) on Saturday May 03, 2008 @01:06PM (#23284966) Homepage Journal
    I recently did a performance analysis of a complex site I've worked on for over a year. My primary tool was Firebug. While the size of the HTML + CSS + JS per page is pretty large, it turns out compressing them and setting header cache only saw a small performance improvement. The execution of JS like Scriptaculous actually accounts for more than 50% of the time it takes to render the pages. Since we only use Scriptaculous for drag-and-drop we're considering alternatives, like mootools or custom code. Loading as much JS as possible from the bottom of the page instead of the head can help, but isn't always an option. Focusing on CSS and JS performance has now made a huge improvement in perceived site performance.

    During the initial development we never considered that simply loading one JS library (even when it's not used for inserting HTML) could slow down page rendering that much. On JS or CSS heavy sites, client-side loading, rendering, and runtime execution can easily account for 50% to 90% of the time it takes to see a final page. So while I've usually focused entirely on server side performance, I now know to pay more attention to the speed of client side rendering. Lesson learned.
  • by hobo sapiens ( 893427 ) on Saturday May 03, 2008 @01:19PM (#23285078) Journal
    I use AJAX all the time. Firebug is an essential tool for me. It is probably one of my most important web dev tools. You can see all server requests. Even if you don't do AJAX, that is useful. Also, the inspect option on Firebug allows you to make CSS changes without committing to the server. Without Firebug, I'd never be able to have the same insight into my pages.

    YSlow is worthless for me. Where I work, I do web development on the intranet. I do not configure the servers, and don't really even have the ability to do so. On the other end, any stuff I might do on the internet will most likely be hosted by some hosting company. Many of the things that YSlow flags are server config items, not *code* items. Sure, that has its place, but if you are a web developer (not a SA) then YSlow gives you a bunch of useless info, then a low grade if your servers are configured a certain way. Ironically, it comes from Yahoo, the masters of the bloated web page. Come to think of it, I should probably get around to uninstalling it instead of just leaving it disabled.
  • NoScript (Score:3, Insightful)

    by slashgrim ( 1247284 ) on Saturday May 03, 2008 @02:04PM (#23285334) Journal
    All websites should have an option to run without fancy Javascript (and still be fully functional!). It makes no sense that a web site should bog down my 400Mhz PDA.
    • I'll second that. And add a no flash option also. God I hate it, mostly because flash designers want to use it on everything.
    • Re: (Score:1, Interesting)

      by Anonymous Coward
      I concur. This is all the more reason to use unobtrusive JavaScript. Same great functionality with more compatibility. I just wish more sights would learn how this works.
  • Being well aware of the thankless, unforgiving nature of the open web, thanks for the link to the article :-)
  • Depends ... (Score:3, Funny)

    by ScrewMaster ( 602015 ) on Saturday May 03, 2008 @03:59PM (#23286046)
    Ajax Performance Analysis

    All I know is, my floors have a nice shine.
  • Talk about making your browser crawl like a turtle, the scripts on that site time out constantly. It's even worse than Yahoo, because at least on Yahoo, the scripts never time out. I have the same issue with Reddit and other such sites. Not everyone has 10+ mbit connections running on Quad-Core machines with 4 GB of RAM. Web devs seem to forget this more than any other programmer class. Just because it runs fine on your 100+ mbit connection and Intel Mac at work, doesn't mean it runs that way anywhere else.
  • by TLLOTS ( 827806 ) on Saturday May 03, 2008 @06:36PM (#23286858)
    While I love Firebug and use it daily for web development, you shouldn't trust its net tab too much. I've found in numerous instances it has grossly understated load times, for example loading a very PHP heavy page which Firebug claimed was loading in 13ms, however serverside performance testing showed that the load time was actually over 1 second. I've actually found Safari's new Web Inspector tools to be much more accurate in this regard so I'd recommend using them for performance testing if you suspect Firebug isn't telling you everything.
  • I'm a bit of a noob to AJAX, but/and getting good performance on data-heavy AJAX pages has been a real challenge. I'm specifically working on a data grid with 1000+ rows using Ext. I've gotten it to be acceptable via techniques like behind-scenes paging, direct innerHTML modification instead of relying on library grid redraws, etc etc. I haven't tried simply rendering the grid with XML->XSLT, though (the grid widget is awful nice ...)

    It's probably unrealistic to want a web page to present a christmas-tre
    • by achacha ( 139424 )
      I feel your pain, we are using ext and while it's a wonderful toolkit the json stuff is quite time consuming on both server and client, time that can be spent doing something more useful like rendering the actual content. In some places I've resorted to sending actual HTML snippets and using innerHTML to "insert" them directly without relying on ext/json. For non-performance critical tasks it's a pretty nice library.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...