I am trying to quantify "site slowness". In the olden days you just made sure that your HTML was lightweight, images optimized and servers not overloaded. In high end sites built on top of modern content management systems there are a lot more variables: third party advertising, trackers and various other callouts, the performance of CDN (interestingly enough sometimes content delivery networks make things worse), javascript execution, css overload, as well as all kinds of server side issues like long queries.
The obvious answer is for every developer to clear the cache and continuously look at the "net" section of the Firebug plugin. What other ways to measure "site dragging ass" have you used?
-
deadprogrammer : right! totally forgot about that onekohlerm : only works for firefoxScottD : I love the ZEN of this answer.From johnstok
-
If it's asp.net you can use Trace.axd.
Yahoo provide yslow which can be great for checking javascript
From dove -
Well, call me old fashioned but..
time curl -L http://www.example.com/path
in linux :) Other than that, I'm a big fan of YSlow as previously mentioned.
Hugo : This adds the time of DNS look ups to your measurement.kohlerm : but does not take rendering time into account, nor simulates parallel requestsFrom f4nt -
"Page Load time" is really not easy to define in general. It depends on the browser you use, because different browsers may do more requests in parallel, because javascript has differents speeds in different browsers and because rendering time is different.
Therefore you can only really measure your true page load time using the browser you are interested in. The end of the page load can also be difficult to define because there might be an Ajax request after everything is visible on the page. Does that count the the page load or not?
And last but not least the real page load time might not matter that much because the "perceived performance" is what matters. For the user what matters is when sHe has enough information to proceed
I'm not aware of any way (at least no I could tell you :] ) that would automatically measure your pages perceived load time.
Use AOL Pagetest for IE and YSlow for firefox (link see above) to get a "feeling" for you load time.
From kohlerm -
Has anybody tried analyzing server's log to get page time load? Is that even possible? I have been thinking lately about it...but i am not sure. I would appreciate any advice (pro or con)
-
Get yourself a proper debugging proxy installed (I thoroughly recommend Charles)
Not only will you be able to see a full breakdown of response times / sizes, you can save the data for later analysis / comparison, as well as fiddle with the requests / responses etc.
(Edit: Charles' support for debugging SOAP requests is worth the pittance of its shareware fee - it's saved me a good half a day of hair-loss this week alone!)
From iAn -
Apache Benchmark. Use
ab -c <number of CPUs on server> -n 1000 url
to get good approximation of how fast your page is.
From bh213 -
In Safari, the Network Timeline (available under the Develop menu, which you have to specifically enable) gives useful information about loading time of individual page components, as well as showing when each component started loading.
From TimB -
Firebug, the must have for web developers Firefox extension, can measure the loading time of different elements on your webpage. At least you can rule out CSS, JavaScript, and other elements taking too much time to load.
If you do need to shrink JavaScript and CSS loading times, there are various JavaScript and CSS compressors out there on the web that simply take out unnecessary text out of them like newline characters and comments. Of course, keep an ordinary version on the side for development sake.
If you use PNGs, I recently came across a PNG optimizer that can shrink PNG sizes called OptiPNG.
Liam : The 'Net' panel in Firebug is excellent for this.From Ido Schacham -
YSlow as mentioned above.
And combine this with Fiddler. It is good if you want to see which page objects are taking the most bandwidth, which are being compressed at the server, unexpected round-trips, and what is being cached. And it can give you a general idea about processing time in the client web browser as compared to time taken between server & client
From iamdudley -
Yslow is good, and HttpWatch for IE is great as well. However, both miss the most important metric to a user "When is the page -above the fold- ready for use to the user?". I don't think that one has been solved yet...
From Jilles -
Last time I worked on a high-volume website, we did several things, including:
- We used Yslow to get an analysis of the individual factors affecting page load: https://addons.mozilla.org/en-US/firefox/addon/5369
- performance monitoring using an external, commercial tool called Gomez - http://www.gomez.com/instant-test-pro/
- We stress tested using a continuous integration build, using Apache JMeter. http://jakarta.apache.org/jmeter/
If you want a quick look, say a first approximation, I'd go with YSlow and see what the major factors affecting page load time in your app are.
From cartoonfox
0 comments:
Post a Comment