Speed optimisation

testing and improving your bitweaver install

Created by: dspt, Last modification: 31 May 2010 (12:05 UTC) by Tochinet

Introduction

Bitweaver has the same issues as any big modular CMS - its pages are heavy, because it is very powerful and has many features. Back in April 2006, when this page was created, the statistics of the main wiki page of bitweaver,
analyzed by the online speed report service of websiteoptimization.com, were:
Total HTTP Requests:31
Total Size:103767 bytes
HTML:7347 bytes
HTML Images:24920 bytes
CSS Images:13273 bytes
Total Images:38193 bytes
Javascript:58227 bytes
connection rate 56K20.68 seconds download time
connection rate ISDN 128K6.33 seconds


In 2010 (May 31), this becomes
Object type Size (bytes)Download @ 56K (seconds)Download @ T1 (seconds)
HTML: 32249 6.63 0.37
HTML Images: 22854 7.35 2.92
CSS Images: 38629 8.50 1.00
Total Images: 61483 15.85 3.92
Javascript: 71186 15.19 1.38
CSS: 1096 0.42 0.21


with some further qualitative analysis : there are green points such as use of Gzip, only one HTML and one CSS file, but too many images (18), and too much Javascript (71k). Of course, nobody today still uses 56k links, but still at T1 speed the calculated speed is about 6 seconds (Yes, I too get more than 9 by simply adding, go figure). So a remommendation would be to use Gzip also for javascript packages. Also adding width and height attributes to all images should make rendering faster.


What could be done?

eliminate extra size

  • Jakob Nielsen is positive that we should achieve sub-8 seconds load time for the users to enjoy navigating our site. Another reason for speed optimisation is reducing the load for sites with heavy traffic. In the above report, it is obviously a rather impossible task to scale down 120 kilobyte (20 kilobyte of CSS are omitted by the analysis) to 30, necessary for sub-8 load time, but we might want to get closer to that usability ideal.

eliminate extra HTTP requests

  • Reading: Modem connections (56Kbps or less) are corrected by a packet loss factor of 0.7. All download times include delays due to round-trip latency with an average of 0.2 seconds per object. With 31 total objects for this page, that computes to a total lag time due to latency of 6.2 seconds. In the above example, we are loosing more then 6 seconds just for HTTP requests, so there's no chance for sub8 response :(
  • "The less requests the better", so before optimizing file size, think how to reduce quantity of files per page.
  • Think twice before you start "slicing" into small pieces your images and backgrounds in some Adobe ImageReady. This technique is an offspring of table-based layouts — when bw has div-based layout at the moment. IMHO with div-based layout you might get better results having 3 big background images instead of 30 small ones.
  • Also, some of the images can be eliminated without any damage to the style. There's a nice CSS trick for rollover images (if using graphic rollover at all): the rollover images are stacked in single background file vertically, and on :hover the vertical position of background is changed in CSS. The same can be done with very small images, like icons and signs of external link etc. The trick is that it doesn't give any boost in load time if the file is smaller then 1160 bytes - it's still single TCP-IP packet. So we can combine two small graphic files if their total is still less in size then 1k. This way we cut off 1 packet and 1 HTTP request. (At this point, reading about how much effort is needed to save 0.2 seconds, all readers are confident I'm crazy and need psychiatric attention %/ )
  • Combine javascript and CSS files: put all external CSS into one file - less files means less HTTP requests.
  • Eliminate redundant markup before uploading to the server, but leave read-frendly version locally for editing.
  • Eliminate redundant markup and pack your custom CSS and Javascript files.

Image optimisation


Other compression

  • Bitweaver's gzip function
  • Smarty's { strip}
  • Always set the height, width and alt-attribute of images, so the browser shows the page before the images are loaded. Same for tables.

Comparisons

Web Page Analyzer (04/2006)

To compare bitweaver with other content management systems, pass their main pages through websiteoptimization.com's analyzer:
Link to page analyzedTotal HTTP Requests Total Size load time on 56k modemremarks
bitweaver3159368 bytes11.83 seconds20k CSS file is not calculated
Drupal5185049 bytes17.95 seconds0 (ZERO!) Javascript
mediawiki2984136 bytes18.37 seconds30kb CSS in 8 files - seems unnecessary for such a basic layout
Xoops30131660 bytes26.24 secondssome broken links
http://tikiwiki.org/tiki-index.php?page=TikiWiki21189316 bytes38.13 secondsquite a lot of everything
Joomla60200149 bytes40.29 secondsover 100k HTTP images!
Xaraya49191770 bytes40.82 seconds60k CSS in 14 files!
e10773238522 bytes47.94 seconds170k images! (which also means the actual frame is rather slick)


WebWait (11/2007)

webwait.com: Use WebWait to benchmark your website or test the speed of your web connection. Timing is accurate because WebWait pulls down the entire website into your browser, so it takes into account Ajax/Javascript processing and image loading which other tools ignore.
Link to page analyzed Average load time after 5 runs (in seconds, call interval: 60 seconds)
Xaraya0.94
Drupal1.05
mediawiki1.26
e1072.03
Joomla2.68
bitweaver.org/blogs2.80
bitweaver.org/wiki3.26
bitweaver.org/articles3.61
http://tikiwiki.org/tiki-index.php?page=TikiWiki3.95
Xoops7.72

Comments

How about mysql queryes and db

by greg, 17 Oct 2007 (09:05 UTC)
You focus alot on download time, however I'm bothered by a long time spent for processing a page: each page takes more than1s on an idle server (Apache2).
The extensive database with more than 100 tables reminds me of the phpNuke nightmare ...
You have 30 to 50 sql queries for each page impression!
I think here lies a lot of reserve for speed optimisation ...

Re: How about mysql queryes and db

by dspt, 24 Nov 2007 (10:21 UTC)
Webwait offers statistics for processing + download, if the chaching is off. if the caching is on, it becomes hard to measure processing time influence on end-user experience.
for low-speed lines (and for dialup especially) processing time is small enough in comparrison to download times to ignore it.
for broadband lines the overall processing+download time is less then attention treshhold anyway, but DB optimisation is always good, of course

Re: How about mysql queryes and db

by laetzer, 18 May 2008 (11:13 UTC)
DB queries probably depend on the packages and side modules which a site uses. In that case it should reduce queries to enable caching for side modules (admin > control layout...). Maybe also by using an opcache for PHP, or by enabling Smarty's template cache, not sure.
  Page 1 of 1  1