Speed optimisation
testing and improving your bitweaver install
Introduction
Bitweaver has the same issues as any big modular CMS - its pages are heavy, because it is very powerful and has many features. Back in April 2006, when this page was created, the statistics of the main wiki page of bitweaver,analyzed by the online speed report service of websiteoptimization.com, were:
Total HTTP Requests: | 31 | |||
Total Size: | 103767 bytes | |||
HTML: | 7347 bytes | |||
HTML Images: | 24920 bytes | |||
CSS Images: | 13273 bytes | |||
Total Images: | 38193 bytes | |||
Javascript: | 58227 bytes | |||
connection rate 56K | 20.68 seconds download time | |||
connection rate ISDN 128K | 6.33 seconds |
In 2010 (May 31), this becomes
Object type | Size (bytes) | Download @ 56K (seconds) | Download @ T1 (seconds) | |
HTML: | 32249 | 6.63 | 0.37 | |
HTML Images: | 22854 | 7.35 | 2.92 | |
CSS Images: | 38629 | 8.50 | 1.00 | |
Total Images: | 61483 | 15.85 | 3.92 | |
Javascript: | 71186 | 15.19 | 1.38 | |
CSS: | 1096 | 0.42 | 0.21 |
with some further qualitative analysis : there are green points such as use of Gzip, only one HTML and one CSS file, but too many images (18), and too much Javascript (71k). Of course, nobody today still uses 56k links, but still at T1 speed the calculated speed is about 6 seconds (Yes, I too get more than 9 by simply adding, go figure). So a remommendation would be to use Gzip also for javascript packages. Also adding width and height attributes to all images should make rendering faster.
What could be done?
eliminate extra size
- Jakob Nielsen is positive that we should achieve sub-8 seconds load time for the users to enjoy navigating our site. Another reason for speed optimisation is reducing the load for sites with heavy traffic. In the above report, it is obviously a rather impossible task to scale down 120 kilobyte (20 kilobyte of CSS are omitted by the analysis) to 30, necessary for sub-8 load time, but we might want to get closer to that usability ideal.
eliminate extra HTTP requests
- Reading: Modem connections (56Kbps or less) are corrected by a packet loss factor of 0.7. All download times include delays due to round-trip latency with an average of 0.2 seconds per object. With 31 total objects for this page, that computes to a total lag time due to latency of 6.2 seconds. In the above example, we are loosing more then 6 seconds just for HTTP requests, so there's no chance for sub8 response :(
- "The less requests the better", so before optimizing file size, think how to reduce quantity of files per page.
- Think twice before you start "slicing" into small pieces your images and backgrounds in some Adobe ImageReady. This technique is an offspring of table-based layouts — when bw has div-based layout at the moment. IMHO with div-based layout you might get better results having 3 big background images instead of 30 small ones.
- Also, some of the images can be eliminated without any damage to the style. There's a nice CSS trick for rollover images (if using graphic rollover at all): the rollover images are stacked in single background file vertically, and on :hover the vertical position of background is changed in CSS. The same can be done with very small images, like icons and signs of external link etc. The trick is that it doesn't give any boost in load time if the file is smaller then 1160 bytes - it's still single TCP-IP packet. So we can combine two small graphic files if their total is still less in size then 1k. This way we cut off 1 packet and 1 HTTP request. (At this point, reading about how much effort is needed to save 0.2 seconds, all readers are confident I'm crazy and need psychiatric attention %/ )
- Combine javascript and CSS files: put all external CSS into one file - less files means less HTTP requests.
- Eliminate redundant markup before uploading to the server, but leave read-frendly version locally for editing.
- Eliminate redundant markup and pack your custom CSS and Javascript files.
Image optimisation
Other compression
- Bitweaver's gzip function
- Smarty's { strip}
- Always set the height, width and alt-attribute of images, so the browser shows the page before the images are loaded. Same for tables.
Comparisons
Web Page Analyzer (04/2006)
To compare bitweaver with other content management systems, pass their main pages through websiteoptimization.com's analyzer:Link to page analyzed | Total HTTP Requests | Total Size | load time on 56k modem | remarks |
bitweaver | 31 | 59368 bytes | 11.83 seconds | 20k CSS file is not calculated |
Drupal | 51 | 85049 bytes | 17.95 seconds | 0 (ZERO!) Javascript |
mediawiki | 29 | 84136 bytes | 18.37 seconds | 30kb CSS in 8 files - seems unnecessary for such a basic layout |
Xoops | 30 | 131660 bytes | 26.24 seconds | some broken links |
http://tikiwiki.org/tiki-index.php?page=TikiWiki | 21 | 189316 bytes | 38.13 seconds | quite a lot of everything |
Joomla | 60 | 200149 bytes | 40.29 seconds | over 100k HTTP images! |
Xaraya | 49 | 191770 bytes | 40.82 seconds | 60k CSS in 14 files! |
e107 | 73 | 238522 bytes | 47.94 seconds | 170k images! (which also means the actual frame is rather slick) |
WebWait (11/2007)
webwait.com: Use WebWait to benchmark your website or test the speed of your web connection. Timing is accurate because WebWait pulls down the entire website into your browser, so it takes into account Ajax/Javascript processing and image loading which other tools ignore.Link to page analyzed | Average load time after 5 runs (in seconds, call interval: 60 seconds) | |||
Xaraya | 0.94 | |||
Drupal | 1.05 | |||
mediawiki | 1.26 | |||
e107 | 2.03 | |||
Joomla | 2.68 | |||
bitweaver.org/blogs | 2.80 | |||
bitweaver.org/wiki | 3.26 | |||
bitweaver.org/articles | 3.61 | |||
http://tikiwiki.org/tiki-index.php?page=TikiWiki | 3.95 | |||
Xoops | 7.72 |
Comments
How about mysql queryes and db
The extensive database with more than 100 tables reminds me of the phpNuke nightmare ...
You have 30 to 50 sql queries for each page impression!
I think here lies a lot of reserve for speed optimisation ...
Re: How about mysql queryes and db
for low-speed lines (and for dialup especially) processing time is small enough in comparrison to download times to ignore it.
for broadband lines the overall processing+download time is less then attention treshhold anyway, but DB optimisation is always good, of course
Re: How about mysql queryes and db