I’m sure that my performance tests will create some controversy – so first let’s just say that I realize that internet conditions and local connections greatly affect performance. It was nearly impossible for me to perform tests concurrently because I had to change my nameservers and wait for the change to propagate. Here is a rundown of my testing methods: For FTP testing I uploaded and downloaded a folder containing 4MB of files. The files were a mix of binary and ASCII files with the following sizes: 1X 2MB, 3X 500KB, 3X 100KB, 3X 50KB, 3X 15KB, 3X 2KB. I transferred the files with Filezilla and used a watch with a second hand to time it. Results are reported in bandwidth (higher is better). For HTTP performance, I used a Firefox plugin to measure the load time of webpages in ms. I used my 7 websites: 2 html static sites, 1 php file manager, 3 WordPress blogs, and 1 Zenphoto gallery. I cleared the cache and restarted the browser before each run, and I tested three times at least 10 minutes apart. I noticed good reproducibility with this method. I had variability ranging from 1% to 10% with the same host and site. For server performance, I used Zenphoto. I purged the cache and then pre-cached an album. This forces a php script to resize 16 photos twice each, and then download the resized photos. It takes about 6 seconds if the photos are cached on the server, and 15-30 seconds if the cache is cleared. I am assuming that this means that the discrepancy lies in the server-side process of resizing the photos.