I decided to test this first on one image per page and did some math multiplication to predict multiple images. I should note:
- I used png for this test, but pngs look pretty standard.
- I used a specific function that apative resizes the image so that the
width and height always match the original aspect ratio of the image I resized.
I got a 450x450 image (approximately) and decided to resize it, on the client side, to 200x200 (all measurements in this answer are pixels). I found very little CPU jump, despite resizing more than half the total size of the images.
The quality was also good in all modern browsers, Chrome, Opera, Firefox and IE showed the image as clearly as if it were done in Photoshop or GD.
In IE7, I did not notice much CPU overload, and the quality was good if I used percentile resizing based on image size restrictions.
In general, the additional storage, computation, and encoding required so that even the server side of the cache of this size appears to be at a disadvantage for the power that could be implied at the end of the user.
If it is said that if I start doing multiple sizes of this type of resizing (say 20, as in my question), I will probably start to run into problems.
After some tweaking, I found that something below 1/3 of the original image size, provided that the image was less than 1000 pixels in width or height, seemed to be careless about the performance of the processor and the shared computer.
With the added function that I used, I got the same good quality from resizing the client side as on the server side. The specific function that I used (for interested parties) was:
function pseudoResize($maxWidth = 0, $maxHeight = 0){ $width = $this->org_width; $height = $this->org_height; $maxWidth = intval($maxWidth); $maxHeight = intval($maxHeight); $newWidth = $width; $newHeight = $height; // Ripped from the phpthumb library in GdThumb.php under the resize() function if ($maxWidth > 0) { $newWidthPercentage = (100 * $maxWidth) / $width; $newHeight = ($height * $newWidthPercentage) / 100; $newWidth = intval($maxWidth); $newHeight = intval($newHeight); if ($maxHeight > 0 && $newHeight > $maxHeight) { $newHeightPercentage = (100 * $maxHeight) / $newHeight; $newWidth = intval(($newWidth * $newHeightPercentage) / 100); $newHeight = ceil($maxHeight); } } if ($maxHeight > 0) { $newHeightPercentage = (100 * $maxHeight) / $height; $newWidth = ($width * $newHeightPercentage) / 100; $newWidth = ceil($newWidth); $newHeight = ceil($maxHeight); if ($maxWidth > 0 && $newWidth > $maxWidth) { $newWidthPercentage = (100 * $maxWidth) / $newWidth; $newHeight = intval(($newHeight * $newWidthPercentage) / 100); $newWidth = intval($maxWidth); } } return array( 'width' => $newWidth, 'height' => $newHeight ); }
Thus, from my own testing, it seems that each size of each image will be used, i.e. as I asked in my question:
If so, is there ever a time when server-side processing and caching may become redundant (i.e. housing 8 images 110x220, 120x230, 150x190, etc.)?
There seems to be too much in modern computing, and you should go for close measurements if you intend to use many different sizes of many images.
However, I found that if you have a standard set of sizes, and they are small, the advantage is that in fact it depends on the size and size of the server at all sizes, since forced resizing of the client will always slow down the computer, but scaling to 1 / 3 of its original size does not seem to make too much difference.
Therefore, I believe that the reason why sites such as FB and Google and Youtube are not too worried about maintaining accurate measurements of all their images is that โclose to measurementโ scaling can be more complete. p>