Server side image resizing

I notice more and more from large sites (Google, Facebook, Youtube and co) that they resize images to "close" to the desired results on the client side, and I wonder if this is a shift in the way people think or are lazy .

Take the scenario of adding a new image size to standard sets of images with specific sizes for a set of products (e-commerce), the number of which is 100 thousand, or maybe millions.

Imagine that I have a copy of my original image of size 300x350 or on the other hand and on the client side, change its size to 200x250. I do this for each product for 20 products per page.

Is work and problems the server side of hosting this new size really beneficial to the client?

If not, what good way to judge when you should pre-process a certain size?

If so, is there ever a time when server-side processing and caching may become redundant (i.e. housing 8 images 110x220, 120x230, 150x190, etc.)?

+7
source share
2 answers

I decided to test this first on one image per page and did some math multiplication to predict multiple images. I should note:

  • I used png for this test, but pngs look pretty standard.
  • I used a specific function that apative resizes the image so that the width and height always match the original aspect ratio of the image I resized.

I got a 450x450 image (approximately) and decided to resize it, on the client side, to 200x200 (all measurements in this answer are pixels). I found very little CPU jump, despite resizing more than half the total size of the images.

The quality was also good in all modern browsers, Chrome, Opera, Firefox and IE showed the image as clearly as if it were done in Photoshop or GD.

In IE7, I did not notice much CPU overload, and the quality was good if I used percentile resizing based on image size restrictions.

In general, the additional storage, computation, and encoding required so that even the server side of the cache of this size appears to be at a disadvantage for the power that could be implied at the end of the user.

If it is said that if I start doing multiple sizes of this type of resizing (say 20, as in my question), I will probably start to run into problems.

After some tweaking, I found that something below 1/3 of the original image size, provided that the image was less than 1000 pixels in width or height, seemed to be careless about the performance of the processor and the shared computer.

With the added function that I used, I got the same good quality from resizing the client side as on the server side. The specific function that I used (for interested parties) was:

 function pseudoResize($maxWidth = 0, $maxHeight = 0){ $width = $this->org_width; $height = $this->org_height; $maxWidth = intval($maxWidth); $maxHeight = intval($maxHeight); $newWidth = $width; $newHeight = $height; // Ripped from the phpthumb library in GdThumb.php under the resize() function if ($maxWidth > 0) { $newWidthPercentage = (100 * $maxWidth) / $width; $newHeight = ($height * $newWidthPercentage) / 100; $newWidth = intval($maxWidth); $newHeight = intval($newHeight); if ($maxHeight > 0 && $newHeight > $maxHeight) { $newHeightPercentage = (100 * $maxHeight) / $newHeight; $newWidth = intval(($newWidth * $newHeightPercentage) / 100); $newHeight = ceil($maxHeight); } } if ($maxHeight > 0) { $newHeightPercentage = (100 * $maxHeight) / $height; $newWidth = ($width * $newHeightPercentage) / 100; $newWidth = ceil($newWidth); $newHeight = ceil($maxHeight); if ($maxWidth > 0 && $newWidth > $maxWidth) { $newWidthPercentage = (100 * $maxWidth) / $newWidth; $newHeight = intval(($newHeight * $newWidthPercentage) / 100); $newWidth = intval($maxWidth); } } return array( 'width' => $newWidth, 'height' => $newHeight ); } 

Thus, from my own testing, it seems that each size of each image will be used, i.e. as I asked in my question:

If so, is there ever a time when server-side processing and caching may become redundant (i.e. housing 8 images 110x220, 120x230, 150x190, etc.)?

There seems to be too much in modern computing, and you should go for close measurements if you intend to use many different sizes of many images.

However, I found that if you have a standard set of sizes, and they are small, the advantage is that in fact it depends on the size and size of the server at all sizes, since forced resizing of the client will always slow down the computer, but scaling to 1 / 3 of its original size does not seem to make too much difference.

Therefore, I believe that the reason why sites such as FB and Google and Youtube are not too worried about maintaining accurate measurements of all their images is that โ€œclose to measurementโ€ scaling can be more complete. p>

0
source

Consider the following: Resizing an image is a difficult process for a server. It is above all expensive. Secondly, these are hard I / O operations, which are rather slow. So it all depends on how busy your server is.

For the client, this has two meanings:

1) Thumbnails are smaller than the file size and therefore load much faster. So they will appear faster. But it all depends on the speed of the Internet connection, which is increasing every day. Have you seen how large images load? They will not be displayed entirely at the same time, but rather "lines"

2) If you try to display a large image of a small size, the quality will be much much lower. This is because of how the browser handles it. They do not have Photoshop capabilities and cannot perform proper resizing.

3) Many large images on one page will increase memory usage on this page. On some not very powerful computers that can give terrible lags when scrolling, opening it.

As a solution to this issue, I try more to do what I saw in one of the Drupal modules ( imagecache , if I'm right).

It does not create thumbnails when loading images. Instead, it creates them on demand using the .htaccess and mod_rewrite capabilities. It checks if the requested file does not exist, and if it does not redirect the request to a small and lightweight PHP script that will create a thumbnail, write it to the file system and then output it to the client.

So, the next visitor will already receive the previously created sketch.

This way you implement lazy / lazy resizing of the image, which will make the load smoother (stretch it over time).

+1
source

All Articles