How to compare the "quality" of two image scaling algorithms?

Suppose I want to include an image scaling / scaling algorithm in my program. The lead time is not important, the result is "quality." How do you define “quality” in this case and how do you then choose the best algorithm from the available options?


On the side of the note, if this is too general, the main problem for which I am trying to find a solution is this: suppose I have a lot of images that I will need to scale at runtime (video, actually). I can pre-process them and improve them a bit using a slow and high-quality algorithm, but I don’t know the final resolution (well, in any case, people have different monitors), so I can’t immediately resize it. Would it be helpful if I change it a bit using my high-quality algorithm and then let the player upgrade it to the required resolution at runtime (with a fast low-quality algorithm)? Or should I leave the video as it is and leave all the scaling in one pass at runtime?

+5
source share
1 answer

The only way to objectively evaluate quality is to do some (semi) research. Attract multiple participants. Show them the enlarged images in random order and evaluate their subjective quality (bonus points for double blind). Then you average the scores and select the algorithm with the highest average score (and possibly check the statistical significance).

, , , , . , , , , - .

, , .

, , . , , , , , 2x. , ... !

+1

All Articles