Suppose I want to include an image scaling / scaling algorithm in my program. The lead time is not important, the result is "quality." How do you define “quality” in this case and how do you then choose the best algorithm from the available options?
On the side of the note, if this is too general, the main problem for which I am trying to find a solution is this: suppose I have a lot of images that I will need to scale at runtime (video, actually). I can pre-process them and improve them a bit using a slow and high-quality algorithm, but I don’t know the final resolution (well, in any case, people have different monitors), so I can’t immediately resize it. Would it be helpful if I change it a bit using my high-quality algorithm and then let the player upgrade it to the required resolution at runtime (with a fast low-quality algorithm)? Or should I leave the video as it is and leave all the scaling in one pass at runtime?
source
share