Quote:
Originally Posted by retiredguy123
Question: If a 4K or 8K TV does a good job at upscaling, why do you need to watch a 4K signal? Last year, they admitted that the Super Bowl was actually broadcast in an upscaled version of 4K, not a native 4K signal. Was that any different from watching it on an upscaling TV? I certainly couldn't tell the difference.
|
Very good question.
So, when a signal is "upscaled" to a higher resolution the software has to "make up" information that's not there to get the additional pixels.
The lower-resolution content is upscaled through a process called "interpolation," which enlarges the image while maintaining (or potentially improving) its visual quality. More specifically, interpolation creates a grid of "blank" pixels on top of the original image and then colors those blanks based on their surrounding pixels. The enlarged picture is then refined by sharpening or softening parts of the image when necessary, as well as applying filters to adjust its colors further. The result is an estimate that closely matches the original picture but now fits the pixel count of a 4K screen. (Quoted from
https://www.howtogeek.com/4k-upscali...ich-is-better/)
The algorithms used these days are very sophisticated and the results are very good. With native 4K content, there is no interpolation so the result will be the best.