View Single Post
 
Old 02-03-2025, 04:01 PM
retiredguy123 retiredguy123 is online now
Sage
Join Date: Feb 2016
Posts: 17,064
Thanks: 2,967
Thanked 16,246 Times in 6,390 Posts
Default

Quote:
Originally Posted by jrref View Post
Very good question.

So, when a signal is "upscaled" to a higher resolution the software has to "make up" information that's not there to get the additional pixels.

The lower-resolution content is upscaled through a process called "interpolation," which enlarges the image while maintaining (or potentially improving) its visual quality. More specifically, interpolation creates a grid of "blank" pixels on top of the original image and then colors those blanks based on their surrounding pixels. The enlarged picture is then refined by sharpening or softening parts of the image when necessary, as well as applying filters to adjust its colors further. The result is an estimate that closely matches the original picture but now fits the pixel count of a 4K screen. (Quoted from https://www.howtogeek.com/4k-upscali...ich-is-better/)

The algorithms used these days are very sophisticated and the results are very good. With native 4K content, there is no interpolation so the result will be the best.
I understand interpolation. But, if they cannot even afford to broadcast the Super Bowl in native 4K, what is the point of broadcasting it using upscaling when almost everyone already has a 4K TV that upscales?