
Originally Posted by
herkimerjerkimer
c and p
As data-intensive as HD is, 4K is even worse. While most of us were just getting used to the idea of H.264's advantages over MPEG-2 on Blu-ray, the Motion Picture Experts Group and the International Telecommunication Union's Telecommunication Standardization Sector (ITU-T) were already starting work on the next generation of video compression, with an eye on the future.
Not wanting to mess around with small, incremental improvements, whenever a new compression standard is introduced, it has to be a sizable change. With each jump, the general rule is half the bit rate for the same quality (or greater quality at the same bit rate).
How does it do this? Largely by expanding on how AVC (and other compression techniques before it) works.
First, it looks at multiple frames to see what doesn't change. In most scenes in a TV show or movie, the vast majority of the frame doesn't change much. Think of a scene with someone talking. The shot is mostly their head. The background isn't going to change much for many frames. For that matter, most of the pixels representing their face probably won't change much (other than their lips, of course). So instead of encoding every pixel from every frame, an initial frame is encoded, and then after that only what changes is encoded (basically).
HEVC then expands the size of the area that's looked at for these changes. Larger and smaller "blocks" essentially, which offers additional efficiency. Ever seen blocks in your image, when the picture goes foul? Those can be bigger, smaller, and differently shaped with HEVC than with previous compression methods. Larger blocks, for example, were found to be more efficient.
Then other things were improved, like motion compensation, spatial prediction, and so on. All of these things would have been done with AVC or even earlier, but it required more processing power than was economically feasible at the time.
During the development phase, the compression algorithm is tested objectively, for its raw number efficiency, but also subjectively, by video professionals comparing different compression methods and amounts in a "blind" test, where they don't know which method is which. The human element is crucial. Just because a computer says one level of compression is better than another doesn't mean it looks better than another.
Because H.265 is so much more processor intensive, don't expect a simple firmware upgrade to get your gear to decode it. In fact, that's part of the issue. You need a hardware decoder somewhere. If your new media streamer, cable box, or BD player has it, then you'll be all set (presuming you also have HDMI 2.0 so you can get 2160p/60 and not just 2160p/30). Could a high-end PC decode it via software? Maybe. Could the Xbox One or PS4? Not likely. Everyone loves their favorite console, but remember, this generation's hardware is equivalent to a pretty average PC.