Friday, October 16th 2020

NVIDIA Updates Video Encode and Decode Matrix with Reference to Ampere GPUs

NVIDIA has today updated its video encode and decode matrix with references to the latest Ampere GPU family. The video encode/decode matrix represents a table of supported video encoding and decoding standards on different NVIDIA GPUs. The matrix has a reference dating back to the Maxwell generation of NVIDIA graphics cards, showing what video codecs are supported by each generation. That is a useful tool for reference purposes, as customers can check if their existing or upcoming GPUs support a specific codec standard if they need any for video reproduction purposes. The update to the matrix comes in a form of Ampere GPUs, which are now present there.

For example, the table shows that, while supporting all of the previous generations of encoding standards, the Ampere based GPUs feature support for HEVC B Frame standard. For decoding purposes, the Ampere lineup now includes support for AV1 8-bit and 10-bit formats, while also supporting all of the previous generation formats. For a more detailed look at the table please go toNVIDIA's website here.
NVIDIA Encoding and Decoding Standards
Source: NVIDIA Video Encode and Decode Matrix
Add your own comment

30 Comments on NVIDIA Updates Video Encode and Decode Matrix with Reference to Ampere GPUs

#26
mtcn77
Mouth of SauronBasically, neither decode nor encode absolutely *needs* hardware support -
You have to understand, client side encoding practically kills your mouse trigger reflex. For the recent while Nvidia has been demonstrative since their good encoding eliminated the use of cpu, thus saving encoding latency for the user. You have to run a seperate server side pc just to offload the encode latency, otherwise, which isn't practical, nor sane.
Posted on Reply
#27
Mouth of Sauron
mtcn77You have to understand, client side encoding practically kills your mouse trigger reflex. For the recent while Nvidia has been demonstrative since their good encoding eliminated the use of cpu, thus saving encoding latency for the user. You have to run a seperate server side pc just to offload the encode latency, otherwise, which isn't practical, nor sane.
Yeah, but which CPU? We have advancement for both Intel (HW encoding) and AMD (raw horsepower, many cores) + NVIDIA GPU encoding. I know that currently most encoding is done on separate PC, but we still have question 'how fast', which is important.

At some time, single PC configuration for video encoding will become reality.

Besides, many CPU-burn tasks (rendering, for example, I could name some others) is not exactly multitasking-friendly...
Posted on Reply
#28
mtcn77
Mouth of SauronAt some time, single PC configuration for video encoding will become reality.
Well, you did miss the point on hardware encoding - it works like a seperate pc. It is quite handy, latency doesn't go on par with encoding latency.
Posted on Reply
#29
Mouth of Sauron
mtcn77Well, you did miss the point on hardware encoding - it works like a seperate pc. It is quite handy, latency doesn't go on par with encoding latency.
Nope. You don't know what you're talking about, sorry to say. Very small amount of either CPU or GPU is dedicated to hardware encoding and it effects results accordingly.

Because I don't want to be misunderstood again, I'll stop this pointless discussion now - I suggested a thing, it's upon the staff to decide is it worthy or representative, I'm fine either way.
Posted on Reply
#30
mtcn77
Mouth of SauronNope. You don't know what you're talking about, sorry to say.
I'm notmaking such a case. I just read what hardware.fr inquired about the subject. Hardware encoding deters video compression lag.
Posted on Reply
Add your own comment
Apr 13th, 2025 06:58 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts