Tuesday, August 17th 2021

DisplayPort 2.0 Could Land in Next-Generation AMD Radeon RDNA3 GPUs

AMD is slowly preparing to launch its next-generation of graphics cards based on the RDNA3 architecture, and it could bring some new connectivity options as well. Currently, the graphics cards we are using today use DisplayPort 1.4 connector for their DP output. However, the more advanced DisplayPort 2.0 could land in RDNA3 GPUs, bringing much-needed improvements to the video output system. What DP 2.0 brings to the table is an upgrade to an Ultra High Bit Rate individual lane speed of 20 GB/s, totaling 80 GB/s with four of those. The DP 2.0 capable system would be able to output a 10K uncompressed resolution at 60 Hz, or two 4K 144 Hz monitors at the same time. With compression, that would be extended much further. We have to wait and see what AMD does and if the next-generation RDNA3 brings this new DisplayPort standard to the masses.
Sources: Freedesktop Patch, via VideoCardz
Add your own comment

25 Comments on DisplayPort 2.0 Could Land in Next-Generation AMD Radeon RDNA3 GPUs

#1
ZoneDymo
the end is near, in a good way, if DP 2.0 can do 10k at 60hz without compression, it should be able to do 8k at 100hz without compression.

DP 2.1 or 2.2 will bring that up to 8k 240hz or so and we will be done on that front for displays, more will be possible but be pointless for consumers.
Posted on Reply
#2
delshay
I'm pretty sure Nvidia would not be lagging behind either, they too will have display port 2.0 in the pipeline, along with Intel.
Posted on Reply
#3
Ferrum Master
I am more concerned how the cables will be like. Length, thickness.

There are known problems with those. Not only the older dreaded power pin problem, but actually real certification and even with that some panels are really capricious.
Posted on Reply
#4
Bomby569
like pcie gen 4 for gpu's it's nice but also useless for now and for next year gpu's.
Posted on Reply
#5
TumbleGeorge
DP 2.0 is capable up to 16K 60Hz HDR with compression on one monitor. This maybe is argument for 16K TV's and maybe PC monitors after 3-4-5 years?
Posted on Reply
#6
ZoneDymo
TumbleGeorgeDP 2.0 is capable up to 16K 60Hz HDR with compression on one monitor. This maybe is argument for 16K TV's and maybe PC monitors after 3-4-5 years?
yeah but it would be pointless, you wont be able to see a difference unless you are looking at a REALLY large screen and I dont think rooms are going to change dimensions enough to accomodate that in the future nor again, would there be any point.
Posted on Reply
#7
TumbleGeorge
ZoneDymoyeah but it would be pointless, you wont be able to see a difference unless you are looking at a REALLY large screen and I dont think rooms are going to change dimensions enough to accomodate that in the future nor again, would there be any point.
This will not stopping corporations on their way to our wallets. :) Is any possibility for DP 2.0 or even HDMI 2.1+ to be enough to keep signal for high resolution holographic video to holographic projector of next generation?
Posted on Reply
#8
Legacy-ZA
delshayI'm pretty sure Nvidia would not be lagging behind either, they too will have display port 2.0 in the pipeline, along with Intel.
As far as I know, the Ti founder models have DP2.0, it's just the AIB's that cheaped out. One thing I wish reviewers picked up and pointed out.
Posted on Reply
#10
TumbleGeorge
Legacy-ZAAs far as I know, the Ti founder models have DP2.0, it's just the AIB's that cheaped out. One thing I wish reviewers picked up and pointed out.
1.2 and after that 1.4a/b. DP 2.0 is too young and to today isn't used in graphic cards. Up to today has also one more adapter between DP 2.0 and maybe USB4 or HDMI 2.1(?) announced.
Posted on Reply
#11
Asni
80 Gb/s, 80GB/s would be 640Gb/s.
mechtechHope it’s true…..been a long time coming. However most monitors still use 1.2a. Monitor makers should have implemented 2.0 awhile ago. It did come out over 2 years ago now.
vesa.org/press/vesa-publishes-displayport-2-0-video-standard-enabling-support-for-beyond-8k-resolutions-higher-refresh-rates-for-4k-hdr-and-virtual-reality-applications/
Most monitors use 1.2a because they don't need more bandwidth and any features can be implemented without additional bandwidth.
Displayport 2.0 specifications have been defined 2 years ago: it usually takes 18-24 months to produce a controller, it took a little longer because of covid.
Posted on Reply
#12
jeremyshaw
Does DP/VESA allow for DP2.0 implementations that don't have the full bitrate, like the trend of HDMI 2.1 devices that have 36 or 40Gbps instead of the full 48?
Posted on Reply
#13
mechtech
Asni80 Gb/s, 80GB/s would be 640Gb/s.



Most monitors use 1.2a because they don't need more bandwidth and any features can be implemented without additional bandwidth.
Displayport 2.0 specifications have been defined 2 years ago: it usually takes 18-24 months to produce a controller, it took a little longer because of covid.
Of course, however for the sake of argument dp 1.4 has been out for a very long time and it’s extremely rare even to find a 1.4 monitor. And oddly enough it seems monitors are quick to adapt new versions of hdmi
Posted on Reply
#15
Rithsom
RichardsRdna 3 8k120 native ?
I know you're joking, but just imagine having a card that could run CP2077 at 8K 120 FPS, with maxed-out settings (including ray tracing), and no upscaling. It would be around five times faster than the RTX 3090, which is the first so-called "8K Gaming Card".

I don't expect such a card to come out until at least 2028, and even then it will cost a fortune. Reasonable 8K gaming performance is just not possible yet, and I doubt it will be with RDNA3 or RTX 4000, either.
Posted on Reply
#16
Turmania
When is rdna3 expected release date?
Posted on Reply
#17
Minus Infinity
TumbleGeorgeDP 2.0 is capable up to 16K 60Hz HDR with compression on one monitor. This maybe is argument for 16K TV's and maybe PC monitors after 3-4-5 years?
16K TV's at least would draw up to 7-8kW depending on size. 75" 8K TV's are up around 2kW peak already. Apparently plasma was discontinued because power usage was too high at 400-600W but now apparently it doesn't matter.
Posted on Reply
#18
Rithsom
Minus InfinityApparently plasma was discontinued because power usage was too high at 400-600W but now apparently it doesn't matter.
There was also a time when people cared about GPU power consumption. Back in the day, the GTX 480 was criticized for consuming 250W of power. Now people are fine with 350+W cards, with 250W considered modest. :kookoo:

I don't understand people sometimes...
Posted on Reply
#19
Richards
RithsomI know you're joking, but just imagine having a card that could run CP2077 at 8K 120 FPS, with maxed-out settings (including ray tracing), and no upscaling. It would be around five times faster than the RTX 3090, which is the first so-called "8K Gaming Card".

I don't expect such a card to come out until at least 2028, and even then it will cost a fortune. Reasonable 8K gaming performance is just not possible yet, and I doubt it will be with RDNA3 or RTX 4000, either.
Rx 6900 xt averages 105 fps at 4k in 15 games.. rdna e is rumoured to be 2.8x with chiplets so thats 294 fps at 4k in some games it can get 8k 120
Posted on Reply
#20
Rithsom
RichardsRx 6900 xt averages 105 fps at 4k in 15 games.. rdna e is rumoured to be 2.8x with chiplets so thats 294 fps at 4k in some games it can get 8k 120
2.8x performance gain from RDNA2 to RDNA3? Maybe if AMD were to release RDNA3 five years from now. However, with a planned release in 2023, I just don't see that happening. We haven't seen that rapid of silicon advancement since the 1990s.
Posted on Reply
#21
wahdangun
ZoneDymoyeah but it would be pointless, you wont be able to see a difference unless you are looking at a REALLY large screen and I dont think rooms are going to change dimensions enough to accomodate that in the future nor again, would there be any point.
but you can also not using AA, and the jaggies will not be visible
Posted on Reply
#22
ARF
ZoneDymoyeah but it would be pointless, you wont be able to see a difference unless you are looking at a REALLY large screen and I dont think rooms are going to change dimensions enough to accomodate that in the future nor again, would there be any point.
It could have been much better to stay forever with 16-color, 640 × 350 monitors because people don't see the differences?!

The difference between 1080p and 2160p on any size is DAY and NIGHT, the difference between 2160p and 4320p ON ANY SIZE will be day and night.
Posted on Reply
#23
ZoneDymo
ARFIt could have been much better to stay forever with 16-color, 640 × 350 monitors because people don't see the differences?!

The difference between 1080p and 2160p on any size is DAY and NIGHT, the difference between 2160p and 4320p ON ANY SIZE will be day and night.
the point is that past 8k you need a magnifying glass to see differences so yeah, that would be rather silly wouldnt it?
magnifying glasses exist to overcome the limits of your sight to decern tiny details/differences, once we are at 8k unless you get a REALLY big monitor and sit really close (Which...who would) its pointless to go for 16k, 32k 1024k....you wont be able to see a difference.
Posted on Reply
#24
ARF
ZoneDymothe point is that past 8k you need a magnifying glass to see differences so yeah, that would be rather silly wouldnt it?
magnifying glasses exist to overcome the limits of your sight to decern tiny details/differences, once we are at 8k unless you get a REALLY big monitor and sit really close (Which...who would) its pointless to go for 16k, 32k 1024k....you wont be able to see a difference.
How do you know without actually seen anything of these?
Just your guesses.

And no, my vision has no limits.
Posted on Reply
#25
Jermelescu
ARFIt could have been much better to stay forever with 16-color, 640 × 350 monitors because people don't see the differences?!

The difference between 1080p and 2160p on any size is DAY and NIGHT, the difference between 2160p and 4320p ON ANY SIZE will be day and night.
2160p -> 4320p on anything smaller than 28" won't make a big difference compared to 1080p -> 2160p.
It's just like 144hz refresh rate is a billion times better than 60hz, but 240hz-360hz compared to 144hz, even though it's better, it's not that monumental.
Posted on Reply
Add your own comment
Jun 11th, 2024 23:36 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts