Tuesday, August 11th 2020
AMD RDNA 2 "Big Navi" to Feature 12 GB and 16 GB VRAM Configurations
As we are getting close to the launch of RDNA 2 based GPUs, which are supposedly coming in September this year, the number of rumors is starting to increase. Today, a new rumor coming from the Chinese forum Chiphell is coming our way. A user called "wjm47196" known for providing rumors and all kinds of pieces of information has specified that AMD's RDNA 2 based "Big Navi" GPU will come in two configurations - 12 GB and 16 GB VRAM variants. Being that that is Navi 21 chip, which represents the top-end GPU, it is logical that AMD has put a higher amount of VRAM like 12 GB and 16 GB. It is possible that AMD could separate the two variants like NVIDIA has done with GeForce RTX 2080 Ti and Titan RTX, so the 16 GB variant is a bit faster, possibly featuring a higher number of streaming processors.
Sources:
TweakTown, via Chiphell
104 Comments on AMD RDNA 2 "Big Navi" to Feature 12 GB and 16 GB VRAM Configurations
If I had to guess, AMD is playing the branding game here. There were some wins on getting Freesync TVs out on the market and Nvidia - while branding it as Gsync Compatible - is using a standard approach behind the marketing this time around. HDMI 2.1 VRR support is not that large of a win before actually having HDMI 2.1 outputs because technically it is a bit of a mess. With HDMI 2.0 you are limited to 2160p 40-60Hz range and no LFC or 1440p 40-120Hz. For proper 2160p 40-120Hz range, you need a card with HDMI 2.1 output.
Second, no evidence that over 3 games can consume 11.01GB or more VRAM. At least TechPowerUp’s reviews don’t have it.
Third, according to VII & 5700XT’s POOR 4K performance, I don’t think RDNA II is available to handle 4K gaming either.
So RDNA II’s VRAM appealing is still a nonsense. It’s definitely for mining etc. but not for gaming.
AMD should not cost up for larger but useless VRAM but improve their GPU core performance.
I've been gaming at 4k /75hz for two years and 90% of game's can easily be made to run at 4k with a vega64 so for about 10% of game's I absolutely have to drop resolution.
8Gb is the new minimum/old minimum for me, more would make the same GPU last hopefully a year longer with probable comprises , something most in reality accept ,like 95% of gamer's. At least.
Sony's engineer addressed that amount of RAM required and game installation size are drastically reduced, if faster SSD is available and there is no need to pre-load / have multiple copies of the same stuff just so that it's faster to load. I've briefly touched on HDMI-COC (when my sat receiver didn't want to play along with... wait a sec, LG TV).
My observations:
1) HDMI implementations is a clusterf*ck of quirks, adding yet another one is no big deal
2) "Which vendor just connected?" is part of the standard handshake.
3) Vendor specific codes are part of the standard. ("Oh, you are LG too? Let's speak Klingon!!!!")
Of course VRAM usage in games is an interesting topic in and of itself given how much it can vary due to different factors. I've seen examples of the same game at the same resolution and settings and similar performance use several GB more VRAM on GPUs from one vendor compared to the other, for example (IIRC it was a case where Nvidia GPUs hit near 6GB while AMD GPUs stayed around 4GB). Whether that is due to the driver, something weird in the game code, or something else entirely is beyond me, but it's a good illustration of how this can't be discussed as a straightforward "game A at resolution X will ALWAYS need *GB of VRAM" situation. That's true, HDMI is indeed a mess - but my impression is that HDMI 2.1 is supposed to try to alleviate this by integrating a lot of optional things into the standard, rather than having vendors make custom extensions. Then again, can you even call something a standard if it isn't universally applicable by some reasonable definition. I would say not. Unless, of course, you're a fan of XKCD. But nonetheless, even if "which vendor just connected" is part of the handshake, "if vendor=X, treat HDMI 2.0 device as HDMI 2.1 device" goes quite a ways beyond this.
Pretty sure new generation will reserve 3-4GB for system use, if not more - operating system, caches and stuff. 12-13GB that remain includes both RAM and VRAM for a game to use. There are some savings on not having to load textures to RAM and them transfer to VRAM but that does not make too big of a difference.
Questa volta volevo passare ad AMD per la prima volta nella mia vita. Lo scorso Natale ho acquistato un LG C9 per via del suo 4K @ 120Hz @VRR (GSync / Freesync). Nvidias RTX Generation può eseguire VRR con HDMI 2.0 come G-Sync Combatible. Ieri ho letto che gli OLED LG del 2019 non funzioneranno con Big Navi @Freesync. Ciò significa che devo acquistare di nuovo Nvidia ...
[/ CITAZIONE]
non ha mai avuto amd? lascia stare sono impegnative in ogni senso, non che non siano potenti.....ma..
By the way, why doesn't chiplet approach work with GPUs?
www.computerworld.com/article/2534312/the--640k--quote-won-t-go-away----but-did-gates-really-say-it-.html
Someone else might have said it with 16k of RAM, I dont know
I would say DX12 multi-GPU was not a bad idea when it comes to completely relying on developers but that seems to be a no-go as a whole.
You are right about the required paradigm shift but not sure exactly that will be. Right now, hardware vendors do not seem to have very good ideas for that either :(