- Joined
- Oct 12, 2019
- Messages
- 128 (0.07/day)
My strongest candidate is the laptop version, heavily cut (NVIDIA is doing it also).
I strongly doubt any memory-related change, if it wasn't engineered from the start and in ready design. The reason being they strongly opted for GDDR6 (not 'X', not HBM-any) and made that large on-chip cache for compensation. More memory = more cache or different bus, so different silicon. Different memory type = more expensive + whole cache stuff is then of questionable need (guess they tested extensively already, so it's possible but not probable).
A related guess is that the whole top-line is mostly finished, there can be better binning or something like that, price correction (highly needed), and the whole RDNA2 train goes toward (desperately) needed mid/low range, APUs too. I don't know if any lithography gimmick like Zen to Zen+ is very easy to do, if it is then maybe that - but not NOW, maybe at 21Q4...
Ampere and RDNA2 are new products with unfinished lineup, both companies will surely look to return the expensive R&D investments, no chance that anyone goes crazy and launch a new generation this year. AMD got what they wanted, similar to Zen 1 vs. Intel - not besting them in all scenarios, but being competitive and bit cheaper. NVIDIA still has Jensens stupid 'fastest overall' crown (they invested huge money in the past for similar stupid cards which nobody sane bought), and what I dub as 'equally idiotic RT crown' (being first adopter among buyers is... either willing enthusiasm, supporting new tech because someone needs to OR being too rich, too uninformed OR being desperate to play CP2077 or whatever 4 other games support it).
[Yeah, I was connected with or worked in that (rendering) field for three decades. I read the news from professional/dedicated forums even now. Hobby now, but a way above average knowledge still, if I may say so]
Like 70% games has exactly ZERO need for any RT - f00k, look at all those pixel-art games (the worst possible example) people like to play so much. Strategies. Simulations (except MS FS and such). Let's face it, it's mainly FP(s) technology. What about cartoonish graphics? 2D? Both may look *somewhat* better with RT lights/shadows, but how much is that and how much all of it worths to an average (-budget) player? Rasterized lights/shadows aren't catastrophically bad, and the whole low/mid market perhaps don't need it that much...
My (posted, initial) opinion was that GOOD RT needs at least 5 years. But some guy from the gaming industry said that real-lime PHOTOREALISTIC gaming is 10 years away, guess he knows it much better than I do. Also, RT isn't equal to photorealism at all.
Also, who wants ALL games to be photorealistic? Not all players, for sure.
[I've skipped all tech details. I wanted to write an easy to understand, general article about RT - but the time...]
Back to the topic - NVIDIA has what they want now, and AMD has it, too. My opinion is that high-end improvements will come with better lithography - 6nm, 5nm, less; perhaps MCM - for both (and perhaps Intel, hahahaha).
Not like anything any GPU producer will invest soooo much THIS year, probably next, too.
So, perhaps binning, relatively small improvements - perhaps this year, larger ones next or even later... There. My opinion. For both.
Oh, and NVIDIA knows much more about RT (and photorealism) then they advertise now. There is a good ebook about it on NVIDIA site - yup, I've read it, nothing to disprove what I said here, or before. F00k, they have top-level 'guys' in the field - would be weird otherwise, RT existed for longer than I'm in it... True photorealism likely require VR, because it's needed to follow eyeball movement to follow the focus, just to name one tech-detail...
I strongly doubt any memory-related change, if it wasn't engineered from the start and in ready design. The reason being they strongly opted for GDDR6 (not 'X', not HBM-any) and made that large on-chip cache for compensation. More memory = more cache or different bus, so different silicon. Different memory type = more expensive + whole cache stuff is then of questionable need (guess they tested extensively already, so it's possible but not probable).
A related guess is that the whole top-line is mostly finished, there can be better binning or something like that, price correction (highly needed), and the whole RDNA2 train goes toward (desperately) needed mid/low range, APUs too. I don't know if any lithography gimmick like Zen to Zen+ is very easy to do, if it is then maybe that - but not NOW, maybe at 21Q4...
Ampere and RDNA2 are new products with unfinished lineup, both companies will surely look to return the expensive R&D investments, no chance that anyone goes crazy and launch a new generation this year. AMD got what they wanted, similar to Zen 1 vs. Intel - not besting them in all scenarios, but being competitive and bit cheaper. NVIDIA still has Jensens stupid 'fastest overall' crown (they invested huge money in the past for similar stupid cards which nobody sane bought), and what I dub as 'equally idiotic RT crown' (being first adopter among buyers is... either willing enthusiasm, supporting new tech because someone needs to OR being too rich, too uninformed OR being desperate to play CP2077 or whatever 4 other games support it).
[Yeah, I was connected with or worked in that (rendering) field for three decades. I read the news from professional/dedicated forums even now. Hobby now, but a way above average knowledge still, if I may say so]
Like 70% games has exactly ZERO need for any RT - f00k, look at all those pixel-art games (the worst possible example) people like to play so much. Strategies. Simulations (except MS FS and such). Let's face it, it's mainly FP(s) technology. What about cartoonish graphics? 2D? Both may look *somewhat* better with RT lights/shadows, but how much is that and how much all of it worths to an average (-budget) player? Rasterized lights/shadows aren't catastrophically bad, and the whole low/mid market perhaps don't need it that much...
My (posted, initial) opinion was that GOOD RT needs at least 5 years. But some guy from the gaming industry said that real-lime PHOTOREALISTIC gaming is 10 years away, guess he knows it much better than I do. Also, RT isn't equal to photorealism at all.
Also, who wants ALL games to be photorealistic? Not all players, for sure.
[I've skipped all tech details. I wanted to write an easy to understand, general article about RT - but the time...]
Back to the topic - NVIDIA has what they want now, and AMD has it, too. My opinion is that high-end improvements will come with better lithography - 6nm, 5nm, less; perhaps MCM - for both (and perhaps Intel, hahahaha).
Not like anything any GPU producer will invest soooo much THIS year, probably next, too.
So, perhaps binning, relatively small improvements - perhaps this year, larger ones next or even later... There. My opinion. For both.
Oh, and NVIDIA knows much more about RT (and photorealism) then they advertise now. There is a good ebook about it on NVIDIA site - yup, I've read it, nothing to disprove what I said here, or before. F00k, they have top-level 'guys' in the field - would be weird otherwise, RT existed for longer than I'm in it... True photorealism likely require VR, because it's needed to follow eyeball movement to follow the focus, just to name one tech-detail...