• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 50-series "GB20X" GPU Memory Interface Details Leak Out

T0@st

News Editor
Joined
Mar 7, 2023
Messages
2,077 (3.32/day)
Location
South East, UK
Earlier in the week it was revealed that NVIDIA had distributed next-gen AI GPUs to its most important ecosystem partners and customers—Dell's CEO expressed enthusiasm with his discussion of "Blackwell" B100 and B200 evaluation samples. Team Green's next-gen family of gaming GPUs have received less media attention in early 2024—a mid-February TPU report pointed to a rumored PCIe 6.0 CEM specification for upcoming RTX 50-series cards, but leaks have become uncommon since late last year. Top technology tipster, kopite7kimi, has broken the relative silence on Blackwell's gaming configurations—an early hours tweet posits a slightly underwhelming scenario: "although I still have fantasies about 512 bit, the memory interface configuration of GB20x is not much different from that of AD10x."

Past disclosures have hinted about next-gen NVIDIA gaming GPUs sporting memory interface configurations comparable to the current crop of "Ada Lovelace" models. The latest batch of insider information suggests that Team Green's next flagship GeForce RTX GPU—GB202—will stick with a 384-bit memory bus. The beefiest current-gen GPU AD102—as featured in GeForce RTX 4090 graphics cards—is specced with a 384-bit interface. A significant upgrade for GeForce RTX 50xx cards could arrive with a step-up to next-gen GDDR7 memory—kopite7kimi reckons that top GPU designers will stick with 16 Gbit memory chip densities (2 GB). JEDEC officially announced its "GDDR7 Graphics Memory Standard" a couple of days ago. VideoCardz has kindly assembled the latest batch of insider info into a cross-generation comparison table (see below).



View at TechPowerUp Main Site | Source
 
Joined
Mar 29, 2023
Messages
1,045 (1.73/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
If this is true, then it is rather disapointing. Will be 3rd flagship card in a row with 24gb vram, which is a limiting factor already in some games at 8k.
 
Joined
Nov 27, 2023
Messages
2,314 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
If this is true, then it is rather disapointing. Will be 3rd flagship card in a row with 24gb vram, which is a limiting factor already in some games at 8k.
Ah yes, gaming at 8K, a totally reasonable and sane workload. Not 5K which is actually the most likely high end standard to emerge after 4K is mainstream in… a decade maybe? Like, come on.
And are AMD GPUs doing significantly better at 8K with their wider buses? I have no idea, I am not in a habit of checking 12 FPS benchmarks for completely unrealistic scenarios.
 
Joined
Sep 26, 2022
Messages
216 (0.27/day)
If this is true, then it is rather disapointing. Will be 3rd flagship card in a row with 24gb vram, which is a limiting factor already in some games at 8k.

I speak for me but I really don't give a toss about 8K gaming, I'm planning on purchasing a pc 4K monitor this year and my playstation 5 can barely output 30 fps at native 4K lol

I don't see the point of engineering and making your customers pay for > 24GB ram for a use case that would satisfy the 0.001% (0.000001% ?) that actually own a 8K display :).

Once GPU can comfortably master 4K, then we can evolve to 8K, we are really not there yet, at all.
 
Joined
Nov 22, 2023
Messages
195 (0.53/day)
Not surprising.

I'd even bet on L2 cache sizes shrinking or at least staying the same.

New GDDR generation means all the workarounds for bandwidth to shader ratios can be alleviated for a short while.

Don't need bigger busses or caches. Clamshell or 3gb chips if you need more capacity.
 
Joined
Mar 29, 2023
Messages
1,045 (1.73/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
Ah yes, gaming at 8K, a totally reasonable and sane workload. Not 5K which is actually the most likely high end standard to emerge after 4K is mainstream in… a decade maybe? Like, come on.
And are AMD GPUs doing significantly better at 8K with their wider buses? I have no idea, I am not in a habit of checking 12 FPS benchmarks for completely unrealistic scenarios.
I speak for me but I really don't give a toss about 8K gaming, I'm planning on purchasing a pc 4K monitor this year and my playstation 5 can barely output 30 fps at native 4K lol

I don't see the point of engineering and making your customers pay for > 24GB ram for a use case that would satisfy the 0.001% (0.000001% ?) that actually own a 8K display :).

Once GPU can comfortably master 4K, then we can evolve to 8K, we are really not there yet, at all.

It gets so tiring with people spewing nonesense about stuff they obviously have no clue about, or experience with...



As seen above i AM playing at 8k, and it works great in alot of titles, but some are limited at vram...

Witcher 3 remaster with ultra raytracing is an example of a game where it runs fine at 8k - right until it runs out of vram, and fps absolutely tanks due to vram swapping.


 
Last edited:
Joined
Nov 27, 2023
Messages
2,314 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
As seen above i AM playing at 8k, and it works great in alot of titles, but some are limited at vram...
*sigh* So what level of DLSS do you have engaged on this “8K” screenshot? Looks like Quality, if I am not mistaken? With a 4090 to boot. At 60 FPS. Barely acceptable. Because when I talk about 8K I am strictly talking in terms of native res. Obviously, upsampling changes the game.
Oh, and Quality DLSS is 5K by the way. Exactly what I was talking about in terms of a next render target. We are more in agreement than not, it seems.
 
Joined
Mar 29, 2023
Messages
1,045 (1.73/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
*sigh* So what level of DLSS do you have engaged on this “8K” screenshot? Looks like Quality, if I am not mistaken? With a 4090 to boot. At 60 FPS. Barely acceptable. Because when I talk about 8K I am strictly talking in terms of native res. Obviously, upsampling changes the game.
Oh, and Quality DLSS is 5K by the way. Exactly what I was talking about in terms of a next render target. We are more in agreement than not, it seems.

Ofc im using dlss - it would be idiotic not to, as at 8k there basically aint a difference in quality. And as essentially all games support it these days, yes, 8k is very much doable.

As for 60 fps barely being acceptable - well you just stay at 1080p then together with the other plebs. Meanwhile i will enjoy glorious 8k graphics ;)
 
Joined
Nov 27, 2023
Messages
2,314 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
@Dragam1337
Ah, so you aren’t actually interested in having a discussion on GPU specs, frametime costs and how resolution scaling affects those going forward. You are just shitposting. Fair enough, carry on.
 
Joined
Mar 29, 2023
Messages
1,045 (1.73/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
@Dragam1337
Ah, so you aren’t actually interested in having a discussion on GPU specs, frametime costs and how resolution scaling affects those going forward. You are just shitposting. Fair enough, carry on.

Evidently you are confusing yourself with me.
 
Joined
Dec 31, 2020
Messages
978 (0.69/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
They are exactly the same. not much different means the same. what could be different. dont tell me they consider 352 bits variants again.
 
Joined
Jun 18, 2021
Messages
2,547 (2.03/day)
The 384bits on GB202 is fine, even the GB203 with 256bits is ok, not great not terrible. What's impressive and really concerning is to see GB205 with 192bits, so we'll have the same clown show of '70 cards having only 12gb of VRAM? Outstanding :nutkick:

Or maybe they use GB203 for '70 class and '80 gets GB202 and they retire the '90 again? Or maybe 24Gb GDDR7 chips will change the math? Oh well, still more than 6 months to go so not really much point in speculating, whatever it ends at it will certainly be the worst possible option as usual
 
Joined
Jul 13, 2016
Messages
3,272 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Ah yes, gaming at 8K, a totally reasonable and sane workload. Not 5K which is actually the most likely high end standard to emerge after 4K is mainstream in… a decade maybe? Like, come on.
And are AMD GPUs doing significantly better at 8K with their wider buses? I have no idea, I am not in a habit of checking 12 FPS benchmarks for completely unrealistic scenarios.

I'm not so sure games in the next 3 years won't use 24GB at lower resolutions than 8K. It's more than a possibility. Flagship graphics cards are not supposed to be at their limit in consumer applications right out of the gate, especially when you are talking an xx90 class card. Extra VRAM enables future games to push what they can do, it doesn't have to have an immediate and obvious impact. The 1080 Ti is a great example of that, the card is still relevant today thanks to it's 11GB of VRAM.

Mind you IMO the biggest problem with Nvidia sticking with 24GB for it's flagship card would be that it curtails it's usefulness for AI. I can already reach 32GB VRAM usage at 1024x1024 on SDXL, never mind newer AI models that are currently in testing that will certainly be out by the time this GPU drops. Nvidia's next gen cards can be amazing performance wise but if AMD is releasing it's next gen flagship with 36 or 48GB for example that's going to attrack a lot of high end customers over to them.
 
Joined
Nov 27, 2023
Messages
2,314 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
I'm not so sure games in the next 3 years won't use 24GB at lower resolutions than 8K. It's more than a possibility. Flagship graphics cards are not supposed to be at their limit in consumer applications right out of the gate, especially when you are talking an xx90 class card. Extra VRAM enables future games to push what they can do, it doesn't have to have an immediate and obvious impact. The 1080 Ti is a great example of that, the card is still relevant today thanks to it's 11GB of VRAM.
The games will use what is available. Of course, in cases of extremely shit optimization it’s possible to gobble up essentially endless VRAM. And just as possible is for developers to implement absurd graphics features or nonsense like, I dunno, 8K texture packs for their Ultra Nightmare Eldritch Horror settings preset that would put even a hypothetical 5090Ti Super Ultra to its knees. But the truth is, the vast majority of the market won’t run top tier 2000 bucks cards. Consoles also don’t have 24 gigs of memory and unlikely to with the refreshes. As such, no developer who would actually like their games to sell would push for insane VRAM usage (not just allocation for cache) targets. I think most people on this enthusiast oriented site forget (perhaps understandably) that Ultra settings at 4K with Path Tracing is more of a tech demo for the GPU market to show off and the big money customers to feel good about their purchase rather than the developers actually intended way for the majority of people to experience the game.
 

Hugis

Moderator
Staff member
Joined
Mar 28, 2010
Messages
824 (0.15/day)
Location
Spain(Living) / UK(Born)
System Name Office / Gamer Mk IV
Processor i5 - 12500
Motherboard TUF GAMING B660-PLUS WIFI D4
Cooling Themalright Peerless Assassin 120 RGB
Memory 32GB (2x16) Corsair CMK32GX4M2D3600C18 "micron B die"
Video Card(s) UHD770 / PNY 4060Ti (www.techpowerup.com/review/pny-geforce-rtx-4060-ti-verto)
Storage SN850X - P41Plat - SN770 - 980Pro - BX500
Display(s) Philips 246E9Q 75Hz @ 1920 * 1080
Case Corsair Carbide 200R
Audio Device(s) Realtek ALC897 (On Board)
Power Supply Cooler Master V750 Gold v2
Mouse Cooler Master MM712
Keyboard Logitech S530 - mac
Software Windows 11 Pro
Ok you 2 two, Keep it on topic ! and stop bickering !
If you need to sling crap around do it by PM.
 
Last edited:
Joined
Jul 13, 2016
Messages
3,272 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
The games will use what is available. Of course, in cases of extremely shit optimization it’s possible to gobble up essentially endless VRAM. And just as possible is for developers to implement absurd graphics features or nonsense like, I dunno, 8K texture packs for their Ultra Nightmare Eldritch Horror settings preset that would put even a hypothetical 5090Ti Super Ultra to its knees. But the truth is, the vast majority of the market won’t run top tier 2000 bucks cards. Consoles also don’t have 24 gigs of memory and unlikely to with the refreshes. As such, no developer who would actually like their games to sell would push for insane VRAM usage (not just allocation for cache) targets. I think most people on this enthusiast oriented site forget (perhaps understandably) that Ultra settings at 4K with Path Tracing is more of a tech demo for the GPU market to show off and the big money customers to feel good about their purchase rather than the developers actually intended way for the majority of people to experience the game.

Consoles are different, they have a dedicated decompression chip that allows them to stream assets from disk with low latency and they benefit from closer to metal optimizations. A game like the new ratchet and clank has to take up more VRAM and memory on PC because it cannot assume the storage subsystem can stream assets in a timely enough manner whereas on console it's guaranteed.

And again, for AI 24GB is simply not enough for a flagship card that should be able to run next gen models. I have a 4090 in my stable diffusion rig and I will not upgrade that to a 5090 if they aren't increasing the VRAM size. If AMD comes out with a card with more VRAM I'd likely upgrade to that instead, particular as ROCm has been making strides performance wise. I can't say I see the logic in Nvidia remaining stagnant for 3 generations in a row. That's silly given the continual push for AI.
 
Joined
Apr 14, 2022
Messages
745 (0.78/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
Unless they use 256bit and 4GB chips...So we get 32GB of VRAM. I don't know how the bandwidth is affected though and the price as well. Are the 4GB chips that much more expensive than 2x2GB?

Definitely nVidia will never give 384bit, 4GB chips aka 48GB in a consumer gaming gpu, anytime soon.
 
Joined
Nov 2, 2016
Messages
111 (0.04/day)
I'm not so sure games in the next 3 years won't use 24GB at lower resolutions than 8K. It's more than a possibility. Flagship graphics cards are not supposed to be at their limit in consumer applications right out of the gate, especially when you are talking an xx90 class card. Extra VRAM enables future games to push what they can do, it doesn't have to have an immediate and obvious impact. The 1080 Ti is a great example of that, the card is still relevant today thanks to it's 11GB of VRAM.

Mind you IMO the biggest problem with Nvidia sticking with 24GB for it's flagship card would be that it curtails it's usefulness for AI. I can already reach 32GB VRAM usage at 1024x1024 on SDXL, never mind newer AI models that are currently in testing that will certainly be out by the time this GPU drops. Nvidia's next gen cards can be amazing performance wise but if AMD is releasing it's next gen flagship with 36 or 48GB for example that's going to attrack a lot of high end customers over to them.
You're totally right, Nvidia is rentseeking and sticking to their VRAM like Intel was sticking to quad core even on their high end because AMD was not able to put any real pressure, much like in the GPU market today. Worse, Nvidia's moat is even bigger than Intel's used to be. Nvidia also doesn't want for consumer GPUs to ever be too good at AI stuff when they sell a card with a 10% higher BOM but 10x higher price.

But the "but 8K, the humanity" argument is one of those useless e-peen arguments. 4K gaming has been around for some time and still not prevalent in the PC space. 8K won't be a real concern for enough people to matter in the next many years. The February 2024 Steam survey says ~60% have a 1080p monitor resolution and another ~20% have 1440p. 4K is in the low single digits. And 8K is that one guy above. On DLSS. At 60FPS. If 8K gaming is your only or even just primary concern then the 24GB VRAM you buy today is going to be a true limitation around the time you need to upgrade anyway.
 
Joined
Apr 15, 2021
Messages
881 (0.67/day)
It gets so tiring with people spewing nonesense about stuff they obviously have no clue about, or experience with...


As seen above i AM playing at 8k, and it works great in alot of titles, but some are limited at vram...

Witcher 3 remaster with ultra raytracing is an example of a game where it runs fine at 8k - right until it runs out of vram, and fps absolutely tanks due to vram swapping.
My question would be, "Is it really worth it?" considering this from both a gamer's & game developer's perspective. Are any games even able to take full advantage of 8k rendering? I highly doubt it. The textures would need to be huge, and given that there's only so much you can do with textures to keep model poly-count down without negatively affecting appearance, this would have to be increased as well.
I'm sure it still looks good, but the only purpose I could see in wasting such an insane amount of resources to run 8k would be for a large display bigger than a 48" where any pixelation is undesired. Beyond that, it makes no sense to me and becomes more of an issue of practicality(i.e. how far back a pair of eyes need to be from the screen).

I remember 2 yrs ago someone I know was talking about running his console games at 8k on his 8k TV, and I correctly pointed out that those games are not being played at 8k, but instead everything is being upscaled & that he just wasted a lot of money on an 8k display if that was his only intent.
Imo, 8k just isn't worth it at present(unless you have some kind of special need) and by far would have more of a niche commercial application than gaming. When they do start making games for 8k, expect to start paying even more hefty prices.
 
Joined
May 8, 2018
Messages
1,568 (0.66/day)
Location
London, UK
Ah yes, gaming at 8K, a totally reasonable and sane workload. Not 5K which is actually the most likely high end standard to emerge after 4K is mainstream in… a decade maybe? Like, come on.
And are AMD GPUs doing significantly better at 8K with their wider buses? I have no idea, I am not in a habit of checking 12 FPS benchmarks for completely unrealistic scenarios.
While I like this push to 8K, I agree with you, 3 generations or 10 years to get to 120 fps at 8k. I do believe 5090 will achieve in some modern games 8k 60fps and 4k 200fps.
 
Joined
Mar 5, 2024
Messages
113 (0.43/day)
I was one of the first people to buy a 4K display in like ages ago. People back then said the same things for 4K that we (even myself) say about 8K. Who knows, the jump might be too great this time. Then again, 4K was a lot more demanding than 1080p and 1440p. Especially 1080p, back then every monitor and TV was mostly focusing on that. TVs are a good example of this huge jump. They went from 1080p to 4K directly. 4K was slow to happen too, then every single TV and every single second gaming monitor had that resolution. Consoles had it too. It's like a 4K switch was flipped, and the planet was suddenly all ready. I kept reading FOR YEARS how there is NO 4K content, movies,TV channels and games supporting it. YEARS. Look where we are now. It's EVERYWHERE! So yeah, who knows? 8K might happen, it is the next step, and we always keep going up when it comes to resolutions. I have yet to see a higher new resolution that was out there and not adopted eventually. Biggest issue atm is lack of content and displays at good prices. I sure ain't gonna buy a first gen 8K TVs. These are terrible atm, and expensive. Bad combo. The first 4K display i got, was actually very good. I still use it today. I doubt i will use (for a long time) the first 8K laggy, bad contrast, bad reflections, bad CPU TV. They say it's a chicken and egg kind of issue. Well, we kind of need both at the same time. Why buy a TV/monitor if you can't use it for gaming or movies? So yeah, we need better video cards too. Nvidia are again disappointing us on that front. They might be right, cus it doesn't seem to be the time for that yet.. but someone has to push it. TV makers ain't, movies ain't... so perhaps gaming?
 
Joined
May 3, 2018
Messages
2,881 (1.20/day)
Ah yes, gaming at 8K, a totally reasonable and sane workload. Not 5K which is actually the most likely high end standard to emerge after 4K is mainstream in… a decade maybe? Like, come on.
And are AMD GPUs doing significantly better at 8K with their wider buses? I have no idea, I am not in a habit of checking 12 FPS benchmarks for completely unrealistic scenarios.
But but DLSS 2+ DLSS3. I love me those fake frames in the morning.

But we are jumping the gun. Where is my 27" 8K OLED 480Hz monitor
 
Joined
Feb 24, 2023
Messages
3,013 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
What matters more is will bang per buck and bang per watt improve (<25% doesn't count ffs). I don't even have hope at this point. Why bother about VRAM if calculating power of hypothetical 5060 Ti (at 500ish USD) is roughly on par with 550ish USD 4070 non-Super? And this, considering AMD are unwilling to compete, is the most likely scenario.

P.S. 8K will be a thing in 2030s. We aren't even 50% through the 4K gaming era.
 
Joined
Mar 29, 2014
Messages
472 (0.12/day)
Rumours based on multiple misquotes. Dell exec never said he had them in hand, he said he was excited about what it will bring, for AI. Whatever it will be, it will be designed for AI first.
 
Joined
Mar 28, 2020
Messages
1,753 (1.03/day)
I was one of the first people to buy a 4K display in like ages ago. People back then said the same things for 4K that we (even myself) say about 8K. Who knows, the jump might be too great this time. Then again, 4K was a lot more demanding than 1080p and 1440p. Especially 1080p, back then every monitor and TV was mostly focusing on that. TVs are a good example of this huge jump. They went from 1080p to 4K directly. 4K was slow to happen too, then every single TV and every single second gaming monitor had that resolution. Consoles had it too. It's like a 4K switch was flipped, and the planet was suddenly all ready. I kept reading FOR YEARS how there is NO 4K content, movies,TV channels and games supporting it. YEARS. Look where we are now. It's EVERYWHERE! So yeah, who knows? 8K might happen, it is the next step, and we always keep going up when it comes to resolutions. I have yet to see a higher new resolution that was out there and not adopted eventually. Biggest issue atm is lack of content and displays at good prices. I sure ain't gonna buy a first gen 8K TVs. These are terrible atm, and expensive. Bad combo. The first 4K display i got, was actually very good. I still use it today. I doubt i will use (for a long time) the first 8K laggy, bad contrast, bad reflections, bad CPU TV. They say it's a chicken and egg kind of issue. Well, we kind of need both at the same time. Why buy a TV/monitor if you can't use it for gaming or movies? So yeah, we need better video cards too. Nvidia are again disappointing us on that front. They might be right, cus it doesn't seem to be the time for that yet.. but someone has to push it. TV makers ain't, movies ain't... so perhaps gaming?
4K is still niche when you consider the fact that most gamers are still on 1080p or 1440p. And if you observed, while the likes of Ampere and Ada launched as a good step towards 4K gaming, that did not last long. And from 2023, even the flagship could not play most new games at native resolution. So if maintaining smooth framerates at 4K is bad, you can imagine the challenge with 8K resolution. Upscaled 4K is basically running at 1080p or 1440p.
Going back to the topic, it is not unexpected that Nvidia will not really change their product specs. If the high margined data center or AI graphics are not getting top end specs, like higher VRAM, etc, you can imagine their care for gamers is even lesser. Probably right at the bottom of their priority list.
 
Top