• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec

Joined
Nov 13, 2024
Messages
88 (2.44/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
I think there's about zero chance RTX 5090 will be below $2000. This is the price movement of RTX 4090, it went ballistic before they allegedly stopped production. But it's still widely available, so it's not scarcity.

This AI focus is going to be worse than cryptomadness, and people are still pretending it doesn't affect them.

View attachment 376039
for the love of god let the 8000 Series be bad at crypto, I meant AI oops
 
Last edited:
Joined
Sep 17, 2014
Messages
22,654 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
My kid loves gaming on his 8GB card. These are not AMD GPU's lol.. NV does things differently.
No, Nvidia also uses a framebuffer and cache, just like AMD does. And when its saturated, your game will stutter like mad. It is that simple. What Nvidia can and does do is limit card performance at the driver level so you don't see the stutter as much, and if you already use frame smoothing you'll see even less of it - but you also lose net performance. Doesn't always turn out well and won't truly eliminate the stutter either. Needs frequent Game Ready updates to remain functional. Its visible because the card starts cutting back on detail levels when the VRAM gets saturated.

So sure, if you have no clue what you're staring at, all is well, but its a simple fact that VRAM limits are hard limits.
 
Joined
Aug 2, 2012
Messages
2,012 (0.44/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7-11700
Motherboard ASRock Z590 Steel Legend
Cooling Noctua NH-D15S
Memory Crucial Ballistix 3200/C16 32GB
Video Card(s) Nvidia RTX 4070 Ti 12GB
Storage Crucial P5 Plus 2TB / Crucial P3 Plus 2TB / Crucial P3 Plus 4TB
Display(s) EIZO CX240
Case Lian-Li O11 Dynamic Evo XL / Noctua NF-A12x25 fans
Audio Device(s) Creative Sound Blaster ZXR / AKG K601 Headphones
Power Supply Seasonic PRIME Fanless TX-700
Mouse Logitech G500S
Keyboard Keychron Q6
Software Windows 10 Pro 64-Bit
Benchmark Scores None, as long as my games runs smooth.
Well 8 GB still enough for the low end model, otherwise it’s nice to see bumps, 32 GB is irrelevant though unless you do work with the 5090 that involves heavy vram usage. Apparently they still needed to go to 512 bit despite using GDDR7, which is way faster than G6X. GTX 280 says hello, that was the last time Nvidia used a 512 bit bus, completely different times.
Indeed, 8GB is still fine for 1080p60, which the 5060 is aimed at.

The 5070 Ti will be for 1440p120 and if you have 4K there will be the 5080/5090.

It's that simple, but apparently it isn't for most people.
 

AcE

Joined
Dec 3, 2024
Messages
243 (15.19/day)
No, Nvidia also uses a framebuffer and cache, just like AMD does. And when its saturated, your game will stutter like mad.
Correct, vram management on Nvidia cards is just a bit better, that's it. Largely Nvidia and AMD cards have a lot of parallels, and as these companies now compete since decades it makes sense their approaches to GPUs are related. Even Intel did nothing special, largely copied the other companies, more so with Nvidia and distinct RT units / AI units, basically Intel cards are nearly a 1:1 copy of the Turing architecture.
 
Joined
Sep 17, 2014
Messages
22,654 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I don't know what kind of games you guys are playing, but I have no issues running any game with an GTX 1080 with 8GB of RAM on 1080p. Like basically ANY game I've played is hovering over 50fps.... And that's with MAX Texture details.
Then you haven't been playing much that is recent and you are definitely cutting back on various settings even at 1080p. Been there done that mate. Don't lie to yourself. Max textures run fine. Post processing not so much. TW WH3 absolutely nukes a 1080 for example, its not smooth and its not pleasant especially on campaign map. And that's not even a new game; try anything UE5 and enjoy the stutterfest.
 

AcE

Joined
Dec 3, 2024
Messages
243 (15.19/day)
Then you haven't been playing much that is recent and you are definitely cutting back on various settings even at 1080p. Been there done that mate. Don't lie to yourself. Max textures run fine. Post processing not so much.
Yes, but to be honest a lot of gamers are like him, just play old stuff and/or don't care about Ultra settings, or even just play competitive games like Valorant that don't need much GPU power and especially not vram buffer. I would say he is a typical use case, and not a reviewer running a GPU through only AAA games with Ultra settings, which is a unrealistic use case.
 
Joined
May 11, 2018
Messages
1,292 (0.53/day)
256 shaders more, or what 512 in case of 4080 vanilla, it's not worth it, will maybe be 10, 20% tops faster. Nvidia is doing exactly this because they know they have 0 competition, otherwise this strategic move would be impossible btw. Otherwise the 5080 would be bigger and this "5080" would really be a 5070 Ti, and nothing else. Welcome to 100% monopoly in high end. Again same disaster like RTX 2000 times, but I think this time even worse.

You're forgetting the new AI-DLSS that will utilize the new AI-Tensor cores, and won't work on older generations.

And AI NPC acceleration tech that will also only work on new gen. Bunch of games will be announced, but basically nothing playable will be released before next generation comes out.

And bunch of other AI related stuff, Nvidia will fully embrace the future tech!

I expect a repetition of Turing, where RTX 2080 eas basically only as fast as GTX 1080 Ti in raster, was more expensive, but it had raytracing, and DLSS! Although before those two technologies natured even a bit, Turing card's were already obsolete and couldn't run them properly.

And, of course, you will be able to do non gaming related AI acceleration, and Nvidia will heavily focus on that.
 
Joined
Sep 17, 2014
Messages
22,654 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Yes, but to be honest a lot of gamers are like him, just play old stuff and/or don't care about Ultra settings, or even just play competitive games like Valorant that don't need much GPU power and especially not vram buffer. I would say he is a typical use case, and not a reviewer running a GPU through only AAA games with Ultra settings, which is a unrealistic use case.
And there is nothing wrong with that either. Until January last year I was playing on the 1080 too. It does the job, but you can certainly not push everything at max settings on that card anymore, even at 1080p. I think its important to remain objective, the card's getting long in the tooth.

The reason the 1080 lasted so long is the excellent balance between core and VRAM. It has as much bandwidth as a high end RDNA2 card or Ampere card and 8GB was perfect for it. Todays' x60 has half the bandwidth and more core power. You only fix so much of that with cache. And if the framebuffer is saturated, cache won't save you.

As always, x60 remains as the poor man's dGPU that really is just a waste of sand. E waste, built for people who can't or won't save for something half decent. We can sugar coat it, but it is what it is, and time proves that every single time. The 1060 6GB was an outlier in that sense, and ONLY because of its 6GB, going down history as the longest lasting x60 ever I think.
 
  • Like
Reactions: AcE

AcE

Joined
Dec 3, 2024
Messages
243 (15.19/day)
You're forgetting the new AI-DLSS that will utilize the new AI-Tensor cores, and won't work on older generations.
I did not hear anything about that, but DLSS is AI anyway and Tensor cores are also AI, always were, nothing new.
And AI NPC acceleration tech that will also only work on new gen.
I don't think so my friend, games must run on a variety of video cards so they can sell them, and not only the newest GPUs. :)

Todays' x60 has half the bandwidth and more core power. You only fix so much of that with cache.
Yes Cache is just to compensate the bandwidth that is gone by narrower bus, and it works well. All cards with big cache so far worked well, starting with RX 6000, then 7000, RTX 40 series copied the concept and improved it via L2 cache instead of L3 cache. Don't stress traditional bandwidth too much and instead look on "effective bandwidth", which is bandwidth including big cache bandwidth, which is the realistic usage of these cards.
As always, x60 remains as the poor man's dGPU that really is just a waste of sand.
I mean, not always, it's just like that since RTX 40 times and with AMD since RX 7000 times, it's now firmly "poor man's GPUs" (because now x6 is low end, before it was mid range still) especially with RX 7600 which is still on 6nm node and didn't even transfer to the 5nm node the rest of the generation uses from both companies.

Edit on x60 cards aging well: RX 480 / 580 also, which competed with 1060, had 8 GB versions, 3060 also was originally a 12 GB card which later got a nerfed 8 GB version, these all are also in the 1060 vein. But 1060 having 6 GB is nothing special in my books, it's a step down from 1070 that had 8 GB. The 3060 is more special here because it had *more* vram than 3070 / 3060 Ti. RX 580 also aged better past the 1060, card simply has more oomph.
 
Last edited:
Joined
May 11, 2018
Messages
1,292 (0.53/day)
I did not hear anything about that, but DLSS is AI anyway and Tensor cores are also AI, always were, nothing new.

I don't think so my friend, games must run on a variety of video cards so they can sell them, and not only the newest GPUs.

DLSS 3 frame generation still only works on RTX 40x0, so 4060 yes, 3090 Ti no - because they lack electrolytes.

Nvidia will do the same with all the newly introduced "AI" tech. Mark my words
 
Joined
Sep 17, 2014
Messages
22,654 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I mean, not always, it's just like that since RTX 40 times and with AMD since RX 7000 times, it's now firmly "poor man's GPUs" especially with RX 7600 which is still on 6nm node and didn't even transfer to the 5nm node the rest of the generation uses from both companies.
Yes, always. The x60 and x50ti were always 'blessed' with a poor man's memory subsystem. Asymmetric for example, or coming out in a half dozen OEM versions with handicapped memory even down to stuff as bad as being full blown DDR instead of GDDR back in the days. The bar has moved up, but for this segment of cards, it always moves in the most cost effective way. If it has some semblance of running half-decently it gets released then and there, screw everything else. What you also see in this segment of cards is just plain older architectures, though not as much of that today, until you start factoring in mobile chips.
 
Joined
Aug 21, 2013
Messages
1,936 (0.47/day)
Try to bring technical arguments instead of empty words and just drama queen talk.
You want it in a format i cant provide. I cannot and will not transcribe entire video for you.
I have TPU links below.
No, it didn't. 8 GB is way longer in the tooth as those ever were. 4 GB for example was quickly outdated because 4 GB is just a low amount, whereas 8 GB isn't. 6 GB was just "replaced" by 8 GB, it just vanished from the market. 2 GB had the same fate as 4 GB. Apples and kiwis.
Yes it did. And after 8GB is obsolete and gone from new cards (60 series?) then the same will happen with 10GB cards next. Then 12GB etc. It's inevitable.
Games constantly get more demanding.
4 GB already reached the critical zone years ago, while 8 GB is still far away from that, you're just technically wrong, and because you got no technical arguments your're just talking endlessly.
8GB is already in critical zone. Only people like you are still in denial.
Lazy? No, I don't like videos, text is way better to digest. You are lazy. It is YOUR argument, so make it or lose the argument, life is simple. So far you didn't refute any of my arguments, it's quite easy going for me.
You're the one trying to prove 8GB is "youtuber drama". The onus is not on me to disprove your delusions.
Good joke, Nvidia surely won't compare their cards with a company that has 0% market share. This is like you saying Apple will compare their phones with a brand that nobody is buying. Yea makes a ton of sense. ^^
Who cares what nvidia compares. What matters is what consumers and reviewers compare. Nobody's buying Intel? Sure. Keep telling yourself that. I guess all those cards they produced just vanished from the shelves all by themself?
I don't need luck, Nvidia will sell 5060 with 8 GB because 8 GB isn't a issue (given the rumours are true). Technical side (tech companies) is on mine. Aside from reviewers like W1zzard proving it.
Proving how? That he runs at Ultra settings without an accompanying frametime graph where 8GB cards get murdered?

I looked at all the performance benchmark reviews he has posted for this years games.
11 games in total. At 1080p max settings (tho not in all games and without RT or FG) the memory usage is average 7614 MB.
7 games stay below 8GB at those settings. 4 games go over it.
6 games are ran at lowest settings 1080p no RT/FG and despite that half of them (3) still go over 8GB even at these low settings.

Anyone looking at these numbers and seeing how close the average is to the 8GB limit should really be considering twice when buying a 8GB card today.
Next year likely more than half of the tested games will surpass 8GB even at 1080p low no RT/FG and you have to remember that RT and FG both increase VRAM usage even more. To say nothing of frametimes on those 8GB cards. Even if the entire buffer is not used up the frametimes already take a nosedive or in some cases textures simply refuse to load.

Links:
So why is he calling it a joke if 16 GB is so great? Maybe because 16 GB is 99% useless on a card that is mainly used for 1080p. :) 8 GB largely also works fine with 1440p, btw.
He's calling the 8GB 4060 Ti at 400 a joke. And it is. 4060 8GB at 300 is not any better.
Off topic. The drama here is about low end GPUs having 8 GB, 2016 was a 1080 and 1070, that's semi high end and upper mid range, so completely different cards that have nothing to do with this discussion other than saying "oh 8 GB was also used back then on completely different cards".
AMD had RX480 in 2016 with 8GB that's slower than 1060. So 8GB even back then was not for semi high end or upper midrange like you claim.
3070 also cost only 379 for 8GB which in 2016 was good price for 8GB.

Eight years later me and many other people expect more because prices have risen but 8GB remains.
 
Joined
Dec 14, 2011
Messages
1,083 (0.23/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Noctua NH-D15 G2
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage SAMSUNG 990 PRO 2TB
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 PRO - OPX Linear Switches
Software Microsoft Windows 11 - Enterprise (64-bit)
I don't know what kind of games you guys are playing, but I have no issues running any game with an GTX 1080 with 8GB of RAM on 1080p. Like basically ANY game I've played is hovering over 50fps.... And that's with MAX Texture details.

Not at 1080p, no, not as much, yet... DLSS reduces VRAM usage too if you use it, but at the same time, you will start needing a beefier CPU.

It's all a balance of things really and I think what people, me included, are tired of, is not being able to strike a balance because nGreedia limits us with VRAM and it forces our hands to upgrade every generation as developers keep pushing the boundaries, as they should, I don't blame them. I like to MOD games too, that can substantially increase the VRAM requirements.

I specifically upgraded to 1440p, as I finally thought, "this is it" things were finally looking up GPU performance wise, to be finally able to run games at this resolution with high refresh rates, and as I mentioned before, it also takes some strain of the CPU, so you can get away with a mid-range CPU, but oh boy, do I regret it thanks to nGreedia, I won't downgrade my resolution though.

Once again I will have to swallow a very very bitter pill and buy a new GPU, but I couldn't stomach my RTX3070Ti on 1440p anymore, I picked up a bad habbit because of it, I keep enabling/disabling my statistics overlay to see how close I am to the VRAM limit, this, so that I can close my game and re-launch it before stutter heaven occurs, or some sort of texture loading bug. urgh, damn you nGreedia, damn you.
 
Joined
Jun 25, 2020
Messages
163 (0.10/day)
System Name The New, Improved, Vicious, Stable, Silent Gaming Space Heater
Processor Ryzen 7 5800X3D
Motherboard MSI B450 Tomahawk Max
Cooling be quiet! DRP4 (w/ added SilentWings3 120mm), 4x Noctua A14x25G2 (3 @ front, 1 @ back)
Memory Teamgroup DDR4 3600 16GBx2 @18-22-22-22-42 -> 18-20-20-20-40
Video Card(s) PowerColor RX7900XTX HellHound
Storage ADATA SX8200Pro 1TB, Crucial P3+ 4TB (w/riser, @Gen2x4), Seagate 3+1TB HDD, Micron 5300 7.68TB SATA
Display(s) Gigabyte M27U @4K150Hz, AOC 24G2 @1080p100Hz(Max144Hz) vertical, ASUS VP228H@1080p60Hz vertical
Case Phanteks P600S
Audio Device(s) Creative Katana V2X gaming soundbar
Power Supply Seasonic Vertex GX-1200 (ATX3.0 compliant)
Mouse Razer Deathadder V3 wired
Keyboard Keychron Q6Max w/ Gateron G Pro 3.0 Black linear switches
I'm still a bit mad that my 3070 8GB didn't like Forza Motorsport 8 with high textures at 1080p. (No, I really tried to leave only one 1080p monitor connected. And no, RT related stuff is off. Everything else is high, not ultra.)
The game is manageable, but frequently dropped below 30fps. Probably a bit better with everything else low, but it will not relieve any of the frame drops. Medium textures will go well over 60 fps consistently, but it looks like early 2010s games. And I'm allergic to abundant upscaling artifacts on 1080p.
Now I have to choose from two very bad compromises: medium texture, or frequent frame drops.
Later patches should have relieved VRAM usage, but not before I angrily threw away the 3070.
(IIRC texture high, DLSS Quality would still do only 30fps on bad places, but this part is muddy memory.)
Also, edge cases will eventually become the norm. so, if whoever want to market their card as "GrEaT 1080p ChOiCe", I guess its either a minimum 10GB VRAM, or some compromises likely in texture (is this still a great 1080p choice then?).



Good joke, Nvidia surely won't compare their cards with a company that has 0% market share. This is like you saying Apple will compare their phones with a brand that nobody is buying. Yea makes a ton of sense. ^^
Xiaomi, anyone? No one is going to win by pricing offering same price-to-performance as US/EU/JP/KR branded products, but Xiaomi took over the Chinese phone and domestic electronic appliance market (and then some) by comically aggressive pricing and price-to-performance ratio over anyone else, including other Chinese brands. Most of them are practically spyware infested if you look at them in a certain way, Xiaomi devices (especially phones) doesn't make sense to me because spyware, and somehow Xiaomi is apparently #2 ~ #3 in global smartphone market in terms of monthly shipped devices (that is, Xiaomi did beat Apple in that metric albeit obviously inconsistently). I'm really really disgusted by this relevation, but I digress.


Here's what some call "drama queens" think about the B580.
Without going too far into the podcast, demand for B580 is much higher than the suppliers expected.

For 5060 to make sense, either:
beat B580 by a comfortable margin in 1080p while maintaining an okay price (good luck at 1440p)
out-price B580 (unlikely if it's still named 5060)
or price it similarly and compete with feature (DLSS4 maybe? I don't know. I generally hate that kind of stuff, but at this point I feel like an old guy yelling at clouds)
or compete with brand value alone (probably still work to a degree, if NVIDIA / shops play around with availablity with non-NVIDIA products.).
 
Joined
Sep 25, 2023
Messages
159 (0.35/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z / Swisssonic Audio 2
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent Nordic
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100 PERKELE!
You want it in a format i cant provide. I cannot and will not transcribe entire video for you.
I have TPU links below.

Yes it did. And after 8GB is obsolete and gone from new cards (60 series?) then the same will happen with 10GB cards next. Then 12GB etc. It's inevitable.
Games constantly get more demanding.

8GB is already in critical zone. Only people like you are still in denial.

You're the one trying to prove 8GB is "youtuber drama". The onus is not on me to disprove your delusions.

Who cares what nvidia compares. What matters is what consumers and reviewers compare. Nobody's buying Intel? Sure. Keep telling yourself that. I guess all those cards they produced just vanished from the shelves all by themself?

Proving how? That he runs at Ultra settings without an accompanying frametime graph where 8GB cards get murdered?

I looked at all the performance benchmark reviews he has posted for this years games.
11 games in total. At 1080p max settings (tho not in all games and without RT or FG) the memory usage is average 7614 MB.
7 games stay below 8GB at those settings. 4 games go over it.
6 games are ran at lowest settings 1080p no RT/FG and despite that half of them (3) still go over 8GB even at these low settings.

Anyone looking at these numbers and seeing how close the average is to the 8GB limit should really be considering twice when buying a 8GB card today.
Next year likely more than half of the tested games will surpass 8GB even at 1080p low no RT/FG and you have to remember that RT and FG both increase VRAM usage even more. To say nothing of frametimes on those 8GB cards. Even if the entire buffer is not used up the frametimes already take a nosedive or in some cases textures simply refuse to load.

Links:

He's calling the 8GB 4060 Ti at 400 a joke. And it is. 4060 8GB at 300 is not any better.

AMD had RX480 in 2016 with 8GB that's slower than 1060. So 8GB even back then was not for semi high end or upper midrange like you claim.
3070 also cost only 379 for 8GB which in 2016 was good price for 8GB.

Eight years later me and many other people expect more because prices have risen but 8GB remains.

Meanwhile me reloading to read the new comments.. I think he's dead already

Animated GIF
 

AcE

Joined
Dec 3, 2024
Messages
243 (15.19/day)
Yes, always.
No, you cannot say these are a waste of sand, that's a massive exaggeration. :) They were mid range, now they are low end, it is what it is. Calling them "waste of sand" just goes way too far. PC gaming got more expensive perhaps, that's what people wanted, because of the better graphics. That's also "it is what it is", evolution of times, with not enough tech companies like TSMC able to produce chips, chips will be more expensive.
You want it in a format i cant provide. I cannot and will not transcribe entire video for you.
Did not expect that, you can summarise it, there's various ways to talk.
And after 8GB is obsolete and gone from new cards (60 series?) then the same will happen with 10GB cards next. Then 12GB etc. It's inevitable.
So you basically agree with me then? Cool. I already said this is natural evolution. But 8 GB is still not at the end. Go check performance of 6500 XT and you will see what "end" means. :) You seem to lack footing in reality.
8GB is already in critical zone. Only people like you are still in denial.
It's not in critical zone, it's in the "it's enough" zone, which is a notch above it. And "denial" is not relevant for people like me, I use high end cards since 2014. :) This is purely a technical discussion to me. Not a emotional one as it is to you. :)
You're the one trying to prove 8GB is "youtuber drama". The onus is not on me to disprove your delusions.
"Trying"? I already did. Just because you're losing the argument doesn't mean you have to get mad and call me "delusional" btw. :)
Who cares what nvidia compares. What matters is what consumers and reviewers compare. Nobody's buying Intel? Sure. Keep telling yourself that.
Yes and the people will buy Nvidia, 90% and then AMD and Intel will get 5 and then later 0% like last time. Their products are just too far behind and their software stack is primitive.
Anyone looking at these numbers and seeing how close the average is to the 8GB limit should really be considering twice when buying a 8GB card today.
Doesn't make much sense, cause the people who buy those mostly don't have more money or they don't care about your edge cases, they go by fine with these video cards. :)
I checked all the links, 4060 has 0 issues in all the games. :) Reading and understanding seem to be 2 different things. The 4060 behaved perfectly normal in all those games, in fact. Thanks for proving all my points correct. :)
He's calling the 8GB 4060 Ti at 400 a joke. And it is. 4060 8GB at 300 is not any better.
No, he's calling the 4060 Ti with 16 GB a 500$ joke which is the topic of the video icon itself, everyone can see it. A fat upsell for something which brings you nearly 0% improvement aside from a few edge cases = burning money. Just buy a 4070, 7700 XT / 7800 XT instead.
AMD had RX480 in 2016 with 8GB that's slower than 1060. So 8GB even back then was not for semi high end or upper midrange like you claim.
3070 also cost only 379 for 8GB which in 2016 was good price for 8GB.
480 was firmly competitive with 1060. And your assessment is wrong. 1070 and 1080 are upper midrange and semi high end and used 8 GB vram, which is a historical fact btw and I won't debate this with you. :)
Meanwhile me reloading to read the new comments.. I think he's dead already
If you got nothing to provide in this discussion, maybe stay away? Trolling isn't great, and to be honest, this is quite the easy discussion for me, as I have already said. He's not refuting one single word of mine, nothing. :) To the contrary, he provided all the TPU links to prove me right, thanks a lot. =)
 

Outback Bronze

Super Moderator
Staff member
Joined
Aug 3, 2011
Messages
2,041 (0.42/day)
Location
Walkabout Creek
System Name Raptor Baked
Processor 14900k w.c.
Motherboard Z790 Hero
Cooling w.c.
Memory 48GB G.Skill 7200
Video Card(s) Zotac 4080 w.c.
Storage 2TB Kingston kc3k
Display(s) Samsung 34" G8
Case Corsair 460X
Audio Device(s) Onboard
Power Supply PCIe5 850w
Mouse Asus
Keyboard Corsair
Software Win 11
Benchmark Scores Cool n Quiet.
Joined
Sep 6, 2013
Messages
3,391 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Nvidia and Intel have the same philosophy about what the AVERAGE user needs. Nvidia will offer 8GB models because the AVERAGE user is at 1080p, medium, 30-60fps. Intel will offer UP TO 8 Performance cores, because 99% of apps the AVERAGE user uses, wouldn't take advantage of more than 8 performance cores anyway. And they can do it because what they lack in VRAM or P cores, they can replace with a shiny sticker. That's on CPUs for Intel. On GPUs that they are trying to get market share, they are the FIRST to offer a minimum of 10GBs of VRAM with TWO models that have an MSRP of lower than $250.

In any case, being Intel, Nvidia or even AMD, what the average fanboy(this is deliberate term to make a point not insult anyone) is doing and it's wrong, is to display loyalty to their favorite brand. Instead there should have been criticism. Nvidia fans should be screaming for having a 4060 out there, with 8GBs of VRAM, when there was a 3060 model with 12GBs of VRAM. Going back to 8GBs is a step backwards and NO ONE should try to justify this(yes but 4060 comes with Frame Generation blah blah blah). Intel fans should be screaming for more P cores, especially now that Intel dropped hyperthreading instead of finding excuses (but but but E cores are faster now blah blah blah). AMD fans should be screaming for seeing AMD using the 9800X3D to make the 7800X3D look like a bargain, when the 7800X3D was at $350 a few months ago and now it sells for $480, $40 over MSRP(but but but 7800X3D is still the second fastest CPU for gaming blah blah blah).
 
Joined
Sep 17, 2014
Messages
22,654 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
No, you cannot say these are a waste of sand, that's a massive exaggeration. :) They were mid range, now they are low end, it is what it is. Calling them "waste of sand" just goes way too far. PC gaming got more expensive perhaps, that's what people wanted, because of the better graphics. That's also "it is what it is", evolution of times, with not enough tech companies like TSMC able to produce chips, chips will be more expensive.
Its not an exaggeration. If you spend this kind of money on a GPU it should not go obsolete this fast. You're always better off spending a bit more so you can land at a well balanced piece of hardware, even an x70 and often times an x60ti was a much, much better buy. But really, x70 and up.

x60 is and was always a penny wise pound stupid purchase. Sure, you pay less, but there is no resale value when you want to upgrade because the card is now completely obsolete, whereas an x70 will net you half the purchase price 3-4 years down the line. And you're not paying double the money for an x70 either, but less than that.

So yes. x60 is e-waste, or put differently, PC gaming's hardware n00b trap. Save a bit more and you'll end up with better gaming and a more valuable product to sell... and fund another decent GPU with. An x60 is ready for the trash bin three times faster than an x70 tends to be.

It is, indeed, what it is, and there is a market of buyers for x60's, but it shouldn't be you ;) If you know a thing or two about this market, you should know you should avoid these cards.
 

AcE

Joined
Dec 3, 2024
Messages
243 (15.19/day)
Its not an exaggeration. If you spend this kind of money on a GPU it should not go obsolete this fast. You're always better off spending a bit more so you can land at a well balanced piece of hardware, even an x70 and often times an x60ti was a much, much better buy. But really, x70 and up.
No, this is normal. You pay more you get more, one of the basic laws of capitalism (unless you spend the money on trash, but not in this case). These are cards for the people who either can't afford more or don't need more, perfectly normal and not e-waste. Also, they all age normally, they are all good for years, there's still users who use 1060 8 years later today, so what you said is just factually wrong. You can call x70 cards better, but you can't say everything under it is e-waste, aside maybe from the 6500 XT, but even there, some people were perfectly happy with it, again, use cases are different. :)

Brother, I even know someone, a friend, who still uses a GTX 960 today and is happy enough with it. ;) Just get some perspective.
 
Joined
Sep 17, 2014
Messages
22,654 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
These are cards for the people who either can't afford more or don't need more
Precisely, so they think these are good purchases, but even they were better off buying a notch higher and then selling it off later. Because 5-6 years down the line, they'll repeat that same counterproductive practice and over a total lifespan of 2-3 GPUs they haven't spent less than I have buying a decent midranger and reselling it, then buying another.

I've lived this very thing for over a decade bro. I know what I'm talking about. Its all a matter of perspective and above all, experience. Its also simple math. And sure, if you don't ever upgrade and ride your x60 until it can barely run Windows then its great value. But then you're not gaming proper.
 
Joined
Sep 15, 2011
Messages
6,759 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Imagine telling people that their brand new overly expensive laptop, with its mobile RTX 4070 GPU and 8GB of VRAM sucks! :laugh: :laugh: :laugh:
And yet there is absolutely no game in existence that doesn't play properly on those laptops.
My cousin has one and use it as a multimedia/gaming station while on the 6 month ship voyage tour, and he is having better FPS on his laptop (1080p), then me with an RTX 3080 in 1440p. :)
Imagine that.
 

AcE

Joined
Dec 3, 2024
Messages
243 (15.19/day)
Precisely, so they think these are good purchases, but even they were better off buying a notch higher and then selling it off later
No, mostly these cards are just the right decisions for them. The only issue is that you have to lower details later, eh, graphics, there are way more important things in life. :) Game will still run easily and will look *good enough*.
I've lived this very thing for over a decade bro. I know what I'm talking about.
Yes and I also have >25 years of experience in IT (I'm 38 years old), so we should just agree to disagree. :) I think your stance with this is way too extreme. And I say this as high end owner.
 
Last edited:
Joined
Jun 25, 2020
Messages
163 (0.10/day)
System Name The New, Improved, Vicious, Stable, Silent Gaming Space Heater
Processor Ryzen 7 5800X3D
Motherboard MSI B450 Tomahawk Max
Cooling be quiet! DRP4 (w/ added SilentWings3 120mm), 4x Noctua A14x25G2 (3 @ front, 1 @ back)
Memory Teamgroup DDR4 3600 16GBx2 @18-22-22-22-42 -> 18-20-20-20-40
Video Card(s) PowerColor RX7900XTX HellHound
Storage ADATA SX8200Pro 1TB, Crucial P3+ 4TB (w/riser, @Gen2x4), Seagate 3+1TB HDD, Micron 5300 7.68TB SATA
Display(s) Gigabyte M27U @4K150Hz, AOC 24G2 @1080p100Hz(Max144Hz) vertical, ASUS VP228H@1080p60Hz vertical
Case Phanteks P600S
Audio Device(s) Creative Katana V2X gaming soundbar
Power Supply Seasonic Vertex GX-1200 (ATX3.0 compliant)
Mouse Razer Deathadder V3 wired
Keyboard Keychron Q6Max w/ Gateron G Pro 3.0 Black linear switches
Top