Monday, December 16th 2024

32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec

Zotac has apparently prematurely published webpages for the entire NVIDIA GeForce RTX 5000 series GPU line-up that will launch in January 2025. According to the leak, spotted by Videocardz, NVIDIA will launch a total of five RTX 5000 series GPUs next month, including the RTX 5090, 5080, 5070 Ti, 5070, and the China-only 5090D. The premature listing has seemingly been removed by Zotac, but screenshots taken by Videocardz confirm previously leaked details, including what appears to be a 32 GB Blackwell GPU.

It's unclear which GPU will feature 32 GB of VRAM, but it stands to reason that it will be either the 5090 or 5090D. Last time we checked in with the RTX 5070 Ti, leaks suggested it would have but 16 GB of GDDR7 VRAM, and there were murmurings of a 32 GB RTX 5090 back in September. Other leaks from Wccftech suggest that the likes of the RTX 5060 and 5060 Ti will pack 8 GB and 16 GB of GDDR7, respectively. While the 5090's alleged 32 GB frame buffer will likely make it more adept at machine learning and other non-gaming tasks, the VRAM bumps given to other, particularly Ti-spec, RTX 5000 GPUs should make them better suited for the ever-increasing demands from modern PC games.
Sources: VideoCardz, Wccftech
Add your own comment

173 Comments on 32 GB NVIDIA RTX 5090 To Lead the Charge As 5060 Ti Gets 16 GB Upgrade and 5060 Still Stuck With Last-Gen VRAM Spec

#51
Vayra86
freeagentMy kid loves gaming on his 8GB card. These are not AMD GPU's lol.. NV does things differently.
No, Nvidia also uses a framebuffer and cache, just like AMD does. And when its saturated, your game will stutter like mad. It is that simple. What Nvidia can and does do is limit card performance at the driver level so you don't see the stutter as much, and if you already use frame smoothing you'll see even less of it - but you also lose net performance. Doesn't always turn out well and won't truly eliminate the stutter either. Needs frequent Game Ready updates to remain functional. Its visible because the card starts cutting back on detail levels when the VRAM gets saturated.

So sure, if you have no clue what you're staring at, all is well, but its a simple fact that VRAM limits are hard limits.
Posted on Reply
#52
TheDeeGee
AcEWell 8 GB still enough for the low end model, otherwise it’s nice to see bumps, 32 GB is irrelevant though unless you do work with the 5090 that involves heavy vram usage. Apparently they still needed to go to 512 bit despite using GDDR7, which is way faster than G6X. GTX 280 says hello, that was the last time Nvidia used a 512 bit bus, completely different times.
Indeed, 8GB is still fine for 1080p60, which the 5060 is aimed at.

The 5070 Ti will be for 1440p120 and if you have 4K there will be the 5080/5090.

It's that simple, but apparently it isn't for most people.
Posted on Reply
#53
AcE
Vayra86No, Nvidia also uses a framebuffer and cache, just like AMD does. And when its saturated, your game will stutter like mad.
Correct, vram management on Nvidia cards is just a bit better, that's it. Largely Nvidia and AMD cards have a lot of parallels, and as these companies now compete since decades it makes sense their approaches to GPUs are related. Even Intel did nothing special, largely copied the other companies, more so with Nvidia and distinct RT units / AI units, basically Intel cards are nearly a 1:1 copy of the Turing architecture.
Posted on Reply
#54
Vayra86
Prima.VeraI don't know what kind of games you guys are playing, but I have no issues running any game with an GTX 1080 with 8GB of RAM on 1080p. Like basically ANY game I've played is hovering over 50fps.... And that's with MAX Texture details.
Then you haven't been playing much that is recent and you are definitely cutting back on various settings even at 1080p. Been there done that mate. Don't lie to yourself. Max textures run fine. Post processing not so much. TW WH3 absolutely nukes a 1080 for example, its not smooth and its not pleasant especially on campaign map. And that's not even a new game; try anything UE5 and enjoy the stutterfest.
Posted on Reply
#55
AcE
Vayra86Then you haven't been playing much that is recent and you are definitely cutting back on various settings even at 1080p. Been there done that mate. Don't lie to yourself. Max textures run fine. Post processing not so much.
Yes, but to be honest a lot of gamers are like him, just play old stuff and/or don't care about Ultra settings, or even just play competitive games like Valorant that don't need much GPU power and especially not vram buffer. I would say he is a typical use case, and not a reviewer running a GPU through only AAA games with Ultra settings, which is a unrealistic use case.
Posted on Reply
#56
Bwaze
AcE256 shaders more, or what 512 in case of 4080 vanilla, it's not worth it, will maybe be 10, 20% tops faster. Nvidia is doing exactly this because they know they have 0 competition, otherwise this strategic move would be impossible btw. Otherwise the 5080 would be bigger and this "5080" would really be a 5070 Ti, and nothing else. Welcome to 100% monopoly in high end. Again same disaster like RTX 2000 times, but I think this time even worse.
You're forgetting the new AI-DLSS that will utilize the new AI-Tensor cores, and won't work on older generations.

And AI NPC acceleration tech that will also only work on new gen. Bunch of games will be announced, but basically nothing playable will be released before next generation comes out.

And bunch of other AI related stuff, Nvidia will fully embrace the future tech!

I expect a repetition of Turing, where RTX 2080 eas basically only as fast as GTX 1080 Ti in raster, was more expensive, but it had raytracing, and DLSS! Although before those two technologies natured even a bit, Turing card's were already obsolete and couldn't run them properly.

And, of course, you will be able to do non gaming related AI acceleration, and Nvidia will heavily focus on that.
Posted on Reply
#57
Vayra86
AcEYes, but to be honest a lot of gamers are like him, just play old stuff and/or don't care about Ultra settings, or even just play competitive games like Valorant that don't need much GPU power and especially not vram buffer. I would say he is a typical use case, and not a reviewer running a GPU through only AAA games with Ultra settings, which is a unrealistic use case.
And there is nothing wrong with that either. Until January last year I was playing on the 1080 too. It does the job, but you can certainly not push everything at max settings on that card anymore, even at 1080p. I think its important to remain objective, the card's getting long in the tooth.

The reason the 1080 lasted so long is the excellent balance between core and VRAM. It has as much bandwidth as a high end RDNA2 card or Ampere card and 8GB was perfect for it. Todays' x60 has half the bandwidth and more core power. You only fix so much of that with cache. And if the framebuffer is saturated, cache won't save you.

As always, x60 remains as the poor man's dGPU that really is just a waste of sand. E waste, built for people who can't or won't save for something half decent. We can sugar coat it, but it is what it is, and time proves that every single time. The 1060 6GB was an outlier in that sense, and ONLY because of its 6GB, going down history as the longest lasting x60 ever I think.
Posted on Reply
#58
AcE
BwazeYou're forgetting the new AI-DLSS that will utilize the new AI-Tensor cores, and won't work on older generations.
I did not hear anything about that, but DLSS is AI anyway and Tensor cores are also AI, always were, nothing new.
BwazeAnd AI NPC acceleration tech that will also only work on new gen.
I don't think so my friend, games must run on a variety of video cards so they can sell them, and not only the newest GPUs. :)
Vayra86Todays' x60 has half the bandwidth and more core power. You only fix so much of that with cache.
Yes Cache is just to compensate the bandwidth that is gone by narrower bus, and it works well. All cards with big cache so far worked well, starting with RX 6000, then 7000, RTX 40 series copied the concept and improved it via L2 cache instead of L3 cache. Don't stress traditional bandwidth too much and instead look on "effective bandwidth", which is bandwidth including big cache bandwidth, which is the realistic usage of these cards.
Vayra86As always, x60 remains as the poor man's dGPU that really is just a waste of sand.
I mean, not always, it's just like that since RTX 40 times and with AMD since RX 7000 times, it's now firmly "poor man's GPUs" (because now x6 is low end, before it was mid range still) especially with RX 7600 which is still on 6nm node and didn't even transfer to the 5nm node the rest of the generation uses from both companies.

Edit on x60 cards aging well: RX 480 / 580 also, which competed with 1060, had 8 GB versions, 3060 also was originally a 12 GB card which later got a nerfed 8 GB version, these all are also in the 1060 vein. But 1060 having 6 GB is nothing special in my books, it's a step down from 1070 that had 8 GB. The 3060 is more special here because it had *more* vram than 3070 / 3060 Ti. RX 580 also aged better past the 1060, card simply has more oomph.
Posted on Reply
#59
Bwaze
AcEI did not hear anything about that, but DLSS is AI anyway and Tensor cores are also AI, always were, nothing new.

I don't think so my friend, games must run on a variety of video cards so they can sell them, and not only the newest GPUs.
DLSS 3 frame generation still only works on RTX 40x0, so 4060 yes, 3090 Ti no - because they lack electrolytes.

Nvidia will do the same with all the newly introduced "AI" tech. Mark my words
Posted on Reply
#60
Vayra86
AcEI mean, not always, it's just like that since RTX 40 times and with AMD since RX 7000 times, it's now firmly "poor man's GPUs" especially with RX 7600 which is still on 6nm node and didn't even transfer to the 5nm node the rest of the generation uses from both companies.
Yes, always. The x60 and x50ti were always 'blessed' with a poor man's memory subsystem. Asymmetric for example, or coming out in a half dozen OEM versions with handicapped memory even down to stuff as bad as being full blown DDR instead of GDDR back in the days. The bar has moved up, but for this segment of cards, it always moves in the most cost effective way. If it has some semblance of running half-decently it gets released then and there, screw everything else. What you also see in this segment of cards is just plain older architectures, though not as much of that today, until you start factoring in mobile chips.
Posted on Reply
#61
Tomorrow
AcETry to bring technical arguments instead of empty words and just drama queen talk.
You want it in a format i cant provide. I cannot and will not transcribe entire video for you.
I have TPU links below.
AcENo, it didn't. 8 GB is way longer in the tooth as those ever were. 4 GB for example was quickly outdated because 4 GB is just a low amount, whereas 8 GB isn't. 6 GB was just "replaced" by 8 GB, it just vanished from the market. 2 GB had the same fate as 4 GB. Apples and kiwis.
Yes it did. And after 8GB is obsolete and gone from new cards (60 series?) then the same will happen with 10GB cards next. Then 12GB etc. It's inevitable.
Games constantly get more demanding.
AcE4 GB already reached the critical zone years ago, while 8 GB is still far away from that, you're just technically wrong, and because you got no technical arguments your're just talking endlessly.
8GB is already in critical zone. Only people like you are still in denial.
AcELazy? No, I don't like videos, text is way better to digest. You are lazy. It is YOUR argument, so make it or lose the argument, life is simple. So far you didn't refute any of my arguments, it's quite easy going for me.
You're the one trying to prove 8GB is "youtuber drama". The onus is not on me to disprove your delusions.
AcEGood joke, Nvidia surely won't compare their cards with a company that has 0% market share. This is like you saying Apple will compare their phones with a brand that nobody is buying. Yea makes a ton of sense. ^^
Who cares what nvidia compares. What matters is what consumers and reviewers compare. Nobody's buying Intel? Sure. Keep telling yourself that. I guess all those cards they produced just vanished from the shelves all by themself?
AcEI don't need luck, Nvidia will sell 5060 with 8 GB because 8 GB isn't a issue (given the rumours are true). Technical side (tech companies) is on mine. Aside from reviewers like W1zzard proving it.
Proving how? That he runs at Ultra settings without an accompanying frametime graph where 8GB cards get murdered?

I looked at all the performance benchmark reviews he has posted for this years games.
11 games in total. At 1080p max settings (tho not in all games and without RT or FG) the memory usage is average 7614 MB.
7 games stay below 8GB at those settings. 4 games go over it.
6 games are ran at lowest settings 1080p no RT/FG and despite that half of them (3) still go over 8GB even at these low settings.

Anyone looking at these numbers and seeing how close the average is to the 8GB limit should really be considering twice when buying a 8GB card today.
Next year likely more than half of the tested games will surpass 8GB even at 1080p low no RT/FG and you have to remember that RT and FG both increase VRAM usage even more. To say nothing of frametimes on those 8GB cards. Even if the entire buffer is not used up the frametimes already take a nosedive or in some cases textures simply refuse to load.

Links:
www.techpowerup.com/review/horizon-forbidden-west-performance-benchmark/5.html
www.techpowerup.com/review/homeworld-3-benchmark/5.html
www.techpowerup.com/review/ghost-of-tsushima-benchmark/5.html
www.techpowerup.com/review/senuas-saga-hellblade-2-benchmark/5.html
www.techpowerup.com/review/black-myth-wukong-fps-performance-benchmark/5.html
www.techpowerup.com/review/star-wars-outlaws-fps-performance-benchmark/5.html
www.techpowerup.com/review/warhammer-40k-space-marine-2-fps-performance-benchmark/5.html
www.techpowerup.com/review/final-fantasy-xvi-fps-performance-benchmark/5.html
www.techpowerup.com/review/silent-hill-2-fps-performance-benchmark/5.html
www.techpowerup.com/review/dragon-age-the-veilguard-fps-performance-benchmark/5.html
www.techpowerup.com/review/stalker-2-fps-performance-benchmark/5.html
AcESo why is he calling it a joke if 16 GB is so great? Maybe because 16 GB is 99% useless on a card that is mainly used for 1080p. :) 8 GB largely also works fine with 1440p, btw.
He's calling the 8GB 4060 Ti at 400 a joke. And it is. 4060 8GB at 300 is not any better.
AcEOff topic. The drama here is about low end GPUs having 8 GB, 2016 was a 1080 and 1070, that's semi high end and upper mid range, so completely different cards that have nothing to do with this discussion other than saying "oh 8 GB was also used back then on completely different cards".
AMD had RX480 in 2016 with 8GB that's slower than 1060. So 8GB even back then was not for semi high end or upper midrange like you claim.
3070 also cost only 379 for 8GB which in 2016 was good price for 8GB.

Eight years later me and many other people expect more because prices have risen but 8GB remains.
Posted on Reply
#62
Legacy-ZA
Prima.VeraI don't know what kind of games you guys are playing, but I have no issues running any game with an GTX 1080 with 8GB of RAM on 1080p. Like basically ANY game I've played is hovering over 50fps.... And that's with MAX Texture details.
Not at 1080p, no, not as much, yet... DLSS reduces VRAM usage too if you use it, but at the same time, you will start needing a beefier CPU.

It's all a balance of things really and I think what people, me included, are tired of, is not being able to strike a balance because nGreedia limits us with VRAM and it forces our hands to upgrade every generation as developers keep pushing the boundaries, as they should, I don't blame them. I like to MOD games too, that can substantially increase the VRAM requirements.

I specifically upgraded to 1440p, as I finally thought, "this is it" things were finally looking up GPU performance wise, to be finally able to run games at this resolution with high refresh rates, and as I mentioned before, it also takes some strain of the CPU, so you can get away with a mid-range CPU, but oh boy, do I regret it thanks to nGreedia, I won't downgrade my resolution though.

Once again I will have to swallow a very very bitter pill and buy a new GPU, but I couldn't stomach my RTX3070Ti on 1440p anymore, I picked up a bad habbit because of it, I keep enabling/disabling my statistics overlay to see how close I am to the VRAM limit, this, so that I can close my game and re-launch it before stutter heaven occurs, or some sort of texture loading bug. urgh, damn you nGreedia, damn you.
Posted on Reply
#63
Todestrieb
I'm still a bit mad that my 3070 8GB didn't like Forza Motorsport 8 with high textures at 1080p. (No, I really tried to leave only one 1080p monitor connected. And no, RT related stuff is off. Everything else is high, not ultra.)
The game is manageable, but frequently dropped below 30fps. Probably a bit better with everything else low, but it will not relieve any of the frame drops. Medium textures will go well over 60 fps consistently, but it looks like early 2010s games. And I'm allergic to abundant upscaling artifacts on 1080p.
Now I have to choose from two very bad compromises: medium texture, or frequent frame drops.
Later patches should have relieved VRAM usage, but not before I angrily threw away the 3070.
(IIRC texture high, DLSS Quality would still do only 30fps on bad places, but this part is muddy memory.)
Also, edge cases will eventually become the norm. so, if whoever want to market their card as "GrEaT 1080p ChOiCe", I guess its either a minimum 10GB VRAM, or some compromises likely in texture (is this still a great 1080p choice then?).
[/HR]
Good joke, Nvidia surely won't compare their cards with a company that has 0% market share. This is like you saying Apple will compare their phones with a brand that nobody is buying. Yea makes a ton of sense. ^^
Xiaomi, anyone? No one is going to win by pricing offering same price-to-performance as US/EU/JP/KR branded products, but Xiaomi took over the Chinese phone and domestic electronic appliance market (and then some) by comically aggressive pricing and price-to-performance ratio over anyone else, including other Chinese brands. Most of them are practically spyware infested if you look at them in a certain way, Xiaomi devices (especially phones) doesn't make sense to me because spyware, and somehow Xiaomi is apparently #2 ~ #3 in global smartphone market in terms of monthly shipped devices(that is, Xiaomi did beat Apple in that metric albeit obviously inconsistently). I'm really really disgusted by this relevation, but I digress.
[/HR]
Here's what some call "drama queens" think about the B580.
Without going too far into the podcast, demand for B580 is much higher than the suppliers expected.

[/HR]
For 5060 to make sense, either:
beat B580 by a comfortable margin in 1080p while maintaining an okay price (good luck at 1440p)
out-price B580 (unlikely if it's still named 5060)
or price it similarly and compete with feature (DLSS4 maybe? I don't know. I generally hate that kind of stuff, but at this point I feel like an old guy yelling at clouds)
or compete with brand value alone (probably still work to a degree, if NVIDIA / shops play around with availablity with non-NVIDIA products.).
Posted on Reply
#64
RayneYoruka
TomorrowYou want it in a format i cant provide. I cannot and will not transcribe entire video for you.
I have TPU links below.

Yes it did. And after 8GB is obsolete and gone from new cards (60 series?) then the same will happen with 10GB cards next. Then 12GB etc. It's inevitable.
Games constantly get more demanding.

8GB is already in critical zone. Only people like you are still in denial.

You're the one trying to prove 8GB is "youtuber drama". The onus is not on me to disprove your delusions.

Who cares what nvidia compares. What matters is what consumers and reviewers compare. Nobody's buying Intel? Sure. Keep telling yourself that. I guess all those cards they produced just vanished from the shelves all by themself?

Proving how? That he runs at Ultra settings without an accompanying frametime graph where 8GB cards get murdered?

I looked at all the performance benchmark reviews he has posted for this years games.
11 games in total. At 1080p max settings (tho not in all games and without RT or FG) the memory usage is average 7614 MB.
7 games stay below 8GB at those settings. 4 games go over it.
6 games are ran at lowest settings 1080p no RT/FG and despite that half of them (3) still go over 8GB even at these low settings.

Anyone looking at these numbers and seeing how close the average is to the 8GB limit should really be considering twice when buying a 8GB card today.
Next year likely more than half of the tested games will surpass 8GB even at 1080p low no RT/FG and you have to remember that RT and FG both increase VRAM usage even more. To say nothing of frametimes on those 8GB cards. Even if the entire buffer is not used up the frametimes already take a nosedive or in some cases textures simply refuse to load.

Links:
www.techpowerup.com/review/horizon-forbidden-west-performance-benchmark/5.html
www.techpowerup.com/review/homeworld-3-benchmark/5.html
www.techpowerup.com/review/ghost-of-tsushima-benchmark/5.html
www.techpowerup.com/review/senuas-saga-hellblade-2-benchmark/5.html
www.techpowerup.com/review/black-myth-wukong-fps-performance-benchmark/5.html
www.techpowerup.com/review/star-wars-outlaws-fps-performance-benchmark/5.html
www.techpowerup.com/review/warhammer-40k-space-marine-2-fps-performance-benchmark/5.html
www.techpowerup.com/review/final-fantasy-xvi-fps-performance-benchmark/5.html
www.techpowerup.com/review/silent-hill-2-fps-performance-benchmark/5.html
www.techpowerup.com/review/dragon-age-the-veilguard-fps-performance-benchmark/5.html
www.techpowerup.com/review/stalker-2-fps-performance-benchmark/5.html

He's calling the 8GB 4060 Ti at 400 a joke. And it is. 4060 8GB at 300 is not any better.

AMD had RX480 in 2016 with 8GB that's slower than 1060. So 8GB even back then was not for semi high end or upper midrange like you claim.
3070 also cost only 379 for 8GB which in 2016 was good price for 8GB.

Eight years later me and many other people expect more because prices have risen but 8GB remains.
Meanwhile me reloading to read the new comments.. I think he's dead already

Posted on Reply
#65
AcE
Vayra86Yes, always.
No, you cannot say these are a waste of sand, that's a massive exaggeration. :) They were mid range, now they are low end, it is what it is. Calling them "waste of sand" just goes way too far. PC gaming got more expensive perhaps, that's what people wanted, because of the better graphics. That's also "it is what it is", evolution of times, with not enough tech companies like TSMC able to produce chips, chips will be more expensive.
TomorrowYou want it in a format i cant provide. I cannot and will not transcribe entire video for you.
Did not expect that, you can summarise it, there's various ways to talk.
TomorrowAnd after 8GB is obsolete and gone from new cards (60 series?) then the same will happen with 10GB cards next. Then 12GB etc. It's inevitable.
So you basically agree with me then? Cool. I already said this is natural evolution. But 8 GB is still not at the end. Go check performance of 6500 XT and you will see what "end" means. :) You seem to lack footing in reality.
Tomorrow8GB is already in critical zone. Only people like you are still in denial.
It's not in critical zone, it's in the "it's enough" zone, which is a notch above it. And "denial" is not relevant for people like me, I use high end cards since 2014. :) This is purely a technical discussion to me. Not a emotional one as it is to you. :)
TomorrowYou're the one trying to prove 8GB is "youtuber drama". The onus is not on me to disprove your delusions.
"Trying"? I already did. Just because you're losing the argument doesn't mean you have to get mad and call me "delusional" btw. :)
TomorrowWho cares what nvidia compares. What matters is what consumers and reviewers compare. Nobody's buying Intel? Sure. Keep telling yourself that.
Yes and the people will buy Nvidia, 90% and then AMD and Intel will get 5 and then later 0% like last time. Their products are just too far behind and their software stack is primitive.
TomorrowAnyone looking at these numbers and seeing how close the average is to the 8GB limit should really be considering twice when buying a 8GB card today.
Doesn't make much sense, cause the people who buy those mostly don't have more money or they don't care about your edge cases, they go by fine with these video cards. :)
TomorrowLinks:
I checked all the links, 4060 has 0 issues in all the games. :) Reading and understanding seem to be 2 different things. The 4060 behaved perfectly normal in all those games, in fact. Thanks for proving all my points correct. :)
TomorrowHe's calling the 8GB 4060 Ti at 400 a joke. And it is. 4060 8GB at 300 is not any better.
No, he's calling the 4060 Ti with 16 GB a 500$ joke which is the topic of the video icon itself, everyone can see it. A fat upsell for something which brings you nearly 0% improvement aside from a few edge cases = burning money. Just buy a 4070, 7700 XT / 7800 XT instead.
TomorrowAMD had RX480 in 2016 with 8GB that's slower than 1060. So 8GB even back then was not for semi high end or upper midrange like you claim.
3070 also cost only 379 for 8GB which in 2016 was good price for 8GB.
480 was firmly competitive with 1060. And your assessment is wrong. 1070 and 1080 are upper midrange and semi high end and used 8 GB vram, which is a historical fact btw and I won't debate this with you. :)
RayneYorukaMeanwhile me reloading to read the new comments.. I think he's dead already
If you got nothing to provide in this discussion, maybe stay away? Trolling isn't great, and to be honest, this is quite the easy discussion for me, as I have already said. He's not refuting one single word of mine, nothing. :) To the contrary, he provided all the TPU links to prove me right, thanks a lot. =)
Posted on Reply
#67
john_
Nvidia and Intel have the same philosophy about what the AVERAGE user needs. Nvidia will offer 8GB models because the AVERAGE user is at 1080p, medium, 30-60fps. Intel will offer UP TO 8 Performance cores, because 99% of apps the AVERAGE user uses, wouldn't take advantage of more than 8 performance cores anyway. And they can do it because what they lack in VRAM or P cores, they can replace with a shiny sticker. That's on CPUs for Intel. On GPUs that they are trying to get market share, they are the FIRST to offer a minimum of 10GBs of VRAM with TWO models that have an MSRP of lower than $250.

In any case, being Intel, Nvidia or even AMD, what the average fanboy(this is deliberate term to make a point not insult anyone) is doing and it's wrong, is to display loyalty to their favorite brand. Instead there should have been criticism. Nvidia fans should be screaming for having a 4060 out there, with 8GBs of VRAM, when there was a 3060 model with 12GBs of VRAM. Going back to 8GBs is a step backwards and NO ONE should try to justify this(yes but 4060 comes with Frame Generation blah blah blah). Intel fans should be screaming for more P cores, especially now that Intel dropped hyperthreading instead of finding excuses (but but but E cores are faster now blah blah blah). AMD fans should be screaming for seeing AMD using the 9800X3D to make the 7800X3D look like a bargain, when the 7800X3D was at $350 a few months ago and now it sells for $480, $40 over MSRP(but but but 7800X3D is still the second fastest CPU for gaming blah blah blah).
Posted on Reply
#68
Vayra86
AcENo, you cannot say these are a waste of sand, that's a massive exaggeration. :) They were mid range, now they are low end, it is what it is. Calling them "waste of sand" just goes way too far. PC gaming got more expensive perhaps, that's what people wanted, because of the better graphics. That's also "it is what it is", evolution of times, with not enough tech companies like TSMC able to produce chips, chips will be more expensive.
Its not an exaggeration. If you spend this kind of money on a GPU it should not go obsolete this fast. You're always better off spending a bit more so you can land at a well balanced piece of hardware, even an x70 and often times an x60ti was a much, much better buy. But really, x70 and up.

x60 is and was always a penny wise pound stupid purchase. Sure, you pay less, but there is no resale value when you want to upgrade because the card is now completely obsolete, whereas an x70 will net you half the purchase price 3-4 years down the line. And you're not paying double the money for an x70 either, but less than that.

So yes. x60 is e-waste, or put differently, PC gaming's hardware n00b trap. Save a bit more and you'll end up with better gaming and a more valuable product to sell... and fund another decent GPU with. An x60 is ready for the trash bin three times faster than an x70 tends to be.

It is, indeed, what it is, and there is a market of buyers for x60's, but it shouldn't be you ;) If you know a thing or two about this market, you should know you should avoid these cards.
Posted on Reply
#69
AcE
Vayra86Its not an exaggeration. If you spend this kind of money on a GPU it should not go obsolete this fast. You're always better off spending a bit more so you can land at a well balanced piece of hardware, even an x70 and often times an x60ti was a much, much better buy. But really, x70 and up.
No, this is normal. You pay more you get more, one of the basic laws of capitalism (unless you spend the money on trash, but not in this case). These are cards for the people who either can't afford more or don't need more, perfectly normal and not e-waste. Also, they all age normally, they are all good for years, there's still users who use 1060 8 years later today, so what you said is just factually wrong. You can call x70 cards better, but you can't say everything under it is e-waste, aside maybe from the 6500 XT, but even there, some people were perfectly happy with it, again, use cases are different. :)

Brother, I even know someone, a friend, who still uses a GTX 960 today and is happy enough with it. ;) Just get some perspective.
Posted on Reply
#70
Vayra86
AcEThese are cards for the people who either can't afford more or don't need more
Precisely, so they think these are good purchases, but even they were better off buying a notch higher and then selling it off later. Because 5-6 years down the line, they'll repeat that same counterproductive practice and over a total lifespan of 2-3 GPUs they haven't spent less than I have buying a decent midranger and reselling it, then buying another.

I've lived this very thing for over a decade bro. I know what I'm talking about. Its all a matter of perspective and above all, experience. Its also simple math. And sure, if you don't ever upgrade and ride your x60 until it can barely run Windows then its great value. But then you're not gaming proper.
Posted on Reply
#71
Prima.Vera
Imagine telling people that their brand new overly expensive laptop, with its mobile RTX 4070 GPU and 8GB of VRAM sucks! :laugh: :laugh: :laugh:
And yet there is absolutely no game in existence that doesn't play properly on those laptops.
My cousin has one and use it as a multimedia/gaming station while on the 6 month ship voyage tour, and he is having better FPS on his laptop (1080p), then me with an RTX 3080 in 1440p. :)
Imagine that.
Posted on Reply
#72
AcE
Vayra86Precisely, so they think these are good purchases, but even they were better off buying a notch higher and then selling it off later
No, mostly these cards are just the right decisions for them. The only issue is that you have to lower details later, eh, graphics, there are way more important things in life. :) Game will still run easily and will look *good enough*.
Vayra86I've lived this very thing for over a decade bro. I know what I'm talking about.
Yes and I also have >25 years of experience in IT (I'm 38 years old), so we should just agree to disagree. :) I think your stance with this is way too extreme. And I say this as high end owner.
Posted on Reply
#73
Todestrieb
Outback BronzeYou need to mod it: 16GB RTX 3070 Mod Shows Impressive Performance Gains | Tom's Hardware :)
I would have done that if the difficulty and cost is low enough …
Nah, no spare card at hand at that time.
[/HR]
Is it just me that felt hard done by 8GB VRAM ruining image quality/frame rate on 1080p non-ultra non-RT settings?? For the sin that I use supposedly high (non-ultra) texture settings??
Posted on Reply
#74
Xaled
LycanwolfenWhat was the price. 5999.00
Yeah the D in 5090D means Double... Double the price ...
Posted on Reply
#75
I_Want_Answers
AcE32 GB is irrelevant though unless you do work with the 5090 that involves heavy vram usage.
Yup, professional 3D modelers / sculptors and video editors will really appreciate the VRAM bump.
Posted on Reply
Add your own comment
Jan 5th, 2025 12:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts