Monday, December 16th 2024

NVIDIA GeForce RTX 5070 Ti Leak Tips More VRAM, Cores, and Power Draw

It's an open secret by now that NVIDIA's GeForce RTX 5000 series GPUs are on the way, with an early 2025 launch on the cards. Now, preliminary details about the RTX 5070 Ti have leaked, revealing an increase in both VRAM and TDP and suggesting that the new upper mid-range GPU will finally address the increased VRAM demand from modern games. According to the leak from Wccftech, the RTX 5070 Ti will have 16 GB of GDDR7 VRAM, up from 12 GB on the RTX 4070 Ti, as we previously speculated. Also confirming previous leaks, the new sources confirm that the 5070 Ti will use the cut-down GB203 chip, although the new leak points to a significantly higher TBP of 350 W. The new memory configuration will supposedly run on a 256-bit memory bus and run at 28 Gbps for a total memory bandwidth of 896 GB/s, which is a significant boost over the RTX 4070 Ti.

Supposedly, the RTX 5070 Ti will also see a bump in total CUDA cores, from 7680 in the RTX 4070 Ti to 8960 in the RTX 5070 Ti. The new RTX 5070 Ti will also switch to the 12V-2x6 power connector, compared to the 16-pin connector from the 4070 Ti. NVIDIA is expected to announce the RTX 5000 series graphics cards at CES 2025 in early January, but the RTX 5070 Ti will supposedly be the third card in the 5000-series launch cycle. That said, leaks suggest that the 5070 Ti will still launch in Q1 2025, meaning we may see an indication of specs at CES 2025, although pricing is still unclear.

Update Dec 16th: Kopite7kimi, ubiquitous hardware leaker, has since responded to the RTX 5070 Ti leaks, stating that 350 W may be on the higher end for the RTX 5070 Ti: "...the latest data shows 285W. However, 350W is also one of the configs." This could mean that a TBP of 350 W is possible, although maybe only on certain graphics card models, if competition is strong, or in certain boost scenarios.
Sources: Wccftech, Kopite7kimi on X
Add your own comment

160 Comments on NVIDIA GeForce RTX 5070 Ti Leak Tips More VRAM, Cores, and Power Draw

#101
rv8000
N/AExcept every single time. isn't 3070 Ti as good as 2080 Ti,. and 4070 ti as good as 3080 Ti, same thing. It's entirely possible that the 5070 Ti is in close range to the 4090 at 1440p, which remains better at 4K and the saving grace for 4090 owners that are coming back for a 5090 this time and now caclulate when is the best time to sell. It really is a 3-4 year investment. Asking $100 more isn't making or breaking the deal.
A 5070ti is going to have 45% (?) less cuda cores, itll never come close to a 4090; absolutely no shot theres going to be a 40-50% generational ipc increase either.

It’ll be amazing if it’s even more than 2-3% faster than a 4080s. Going to be another generation of Nvidia giving you less for more.
Posted on Reply
#102
Wasteland
yfn_ratchetBut that doesn't make any sense. That prediction would be in line with 3070Ti ~ Titan RTX, 4070Ti ~ 3090 Ti... but they're not. With the holding pattern you describe, the 5070Ti would be in place to touch tips with... the 4080 SUPER.
Yes, the 4090 is much stronger, relative to the rest of Ada's product stack, than previous flagships were. It seems hopelessly optimistic to expect the 70-Ti-matches-previous-flagship trend to hold, going forward.
AusWolfAmpere was supposed to be good based on launch slides, but then, Nvidia ended up doubling shader count without actually doubling shader count (by calling half of them "dual issue"). Combine that with the fact that we didn't see a single model sold anywhere near MSRP through the entire life cycle of the generation, and we have the complete dumpster fire we call Ampere.

It's only exaggerated by the fact that Nvidia went full retard with prices on Ada, thinking that if gullible gamers are willing to pay thousands for a graphics card just because it comes in a green box, then maybe they should. The saddest part of it all is that time proved Nvidia right.
On the basis of performance-to-MSRP, Ampere initially looked good. We have to remember that the 20 series was widely regarded as a value turd at the time. 30-series looked like a promising return to form by contrast. Then of course the crypto shortages hit, destroying our brief moment of GPU zen. Then Ada released, at a high price premium and with its generational gains skewed towards the very top of the stack to an unprecedented degree--offering almost zilch in terms of perf-per-dollar relative to Ampere, at any price point below $1,000. The Super cards later improved that situation, but not by leaps and bounds. Either way, there was a solid year in there when Frame Gen was basically the entire selling point.

So yeah, I can see where Vayra's coming from; Ampere certainly won't win any awards on the VRAM front--but relative to what came before and since, the mental image of Ampere's intended stack seems like an unattainable ideal.

VRAM will continue to be a sore point, I suspect, because AI workloads are extremely VRAM-intensive. Nvidia therefore has a very keen incentive to use VRAM as a market segmentation mechanism. If GPUs were primarily about gaming, this probably wouldn't be an issue, at least not to anywhere near the same degree.

It is funny how things shift over time, though. 20 years ago, adding lots of VRAM to weak cards was a common and scummy marketing tactic--so much so that PC-Hardware/gaming communities grew to view VRAM as unimportant/overrated. That legacy, I believe, explains why we still see so many people insisting that e.g. 8 GB is just fine ("those cards are too weak to use more anyway!"), even despite the growing mountain of evidence demonstrating otherwise--even despite the fact that adding better textures to older games is perhaps the easiest way to jazz them up, even despite expansive modding communities creating VRAM intensive enhancements that can, in fact, run very well on lower end cards, even despite the fact that new-fangled technologies like RT and frame generation cost non-trivial amounts of VRAM. Now, if anything, the tables have turned. VRAM is under-specced and widely under-valued, except by the people who make money selling it, of course.
Posted on Reply
#103
AusWolf
OnasiI see that this thread is going about as well as usually such things do.

Anyway, rumors are all well and good, but what will matter is performance and price. I am not hugely optimistic, NV essentially has a captive market and can price at whatever the hell they think said market will bear, but we’ll see. Not too enthused about a potential TDP jump. I do realize that this is inevitable nowadays as a means to scrape every little bit of performance, but it’s not to my preference. Probably will end up being that you can limit the power significantly without losing much, but still.
I wouldn't want to resort to third party software tools to limit power to reasonable levels on a x70 class GPU, but each to their own.
Posted on Reply
#104
wolf
Better Than Native
AusWolfI wouldn't want to resort to third party software tools to limit power to reasonable levels on a x70 class GPU
To me the class of card has very little to do with it, basically any modern Nvidia GPU can be easily and effectively undervolted to use less power and boost efficiency relative to stock form, I'd do it with a 4060 if I owned one.
Posted on Reply
#105
AusWolf
wolfTo me the class of card has very little to do with it, basically any modern Nvidia GPU can be easily and effectively undervolted to use less power and boost efficiency relative to stock form, I'd do it with a 4060 if I owned one.
Each to their own, I suppose. :)

All my cards have been pretty reasonable with power out of the box so far, and I'd prefer it to keep it that way. Mainly because I'm on Linux now, so my tools for software tuning, especially with Nvidia, are limited. But anyway, I trust the engineers at AMD/Nvidia know what they're doing, so I don't feel any itch to tinker.
Posted on Reply
#106
wolf
Better Than Native
AusWolfso I don't feel any itch to tinker.
On my hand I can't possibly not tinker with basically everything I own lol, PC's and their components, cars, motorcycles, e-scooters, even some appliances.... lol. If it can be modified, overclocked, undervolted, optimised, or even simply aesthetically tweaked to my liking, don't even bother trying to stop me :D
Posted on Reply
#107
AusWolf
wolfOn my hand I can't possibly not tinker with basically everything I own lol, PC's and their components, cars, motorcycles, e-scooters, even some appliances.... lol. If it can be modified, overclocked, undervolted, optimised, or even simply aesthetically tweaked to my liking, don't even bother trying to stop me :D
Don't get me wrong, I do like to tinker with PC hardware very much! :D

It's just that I don't think I could ever do anything in software to make a meaningful difference (I don't care about +-10%), so I'd rather not waste my time on fruitless efforts. :)
Posted on Reply
#108
wolf
Better Than Native
AusWolfDon't get me wrong, I do like to tinker with PC hardware very much! :D

It's just that I don't think I could ever do anything in software to make a meaningful difference
I'm starting after big gains, but I'll chase 10ths all day long too lol. Meaningful might be debatable, but I won't be happy till it's perfect.
Posted on Reply
#109
Visible Noise
N/AExcept every single time. isn't 3070 Ti as good as 2080 Ti,. and 4070 ti as good as 3080 Ti, same thing. It's entirely possible that the 5070 Ti is in close range to the 4090 at 1440p, which remains better at 4K and the saving grace for 4090 owners that are coming back for a 5090 this time and now caclulate when is the best time to sell. It really is a 3-4 year investment. Asking $100 more isn't making or breaking the deal.
Very, very few people that own a 4090 will have any need or desire to upgrade to a 5090.

Even at 4k a 4090 is cpu bound in many of the latest releases.
Posted on Reply
#110
Xaled
So you will get the perofmance of a 4080 for the price of a .. 4080.. just with a different name?
Posted on Reply
#111
AusWolf
XaledSo you will get the perofmance of a 4080 for the price of a .. 4080.. just with a different name?
That's pretty much the idea, as it has been for the last two generations, I suppose.
Posted on Reply
#112
yfn_ratchet
SRSThere are so many comments claiming that people buy Nvidia cards because of the branding. That's not true. The main reason people buy them is because the alternatives aren't as good on a technical level. If I were to be given a chunk of Elon's fortune to create the Potato GPU corporation, releasing a card with Vicki Lawrence's Mama on the striped and polka dotted box, fake flowers and potpurri in the box with the GPU, influencer videos from me as the CEO mocking people for buying them — saying all their friends will make fun of them, and printing LAME! on the GPU shroud, they would sell out. Why? Because they'd offer better performance for less money than what Nvidia is offering, without the shortcomings. How?

1) More VRAM than Nvidia at each tier.
2) Competitive gaming performance per watt.
3) Better gaming performance per dollar.
4) Clearer naming strategy. No more Ti, Super, XT/XTX, products with the same name but different specs, products with a bigger number but worse performance, etc.
5) Each generation would be much better than the previous one in performance, not going down in performance per dollar especially.
6) Possibly moving the AI-oriented/RT-oriented hardware to a separate GPU, for a dual-GPU setup for those who care about those things. Possibly involving a new form factor to reduce latency.
7) Longer driver support than both Nvidia and AMD.
8) Better drivers out of the gate than Intel.
9) Top-end performance that's, at minimum, no lower than whatever Nvidia's top consumer card offers.
10) Serious committment to performance in AI workloads.
11) Excellent Linux support, not just Windows support.
12) Quiet cooling.
---
There are endless comments trying to justify AMD's refusal to compete which is AMD's method of letting Nvidia set prices. (Soft collusion that also benefits Sony and MS by keeping "consoles" relevant.) They claim that there aren't enough customers to justify the creation of products, even though the 4090 was sold out for a long time. The argument that "consoles" are so good now (as compared to the pathetic Jaguar generation) has some merit but the video game market continues to expand, not contract.
---
If it were my company, I would ditch the archaic ATX form factor so that GPUs, which are the highest-wattage components by far, would have the form factor be about cooling them efficiently as the #1 design priority. Let's have some actual innovation (serious and committed) for once, instead of endless iteration of copy-cat products.
Are... are you serious? Are you for real? At what point in your critical process did you decide that, yeah, all of this is rational and makes sense and has no glaring logical issues? I can't figure out if I'm being prompted to chuckle in my seat at this or not.
Posted on Reply
#113
Prima.Vera
Question. Will this beat the 4090 or not??
Posted on Reply
#114
AusWolf
SRSThere are so many comments claiming that people buy Nvidia cards because of the branding. That's not true. The main reason people buy them is because the alternatives aren't as good on a technical level. If I were to be given a chunk of Elon's fortune to create the Potato GPU corporation, releasing a card with Vicki Lawrence's Mama on the striped and polka dotted box, fake flowers and potpurri in the box with the GPU, influencer videos from me as the CEO mocking people for buying them — saying all their friends will make fun of them, and printing LAME! on the GPU shroud, they would sell out. Why? Because they'd offer better performance for less money than what Nvidia is offering, without the shortcomings. How?

1) More VRAM than Nvidia at each tier.
2) Competitive gaming performance per watt.
3) Better gaming performance per dollar.
4) Clearer naming strategy. No more Ti, Super, XT/XTX, products with the same name but different specs, products with a bigger number but worse performance, etc.
5) Each generation would be much better than the previous one in performance, not going down in performance per dollar especially.
6) Possibly moving the AI-oriented/RT-oriented hardware to a separate GPU, for a dual-GPU setup for those who care about those things. Possibly involving a new form factor to reduce latency.
7) Longer driver support than both Nvidia and AMD.
8) Better drivers out of the gate than Intel.
9) Top-end performance that's, at minimum, no lower than whatever Nvidia's top consumer card offers.
10) Serious committment to performance in AI workloads.
11) Excellent Linux support, not just Windows support.
12) Quiet cooling.

One doesn't need smoke and mirrors to sell a superior product.
Care to share the technical details? You've obviously got it figured out to the last transistor, and I'm curious.
yfn_ratchetAre... are you serious? Are you for real? At what point in your critical process did you decide that, yeah, all of this is rational and makes sense and has no glaring logical issues? I can't figure out if I'm being prompted to chuckle in my seat at this or not.
Ssh! You're talking to the greatest GPU mastermind of our age. Let him speak. ;)
Posted on Reply
#115
Vayra86
SRSUntil someone from the tech community decides to step up and take Nvidia on in the serious consumer GPU segment, all of the energy people spend in posting comments is wasted.

It is not at all impossible. It will, though, take a lot of money. "Oh, Apple won't be able to beat Intel. Intel has decades of expertise and even has its own leading fabs. Apple should stick to the Intel contract. The idea of using ARM designs for high performance is laughable and pitiable. What kind of expertise does Apple have in CPU design? Zero."

AMD could switch its role from enabling Nvidia to set prices to actually competing. That's the biggest barrier facing a would-be serious competitor... AMD's intentional sandbagging. However, even with that, AMD will still want to allocate as many of its wafers to enterprise as possible. There is space, right now, for a serious competitor which AMD has vacated and hasn't occupied for many years. Claims that there isn't enough market aren't supported when discontinued GPUs sell out so quickly, regardless of whether or not something like a mining craze is happening. The cards sell. If there were no market, they wouldn't.

It strikes me as weak that people are so excited about the 9800X3D, even though it's mostly an overclocked (increased power budget) variant of the 7800X3D and what people really need are more affordable powerful GPUs. Oh boy... a faster CPU to use with massively overpriced GPUs. What value!

There are so many comments claiming that people buy Nvidia cards because of the branding. That's not true. The main reason people buy them is because the alternatives aren't as good on a technical level. If I were to be given a chunk of Elon's fortune to create the Potato GPU corporation, releasing a card with Vicki Lawrence's Mama on the striped and polka dotted box, fake flowers and potpurri in the box with the GPU, influencer videos from me as the CEO mocking people for buying them — saying all their friends will make fun of them, and printing LAME! on the GPU shroud, they would sell out. Why? Because they'd offer better performance for less money than what Nvidia is offering, without the shortcomings. How?

1) More VRAM than Nvidia at each tier.
2) Competitive gaming performance per watt.
3) Better gaming performance per dollar.
4) Clearer naming strategy. No more Ti, Super, XT/XTX, products with the same name but different specs, products with a bigger number but worse performance, etc.
5) Each generation would be much better than the previous one in performance, not going down in performance per dollar especially.
6) Possibly moving the AI-oriented/RT-oriented hardware to a separate GPU, for a dual-GPU setup for those who care about those things. Possibly involving a new form factor to reduce latency.
7) Longer driver support than both Nvidia and AMD.
8) Better drivers out of the gate than Intel.
9) Top-end performance that's, at minimum, no lower than whatever Nvidia's top consumer card offers.
10) Serious committment to performance in AI workloads.
11) Excellent Linux support, not just Windows support.
12) Quiet cooling.



One doesn't need smoke and mirrors to sell a superior product.

There are endless comments trying to justify AMD's refusal to compete which is AMD's method of letting Nvidia set prices. (Soft collusion that also benefits Sony and MS by keeping "consoles" relevant.) They claim that there aren't enough customers to justify the creation of products, even though the 4090 was sold out for a long time. The argument that "consoles" are so good now (as compared to the pathetic Jaguar generation) has some merit but the video game market continues to expand, not contract. I would like to see good data showing that the serious ("enthusiast") PC gaming market is too small for a company to be able to make a profit whilst undercutting Nvidia — and that the market wouldn't expand if people were to be able to purchase better-value PC gaming equipment at the enthusiast level. Instead, what I've seen are comments that could be written by AMD and Nvidia. "Oh... woe is us... there's nothing we can do... Here's my money..." fatalism.

Enthusiasts are the people who care about hardware specs. The claim that they're blinded by "team" this and that has been shown to be untrue. Enthusiasts are not Dell, chained to one vendor. When a truly superior product becomes available, they will abandon everything else unless they're being paid to use the competition's. Enthusiasts are debating the 7800X3D vs 9800X3D for gaming. They aren't blinded by Intel's history of better performance (particularly Sandy Bridge—Skylake.)

Pointing to historical situations in which Nvidia was able to sell inferior products at a higher rate than AMD/ATI seems to point to inadequate marketing. But even then, ATI and AMD cards had drawbacks, like inadequate coolers. The cooler AMD used for the 290X was embarassingly underpowered and I believe I recall that ASUS released a Strix version that was defectively designed. The current state of the Internet makes it very easy to get the word out about a superior product. A certain tech video review site, for instance, has millions of YT followers. Don't tell me people aren't going to learn about the superior product and will instead buy blindly. I don't buy it. I also don't think serious gamers care about what generic imagery is on the box, and that includes the brand logo and color scheme.

If it were my company, I would ditch the archaic ATX form factor so that GPUs, which are the highest-wattage components by far, would have the form factor be about cooling them efficiently as the #1 design priority. Let's have some actual innovation (serious and committed) for once, instead of endless iteration of copy-cat products.

Anyway... my 1 cent. That's how much I have to rub together to get Potato GPU corporation off the ground. I'm not friends with the guys who build flaming moats.
Amen to this whole story. Especially the latter part. Hear hear.

There is a caveat to the marketing stance though. There is an overwhelming majority of market segment(s) that are not enthusiasts and they do fall for it. You see this in gaming too. If there were only enthusiast gamers... CoD and FIFA would not be this big, for example. And in the slipstream of the majority vote, come the followers who 'play this anyway because friends do it too'. Peer pressure is powerful. We only need to recall that South Park episode...

Posted on Reply
#116
Hecate91
SRSUntil someone from the tech community decides to step up and take Nvidia on in the serious consumer GPU segment, all of the energy people spend in posting comments is wasted.

It is not at all impossible. It will, though, take a lot of money. "Oh, Apple won't be able to beat Intel. Intel has decades of expertise and even has its own leading fabs. Apple should stick to the Intel contract. The idea of using ARM designs for high performance is laughable and pitiable. What kind of expertise does Apple have in CPU design? Zero."

AMD could switch its role from enabling Nvidia to set prices to actually competing. That's the biggest barrier facing a would-be serious competitor... AMD's intentional sandbagging. However, even with that, AMD will still want to allocate as many of its wafers to enterprise as possible. There is space, right now, for a serious competitor which AMD has vacated and hasn't occupied for many years. Claims that there isn't enough market aren't supported when discontinued GPUs sell out so quickly, regardless of whether or not something like a mining craze is happening. The cards sell. If there were no market, they wouldn't.

It strikes me as weak that people are so excited about the 9800X3D, even though it's mostly an overclocked (increased power budget) variant of the 7800X3D and what people really need are more affordable powerful GPUs. Oh boy... a faster CPU to use with massively overpriced GPUs. What value!

There are so many comments claiming that people buy Nvidia cards because of the branding. That's not true. The main reason people buy them is because the alternatives aren't as good on a technical level. If I were to be given a chunk of Elon's fortune to create the Potato GPU corporation, releasing a card with Vicki Lawrence's Mama on the striped and polka dotted box, fake flowers and potpurri in the box with the GPU, influencer videos from me as the CEO mocking people for buying them — saying all their friends will make fun of them, and printing LAME! on the GPU shroud, they would sell out. Why? Because they'd offer better performance for less money than what Nvidia is offering, without the shortcomings. How?

1) More VRAM than Nvidia at each tier.
2) Competitive gaming performance per watt.
3) Better gaming performance per dollar.
4) Clearer naming strategy. No more Ti, Super, XT/XTX, products with the same name but different specs, products with a bigger number but worse performance, etc.
5) Each generation would be much better than the previous one in performance, not going down in performance per dollar especially.
6) Possibly moving the AI-oriented/RT-oriented hardware to a separate GPU, for a dual-GPU setup for those who care about those things. Possibly involving a new form factor to reduce latency.
7) Longer driver support than both Nvidia and AMD.
8) Better drivers out of the gate than Intel.
9) Top-end performance that's, at minimum, no lower than whatever Nvidia's top consumer card offers.
10) Serious committment to performance in AI workloads.
11) Excellent Linux support, not just Windows support.
12) Quiet cooling.



One doesn't need smoke and mirrors to sell a superior product.

There are endless comments trying to justify AMD's refusal to compete which is AMD's method of letting Nvidia set prices. (Soft collusion that also benefits Sony and MS by keeping "consoles" relevant.) They claim that there aren't enough customers to justify the creation of products, even though the 4090 was sold out for a long time. The argument that "consoles" are so good now (as compared to the pathetic Jaguar generation) has some merit but the video game market continues to expand, not contract. I would like to see good data showing that the serious ("enthusiast") PC gaming market is too small for a company to be able to make a profit whilst undercutting Nvidia — and that the market wouldn't expand if people were to be able to purchase better-value PC gaming equipment at the enthusiast level. Instead, what I've seen are comments that could be written by AMD and Nvidia. "Oh... woe is us... there's nothing we can do... Here's my money..." fatalism.

Enthusiasts are the people who care about hardware specs. The claim that they're blinded by "team" this and that has been shown to be untrue. Enthusiasts are not Dell, chained to one vendor. When a truly superior product becomes available, they will abandon everything else unless they're being paid to use the competition's. Enthusiasts are debating the 7800X3D vs 9800X3D for gaming. They aren't blinded by Intel's history of better performance (particularly Sandy Bridge—Skylake.)

Pointing to historical situations in which Nvidia was able to sell inferior products at a higher rate than AMD/ATI seems to point to inadequate marketing. But even then, ATI and AMD cards had drawbacks, like inadequate coolers. The cooler AMD used for the 290X was embarassingly underpowered and I believe I recall that ASUS released a Strix version that was defectively designed. The current state of the Internet makes it very easy to get the word out about a superior product. A certain tech video review site, for instance, has millions of YT followers. Don't tell me people aren't going to learn about the superior product and will instead buy blindly. I don't buy it. I also don't think serious gamers care about what generic imagery is on the box, and that includes the brand logo and color scheme.

If it were my company, I would ditch the archaic ATX form factor so that GPUs, which are the highest-wattage components by far, would have the form factor be about cooling them efficiently as the #1 design priority. Let's have some actual innovation (serious and committed) for once, instead of endless iteration of copy-cat products.

Anyway... my 1 cent. That's how much I have to rub together to get Potato GPU corporation off the ground. I'm not friends with the guys who build flaming moats.
I'm not even sure where to start, but blaming AMD for the greed and monopolization Nvidia has on the market is an interesting take, although unsurprising.
Posted on Reply
#117
TheinsanegamerN
rv8000A 5070ti is going to have 45% (?) less cuda cores, itll never come close to a 4090; absolutely no shot theres going to be a 40-50% generational ipc increase either.

It’ll be amazing if it’s even more than 2-3% faster than a 4080s. Going to be another generation of Nvidia giving you less for more.
Performance moving down 1 tier has been the norm for over a decade now. I doubt anyone will be upset at getting a $650-750 4080 super.

Well, let me rephrase, MOST people will not be upset. There will be those angry that nvidia doesnt give then a 4090 at $200 but, meh. Cant please everyone.
Posted on Reply
#118
95Viper
Discuss the topic... not the members or their state of mind.
Posted on Reply
#119
Krit
TheinsanegamerNI doubt anyone will be upset at getting a $650-750 4080 super.
After things nvidia did to RTX 4000 series gpus i don't see such a low prices for RTX 5070 Ti. At least 800-900$ and those slaves with big math problems will be proud.
Posted on Reply
#120
Dawora
Vya DomusMan these things will be atrociously underpowered compared to their predecessors, 6% more shaders lol.
But core clocks is 2800-2900mhz
IPC gains and more memory bandwith

So it will be much faster
freeagentIt means that 5070Ti will smoke 4070Ti.

And many of the comments in this thread are from guys running AMD GPU's lol..
Thats why Nvidia got so much hate.. by looking marketshare its have to be that many Amd users only speak lot or trash about Nvidia in forums atm.

Same story..
Low Vram
it gonna be bad Perf vs old gen
High price
Prima.VeraQuestion. Will this beat the 4090 or not??
It will be close at least..
Posted on Reply
#121
Glina
My base expectation is 10% more cores, 10% higher clock speed and 10% bump (at 4K mostly) from memory bandwidth. 1.1*1.1*1.1=1.33x performance. I think this is perfectly realistic and the only thing that can spoil the fun is price.
Posted on Reply
#122
Redwoodz
Does anyone really care? It will be what it is and Nvidia is going to charge whatever it wants. The MSRP would have to be $499 for it to matter to me. I'm much more impressed with the B580. Let's see where the vanilla 5060 lands. (not very hopeful)
Posted on Reply
#123
GhostRyder
Most of that sounds fine (Spec wise), I am more concerned on the price because I will bet this is going to be around the $900 mark or higher. Every time a card gets a bump in specs in some way they bump the price up. Would not be surprised if the XX70 series starts to become the 1K price point cards (Meaning some below some above).
Posted on Reply
#124
Nostras
DaworaIt will be close at least..
It will be within 5% of the 4080 Super for sure. If it truly was close to the 4090 it would've been called the 5080.
Posted on Reply
#125
Kyan
RedwoodzDoes anyone really care? It will be what it is and Nvidia is going to charge whatever it wants. The MSRP would have to be $499 for it to matter to me. I'm much more impressed with the B580. Let's see where the vanilla 5060 lands. (not very hopeful)
Same, I really want to see what the B750 and B770 will put on the table. Nvidia need to lower price to appeal to me. I want them to prove that they've understand that people can't throw 700€ on a mid range gpu. 700 to 1500€ should be high tier to best card of the generation, not 1000 to 2000€
Posted on Reply
Add your own comment
Dec 23rd, 2024 19:36 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts