• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5070 Ti Leak Tips More VRAM, Cores, and Power Draw

Joined
May 29, 2017
Messages
383 (0.14/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Zalman i3 Neo + Arctic P12
Audio Device(s) Airpulse A100 + Edifier T5
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
So approximately RTX 4080 Super + ~2-7% performance uplift and ~100$ off.

It's basically stagnation but an average nvidia buyer will swallow it easy without blinking. Even for the same price ~15-20% uplift is kinda small step for next gen GPU in average if we look at gpu generation history. RTX 4060, 4060Ti, RTX 4070, RTX 4070Ti was and are selling very well despite poor value p/p.
 
Last edited:

SRS

New Member
Joined
Oct 12, 2024
Messages
4 (0.06/day)
Until someone from the tech community decides to step up and take Nvidia on in the serious consumer GPU segment, all of the energy people spend in posting comments is wasted.

It is not at all impossible. It will, though, take a lot of money. "Oh, Apple won't be able to beat Intel. Intel has decades of expertise and even has its own leading fabs. Apple should stick to the Intel contract. The idea of using ARM designs for high performance is laughable and pitiable. What kind of expertise does Apple have in CPU design? Zero."

AMD could switch its role from enabling Nvidia to set prices to actually competing. That's the biggest barrier facing a would-be serious competitor... AMD's intentional sandbagging. However, even with that, AMD will still want to allocate as many of its wafers to enterprise as possible. There is space, right now, for a serious competitor which AMD has vacated and hasn't occupied for many years. Claims that there isn't enough market aren't supported when discontinued GPUs sell out so quickly, regardless of whether or not something like a mining craze is happening. The cards sell. If there were no market, they wouldn't.

It strikes me as weak that people are so excited about the 9800X3D, even though it's mostly an overclocked (increased power budget) variant of the 7800X3D and what people really need are more affordable powerful GPUs. Oh boy... a faster CPU to use with massively overpriced GPUs. What value!

There are so many comments claiming that people buy Nvidia cards because of the branding. That's not true. The main reason people buy them is because the alternatives aren't as good on a technical level. If I were to be given a chunk of Elon's fortune to create the Potato GPU corporation, releasing a card with Vicki Lawrence's Mama on the striped and polka dotted box, fake flowers and potpurri in the box with the GPU, influencer videos from me as the CEO mocking people for buying them — saying all their friends will make fun of them, and printing LAME! on the GPU shroud, they would sell out. Why? Because they'd offer better performance for less money than what Nvidia is offering, without the shortcomings. How?

1) More VRAM than Nvidia at each tier.
2) Competitive gaming performance per watt.
3) Better gaming performance per dollar.
4) Clearer naming strategy. No more Ti, Super, XT/XTX, products with the same name but different specs, products with a bigger number but worse performance, etc.
5) Each generation would be much better than the previous one in performance, not going down in performance per dollar especially.
6) Possibly moving the AI-oriented/RT-oriented hardware to a separate GPU, for a dual-GPU setup for those who care about those things. Possibly involving a new form factor to reduce latency.
7) Longer driver support than both Nvidia and AMD.
8) Better drivers out of the gate than Intel.
9) Top-end performance that's, at minimum, no lower than whatever Nvidia's top consumer card offers.
10) Serious committment to performance in AI workloads.
11) Excellent Linux support, not just Windows support.
12) Quiet cooling.

mama.jpeg


One doesn't need smoke and mirrors to sell a superior product.

There are endless comments trying to justify AMD's refusal to compete which is AMD's method of letting Nvidia set prices. (Soft collusion that also benefits Sony and MS by keeping "consoles" relevant.) They claim that there aren't enough customers to justify the creation of products, even though the 4090 was sold out for a long time. The argument that "consoles" are so good now (as compared to the pathetic Jaguar generation) has some merit but the video game market continues to expand, not contract. I would like to see good data showing that the serious ("enthusiast") PC gaming market is too small for a company to be able to make a profit whilst undercutting Nvidia — and that the market wouldn't expand if people were to be able to purchase better-value PC gaming equipment at the enthusiast level. Instead, what I've seen are comments that could be written by AMD and Nvidia. "Oh... woe is us... there's nothing we can do... Here's my money..." fatalism.

Enthusiasts are the people who care about hardware specs. The claim that they're blinded by "team" this and that has been shown to be untrue. Enthusiasts are not Dell, chained to one vendor. When a truly superior product becomes available, they will abandon everything else unless they're being paid to use the competition's. Enthusiasts are debating the 7800X3D vs 9800X3D for gaming. They aren't blinded by Intel's history of better performance (particularly Sandy Bridge—Skylake.)

Pointing to historical situations in which Nvidia was able to sell inferior products at a higher rate than AMD/ATI seems to point to inadequate marketing. But even then, ATI and AMD cards had drawbacks, like inadequate coolers. The cooler AMD used for the 290X was embarassingly underpowered and I believe I recall that ASUS released a Strix version that was defectively designed. The current state of the Internet makes it very easy to get the word out about a superior product. A certain tech video review site, for instance, has millions of YT followers. Don't tell me people aren't going to learn about the superior product and will instead buy blindly. I don't buy it. I also don't think serious gamers care about what generic imagery is on the box, and that includes the brand logo and color scheme.

If it were my company, I would ditch the archaic ATX form factor so that GPUs, which are the highest-wattage components by far, would have the form factor be about cooling them efficiently as the #1 design priority. Let's have some actual innovation (serious and committed) for once, instead of endless iteration of copy-cat products.

Anyway... my 1 cent. That's how much I have to rub together to get Potato GPU corporation off the ground. I'm not friends with the guys who build flaming moats.
 
Last edited:
Joined
Apr 14, 2018
Messages
699 (0.29/day)
Except every single time. isn't 3070 Ti as good as 2080 Ti,. and 4070 ti as good as 3080 Ti, same thing. It's entirely possible that the 5070 Ti is in close range to the 4090 at 1440p, which remains better at 4K and the saving grace for 4090 owners that are coming back for a 5090 this time and now caclulate when is the best time to sell. It really is a 3-4 year investment. Asking $100 more isn't making or breaking the deal.

A 5070ti is going to have 45% (?) less cuda cores, itll never come close to a 4090; absolutely no shot theres going to be a 40-50% generational ipc increase either.

It’ll be amazing if it’s even more than 2-3% faster than a 4080s. Going to be another generation of Nvidia giving you less for more.
 
Joined
Jan 18, 2021
Messages
194 (0.14/day)
Processor Core i7-12700
Motherboard MSI B660 MAG Mortar
Cooling Noctua NH-D15
Memory G.Skill Ripjaws V 64GB (4x16) DDR4-3600 CL16 @ 3466 MT/s
Video Card(s) AMD RX 6800
Storage Too many to list, lol
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply Corsair RM750x
Mouse Too many to list, lol
Keyboard Keychron low profile
Software Fedora, Mint
But that doesn't make any sense. That prediction would be in line with 3070Ti ~ Titan RTX, 4070Ti ~ 3090 Ti... but they're not. With the holding pattern you describe, the 5070Ti would be in place to touch tips with... the 4080 SUPER.
Yes, the 4090 is much stronger, relative to the rest of Ada's product stack, than previous flagships were. It seems hopelessly optimistic to expect the 70-Ti-matches-previous-flagship trend to hold, going forward.

Ampere was supposed to be good based on launch slides, but then, Nvidia ended up doubling shader count without actually doubling shader count (by calling half of them "dual issue"). Combine that with the fact that we didn't see a single model sold anywhere near MSRP through the entire life cycle of the generation, and we have the complete dumpster fire we call Ampere.

It's only exaggerated by the fact that Nvidia went full retard with prices on Ada, thinking that if gullible gamers are willing to pay thousands for a graphics card just because it comes in a green box, then maybe they should. The saddest part of it all is that time proved Nvidia right.
On the basis of performance-to-MSRP, Ampere initially looked good. We have to remember that the 20 series was widely regarded as a value turd at the time. 30-series looked like a promising return to form by contrast. Then of course the crypto shortages hit, destroying our brief moment of GPU zen. Then Ada released, at a high price premium and with its generational gains skewed towards the very top of the stack to an unprecedented degree--offering almost zilch in terms of perf-per-dollar relative to Ampere, at any price point below $1,000. The Super cards later improved that situation, but not by leaps and bounds. Either way, there was a solid year in there when Frame Gen was basically the entire selling point.

So yeah, I can see where Vayra's coming from; Ampere certainly won't win any awards on the VRAM front--but relative to what came before and since, the mental image of Ampere's intended stack seems like an unattainable ideal.

VRAM will continue to be a sore point, I suspect, because AI workloads are extremely VRAM-intensive. Nvidia therefore has a very keen incentive to use VRAM as a market segmentation mechanism. If GPUs were primarily about gaming, this probably wouldn't be an issue, at least not to anywhere near the same degree.

It is funny how things shift over time, though. 20 years ago, adding lots of VRAM to weak cards was a common and scummy marketing tactic--so much so that PC-Hardware/gaming communities grew to view VRAM as unimportant/overrated. That legacy, I believe, explains why we still see so many people insisting that e.g. 8 GB is just fine ("those cards are too weak to use more anyway!"), even despite the growing mountain of evidence demonstrating otherwise--even despite the fact that adding better textures to older games is perhaps the easiest way to jazz them up, even despite expansive modding communities creating VRAM intensive enhancements that can, in fact, run very well on lower end cards, even despite the fact that new-fangled technologies like RT and frame generation cost non-trivial amounts of VRAM. Now, if anything, the tables have turned. VRAM is under-specced and widely under-valued, except by the people who make money selling it, of course.
 
Joined
Jan 14, 2019
Messages
12,576 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I see that this thread is going about as well as usually such things do.

Anyway, rumors are all well and good, but what will matter is performance and price. I am not hugely optimistic, NV essentially has a captive market and can price at whatever the hell they think said market will bear, but we’ll see. Not too enthused about a potential TDP jump. I do realize that this is inevitable nowadays as a means to scrape every little bit of performance, but it’s not to my preference. Probably will end up being that you can limit the power significantly without losing much, but still.
I wouldn't want to resort to third party software tools to limit power to reasonable levels on a x70 class GPU, but each to their own.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I wouldn't want to resort to third party software tools to limit power to reasonable levels on a x70 class GPU
To me the class of card has very little to do with it, basically any modern Nvidia GPU can be easily and effectively undervolted to use less power and boost efficiency relative to stock form, I'd do it with a 4060 if I owned one.
 
Joined
Jan 14, 2019
Messages
12,576 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
To me the class of card has very little to do with it, basically any modern Nvidia GPU can be easily and effectively undervolted to use less power and boost efficiency relative to stock form, I'd do it with a 4060 if I owned one.
Each to their own, I suppose. :)

All my cards have been pretty reasonable with power out of the box so far, and I'd prefer it to keep it that way. Mainly because I'm on Linux now, so my tools for software tuning, especially with Nvidia, are limited. But anyway, I trust the engineers at AMD/Nvidia know what they're doing, so I don't feel any itch to tinker.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
so I don't feel any itch to tinker.
On my hand I can't possibly not tinker with basically everything I own lol, PC's and their components, cars, motorcycles, e-scooters, even some appliances.... lol. If it can be modified, overclocked, undervolted, optimised, or even simply aesthetically tweaked to my liking, don't even bother trying to stop me :D
 
Joined
Jan 14, 2019
Messages
12,576 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
On my hand I can't possibly not tinker with basically everything I own lol, PC's and their components, cars, motorcycles, e-scooters, even some appliances.... lol. If it can be modified, overclocked, undervolted, optimised, or even simply aesthetically tweaked to my liking, don't even bother trying to stop me :D
Don't get me wrong, I do like to tinker with PC hardware very much! :D

It's just that I don't think I could ever do anything in software to make a meaningful difference (I don't care about +-10%), so I'd rather not waste my time on fruitless efforts. :)
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Don't get me wrong, I do like to tinker with PC hardware very much! :D

It's just that I don't think I could ever do anything in software to make a meaningful difference
I'm starting after big gains, but I'll chase 10ths all day long too lol. Meaningful might be debatable, but I won't be happy till it's perfect.
 
Joined
Jun 19, 2024
Messages
130 (0.70/day)
Except every single time. isn't 3070 Ti as good as 2080 Ti,. and 4070 ti as good as 3080 Ti, same thing. It's entirely possible that the 5070 Ti is in close range to the 4090 at 1440p, which remains better at 4K and the saving grace for 4090 owners that are coming back for a 5090 this time and now caclulate when is the best time to sell. It really is a 3-4 year investment. Asking $100 more isn't making or breaking the deal.

Very, very few people that own a 4090 will have any need or desire to upgrade to a 5090.

Even at 4k a 4090 is cpu bound in many of the latest releases.
 
Joined
Jan 14, 2019
Messages
12,576 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
So you will get the perofmance of a 4080 for the price of a .. 4080.. just with a different name?
That's pretty much the idea, as it has been for the last two generations, I suppose.
 
Joined
Jul 31, 2024
Messages
185 (1.28/day)
Processor AMD Ryzen 7 5700X
Motherboard ASUS ROG Strix B550-F Gaming Wifi II
Cooling Noctua NH-U12S Redux
Memory 4x8G Teamgroup Vulcan Z DDR4; 3600MHz @ CL18
Video Card(s) MSI Ventus 2X GeForce RTX 3060 12GB
Storage WD_Black SN770, Leven JPS600, Toshiba DT01ACA
Display(s) Samsung ViewFinity S6
Case Fractal Design Pop Air TG
Power Supply Corsair CX750M
Mouse Corsair Harpoon RGB
Keyboard Keychron C2 Pro
VR HMD Valve Index
There are so many comments claiming that people buy Nvidia cards because of the branding. That's not true. The main reason people buy them is because the alternatives aren't as good on a technical level. If I were to be given a chunk of Elon's fortune to create the Potato GPU corporation, releasing a card with Vicki Lawrence's Mama on the striped and polka dotted box, fake flowers and potpurri in the box with the GPU, influencer videos from me as the CEO mocking people for buying them — saying all their friends will make fun of them, and printing LAME! on the GPU shroud, they would sell out. Why? Because they'd offer better performance for less money than what Nvidia is offering, without the shortcomings. How?

1) More VRAM than Nvidia at each tier.
2) Competitive gaming performance per watt.
3) Better gaming performance per dollar.
4) Clearer naming strategy. No more Ti, Super, XT/XTX, products with the same name but different specs, products with a bigger number but worse performance, etc.
5) Each generation would be much better than the previous one in performance, not going down in performance per dollar especially.
6) Possibly moving the AI-oriented/RT-oriented hardware to a separate GPU, for a dual-GPU setup for those who care about those things. Possibly involving a new form factor to reduce latency.
7) Longer driver support than both Nvidia and AMD.
8) Better drivers out of the gate than Intel.
9) Top-end performance that's, at minimum, no lower than whatever Nvidia's top consumer card offers.
10) Serious committment to performance in AI workloads.
11) Excellent Linux support, not just Windows support.
12) Quiet cooling.
---
There are endless comments trying to justify AMD's refusal to compete which is AMD's method of letting Nvidia set prices. (Soft collusion that also benefits Sony and MS by keeping "consoles" relevant.) They claim that there aren't enough customers to justify the creation of products, even though the 4090 was sold out for a long time. The argument that "consoles" are so good now (as compared to the pathetic Jaguar generation) has some merit but the video game market continues to expand, not contract.
---
If it were my company, I would ditch the archaic ATX form factor so that GPUs, which are the highest-wattage components by far, would have the form factor be about cooling them efficiently as the #1 design priority. Let's have some actual innovation (serious and committed) for once, instead of endless iteration of copy-cat products.
Are... are you serious? Are you for real? At what point in your critical process did you decide that, yeah, all of this is rational and makes sense and has no glaring logical issues? I can't figure out if I'm being prompted to chuckle in my seat at this or not.
 
Joined
Sep 15, 2011
Messages
6,761 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Question. Will this beat the 4090 or not??
 
Joined
Jan 14, 2019
Messages
12,576 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
There are so many comments claiming that people buy Nvidia cards because of the branding. That's not true. The main reason people buy them is because the alternatives aren't as good on a technical level. If I were to be given a chunk of Elon's fortune to create the Potato GPU corporation, releasing a card with Vicki Lawrence's Mama on the striped and polka dotted box, fake flowers and potpurri in the box with the GPU, influencer videos from me as the CEO mocking people for buying them — saying all their friends will make fun of them, and printing LAME! on the GPU shroud, they would sell out. Why? Because they'd offer better performance for less money than what Nvidia is offering, without the shortcomings. How?

1) More VRAM than Nvidia at each tier.
2) Competitive gaming performance per watt.
3) Better gaming performance per dollar.
4) Clearer naming strategy. No more Ti, Super, XT/XTX, products with the same name but different specs, products with a bigger number but worse performance, etc.
5) Each generation would be much better than the previous one in performance, not going down in performance per dollar especially.
6) Possibly moving the AI-oriented/RT-oriented hardware to a separate GPU, for a dual-GPU setup for those who care about those things. Possibly involving a new form factor to reduce latency.
7) Longer driver support than both Nvidia and AMD.
8) Better drivers out of the gate than Intel.
9) Top-end performance that's, at minimum, no lower than whatever Nvidia's top consumer card offers.
10) Serious committment to performance in AI workloads.
11) Excellent Linux support, not just Windows support.
12) Quiet cooling.

One doesn't need smoke and mirrors to sell a superior product.
Care to share the technical details? You've obviously got it figured out to the last transistor, and I'm curious.

Are... are you serious? Are you for real? At what point in your critical process did you decide that, yeah, all of this is rational and makes sense and has no glaring logical issues? I can't figure out if I'm being prompted to chuckle in my seat at this or not.
Ssh! You're talking to the greatest GPU mastermind of our age. Let him speak. ;)
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Until someone from the tech community decides to step up and take Nvidia on in the serious consumer GPU segment, all of the energy people spend in posting comments is wasted.

It is not at all impossible. It will, though, take a lot of money. "Oh, Apple won't be able to beat Intel. Intel has decades of expertise and even has its own leading fabs. Apple should stick to the Intel contract. The idea of using ARM designs for high performance is laughable and pitiable. What kind of expertise does Apple have in CPU design? Zero."

AMD could switch its role from enabling Nvidia to set prices to actually competing. That's the biggest barrier facing a would-be serious competitor... AMD's intentional sandbagging. However, even with that, AMD will still want to allocate as many of its wafers to enterprise as possible. There is space, right now, for a serious competitor which AMD has vacated and hasn't occupied for many years. Claims that there isn't enough market aren't supported when discontinued GPUs sell out so quickly, regardless of whether or not something like a mining craze is happening. The cards sell. If there were no market, they wouldn't.

It strikes me as weak that people are so excited about the 9800X3D, even though it's mostly an overclocked (increased power budget) variant of the 7800X3D and what people really need are more affordable powerful GPUs. Oh boy... a faster CPU to use with massively overpriced GPUs. What value!

There are so many comments claiming that people buy Nvidia cards because of the branding. That's not true. The main reason people buy them is because the alternatives aren't as good on a technical level. If I were to be given a chunk of Elon's fortune to create the Potato GPU corporation, releasing a card with Vicki Lawrence's Mama on the striped and polka dotted box, fake flowers and potpurri in the box with the GPU, influencer videos from me as the CEO mocking people for buying them — saying all their friends will make fun of them, and printing LAME! on the GPU shroud, they would sell out. Why? Because they'd offer better performance for less money than what Nvidia is offering, without the shortcomings. How?

1) More VRAM than Nvidia at each tier.
2) Competitive gaming performance per watt.
3) Better gaming performance per dollar.
4) Clearer naming strategy. No more Ti, Super, XT/XTX, products with the same name but different specs, products with a bigger number but worse performance, etc.
5) Each generation would be much better than the previous one in performance, not going down in performance per dollar especially.
6) Possibly moving the AI-oriented/RT-oriented hardware to a separate GPU, for a dual-GPU setup for those who care about those things. Possibly involving a new form factor to reduce latency.
7) Longer driver support than both Nvidia and AMD.
8) Better drivers out of the gate than Intel.
9) Top-end performance that's, at minimum, no lower than whatever Nvidia's top consumer card offers.
10) Serious committment to performance in AI workloads.
11) Excellent Linux support, not just Windows support.
12) Quiet cooling.

View attachment 375838

One doesn't need smoke and mirrors to sell a superior product.

There are endless comments trying to justify AMD's refusal to compete which is AMD's method of letting Nvidia set prices. (Soft collusion that also benefits Sony and MS by keeping "consoles" relevant.) They claim that there aren't enough customers to justify the creation of products, even though the 4090 was sold out for a long time. The argument that "consoles" are so good now (as compared to the pathetic Jaguar generation) has some merit but the video game market continues to expand, not contract. I would like to see good data showing that the serious ("enthusiast") PC gaming market is too small for a company to be able to make a profit whilst undercutting Nvidia — and that the market wouldn't expand if people were to be able to purchase better-value PC gaming equipment at the enthusiast level. Instead, what I've seen are comments that could be written by AMD and Nvidia. "Oh... woe is us... there's nothing we can do... Here's my money..." fatalism.

Enthusiasts are the people who care about hardware specs. The claim that they're blinded by "team" this and that has been shown to be untrue. Enthusiasts are not Dell, chained to one vendor. When a truly superior product becomes available, they will abandon everything else unless they're being paid to use the competition's. Enthusiasts are debating the 7800X3D vs 9800X3D for gaming. They aren't blinded by Intel's history of better performance (particularly Sandy Bridge—Skylake.)

Pointing to historical situations in which Nvidia was able to sell inferior products at a higher rate than AMD/ATI seems to point to inadequate marketing. But even then, ATI and AMD cards had drawbacks, like inadequate coolers. The cooler AMD used for the 290X was embarassingly underpowered and I believe I recall that ASUS released a Strix version that was defectively designed. The current state of the Internet makes it very easy to get the word out about a superior product. A certain tech video review site, for instance, has millions of YT followers. Don't tell me people aren't going to learn about the superior product and will instead buy blindly. I don't buy it. I also don't think serious gamers care about what generic imagery is on the box, and that includes the brand logo and color scheme.

If it were my company, I would ditch the archaic ATX form factor so that GPUs, which are the highest-wattage components by far, would have the form factor be about cooling them efficiently as the #1 design priority. Let's have some actual innovation (serious and committed) for once, instead of endless iteration of copy-cat products.

Anyway... my 1 cent. That's how much I have to rub together to get Potato GPU corporation off the ground. I'm not friends with the guys who build flaming moats.
Amen to this whole story. Especially the latter part. Hear hear.

There is a caveat to the marketing stance though. There is an overwhelming majority of market segment(s) that are not enthusiasts and they do fall for it. You see this in gaming too. If there were only enthusiast gamers... CoD and FIFA would not be this big, for example. And in the slipstream of the majority vote, come the followers who 'play this anyway because friends do it too'. Peer pressure is powerful. We only need to recall that South Park episode...

 
Last edited:
Low quality post by TSiAhmat
Joined
Nov 13, 2024
Messages
88 (2.26/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
Are... are you serious? Are you for real? At what point in your critical process did you decide that, yeah, all of this is rational and makes sense and has no glaring logical issues? I can't figure out if I'm being prompted to chuckle in my seat at this or not.
Phew... thought I was just stupid after my eyes glazing over reading his message ^^"
 
Joined
Dec 1, 2022
Messages
251 (0.33/day)
Until someone from the tech community decides to step up and take Nvidia on in the serious consumer GPU segment, all of the energy people spend in posting comments is wasted.

It is not at all impossible. It will, though, take a lot of money. "Oh, Apple won't be able to beat Intel. Intel has decades of expertise and even has its own leading fabs. Apple should stick to the Intel contract. The idea of using ARM designs for high performance is laughable and pitiable. What kind of expertise does Apple have in CPU design? Zero."

AMD could switch its role from enabling Nvidia to set prices to actually competing. That's the biggest barrier facing a would-be serious competitor... AMD's intentional sandbagging. However, even with that, AMD will still want to allocate as many of its wafers to enterprise as possible. There is space, right now, for a serious competitor which AMD has vacated and hasn't occupied for many years. Claims that there isn't enough market aren't supported when discontinued GPUs sell out so quickly, regardless of whether or not something like a mining craze is happening. The cards sell. If there were no market, they wouldn't.

It strikes me as weak that people are so excited about the 9800X3D, even though it's mostly an overclocked (increased power budget) variant of the 7800X3D and what people really need are more affordable powerful GPUs. Oh boy... a faster CPU to use with massively overpriced GPUs. What value!

There are so many comments claiming that people buy Nvidia cards because of the branding. That's not true. The main reason people buy them is because the alternatives aren't as good on a technical level. If I were to be given a chunk of Elon's fortune to create the Potato GPU corporation, releasing a card with Vicki Lawrence's Mama on the striped and polka dotted box, fake flowers and potpurri in the box with the GPU, influencer videos from me as the CEO mocking people for buying them — saying all their friends will make fun of them, and printing LAME! on the GPU shroud, they would sell out. Why? Because they'd offer better performance for less money than what Nvidia is offering, without the shortcomings. How?

1) More VRAM than Nvidia at each tier.
2) Competitive gaming performance per watt.
3) Better gaming performance per dollar.
4) Clearer naming strategy. No more Ti, Super, XT/XTX, products with the same name but different specs, products with a bigger number but worse performance, etc.
5) Each generation would be much better than the previous one in performance, not going down in performance per dollar especially.
6) Possibly moving the AI-oriented/RT-oriented hardware to a separate GPU, for a dual-GPU setup for those who care about those things. Possibly involving a new form factor to reduce latency.
7) Longer driver support than both Nvidia and AMD.
8) Better drivers out of the gate than Intel.
9) Top-end performance that's, at minimum, no lower than whatever Nvidia's top consumer card offers.
10) Serious committment to performance in AI workloads.
11) Excellent Linux support, not just Windows support.
12) Quiet cooling.

View attachment 375838

One doesn't need smoke and mirrors to sell a superior product.

There are endless comments trying to justify AMD's refusal to compete which is AMD's method of letting Nvidia set prices. (Soft collusion that also benefits Sony and MS by keeping "consoles" relevant.) They claim that there aren't enough customers to justify the creation of products, even though the 4090 was sold out for a long time. The argument that "consoles" are so good now (as compared to the pathetic Jaguar generation) has some merit but the video game market continues to expand, not contract. I would like to see good data showing that the serious ("enthusiast") PC gaming market is too small for a company to be able to make a profit whilst undercutting Nvidia — and that the market wouldn't expand if people were to be able to purchase better-value PC gaming equipment at the enthusiast level. Instead, what I've seen are comments that could be written by AMD and Nvidia. "Oh... woe is us... there's nothing we can do... Here's my money..." fatalism.

Enthusiasts are the people who care about hardware specs. The claim that they're blinded by "team" this and that has been shown to be untrue. Enthusiasts are not Dell, chained to one vendor. When a truly superior product becomes available, they will abandon everything else unless they're being paid to use the competition's. Enthusiasts are debating the 7800X3D vs 9800X3D for gaming. They aren't blinded by Intel's history of better performance (particularly Sandy Bridge—Skylake.)

Pointing to historical situations in which Nvidia was able to sell inferior products at a higher rate than AMD/ATI seems to point to inadequate marketing. But even then, ATI and AMD cards had drawbacks, like inadequate coolers. The cooler AMD used for the 290X was embarassingly underpowered and I believe I recall that ASUS released a Strix version that was defectively designed. The current state of the Internet makes it very easy to get the word out about a superior product. A certain tech video review site, for instance, has millions of YT followers. Don't tell me people aren't going to learn about the superior product and will instead buy blindly. I don't buy it. I also don't think serious gamers care about what generic imagery is on the box, and that includes the brand logo and color scheme.

If it were my company, I would ditch the archaic ATX form factor so that GPUs, which are the highest-wattage components by far, would have the form factor be about cooling them efficiently as the #1 design priority. Let's have some actual innovation (serious and committed) for once, instead of endless iteration of copy-cat products.

Anyway... my 1 cent. That's how much I have to rub together to get Potato GPU corporation off the ground. I'm not friends with the guys who build flaming moats.
I'm not even sure where to start, but blaming AMD for the greed and monopolization Nvidia has on the market is an interesting take, although unsurprising.
 
Last edited:
Joined
Dec 28, 2012
Messages
3,954 (0.90/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
A 5070ti is going to have 45% (?) less cuda cores, itll never come close to a 4090; absolutely no shot theres going to be a 40-50% generational ipc increase either.

It’ll be amazing if it’s even more than 2-3% faster than a 4080s. Going to be another generation of Nvidia giving you less for more.
Performance moving down 1 tier has been the norm for over a decade now. I doubt anyone will be upset at getting a $650-750 4080 super.

Well, let me rephrase, MOST people will not be upset. There will be those angry that nvidia doesnt give then a 4090 at $200 but, meh. Cant please everyone.
 

95Viper

Super Moderator
Staff member
Joined
Oct 12, 2008
Messages
13,050 (2.21/day)
Discuss the topic... not the members or their state of mind.
 
Joined
May 29, 2017
Messages
383 (0.14/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Zalman i3 Neo + Arctic P12
Audio Device(s) Airpulse A100 + Edifier T5
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
I doubt anyone will be upset at getting a $650-750 4080 super.
After things nvidia did to RTX 4000 series gpus i don't see such a low prices for RTX 5070 Ti. At least 800-900$ and those slaves with big math problems will be proud.
 
Last edited:
Joined
Sep 19, 2014
Messages
73 (0.02/day)
Man these things will be atrociously underpowered compared to their predecessors, 6% more shaders lol.
But core clocks is 2800-2900mhz
IPC gains and more memory bandwith

So it will be much faster

It means that 5070Ti will smoke 4070Ti.

And many of the comments in this thread are from guys running AMD GPU's lol..
Thats why Nvidia got so much hate.. by looking marketshare its have to be that many Amd users only speak lot or trash about Nvidia in forums atm.

Same story..
Low Vram
it gonna be bad Perf vs old gen
High price

Question. Will this beat the 4090 or not??
It will be close at least..
 

Glina

New Member
Joined
Oct 28, 2024
Messages
9 (0.16/day)
My base expectation is 10% more cores, 10% higher clock speed and 10% bump (at 4K mostly) from memory bandwidth. 1.1*1.1*1.1=1.33x performance. I think this is perfectly realistic and the only thing that can spoil the fun is price.
 
Joined
Mar 29, 2014
Messages
496 (0.13/day)
Does anyone really care? It will be what it is and Nvidia is going to charge whatever it wants. The MSRP would have to be $499 for it to matter to me. I'm much more impressed with the B580. Let's see where the vanilla 5060 lands. (not very hopeful)
 
Top