• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Readies New GeForce RTX 30-series SKU Positioned Between RTX 3070 and RTX 3080

Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
It's not just the settings, but the general assets of the game, texture quality, etc. You can lower your settings, but you can't lower them indefinitely. The fact that you can still play games fine with 4 GB VRAM has nothing to do with it. I currently play The Witcher 3 on my GT 1030 2 GB (since I sold my GTX 1660 Ti) and only use about 1.5 GB with 1080p medium settings. Microsoft Flight Simulator would be a totally different story, I guess. You can't say that 4 GB is enough just because it's enough for you.

Besides, if you're happy to play at lower settings, then why is it an issue for you if a graphics card doesn't have hardware-accelerated RT? ;)

Not to mention, if you're lowering your settings right now, then what kind of future proofing are we talking about?

By your definition then the Radeon VII is the most future proofed GPU right now :roll: (except the 3090).
Future proofing means more than just VRAM, it's also the features set. Anyways the only people I know that correlate VRAM to performance are tech noobie.

Its not so much that, its that the first go at a new tech is never going to be worth it because what you buy it for its so new its not established yet and by the time it is, then we are 2 or so generations further which will be needed to handle it properly anyway.
RTX2000 series was the first go at RT, and even now with the 3000 series out RT is not even really a thing yet, so if you bought the RTX2000 series for RT then you were just being silly yet that is what you base your opinion on as to why the 5700XT was obsolete on release.

I really have no clue what you are talking about, I finished 4 RTX games just fine. Who said you need 100+ fps to enjoy single player games ? AMD owners ?:roll:
Btw I can probably max out CP2077 settings with my 2080 Ti and it would still be enjoyable.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,592 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
By your definition then the Radeon VII is the most future proofed GPU right now :roll: (except the 3090).
Future proofing means more than just VRAM, it's also the features set. Anyways the only people I know that correlate VRAM to performance are tech noobie.
Don't put words into my mouth. If I want to define something, I will.

All I'm saying is, you have to differentiate between features that your game requires, and features that you want. If you want to "future-proof" yourself with all the shine available, then of course you buy a high-tier GPU with all the current feature sets. There is nothing wrong with that - in fact, that's what I'm planning with my next build.

On the other hand, your game still needs certain processing power and an amount of RAM/VRAM even on the lowest settings. If you aim for just playing a game on basically any setting, then in this case, a graphics card with RT is not going to be any more future proof than one without. But you still need processing power and VRAM whether you use RT or not.

I hope I was clear enough this time.
 
Joined
Feb 20, 2019
Messages
8,341 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
See ? when you talk about RT and DLSS it's "only in a handful of games", yet they are obviously the future proofing features of that time. Do you know how many games require more than 10GB VRAM right now ? 0 ?
People who switch between price to performance then future proofing as they see fit really shouldn't comment on both matters.

Btw DX12 Ultimate is more than RT.
The best future-proofing by far is to buy something much closer to the performance/$ sweet spot and avoid the early-adopter/flagship/premium tax, then do the same thing again regularly, rather than blowing all of your saved budget once every 5-6 years.
  • Two years ago you could have bought a "future proof" 2080Ti for $1200, or (at the time) a $400 1070Ti and put $800 aside.
  • You'd still be using your "future proof" 2080Ti now, but if you'd bought the 1070Ti you could jump on the RTX3070 or 6800XL and still have enough cash leftover for another GPU two years from now.
  • Going into 2023 I'm pretty sure the 2080Ti is going to be on the struggle bus, Meanwhile, the sweet-spot route still has $350 in the kitty for some RDNA3 or RTX5000 series card...
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
14,019 (2.34/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Is anyone else tired of this forum and its users propagating misinformation? I mean, you link/support/show, and yet here we are...users droning on about vram...

Crying shame that the reviews are so good, but when you hit the forum it's polarizing, toxic fanboy driven drivel. :(
You can count me in the Truth Brigade. I’m tired of correcting the FUD about VRAM though. These people are the ones that will buy anything as long as it has enough of whatever it is they think will increase their epeen, truth be damned.
 
Joined
Feb 11, 2009
Messages
5,574 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
I really have no clue what you are talking about, I finished 4 RTX games just fine. Who said you need 100+ fps to enjoy single player games ? AMD owners ?:roll:
Btw I can probably max out CP2077 settings with my 2080 Ti and it would still be enjoyable.

Ok this discussion is getting kinda pointless, but to sum it up:
1. rx5700 accourding to you was outdated upon release, your reasoning for this is the lack of RT support.
2. The RTX2000 is the first itteration of RT and thus is very inefficient at it, poor performance and mediocre looks, RT is not ready for prime-time so buying an RTX2000 series for RT is just silly.
3. RTX3000 is out now, much better at it and it can finally sorta properly play that handful of RT titles currently out, I mean sure if you liked mediocre framerate on a 700+ dollar card....enjoy I guess but I would say that is a poor investment for that experience.

Basically the RX5700 is a perfect purchase for the time period, when RT is nothing more then a gimmick, a techdemo, something to experiment with
By the time it actually becomes available properly the RTX2000 series wont be able to deal with that so you will have to upgrade anyway, and by that time people with an RX5700 will also upgrade.

You have to understand, for games to actually REALLY go for RT, the world has to be able to make use of it, OR Nvidia would have to pump a TON of money in the developer to compensate them for the low amount of sales due to the limited customers that can actually buy/play the game.
It wont be until everyone, low, med and high end can do RT, including consoles, that games can be build on RT tech and again, by that time RTX2000, probably 3000 as well will be just as obsolete as the rx5700.

And probably max out CP2077 on an RTX2080Ti just means they are holding back on the RT in favor of a larger playerbase making use of it.
That is kinda my problem with the whole Vram debate, you see sure, maybe games dont need more then 8gb of Vram today for 4k gaming....but lets keep in mind that the only reason they dont need more is because they dont have more so developers cant make use of it so games are held back.
If all new cards had like 12gb and up, then devs could use higher quality/resolution assets because the cards out there in the hands of players can support it, basically if you are sticking the cards with a relatively low amount, then we are just held back in that regard for the near future, which is not a good thing imo which is why im disappointed in that.
 
Joined
Sep 3, 2019
Messages
3,586 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
I've played a few games that overtime filled the 8GB VRAM of the 5700XT. Wasnt a problem and I really didnt notice lag or anything weird. As many said, just because all available VRAM is filled, does not mean that the GPU or the game need all that data at once. Brings us to the point that 10GB is fine now or the next year.

But, nVIdia has played it dirty as usually does. Previous flagship, the 2080Ti has 11GB VRAM. Why on earth the next flagship and replacement has 1GB less? Because 3080 10GB is not what nVidia anounced.
The 3090 24GB is the flagship and replacement of 2080Ti 11GB.
nVidia could have gone with TSMC 7nm and build a real monster GPU. Create a 2080s 8GB replacement with at least 12GB (1 side of PCB fully filled) if not 24GB, call it 3080, with a performance +30% of a 2080Ti 11GB and give it an 800$ price tag. 7nm could have give nVidia headroom to uplift performance another 30% from 3080 12/24GB and create the 3080Ti with 20/24GB, give it a price tag around 1000~1100$ and call it a day... RDNA2 could not reach that level of performance hence the crown and the 1000++ price is justified for the fastest GPU in the world with +60% perf above the 2080Ti. I bet those cards would sell like hot cakes.

Instead nVidia...
Deliberately chopped VRAM off the 3080 and make it 10. Call that a flagship GPU with 1GB less than previous...
Has created another GPU with insingificant perf uplift, call it 3090 instead of 3080Ti, to present as a whole other tier when in reality is not, gave it 24GB and disquise it under the "Titan level performance" just to give it the 1500$ price, and mislead users to a fake 1000$ Titan discount. When in reality is a 2080Ti replacement with a +300~400$ price hike.
Because was also planning to promote a 3080s/Ti 20GB with +5% and a 1000+$ price tag.

But we all know the outcome...

If the 3090 is a Titan call it a Titan, give it Titan performance and Titan drivers and be done with it.
But no...
The master of deceptive marketing strikes again, and making plans to maximize profit margins without giving real performance uplifts beyond 3080 10GB, but taking a few hundread $$$ for every +5% of perf uplift.

The dumbest (rather cheap) decision of all was the choise of Samsungs 8nm(10++++) which lead them to terrible(small) perf/watt gains over Turing and that lead them to push TBP to unpresedent levels of power draw (for nVIdia) and... guess what... lead them to over sized and over engineered and expensive coolers for both FE and AIBs. Thus with the combination of GDDR6X diminishes profit for the 700~800$ price range. Thats why they wanted the 3080s/Ti 20GB and the fake 3080Titan/3090 24GB.

That was a real good plan and business.
 
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Ok this discussion is getting kinda pointless, but to sum it up:
1. rx5700 accourding to you was outdated upon release, your reasoning for this is the lack of RT support.
2. The RTX2000 is the first itteration of RT and thus is very inefficient at it, poor performance and mediocre looks, RT is not ready for prime-time so buying an RTX2000 series for RT is just silly.
3. RTX3000 is out now, much better at it and it can finally sorta properly play that handful of RT titles currently out, I mean sure if you liked mediocre framerate on a 700+ dollar card....enjoy I guess but I would say that is a poor investment for that experience.

Basically the RX5700 is a perfect purchase for the time period, when RT is nothing more then a gimmick, a techdemo, something to experiment with
By the time it actually becomes available properly the RTX2000 series wont be able to deal with that so you will have to upgrade anyway, and by that time people with an RX5700 will also upgrade.

You have to understand, for games to actually REALLY go for RT, the world has to be able to make use of it, OR Nvidia would have to pump a TON of money in the developer to compensate them for the low amount of sales due to the limited customers that can actually buy/play the game.
It wont be until everyone, low, med and high end can do RT, including consoles, that games can be build on RT tech and again, by that time RTX2000, probably 3000 as well will be just as obsolete as the rx5700.

And probably max out CP2077 on an RTX2080Ti just means they are holding back on the RT in favor of a larger playerbase making use of it.
That is kinda my problem with the whole Vram debate, you see sure, maybe games dont need more then 8gb of Vram today for 4k gaming....but lets keep in mind that the only reason they dont need more is because they dont have more so developers cant make use of it so games are held back.
If all new cards had like 12gb and up, then devs could use higher quality/resolution assets because the cards out there in the hands of players can support it, basically if you are sticking the cards with a relatively low amount, then we are just held back in that regard for the near future, which is not a good thing imo which is why im disappointed in that.

Well for you RTX, DLSS, VRS, Mesh Shaders are all gimmick, yet bigger VRAM than necessary is not a gimmick :roll: . How many game benefit from >10GB VRAM right now ? that's 0.

Turing GPUs are very capable of handling RT and Ultra details at 1440p and below, not sure why you declare it dead when 4K gaming is still a long ways off.

Ultra details which require massive VRAM are rarely worth it visually, rasterization has long past its diminishing return. That why everyone is flocking to RT now, a less costly RT effect like Reflection and Transparent Reflection can make bigger visual impact than Ultra detail.


The best future-proofing by far is to buy something much closer to the performance/$ sweet spot and avoid the early-adopter/flagship/premium tax, then do the same thing again regularly, rather than blowing all of your saved budget once every 5-6 years.
  • Two years ago you could have bought a "future proof" 2080Ti for $1200, or (at the time) a $400 1070Ti and put $800 aside.
  • You'd still be using your "future proof" 2080Ti now, but if you'd bought the 1070Ti you could jump on the RTX3070 or 6800XL and still have enough cash leftover for another GPU two years from now.
  • Going into 2023 I'm pretty sure the 2080Ti is going to be on the struggle bus, Meanwhile, the sweet-spot route still has $350 in the kitty for some RDNA3 or RTX5000 series card...

I agree, so why people are bothered with future proofing 3070/3080 when they are just gonna upgrading to RTX 4000 anyways :D. Adding useless VRAM would just destroy 3070/3080 price to performance advantage.

I've played a few games that overtime filled the 8GB VRAM of the 5700XT. Wasnt a problem and I really didnt notice lag or anything weird. As many said, just because all available VRAM is filled, does not mean that the GPU or the game need all that data at once. Brings us to the point that 10GB is fine now or the next year.

But, nVIdia has played it dirty as usually does. Previous flagship, the 2080Ti has 11GB VRAM. Why on earth the next flagship and replacement has 1GB less? Because 3080 10GB is not what nVidia anounced.
The 3090 24GB is the flagship and replacement of 2080Ti 11GB.
nVidia could have gone with TSMC 7nm and build a real monster GPU. Create a 2080s 8GB replacement with at least 12GB (1 side of PCB fully filled) if not 24GB, call it 3080, with a performance +30% of a 2080Ti 11GB and give it an 800$ price tag. 7nm could have give nVidia headroom to uplift performance another 30% from 3080 12/24GB and create the 3080Ti with 20/24GB, give it a price tag around 1000~1100$ and call it a day... RDNA2 could not reach that level of performance hence the crown and the 1000++ price is justified for the fastest GPU in the world with +60% perf above the 2080Ti. I bet those cards would sell like hot cakes.

Instead nVidia...
Deliberately chopped VRAM off the 3080 and make it 10. Call that a flagship GPU with 1GB less than previous...
Has created another GPU with insingificant perf uplift, call it 3090 instead of 3080Ti, to present as a whole other tier when in reality is not, gave it 24GB and disquise it under the "Titan level performance" just to give it the 1500$ price, and mislead users to a fake 1000$ Titan discount. When in reality is a 2080Ti replacement with a +300~400$ price hike.
Because was also planning to promote a 3080s/Ti 20GB with +5% and a 1000+$ price tag.

But we all know the outcome...

If the 3090 is a Titan call it a Titan, give it Titan performance and Titan drivers and be done with it.
But no...
The master of deceptive marketing strikes again, and making plans to maximize profit margins without giving real performance uplifts beyond 3080 10GB, but taking a few hundread $$$ for every +5% of perf uplift.

The dumbest (rather cheap) decision of all was the choise of Samsungs 8nm(10++++) which lead them to terrible(small) perf/watt gains over Turing and that lead them to push TBP to unpresedent levels of power draw (for nVIdia) and... guess what... lead them to over sized and over engineered and expensive coolers for both FE and AIBs. Thus with the combination of GDDR6X diminishes profit for the 700~800$ price range. Thats why they wanted the 3080s/Ti 20GB and the fake 3080Titan/3090 24GB.

That was a real good plan and business.

So you prefer another 1200usd 3080 Ti 11GB ?
Not sure if it's a better sell better than the 3080 at 700usd, probably a bit better than 3090.
People whine about Turing high price points too, but what are high end PC gamers gonna do, buy an AMD GPU ? :roll:. Let just hope Big Navi bring big performance back to the table so high end PC gamers will have more choices this time.
 
Last edited:
Joined
Mar 18, 2015
Messages
2,963 (0.83/day)
Location
Long Island
Nvidia must be bloody terrified of what AMD has if its moving GA102 down to 70 Ti levels

I'd guess as terrified as they were for the 7xx, 9xx, 10xx and 20xx series ... the fact they they are dropping the VRAM versions says nothing other than ... "it wasn't bringing anything to the table". AMD hasn't had a horse in the race since the 6xx series. AMD slogan should "AMD is gonna".... the 290 / 290x lost to the 780 with both cards overclocked, never threatened the Ti ... the 970 outsold all AMD cards combined by more than 2 to 1 ... the Fury, Vega and Radeon VII were a flop. I'd love to see it be competitive ... it's not like when you saw the Globetrotters play the Generals you expected a close game ... but every new generations ... it's AMD is gonna and it has yet to happen. Same with the VRAM...the 480 / 580 w/ more RAM couldn't compete with the 1060 .... we have seen the same VRAM argument thru 6xx, 7xx, 9xxx, 10xx and it never pans out. I hope it does this time, but with 7xx nVidia held the top 2 tiers, with 9xx the top 3, 10xx the top 4.

I truly hope they can make a top tier version of the 5600 XT .... but every new generation it's been "AMD is gonna". They were the Chicago Cubs of the technology world.... hadn't won it in all in 100 years ... but this was gonna be there year. After 100 years, you couldn't help but root for the cubbies and Im sitting here rooting for AMD ... I would love to see them do it.. But I just can't get invested in the same ole hype we see every new generation only to be left with disappointment ... yet again .... I don't know how peeps can go thru this every generation .... shouting to the rooftops for 2 months before the release how "this is gonna change everything" and than ya gotta show ya face when it's another 290x, Fury, vega or Radeon VIIl.

Its going to begin and end with the GPU ...every time 2 VRAM versions of a card have come out it's been shown that uness you work really hartd to create a special situation, VRAM only makes a significant difference in fps when the frame rates are so low as to make the game unplayable.


Yes ya can find a game that appears to show a difference... a poor console port ... but most games are unaffected ... you can also find games where the performance difference between same cards w/ different VRAM goes DOWN when you increase resolution ... which makes no sense.
 
Joined
Sep 3, 2019
Messages
3,586 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
So you prefer another 1200usd 3080 Ti 11GB ?
Not sure if it's a better sell better than the 3080 at 700usd, probably a bit better than 3090.
People whine about Turing high price points too, but what are high end PC gamers gonna do, buy an AMD GPU ? :roll:. Let just hope Big Navi bring big performance back to the table so high end PC gamers will have more choices this time.
I think you are smarter than this. Because I can’t believe that this is what you have understood from all I wrote, which I didn’t say anything like that.

But I will play the “broken record mode” and try to break it down for others to understand it better. You understood it just fine, but trying to confuse public. Couldn’t care less why.

nVidia should have gone with 7nm to have better perf/watt ratio so could increase performance even higher on the same/less TBP.

And do...

3080 (at least)12GB, 250W, 30+% perf over 2080Ti 11GB, 700~800$
3080Ti 20/24GB, 300+W, 30+% over 3080 10GB, 1100 or even 1200$.
Titan, 48GB, 2500~3000$, with real Titan performance and drivers.

Instead, they did...

3080 10GB, 320W, +20~30% perf over 2080Ti 11GB, 700~800$
3090 24GB, 350+W, +10% perf over 3080 10GB, 1500+$
Also wanted...
3080Ti 20GB, 320+W, +5% perf over 3080 10GB, 1000+$
So they want/ed after 3080 10GB to charge +300~500$ for every +5% perf uplift.

So to sum up.
Gone cheap with Samsung, forced to over engineer coolers, restrict 3080 to 10GB because they can’t make good enough profit margins, and so push users to 1000+~1500$ GPUs with insignificant (+5%, +5%) perf uplift.
The canceled “3080Ti 20GB” and 3090 24GB.

Faked the 3090 24GB as the next Titan with a fake 1000$ discount, with no Titan performance or drivers. That is no Titan...

As I said, it was a good business plan to maximize profit margins. Sure they will find a way to survive this. But their original plan back-fired pretty much everywhere. From their faces to Jensen’s arse...

They could have won the “fight” single handed and be the clear King of performance and features, still with good enough margins.
But no... they choose lies, fake titles and user exploitation to the max level.
 
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
I think you are smarter than this. Because I can’t believe that this is what you have understood from all I wrote, which I didn’t say anything like that.

But I will play the “broken record mode” and try to break it down for others to understand it better. You understood it just fine, but trying to confuse public. Couldn’t care less why.

nVidia should have gone with 7nm to have better perf/watt ratio so could increase performance even higher on the same/less TBP.

And do...

3080 (at least)12GB, 250W, 30+% perf over 2080Ti 11GB, 700~800$
3080Ti 20/24GB, 300+W, 30+% over 3080 10GB, 1100 or even 1200$.
Titan, 48GB, 2500~3000$, with real Titan performance and drivers.

Instead, they did...

3080 10GB, 320W, +20~30% perf over 2080Ti 11GB, 700~800$
3090 24GB, 350+W, +10% perf over 3080 10GB, 1500+$
Also wanted...
3080Ti 20GB, 320+W, +5% perf over 3080 10GB, 1000+$
So they want/ed after 3080 10GB to charge +300~500$ for every +5% perf uplift.

So to sum up.
Gone cheap with Samsung, forced to over engineer coolers, restrict 3080 to 10GB because they can’t make good enough profit margins, and so push users to 1000+~1500$ GPUs with insignificant (+5%, +5%) perf uplift.
The canceled “3080Ti 20GB” and 3090 24GB.

Faked the 3090 24GB as the next Titan with a fake 1000$ discount, with no Titan performance or drivers. That is no Titan...

As I said, it was a good business plan to maximize profit margins. Sure they will find a way to survive this. But their original plan back-fired pretty much everywhere. From their faces to Jensen’s arse...

They could have won the “fight” single handed and be the clear King of performance and features, still with good enough margins.
But no... they choose lies, fake titles and user exploitation to the max level.

Not sure what Nvidia own you but you are demanding something that are quite outrageous there.
Anything more outrageous that you are demanding in the process ? world peace ? end of pandemic ?

The current 3080/3090 are not good enough ? don't buy it, problem solved.
FYI TSMC is also charging ridiculous prices for their foundry, nothing wrong with supporting Samsung foundry to bring in competition, sounds very much like people want AMD to be competitive eh :D
Also AMD is not a charity either, they will try to increase their margins once they have the performance crown, Zen 3 would like to say Hi :roll:
 
Last edited:
Joined
Sep 3, 2019
Messages
3,586 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
Not sure what Nvidia own you but you are demanding something that are quite outrageous there.
Anything more outrageous that you are demanding in the process ? world peace ? end of pandemic ?

The current 3080/3090 are not good enough ? don't buy it, problem solved.
FYI TSMC is also charging ridiculous prices for their foundry, nothing wrong with supporting Samsung foundry to bring in competition, sounds very much like people want AMD to be competitive eh :D
Also AMD is not a charity either, they will try to increase their margins once they have the performance crown, Zen 3 would like to say Hi :roll:
It’s not unethical to claim profit margins. Of course AMD, nVidia or Intel doesn’t owe anything to anyone.
My 2 examples of nVidia of what should have done and what has done has a meaning.
They’ve choose to do what they have done, and try to present it as my “should” option. That’s was all about.

At least the last 3-4 years any price uplift from AMD was attended by something. A nice performance uplift, and not fake titles(not all but some of them).
 
Joined
Jul 10, 2015
Messages
754 (0.22/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
I was shocked that 128mb 4200ti was slower than 64mb in 2002.
 
Top