• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 Ti Launched at $799 with Performance Matching RTX 3090 Ti

Joined
Dec 24, 2008
Messages
2,062 (0.35/day)
Location
Volos, Greece
System Name ATLAS
Processor Intel Core i7-4770 (4C/8T) Haswell
Motherboard GA-Z87X-UD5H , Dual Intel LAN, 10x SATA, 16x Power phace.
Cooling ProlimaTech Armageddon - Dual GELID 140 Silent PWM
Memory Mushkin Blackline DDR3 2400 997123F 16GB
Video Card(s) MSI GTX1060 OC 6GB (single fan) Micron
Storage WD Raptors 73Gb - Raid1 10.000rpm
Display(s) DELL U2311H
Case HEC Compucase CI-6919 Full tower (2003) moded .. hec-group.com.tw
Audio Device(s) Creative X-Fi Music + mods, Audigy front Panel - YAMAHA quad speakers with Sub.
Power Supply HPU-4M780-PE refurbished 23-3-2022
Mouse MS Pro IntelliMouse 16.000 Dpi Pixart Paw 3389
Keyboard Microsoft Wired 600
Software Win 7 Pro x64 ( Retail Box ) for EU
design, and you get DOUBLE the ram bandwidth compared to the RTX 4070 Ti. That's how crap the 4070 Ti is. Terrible design. It's '60 class.

but 2.75 1080ti would take 300w more watts than a single 4090

This is the only thing worth talking about today, power consumption RTX3000 vs RTX4000
RTX3000 failed to have reasonable power consumption in contrast to their transistors count.
And many poor kids wasted piles of money at buying thermal pads and magical pills, and they succeed nothing.
 
Joined
Jun 11, 2017
Messages
283 (0.10/day)
Location
Montreal Canada
I see, 6.3 times more transistors; 2.75 times more fps:

but 2.75 1080ti would take 300w more watts than a single 4090

Funny thing when two 1080ti's are in SLI its pretty much the same result as the new card today. Took them 4 years to compete with the greatness they allready had but killed off.
 
  • Like
Reactions: Lei
D

Deleted member 185088

Guest
Another overpriced GPU, hopefully no one buys it, fortunately it seems the case for all new realises, 4090, 4080 7900 all are available to buy at their real retail price.
 
Joined
Oct 6, 2021
Messages
1,605 (1.37/day)
Ngreedia's profit margin tells the different story. They hit 65%, all time high, before mining bust and AMD is no different. It's greed not the cost of dies that's hiking the prices.
View attachment 277398
This comes from server and AI GPUs that sell for multiple times the price of mainstream consumer GPUs.

The current cost of GPUs comes from multiple factors, the main one is advancing the density of manufacturing processes is getting slower, while the cost/wafer due to complexity practically doubles. Don't expect it to get any cheaper going forward with TSMC alone swimming ahead with no competition.

Second, the obsession with RT and high resolutions requires much larger GPUs with dedicated cores, plus all the additional cost of R&D, and both RT and high resolutions pull massive amounts of (also increasingly expensive) video memory, imagine $300+ in memory alone on recent GPUs.
 
Joined
Mar 28, 2020
Messages
1,761 (1.02/day)
Die size is not the only thing determining fab costs, if that were the case than we would see very different wafer prices in the industry, but its just not the case. 5nm/4N is expensive AF, no way to shake that. GTX1080 was on 16nm and that process was dirt cheap relative to fab costs at <10nm today. Fab costs have gone up exponentially as nodes have gotten smaller, and TSMC has been announcing price increases the last couple years, and just did again recently.
I don't disagree that cost of fab have increased exponentially over the years. Assuming cost have increase 2x, we are just looking at the cost of the die. In other words, if it cost 150 to manufacture the GA104, and now its 300 for the AD104, I don't think it is enough to account for all the price increases. Its got more GDDR6X, but the cost of VRAM and other components have likely come down due to stable supply and low demand at this point. In my opinion, they are just trying to increase prices knowing that demand is going to be weak, just so that they can maintain a level of profit, by widening the margins.

This is the only thing worth talking about today, power consumption RTX3000 vs RTX4000
RTX3000 failed to have reasonable power consumption in contrast to their transistors count.
And many poor kids wasted piles of money at buying thermal pads and magical pills, and they succeed nothing.
Power consumption is clearly an advantage for Ada. All retail Ampere chips were built on Samsung's 10nm even though they call it 8nm. Looking at the power characteristics of Samsung vs TSMC node, the latter clearly have a significant advantage. This is very apparent in the mobile SOC space, and recently with Qualcomm moving from Samsung to TSMC 5nm, the 8 Gen 1 and Gen 1+ is a good example. So moving from Samsung's 10nm to TSMC's 4nm is almost a good 2 nodes or more upgrade.

Also, thermal pads were meant for the VRAM, not for the core. So at least for my case, the changing for thermal pads clearly brings the memory hotspot temps down significantly.
 
Joined
May 3, 2018
Messages
2,881 (1.19/day)
Just look at the pure raster numbers of this rip-off. It's not faster than 3090. Without DLSS to save it's arse it's an utter joke. It's a $599 pig dressed up with lipstick for $799. Even it it were 256bit bus and had 16GB just a lot less cuda cores than 4080, it shouldn't be more than $699.
 
Joined
Jan 20, 2019
Messages
1,593 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
It amazes me how people quickly jump to justify every and any given NVIDIA price point. Doesn't matter how much the 4070 TI goes for (MSRP), you will always get people jumping on the YES-MERCHANT bandwagon and fight tooth and nail for these holy Nvidia revelations. If I was a shareholder or investor it would make sense but as a "consumer", yep just a barebone "consumer", it's difficult to digest the mid-ranged XX70 segment being hijacked with XXX-profiting. Not that i'm surprised.... seeing the exorbitant 4080 hitting the shelves no doubt the 4070~ was always destined to rip holes in the general consumers wallet (a pretty large portion of the consumer base). What is also inevitable is seeing the XX60/XX50 segments equally suffering from this greasy profit war machine and no doubt AMD will sadly follow.

Personally, i might just pull the trigger on one as I had stated previously i am willing to fork out $800 for a GPU (not a dime more).... but i expected way more for this sort of money and it's hardly an "exciting" buy for the monies worth. I feel for the guys on a budget... although most likely some will have a fat enough budget but just not cutting it nowadays with these rediculous post-pandemic pocket pinch price scheming. I dunno, might even give 40-series/RDNA3 a miss ... i'm just not feeling the "GPU" upgrade impulse nowadays (well since 30-series anyway). Closely trailing RDNA3 doesn't seem exciting either.... all PANTS if you ask me.

Looks like most gamers looking for a spectacular eye-candy gaming experience + snappy frame render will have to settle with less. I'm glad i kept the impulse at bay from moving to 4K... that would have sucked for me! 1440p it is for another 5 years or so it seems.
 
Joined
Dec 25, 2020
Messages
7,023 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
It's not bad at all what is not to like. The very fact that it's not 256 bit like it should,

But take into account that 3080 Ti 3090 and all the halo products were discounted to under 1K. from the original 1199 to 1999. 3080 12 GB for as low as 750.

Clearly 48 MB L2$ and 192bit is as efficient as 384 bit 6MB. and the price being slashed from $999 3090 Ti 24GB, to a more reasonable 799 losing half of the bus and the memory, the same 40. Tflops.

The one to get is 4070 5888 Cuda version and if delivering 3070 Ti + 10% is as good as it gets, I'll take that. 3x faster than my 980Ti.

3x performance in 8 years with a relative increase in price is not an accomplishment. The GTX 980 Ti is an ancient graphics card at this point. If the market was anywhere close to healthy, you'd have $150 low-end graphics cards giving it a biblical spanking. But instead, anything below $200 cannot beat it in performance, only in power consumption. That's just sad!

It amazes me how people quickly jump to justify every and any given NVIDIA price point. Doesn't matter how much the 4070 TI goes for (MSRP), you will always get people jumping on the YES-MERCHANT bandwagon and fight tooth and nail for these holy Nvidia revelations. If I was a shareholder or investor it would make sense but as a "consumer", yep just a barebone "consumer", it's difficult to digest the mid-ranged XX70 segment being hijacked with XXX-profiting. Not that i'm surprised.... seeing the exorbitant 4080 hitting the shelves no doubt the 4070~ was always destined to rip holes in the general consumers wallet (a pretty large portion of the consumer base). What is also inevitable is seeing the XX60/XX50 segments equally suffering from this greasy profit war machine and no doubt AMD will sadly follow.

Personally, i might just pull the trigger on one as I had stated previously i am willing to fork out $800 for a GPU (not a dime more).... but i expected way more for this sort of money and it's hardly an "exciting" buy for the monies worth. I feel for the guys on a budget... although most likely some will have a fat enough budget but just not cutting it nowadays with these rediculous post-pandemic pocket pinch price scheming. I dunno, might even give 40-series/RDNA3 a miss ... i'm just not feeling the "GPU" upgrade impulse nowadays (well since 30-series anyway). Closely trailing RDNA3 doesn't seem exciting either.... all PANTS if you ask me.

Looks like most gamers looking for a spectacular eye-candy gaming experience + snappy frame render will have to settle with less. I'm glad i kept the impulse at bay from moving to 4K... that would have sucked for me! 1440p it is for another 5 years or so it seems.

Agreed, though, I'm unsure I can call either of the Navi 31 duo "settling". They perform very well, even if AD102 is ahead of the curve. It will be some time until games cannot run well on that.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,012 (2.49/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | LG 24" IPS 1440p
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
The cable issue is due to the terrible card design. Instead of making a longer PCB and a connector on the far right, they made the opposite - very short PCB and connector in the middle.
If you look at your ATX case and the cable management, you will see that it is the design choice which led to this "cable issue".



RTX 4070 Ti is too slow for this market tier.
Look, if RTX 4080 was 5-10% within RTX 4090, and RTX 4070 Ti was then 5-10% within RTX 4080, it would be better.
Now RTX 4070 Ti will be around 60% the performance of RTX 4090 ! Not ok.
Too slow? if its anywhere close to RTX3090 performance at $799 that ain't that bad. RTX3080 MSRP was $699 and 4070ti will likely be faster.

12nm didn't cost that low :kookoo:
12nm wafer costs roughly the same as 16nm since they are mostly the same thing.
 
Joined
Dec 25, 2020
Messages
7,023 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Too slow? if its anywhere close to RTX3090 performance at $799 that ain't that bad. RTX3080 MSRP was $699 and 4070ti will likely be faster.


12nm wafer costs roughly the same as 16nm since they are mostly the same thing.

The primary problem is that the performance should increase generationally, yet the price should not, otherwise that negates that generational leap almost entirely since these GPUs have no major breakthrough features that differentiate them from the previous generation (i.e. Ampere supports DirectX 12 Ultimate to its fullest - and the vast majority software doesn't take advantage of its capabilities yet).

So overall we have NVIDIA's marketing team tryharding with ridiculous lies like "4070 Ti + DLSS = 3x 3090 Ti", as if that 8-month-old $2000 GPU was already an ancient relic, all in an attempt to get the unsuspecting buyers on board. It's beyond low, it's just pathetic. Not two years ago they were touting the RTX 3090 as an 8K-ready next-generation product ready for the future, and now they're calling "mainstream gamer's hardware", a polite dig at "what are you, poor?" which to me, just implies medium settings 1080p experience throughout - the 40 fps or so that my RTX 3090 runs Portal RTX at seems to imply that, at least.

A 192-bit GPU with such modest BoM and on this precise segment should not be over $499.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,012 (2.49/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | LG 24" IPS 1440p
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
The primary problem is that the performance should increase generationally, yet the price should not, otherwise that negates that generational leap almost entirely since these GPUs have no major breakthrough features that differentiate them from the previous generation (i.e. Ampere supports DirectX 12 Ultimate to its fullest - and the vast majority software doesn't take advantage of its capabilities yet).

So overall we have NVIDIA's marketing team tryharding with ridiculous lies like "4070 Ti + DLSS = 3x 3090 Ti", as if that 8-month-old $2000 GPU was already an ancient relic, all in an attempt to get the unsuspecting buyers on board. It's beyond low, it's just pathetic. Not two years ago they were touting the RTX 3090 as an 8K-ready next-generation product ready for the future, and now they're calling "mainstream gamer's hardware", a polite dig at "what are you, poor?" which to me, just implies medium settings 1080p experience throughout - the 40 fps or so that my RTX 3090 runs Portal RTX at seems to imply that, at least.

A 192-bit GPU with such modest BoM and on this precise segment should not be over $499.
I mean if we are going to use a GPUs memory bus width as a metric for GPU pricing, sure....i guess
 
Joined
Dec 25, 2020
Messages
7,023 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I mean if we are going to use a GPUs memory bus width as a metric for GPU pricing, sure....i guess

Bus width is, imho, a decent metric for a couple of reasons: G6X ICs are still relatively expensive, and the more you add involve more and more PCB complexity, which also increases the BoM.

A good case study is the original RTX 3090, it initially carried hundreds of dollars in memory alone*, due to very high prices earlier on and needing 24 ICs plus a PCB and power delivery system to match.

*= (rumor was that it had roughly $600 of memory on it back then, even accounting for economy of scale, I do not know if this is true but given that the chips used on it still fetch $24.50 each on low volume market, it may very well be)

The RTX 4070 Ti, in contrast should use a much simpler design with only six 16Gbit G6X ICs, and the AD104 is a relatively small processor with a 250-280W footprint, which doesn't require as advanced a VRM as either the AD103 or the AD102.
 
Joined
Jun 21, 2013
Messages
608 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
waiting for 4000 series prices to fall is delusional, just look at current 3000 series prices. it's over. cheap or affordable GPU's are a thing of the past
Is it? GPU sales are at 20 year low, so it's only a matter of time before the scalping corporations will have to do significant price cuts just to free up storage space.
 
Joined
Sep 6, 2013
Messages
3,393 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Is it? GPU sales are at 20 year low, so it's only a matter of time before the scalping corporations will have to do significant price cuts just to free up storage space.
No he is right. Nvidia didn't dropped prices on cheaper 3000 models, only on expensive ones. Nvidia droped prices only on 3090/Ti and tech press was celebrating with "Nvidia dropped prices" articles creating the misleading impression that prices dropped on all models. AMD did price drops across the line, tech press relegated those to footnotes, while people kept buying Nvidia cards.

The environment is so toxic, so monopolistic, so favoring to Nvidia, that Nvidia, controlling 90% of the market, has no real reasons to drop prices. Intel is still far behind to become a factor, AMD still doesn't seem to be very excited about the GPU market and probably they are more concern about AM5. Both Intel and AMD will gain long term if Nvidia does all the necessary work to establish higher prices in the retail GPU market. Also I doubt OEMs pay these prices. Nvidia advertises 4070 Ti at $800, but what is Dell for example pay to get one? $800? $700? $600? $500? Those high prices are beneficial for every company out there making gaming equipment, just not for consumers. MS and SONY can price their next gen consoles higher, Valve and others building gaming handheld devices, also, Dell that I will use again as an example, buy a 4070 Ti from Nvidia at, let's say $600 and price it at $700, making it's prebuild PCs look like bargains and of course AMD and Intel price their own offering higher. While they will sell less stuff, they will achieve higher profit margins and if TSMC doesn't have enough capacity for everyone, selling less stuff at higher profit margins, is crucial. Nvidia also looks more at AI, roboticks and stuff today than gaming, AMD is concentrating more in servers while it still has an advantage there, as for Intel, they build GPUs out of necessity, not to make gamers happy.

Cheap GPUs are dead. We already witnessed the death of sub $100 GPUs those last years, for sub $200 we only got the laptop RX 6500XT and then the beta A380 and finally the abysmal insult in the form of an old arch GTX 1630. Under $300 we only have one good option, the RX 6600 that no one buys, because it doesn't have an Nvidia logo on it. Instead they buy the RTX 3050. Go figure....

Nvidia dropping prices? Why?
 

zer0day777

New Member
Joined
Jan 4, 2023
Messages
23 (0.03/day)
sweet, exactly what I need, nothing more.


RTX 3080 VENTUS 3X PLUS 10G OC - cheapest 3080 in my country, price is exactly 1k EUR
lol I'd need AAA 4K raytracing, sorry. And I definitely mean that WITH sane pricing. Oh, what's that? nuVidiatm RTXtm can't even perform on the third iteration? pfffft

Facts: I've got a 1440p75hz panel since the start of the pandemic, I been gaming on 1440p with an RDNA1 card because RTX1 cost too much money in comparison ($500 floor minimum on 2070 super aka 2-70ti, $410 for a much nicer looking 5700XT, same performance, up to $170 cheaper than what I was looking at back then for 2070s custom AIB). So, to me, I'm gonna have to need 4k raytracing at affordable prices by this point considering we are now up to RTX gen 3. nuVidia no longer has got any excuse on this. It's even worse now that RDNA3 shows up and, underwhelming though it is, still isn't that far behind in RT ultimately.

So basically if it can't even do that, I've got ZERO reason for ever upgrading, because I don't have to replace my monitor because the card is literally unable to do raytracing at 1440p native, even to this day, going by that logic. Really all the DLSS bull$%^& and FSR nonsense, trying to muddy the water (literally lel) when the clear issue is PRICE TO PERFORMANCE. That's it. I don't care what s&%$ this company throws at the fan to trick and confuse people at this point. Jensen honestly bet (and lost his gamble, badly) on mining farms still having demand for scalped af mining GPUs; that's vanished. And so now they're literally stuck with gamers, who they've been screwing for generations, and trying to justify this HORSE %$#&. IT WAS NEVER "INFLATION".

IT WAS LITERALLY ALWAYS THEIR PRICE/PROFIT MODEL. So bad, nobody purchased a 2080, because why tf would you when you could get a 1080ti instead for cheap. And that was a thousand dollars/euros at retail, which then the 2-80ti bumped to $1200. Think about the MSRP for the GTX 1070 or 1070ti. Well then the 2-70ti (aka 2070super) is $500, right? So that's at when I thought the price didn't make sense so I switched to AMD, which hasn't failed me yet (and was frankly an excellent mining card, literally paid for itself with extra profit 2020-2021). That was BEFORE the $600 RTX 3070ti, btw, which WAS BEFORE THE ALLEGED "INFLATION" EVEN STARTED HAPPENING. It's a SCAM.

So, no, Jensen can go F himself, and his shareholders. Anyone who buys this card at anywhere near those prices is a simp, a moron, and a cuck of highest calibre. It's like, they are playing some alleyway scam on total idiots where you show bad thing1 and it sucks less than bad thing2 so they con themselves to thinking bad1 is "a better deal." No, they're both TERRIBLE deals. Like seriously I hopped off team green when RTX 2000 pricing was unbelievable, and as disappointing RDNA3 has been, it's nowhere near the disaster that is Lovelace. I would've even consider it without literally halving the prices.
 

Attachments

  • 1625430608175.jpg
    1625430608175.jpg
    10.8 KB · Views: 30
Joined
Dec 24, 2008
Messages
2,062 (0.35/day)
Location
Volos, Greece
System Name ATLAS
Processor Intel Core i7-4770 (4C/8T) Haswell
Motherboard GA-Z87X-UD5H , Dual Intel LAN, 10x SATA, 16x Power phace.
Cooling ProlimaTech Armageddon - Dual GELID 140 Silent PWM
Memory Mushkin Blackline DDR3 2400 997123F 16GB
Video Card(s) MSI GTX1060 OC 6GB (single fan) Micron
Storage WD Raptors 73Gb - Raid1 10.000rpm
Display(s) DELL U2311H
Case HEC Compucase CI-6919 Full tower (2003) moded .. hec-group.com.tw
Audio Device(s) Creative X-Fi Music + mods, Audigy front Panel - YAMAHA quad speakers with Sub.
Power Supply HPU-4M780-PE refurbished 23-3-2022
Mouse MS Pro IntelliMouse 16.000 Dpi Pixart Paw 3389
Keyboard Microsoft Wired 600
Software Win 7 Pro x64 ( Retail Box ) for EU
I don't disagree that cost of fab have increased exponentially over the years. Assuming cost have increase 2x, we are just looking at the cost of the die. In other words, if it cost 150 to manufacture the GA104, and now its 300 for the AD104, I don't think it is enough to account for all the price increases. Its got more GDDR6X, but the cost of VRAM and other components have likely come down due to stable supply and low demand at this point. In my opinion, they are just trying to increase prices knowing that demand is going to be weak, just so that they can maintain a level of profit, by widening the margins.


Power consumption is clearly an advantage for Ada. All retail Ampere chips were built on Samsung's 10nm even though they call it 8nm. Looking at the power characteristics of Samsung vs TSMC node, the latter clearly have a significant advantage. This is very apparent in the mobile SOC space, and recently with Qualcomm moving from Samsung to TSMC 5nm, the 8 Gen 1 and Gen 1+ is a good example. So moving from Samsung's 10nm to TSMC's 4nm is almost a good 2 nodes or more upgrade.

Also, thermal pads were meant for the VRAM, not for the core. So at least for my case, the changing for thermal pads clearly brings the memory hotspot temps down significantly.

This is the point that you got it wrong, its true that pads serving VRAM, but due VRAM adequate cooling there is removed portion of heat this generated due the GPU too.
It was 100% NVIDIA's responsibility to think first and improve electronics design, instead regular people to become beta testers and search for solutions of their own.
NVIDIA is the one which got your money, and you should receive a trouble free product.

My advice to all, just be more careful about your choices from now and on.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
I don't disagree that cost of fab have increased

It is exaggerated. The fab cost can actually be lower compared to years in the past because of using more efficient technologies and optimisations, like independent power supply from own photovoltaics, more energy efficient buildings and equipment, less employees overhead, etc cost reductions.

There is no scientific proof that the newest technology today is more expensive than the newest technology 5 or 10 years ago.
It can be fake news, speculations, and lies in order to justify the profit margins, the private jets and yachts for the top management and stockholders.
 

zer0day777

New Member
Joined
Jan 4, 2023
Messages
23 (0.03/day)
No he is right. Nvidia didn't dropped prices on cheaper 3000 models, only on expensive ones. Nvidia droped prices only on 3090/Ti and tech press was celebrating with "Nvidia dropped prices" articles creating the misleading impression that prices dropped on all models. AMD did price drops across the line, tech press relegated those to footnotes, while people kept buying Nvidia cards.

The environment is so toxic, so monopolistic, so favoring to Nvidia, that Nvidia, controlling 90% of the market, has no real reasons to drop prices. Intel is still far behind to become a factor, AMD still doesn't seem to be very excited about the GPU market and probably they are more concern about AM5. Both Intel and AMD will gain long term if Nvidia does all the necessary work to establish higher prices in the retail GPU market. Also I doubt OEMs pay these prices. Nvidia advertises 4070 Ti at $800, but what is Dell for example pay to get one? $800? $700? $600? $500? Those high prices are beneficial for every company out there making gaming equipment, just not for consumers. MS and SONY can price their next gen consoles higher, Valve and others building gaming handheld devices, also, Dell that I will use again as an example, buy a 4070 Ti from Nvidia at, let's say $600 and price it at $700, making it's prebuild PCs look like bargains and of course AMD and Intel price their own offering higher. While they will sell less stuff, they will achieve higher profit margins and if TSMC doesn't have enough capacity for everyone, selling less stuff at higher profit margins, is crucial. Nvidia also looks more at AI, roboticks and stuff today than gaming, AMD is concentrating more in servers while it still has an advantage there, as for Intel, they build GPUs out of necessity, not to make gamers happy.

Cheap GPUs are dead. We already witnessed the death of sub $100 GPUs those last years, for sub $200 we only got the laptop RX 6500XT and then the beta A380 and finally the abysmal insult in the form of an old arch GTX 1630. Under $300 we only have one good option, the RX 6600 that no one buys, because it doesn't have an Nvidia logo on it. Instead they buy the RTX 3050. Go figure....

Nvidia dropping prices? Why?
The thing is though, they don't, it's just people keep trying to claim everyone is buying those. Partly this is because most people buy prebuilts, and nVidia still hasn't managed to F up their contracts (yet) with all the SIs and major brick and mortar stores so every piece of crap prebuilt has got a 1060 or 1650 or 3060 or something like that in it, no matter how bad the hardware actually is nor how outrageous the prices. Like, tf you think I'd pay that much for a 1650 Max-Q? Hell, Craigslist is hysterical or sad though, not sure which, I saw somebody trying to sell no joke a RX 460 2gb system "bought new from Best Buy never opened" with some other garbage hardware for $700. This on the same page someone's selling a used 2080ti for under $400.

I think it's that the megacorpos just love ignorance, hell Capitalistic excess generally favours NPCs, the ignorant, the impulsive, the most childlike creature imaginable to be useless at anything than being a consumer drone, a cheap biorobot worker, and expendable canon fodder (the rest gets shipped to private schools to be the middle-management biorobots). So it's even more obvious to us in tech where we clearly can see the ripoff and are used to working with numbers, and these scumbag dirtbag corporations just pull literally the exact same sort of scammer from Mumbai "hello your PC is broken we need your Social Security number to unlock it from virus" crap on old people, kids, and NORPs that don't know better. And they try making it sound like computers are "really complicated" and they are not, you can easily teach yourself how to not get swindled, it's just they hold back some of the info and try playing 3 card shell games, like calling it "i7 with 8gb RAM" yeah single stick, not dual rank, lowest speed dustbin RAM with an ancient low-tier 9700 or something.

I'm so outraged by nuVidia I'm just not buying anymore. They became nuVidia at RTX frankly, it's embarrassing. Like they had their problems and always been a scumbag scammy company from the alleged GTX 1060 "3gb model" to 3.5gb of GTX 970, but at least Pascal and Maxwell were really good. They didn't even age poorly, no matter how much nuVidia wishes they did. This is because their dumb crap usually doesn't take off because no one wants to deal with them and their proprietary nonsense, so no one uses HairWorks for example can you imagine selling 1080tis purely on "it does HairWorks and AMD cannot"?? They try forcing the market to obsolesence more quickly but really, RTX can't even run RT natively, their bottom end SKUs also can't even run acceptable frames even with the smudgiest DLSS on. The alleged "RTX" 3050 should be a GTX 3050. But the real problem is lots and lots of morons too braindead to understand why a 3050 is a horrific deal at $400, it's literally slapped to s&%$ and thrown in a dumpster by the 5700XT, even 5600XT iirc.
I mean if we are going to use a GPUs memory bus width as a metric for GPU pricing, sure....i guess
I think the bigger deal is the fail across every conceivable metric, from pricing, to performance, to power consumption, hell to even memory bus and anything else you can think of. There's no reason to buy a Lovelace GPU at all. You'd have to be a moron or so completely misinformed you already spent your money like a simp and found this thread in like 2025. They SUCK. I think literally the last time nVidia released a generation so terrible was the GT 280? Wasn't that when they charged $650 for a card that AMD's top end HD something beat at $400, so they had to lower prices?

That's the problem--they not only refused to lower prices, they jacked it. I saw those TDPs long ago and was like "alright, AMD needs to not screw up basically or nVidia need to lower prices because I'm not paying that for something so inefficient I need a new PSU." Lo and behold, they not only jacked it, they jacked prices outrageously to literally scalper levels.

Stop giving nuVidia your money.
Too slow? if its anywhere close to RTX3090 performance at $799 that ain't that bad. RTX3080 MSRP was $699 and 4070ti will likely be faster.


12nm wafer costs roughly the same as 16nm since they are mostly the same thing.
Like this, why are any of you guys even rationalizing this, it's like watching heroin addicts and alcoholics trying to justify why dying in a gutter is a great life decision for them. Also the 3090 wasn't much better than a 3080, facts. Each time with nVidia lately it felt like paying $500 for 2.5% uplift. That's literally hitting margin of error where you can get that kind of performance by just cleaning out the dust from your GPU and repasting, or getting a different AIB model or something.

You know the funniest thing to me, is they also released such a terrible last few generations that not only people been having all kind performance problems due to dumb s--- like driver bloat, but even just going on Steam all the time it's "how come my 3080 gets such low frames" "hello why is my 3070 getting this problem" and you go through them and quickly realize people are like "I have RX 6800 and it just werks for me."
It amazes me how people quickly jump to justify every and any given NVIDIA price point. Doesn't matter how much the 4070 TI goes for (MSRP), you will always get people jumping on the YES-MERCHANT bandwagon and fight tooth and nail for these holy Nvidia revelations. If I was a shareholder or investor it would make sense but as a "consumer", yep just a barebone "consumer", it's difficult to digest the mid-ranged XX70 segment being hijacked with XXX-profiting. Not that i'm surprised.... seeing the exorbitant 4080 hitting the shelves no doubt the 4070~ was always destined to rip holes in the general consumers wallet (a pretty large portion of the consumer base). What is also inevitable is seeing the XX60/XX50 segments equally suffering from this greasy profit war machine and no doubt AMD will sadly follow.

Personally, i might just pull the trigger on one as I had stated previously i am willing to fork out $800 for a GPU (not a dime more).... but i expected way more for this sort of money and it's hardly an "exciting" buy for the monies worth. I feel for the guys on a budget... although most likely some will have a fat enough budget but just not cutting it nowadays with these rediculous post-pandemic pocket pinch price scheming. I dunno, might even give 40-series/RDNA3 a miss ... i'm just not feeling the "GPU" upgrade impulse nowadays (well since 30-series anyway). Closely trailing RDNA3 doesn't seem exciting either.... all PANTS if you ask me.

Looks like most gamers looking for a spectacular eye-candy gaming experience + snappy frame render will have to settle with less. I'm glad i kept the impulse at bay from moving to 4K... that would have sucked for me! 1440p it is for another 5 years or so it seems.
lmao
"I will pay the merchant $1299, but not a dime more!"
I wouldn't be willing to pay $600 for a 4070 super, period. I literally just skipped RTX 2070 super based on it being $500 minimum, EVGA models $570+ and meanwhile Radeon releases comparably performing $400 cards. "But it doesn't have raytracing!" So tell me then, was your DLSS 1.0 RT experience on that card really worth the extra $200? And it mined worse too. So it cost less on ebay even during the scalping.

That's what you are asking me to do. Is make my next upgrade be a card that at my targeted segment $500 was getting a bit steep, and going up to $600 I'm starting to expect xx80 performance. Considering the GTX 980 was $550, and not even all that long ago, and was a MUCH better made GPU, I don't find this to be terribly unreasonable to say "I'm not spending a dime more than $600 on a 70ti custom card and that's firm." And then that's also asking me to do it for a generation that's let's face, just plain sucks.
We all know it. It's literally a MEME. Like
honest to God joke GPU. It's a clowncarPU. And I can't even fit the stupid things in my case anyway, takes up 4 freaking slots so goodbye literally anything else (yes I do use those x1 and x8 slots btw) and requires a completely monstrous PSU, which is suddenly making my budget way outside the range of a thousand dollars because think about it:
as the average gamer, you are having, what, like $500 for GPU? Well let's think this way, you've got a new monitor in store too. Let's double that budget. You aren't going to magically have $500 appear from thin air in your account, but let's imagine you do. $1000 to play games at better visual quality. So let's say, $350 for monitor. That's $650 for GPU, now how much is that PSU going to cost? So suddenly, you've already dropped from being able to afford 3080 quality, to 3060ti quality at best. Meanwhile for that same cost you can get 6800XT quality for the same cost, because you don't have to get a bigger PSU.
That's why Lovelace is such a bad deal.

I mean really it's just s#@$ from every conceivable angle. The sole thing nuVidia accomplished was padding shareholder pockets and getting simps to cheer on their own robberies "yeah but it benefits shareholders so it's so smart!" Oh, and allegedly better performance, but when you slam that much additional power and it costs more, you really didn't advance at all, just expanded the range of halo products into more luxury price halo products. You could LN2 a 3080 and do the same thing more or less.

Just look at the pure raster numbers of this rip-off. It's not faster than 3090. Without DLSS to save it's arse it's an utter joke. It's a $599 pig dressed up with lipstick for $799. Even it it were 256bit bus and had 16GB just a lot less cuda cores than 4080, it shouldn't be more than $699.
This.

But again, this is part of a broader problem. It's like, if a guy is a heroin addict, he's stealing from people, stealing from friends, has a big criminal record, are we really going to spend that much time arguing how he ripped off grandma? It isn't even that he stole grandma's checkbook at that point. It's that Jensen has simply become so freaking brazen about it he's all but throwing a towel on the floor of jail cell "pick up my towel for me, bish." Jensen is openly daring you all to bend over, because he feels like he pushed up on you so much he can do literally anything to you he wants and you'll love him for it.

This is the point that you got it wrong, its true that pads serving VRAM, but due VRAM adequate cooling there is removed portion of heat this generated due the GPU too.
It was 100% NVIDIA's responsibility to think first and improve electronics design, instead regular people to become beta testers and search for solutions of their own.
NVIDIA is the one which got your money, and you should receive a trouble free product.

My advice to all, just be more careful about your choices from now and on.
Man...lol that's the problem, and again I cannot emphasize enough these kind of things is why I hopped off nVidia's dik years ago and got an RDNA1 card instead. People gave the 5700XT so much shit (it deserved it on release, but became a dead horse meme as people realized it got fixed and was one of the best value cards of the pandemic) but really, that's a ~$400 card and meanwhile crickets when you bring up the nonstop issues of a $1200 GPU like the 2080ti. I mean, if I spent over a grand on a GPU, you damn well better work right from the box. Instead, somehow the 2080ti never got the meme status s%$* it rightfully deserved for being 5700XT release tier...and costing over a thousand dollars.

But it didn't end there, did it. Ampere was the same thing, all kinds of problems plagued that series of cards, right up to their halo products bursting into flames. 3090, not 4090, which also burst into flames apparently. POSCAPs was a big issue, they had bad drivers, I remember they actually downclocked the 3080 so hilariously became the anti-finewine of performing worse with drivers over time, because they couldn't stop its crashing otherwise.

It happens because nobody wants to hold nVidia accountable, and you'll notice the actual businessmen who make money not just play with toys all dropped nVidia. Apple, EVGA, Sony, Microsoft, nobody really wants to work with that company and this includes gamedevs. So I think it's even funnier thinking about this and realizing all that AAA gaming is being finetuned on AMD-only hardware, Xbox and PS5 take Ryzen processors and RDNA2 graphics cards, so I'm not sure what people are thinking buying these and trying to claim some AAA gimmick.
3x performance in 8 years with a relative increase in price is not an accomplishment. The GTX 980 Ti is an ancient graphics card at this point. If the market was anywhere close to healthy, you'd have $150 low-end graphics cards giving it a biblical spanking. But instead, anything below $200 cannot beat it in performance, only in power consumption. That's just sad!



Agreed, though, I'm unsure I can call either of the Navi 31 duo "settling". They perform very well, even if AD102 is ahead of the curve. It will be some time until games cannot run well on that.
Yeah see this guy also gets it, if you actually remembered the 1gb graphics cards being "huge" because we finally hit the gigabyte mark on VRAM...well, not really impressive lately per performance. We had something like this sorta with Fermi, but not even Fermi was anywhere near these TDPs and the 590 was like 365w. Meanwhile we have such pathetic things like GTX 1630 pricing, companies realizing they can sell literally anything and people will buy it. The same crap happened with scalping, because the mining farms set the new standard because it was a direct economic calculation of how much money can I make and how long will it take me to turn my investment from net loss into a monetary gain, that's why they cost like $2000 because you could still turn a profit at that mark. Meanwhile the occasional moron gamer would buy a GPU at these prices to play games on. But when you go "all the way back" to like 10 years ago, a GTX 980ti was a beast at 240w or whatever it was.

If you remember then you're also mentally comparing any card today to the GTX 980 for $550, both in terms of uplift as well as pricing. Same goes for GTX 980ti and GTX 1080ti, the two generations nVidia was undisputably good, before the dark times, before RTX. Every single year since then they've jacked prices and delivered far less. They've had even worse standards and all kinds of problems everywhere, poorer physical products (no $500+ card should have a plastic backplate, the 900 series was all metal backplates though to be fair, they started using more plastic shrouds there was a time when it was all metal) numerous bugs, I mean really nuVidia has zero room to try and poke fun at AMD when bugs-wise nuVidia's been a complete disaster. And it wouldn't even be such a big deal were it not the fact they're charging these ridiculously outrageous prices for a throttling inferno. Like, I can't even imagine paying $800 for a 70ti. That's insane. So, a 4070 is NOT in my plans, period. But the problem is AMD has no reason to set a 7700XT at "normal" $500> if people are stupid enough to buy an nVidia card. So it's like the same problem as people rewarding the scalpers, only now nVidia is the scalper.
Would I publicly shame a man for giving his money to a scalper?
Yes I would.
 

Attachments

  • 1625357041083.png
    1625357041083.png
    51 KB · Views: 35
  • 1605739249906.jpg
    1605739249906.jpg
    95.8 KB · Views: 39
  • the absolute state ...6.png
    the absolute state ...6.png
    3.3 MB · Views: 37

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,970 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Everything here gets a 5/5. No matter if it undelivered performance, efficiency, affordability, and cooling - all at the same time. Look at RX 7900XT review. I would remove the star rating entirely (for the past reviews too) because it makes TPU look like sponsored influencers.

We have removed numeric scores a long time ago, because people were crazy about the number and the drama disrupted all other discussion

I guess you mean the rating stars in Google search results? Google awards the ability to display those stars to sites that it considers authoritative for the topic. The stars make the result stand out more, so more clicks. More clicks = higher placement in the search results. Unfortunately a majority of visitors (! I polled people on this) think that the stars are a rating for the article, not for the tested product

Removing the stars will lower the click rates, which will push us down in search results, and people will no longer find our reviews, then why even write reviews
 

zer0day777

New Member
Joined
Jan 4, 2023
Messages
23 (0.03/day)
800+27%VAT 1020(only VAT)-1300(bonus for design or for energy price lol) usd matter of how much retailers put on it for the 4070 ti is still damn cheap(though it's not).
3070 ti cheapest are 790 usd(with vat), you can't get 3080 below 1060 usd (with vat and avg price for it is still over 1160 usd with vat), 3080 ti cheapest is 1670 usd(with VAT and avg 1850), 3090 with the best deal of the month 1965 usd (with vat but it's only one piece of card and after this somehow gone the next one is the avg min price is 2500 usd with vat GG Hungary, look how nice retailers we have), and there is no new 3090 ti.

Most of the 4080 tanked at 1560-1690 usd (with vat), 4090 cheapest at 2100 but avg 2600 usd (with vat).
7900 XT ohh boi price for it is 1170-1270 usd with VAT and the AIB design models are go for 1320-1450 usd, 7900 XTX 1530(only the standard)-1800(feel the premium, just feel it you can't see it).

So with that performance i can see it as an actual finally long last buy of a card for that price with 3090 performance, still it could be cheaper for a XX70 card like 600-650+VAT.I use a 970 and in 2021 wanted to upgrade for a 3070 ti or a 3080 HAHAHA lol, still using my 970.
The funniest thing, the only reason to even buy a card today is your old Maxwell/Pascal broke, or because you're getting a 4k or 1440p monitor.
Seriously, that's it. And they can shill this RT b.s. all they want but AMD has RT too so nuVidia's not special, it's just another HairWorks far as I (or a lot of other gamers) are concerned. Games like Battlefleet Gothic Armada II and The Ascent look very pretty regardless, because Unreal is a pretty engine. nVidia would try calling those particle effects "PTXtm ParticleCorestm" or something and try comparing how well their card does that compared to AMD.
Meanwhile in all reality, they're more or less equivalent brands in performance, and the real reason to upgrade is going higher resolution because lots of games also aren't super demanding and it still feels like stagnation in the gaming industry too.

I only even wanted a new card because I was thinking about 4k144hz, and frankly that's what EVERYONE wants. That's why it's so nuts they went with the stupid displayport gimping because 4k144 is quite literally the new normative standard. It's solely about whoever can do 4k144 native better and at a better cost. That's the metric. Nobody is buying an RTX 4000 card because of raytracing or whatever. So basically, if you're still using a 1080p panel, or 1440p75 or anything below 1080p240 you have zero reason to upgrade. 970/980/1060 is more or less what every game needs today anyway.

It's not bad at all what is not to like. The very fact that it's not 256 bit like it should,

But take into account that 3080 Ti 3090 and all the halo products were discounted to under 1K. from the original 1199 to 1999. 3080 12 GB for as low as 750.

Clearly 48 MB L2$ and 192bit is as efficient as 384 bit 6MB. and the price being slashed from $999 3090 Ti 24GB, to a more reasonable 799 losing half of the bus and the memory, the same 40. Tflops.

The one to get is 4070 5888 Cuda version and if delivering 3070 Ti + 10% is as good as it gets, I'll take that. 3x faster than my 980Ti.
Jesus do you guys really not get this
>as good as it gets
Maybe that's why, you have zero standards. How someone has a 980ti and thinks like this is beyond me.

Yes of course it's cheaper and discounted it's 2 freaking years old! Like, why tf would you expect it to still cost that much? And it's not "as low as" the f'ing things cost $700 at retail brand new at the end of 2020. It's 2023 now. If your 980ti cost "only" $900 today, would you call that a great deal?

>to a more reasonable
No it's not, because that TDP isn't being slashed, and my PSU isn't magically going to do 1 kilowatt. So it's altogether a much worse deal buying powerhog halo products. I'd consider 1080ti or 980ti used but that's more because they're really efficient and just noice cards with a special place in our hearts. Like, imagine GTX 590 performance at over 300w. That's why there's a sort of balance to older and used, because it reaches a certain limit in inefficiency, and the problem with Lovelace and Ampere they're literally the worst inefficient af graphics cards since Fermi, in fact it's literally worse than Fermi. So anyone buying these cards is also going to have a much harder time offloading them on the used market. Like the kind of person that has a 600w PSU and is gaming on a R5 1600 looking for used parts wouldn't be looking at a used 4090 or 3090ti.
 
Joined
Sep 1, 2022
Messages
25 (0.03/day)
Processor i5 4570
Motherboard Asus h81m-p
Memory G.skill ddr3 1600 2x4 gb
Video Card(s) MSI RX 470 4gb
Storage Crucial MX 500 500gb ssd , WD 500 gb hdd
Display(s) LG m237wdp
Audio Device(s) Microlab FC 330/Realtek 887
Power Supply Be quiet 400w
Mouse Logitech g203
Keyboard Logitech wireless
Everything here gets a 5/5. No matter if it undelivered performance, efficiency, affordability, and cooling - all at the same time. Look at RX 7900XT review. I would remove the star rating entirely (for the past reviews too) because it makes TPU look like sponsored influencers.
I browse the tpu forum only. Take a close look at reviews and news section(leading manufacter of or that like every tech company was a god saviour). Tpu its the most anti consumer website out there - just buy its the best and shut up. Also many posts on forum are fake and promote buying overpriced stuff. Once i saw how one guy wanted to cancel 4080 order because of backhlash from other users....and gues what? This wozzard advice him to buy that scam product anyway. Dont trust anything you read here especially reviews!
 
Joined
Dec 28, 2012
Messages
3,956 (0.90/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
The funniest thing, the only reason to even buy a card today is your old Maxwell/Pascal broke, or because you're getting a 4k or 1440p monitor.
Seriously, that's it. And they can shill this RT b.s. all they want but AMD has RT too so nuVidia's not special, it's just another HairWorks far as I (or a lot of other gamers) are concerned. Games like Battlefleet Gothic Armada II and The Ascent look very pretty regardless, because Unreal is a pretty engine. nVidia would try calling those particle effects "PTXtm ParticleCorestm" or something and try comparing how well their card does that compared to AMD.
Meanwhile in all reality, they're more or less equivalent brands in performance, and the real reason to upgrade is going higher resolution because lots of games also aren't super demanding and it still feels like stagnation in the gaming industry too.

I only even wanted a new card because I was thinking about 4k144hz, and frankly that's what EVERYONE wants. That's why it's so nuts they went with the stupid displayport gimping because 4k144 is quite literally the new normative standard. It's solely about whoever can do 4k144 native better and at a better cost. That's the metric. Nobody is buying an RTX 4000 card because of raytracing or whatever. So basically, if you're still using a 1080p panel, or 1440p75 or anything below 1080p240 you have zero reason to upgrade. 970/980/1060 is more or less what every game needs today anyway.


Jesus do you guys really not get this
>as good as it gets
Maybe that's why, you have zero standards. How someone has a 980ti and thinks like this is beyond me.

Yes of course it's cheaper and discounted it's 2 freaking years old! Like, why tf would you expect it to still cost that much? And it's not "as low as" the f'ing things cost $700 at retail brand new at the end of 2020. It's 2023 now. If your 980ti cost "only" $900 today, would you call that a great deal?

>to a more reasonable
No it's not, because that TDP isn't being slashed, and my PSU isn't magically going to do 1 kilowatt. So it's altogether a much worse deal buying powerhog halo products. I'd consider 1080ti or 980ti used but that's more because they're really efficient and just noice cards with a special place in our hearts. Like, imagine GTX 590 performance at over 300w. That's why there's a sort of balance to older and used, because it reaches a certain limit in inefficiency, and the problem with Lovelace and Ampere they're literally the worst inefficient af graphics cards since Fermi, in fact it's literally worse than Fermi. So anyone buying these cards is also going to have a much harder time offloading them on the used market. Like the kind of person that has a 600w PSU and is gaming on a R5 1600 looking for used parts wouldn't be looking at a used 4090 or 3090ti.
Bruh chill out LMFAO. It's just a GPU, why do you have to be mad?

If power usage is that big a deal to you you shouldnt be buying $500+ GPUs in the first place.

And LOL at thinking the ONLY reason you'd be replacing a maxwell is if it broke or you were going 4k, bud maxwell were great GPUs but its 8 years old now, games have moved on.
 
Joined
Jun 3, 2012
Messages
1,954 (0.43/day)
Location
Denmark
Processor Ryzen 7 7700
Motherboard Asrock B650 PG LIgtning
Cooling artic freezer 36
Memory G.Skill Flare X5 DDR5-6000 - 32GB - CL30
Video Card(s) ASUS Dual GeForce RTX 4070 EVO
Storage 1x2tb KC3000 & 2tb samsung 970 evo plus, 2 x 2 tb external usb harddrives
Display(s) LG 32GP850, IIyama G2470HSU-B1
Case Corsair 5000D airflow tg
Audio Device(s) Marantz PM 6007 Triangle Esprit Titus Ez
Power Supply Corsair RM850X White
Mouse Logitech G PRO Superlight
Keyboard Corsair K70 RGB TKL Champion
Software Windows 11 64 bit, Free bitdefender
Last edited:
Joined
Sep 1, 2009
Messages
1,237 (0.22/day)
Location
CO
System Name 4k
Processor AMD 5800x3D
Motherboard MSI MAG b550m Mortar Wifi
Cooling ARCTIC Liquid Freezer II 240
Memory 4x8Gb Crucial Ballistix 3600 CL16 bl8g36c16u4b.m8fe1
Video Card(s) Nvidia Reference 3080Ti
Storage ADATA XPG SX8200 Pro 1TB
Display(s) LG 48" C1
Case CORSAIR Carbide AIR 240 Micro-ATX
Audio Device(s) Asus Xonar STX
Power Supply EVGA SuperNOVA 650W
Software Microsoft Windows10 Pro x64
Why it's faster and much cheaper than a rtx 3090 ti
You cant compare the price of a Halo Product to a **70Ti. Halo products have their own pricing which is usually unreasonable because its the best. Just because it comes in as fast or slower for cheaper does not mean its good value.
 
Top