• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS GeForce RTX 4070 Ti Super TUF

Joined
Nov 26, 2021
Messages
1,641 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
NOT!
leaks pointed to a near 4080 like performance.
it seems the leaks were launched by nvidia themselves to overhype fans.
4070 ti super is barely any faster than the non-super card.

I maintain my statement: underwhelming refresh.
Those leaks were always unbelievable. We have known for some time that the 4070 Ti Super would have 66 SMs compared to the 76 SMs of the 4080. In other words, the 4080 has 15% more SMs than this SKU. All Ada cards have clocked in the same range so the difference in SMs would have to be made up by a clock speed boost of 10% or so which would have been unprecedented.
 
Joined
Jan 29, 2021
Messages
1,843 (1.33/day)
Location
Alaska USA
Those leaks were always unbelievable. We have known for some time that the 4070 Ti Super would have 66 SMs compared to the 76 SMs of the 4080. In other words, the 4080 has 15% more SMs than this SKU. All Ada cards have clocked in the same range so the difference in SMs would have to be made up by a clock speed boost of 10% or so which would have been unprecedented.
I hate to admit it but those leaks had me thinking this card was going to be a big hit come review time.
 
Joined
Nov 26, 2021
Messages
1,641 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
I hate to admit it but those leaks had me thinking this card was going to be a big hit come review time.
It has happened to most of us at one time or the other. Nowadays, I just wait for reviews before forming my own opinion. However, the leaks were right about the specifications; they just overestimated the performance gains.
 
Last edited:
Joined
Sep 20, 2021
Messages
431 (0.37/day)
Processor Ryzen 7 9700x
Motherboard Asrock B650E PG Riptide WiFi
Cooling Underfloor CPU cooling
Memory 2x32GB 6200MT/s
Video Card(s) 4080 SUPER Noctua OC Edition
Storage Kingston Fury Renegade 1TB, Seagate Exos 12TB
Display(s) MSI Optix MAG301RF 2560x1080@200Hz
Case Phanteks Enthoo Pro
Power Supply NZXT C850 850W Gold
Mouse Bloody W95 Max Naraka
@W1zzard

I want to ask about this:
For better real-life applicability, all game tests use custom in-game test scenes, not the integrated benchmarks

Is it possible to know where these "in-game test scenes" are if we want to test them on our own computers?
 
Joined
Apr 12, 2013
Messages
7,516 (1.77/day)
At least we ain't stuck in 2010—17 era with 4C8T CPUs being a go-to $300+ option for Intel and dud CPUs being a go-to option for AMD in every market segment.

$ per FPS still decreases. That's fine.
The counter to that is between 2017/18-23 we had arguably the best VFM desktop CPU's released in probably 2 decades if not more & generally the value of CPU's is only getting better! Can't say the same about dGPU though can you, they're getting stupidly expensive & literally a side project for the biggest GPU vendor out there :wtf:
 
Joined
Feb 24, 2023
Messages
2,986 (4.73/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
they're getting stupidly expensive
They were. Now, we only have a couple NV GPUs boasting this "feature," namely an RTX 4090 (this one can't be arsed because if you're top-1 you can ask for whatever money) and RTX 4060 Ti 16 GB.
Today's MSRP of 800 USD for 4070 Ti Super roughly equals to 650 (inflation percentage for 2024 hasn't been calculated yet) dollars from 2020 and offers about 1/3 more performance than Ampere/RDNA2 GPUs of such an MSRP. Not to mention lower TDP and better feature set.
1706112947621.png


This is not awesome by any stretch of imagination, true. But that's better than we could've expected from NV. I dunno what would've stopped 4070TS from selling at 900 USD considering it demolishes any AMD GPU at RT and this is the price range where RT performance does matter. 4070TS also ain't leagues behind in pure raster. It's just behind 7900 XTX (essentially being 83% as fast) and on par with 7900 XT (being <1% slower on average at 4K). I'm still pessimistic about Blackwell series GPUs because AMD don't even promise anything spectacular, let alone actually produce it.
 
Joined
Dec 12, 2012
Messages
773 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
In Poland this model is selling for $915 before taxes ($1125 with taxes), which is very far from the MSRP reported in this article. Not surprising, though, considering the cooling performance.

Only the really basic models are going for $800-825 before taxes.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,780 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
In Poland this model is selling for $915 before taxes ($1125 with taxes), which is very far from the MSRP reported in this article. Not surprising, though, considering the cooling performance.

Only the really basic models are going for $800-825 before taxes.
Meh .. I checked Newegg at 3pm and the TUF non-OC was really listed for MSRP, now it's marked as out of stock

edit: back in stock
1706115157603.png
 
Joined
Nov 4, 2005
Messages
36 (0.01/day)
Meh .. I checked Newegg at 3pm and the TUF non-OC was really listed for MSRP, now it's marked as out of stock

edit: back in stock

Please hold them all accountable and honest in the days after store release, by updating the first line in the value in conclusion page, to reflect the actual store price and if it's a paper launch for the reviewed model.

They get the reviews with the msrp prices, put up low stock for the reviewed version and upsell the OC ones.

Thanks for doing the price update on the 4070 ti super strix!
 
Last edited:
Joined
Jul 13, 2016
Messages
3,268 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Oh boy, are the children going to cry about that statement. Doesn't change its truth though.

I can't say I'm a fan of the argument that because said issue is rare makes it's a non-issue. What you are implying is that VRAM will only be an issue once game devs start making games that perform terribly on most people's systems, which is simply illogical and is unlikely to ever happen. It's a condition that nearly impossible to statisfy because it's not how the market works. Software follows the hardware, not the other way around. Game devs have chimmed in that having to optimize games for the 8 VRAM buffers for years as VRAM size has stagnated has crimped creativity and created technical strain on their teams. Makes sense, Nvidia first needs to increase VRAM allotments being the majority market leader in order for devs to be able to utilize said VRAM. At the end of the day the logic is circular, saying VRAM shouldn't be increased because games don't need it means games won't be made requiring more VRAM.

In addition most people are buying cards to last 3-7 years, not just to play games available at launch. Given how cheap VRAM is there's really no excuse to not be including more on mid-range products to begin with other than to force people to upgrade at a quicker cadence. The only reason this is a discussion is because we had / have games exceeding the VRAM buffer of new moderately expensive cards, some crashing and others having to constantly swap textures causing visual degradation. Not that I expect a ton of games to do that within 1-2 years of a video card's launch, as explained above game devs have to optimize to existing hardware.
 
Last edited:
Joined
Feb 3, 2012
Messages
134 (0.03/day)
Location
Medina, Ohio
System Name Daily driver
Processor i9 13900k
Motherboard Z690 Aorus Master
Cooling Custom loop
Memory 2x16 GB GSkill DDR5 @ 6000
Video Card(s) RTX4090 FE
Storage 2x 2TB 990 Pro SSD 1x 2TB 970 evo SSD, 1x 4TB HDD
Display(s) LG 32" 2560x1440
Case Fractal Design Meshify 2 XL
Audio Device(s) onboard
Power Supply beQuiet Dark Power 12 1000W
Mouse Razer Death adder
Keyboard Razer blackwidow v3
VR HMD n/a
Software Windows 11 pro
Benchmark Scores Heaven 4.0 @ 2560x1440 270.5 FPS
I dont understand why all the negativity around this. Your getting more for the same money. End of story. Nvidia could have just let things lie the way they were, but were getting more choices, and that is never a bad thing.
 
Joined
Jul 13, 2016
Messages
3,268 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
I dont understand why all the negativity around this. Your getting more for the same money. End of story. Nvidia could have just let things lie the way they were, but were getting more choices, and that is never a bad thing.

People are negative because these are the specs the cards should have released with from the start.
 
Joined
Jan 29, 2023
Messages
1,520 (2.31/day)
Location
France
System Name KLM
Processor 7800X3D
Motherboard B-650E-E Strix
Cooling Arctic Cooling III 280
Memory 16x2 Fury Renegade 6000-32
Video Card(s) 4070-ti PNY
Storage 500+512+8+8+2+1+1+2+256+8+512+2
Display(s) VA 32" 4K@60 - OLED 27" 2K@240
Case 4000D Airflow
Audio Device(s) Edifier 1280Ts
Power Supply Shift 1000
Mouse 502 Hero
Keyboard K68
Software EMDB
Benchmark Scores 0>1000
I dont understand why all the negativity around this. Your getting more for the same money. End of story. Nvidia could have just let things lie the way they were, but were getting more choices, and that is never a bad thing.
 
Joined
Dec 25, 2020
Messages
6,643 (4.67/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I dont understand why all the negativity around this. Your getting more for the same money. End of story. Nvidia could have just let things lie the way they were, but were getting more choices, and that is never a bad thing.

It doesn't take a genius to figure it out.

NGREEDIA is an exceptionally greedy megacorp that peddles overpriced wares and whose HQ is located in one of the inner circles of Hell and run by a leather-clad incarnation of Beelzebub himself. Not even Dante Alighieri could describe the unspeakable evil perpetrated by the monsters who run this ruthless global corporation, nor could the works of H.P. Lovecraft even begin to compare to the horrors they have unleashed upon the world. Remember, a decade ago they faced a class action lawsuit over a product's design! 13 years ago some drivers killed some random cards! THEY ARE EVIL. Did you know that the "nVIDIA" driver caused lots of BSODs back when Vista came out!? THEY LITERALLY RUINED MICROSOFT'S REPUTATION. Every time you buy a GeForce, a kitten dies.

Think that's hyperbole? I've got an example on this thread!

They were. Now, we only have a couple NV GPUs boasting this "feature," namely an RTX 4090 (this one can't be arsed because if you're top-1 you can ask for whatever money) and RTX 4060 Ti 16 GB.
Today's MSRP of 800 USD for 4070 Ti Super roughly equals to 650 (inflation percentage for 2024 hasn't been calculated yet) dollars from 2020 and offers about 1/3 more performance than Ampere/RDNA2 GPUs of such an MSRP. Not to mention lower TDP and better feature set.
View attachment 331219

This is not awesome by any stretch of imagination, true. But that's better than we could've expected from NV. I dunno what would've stopped 4070TS from selling at 900 USD considering it demolishes any AMD GPU at RT and this is the price range where RT performance does matter. 4070TS also ain't leagues behind in pure raster. It's just behind 7900 XTX (essentially being 83% as fast) and on par with 7900 XT (being <1% slower on average at 4K). I'm still pessimistic about Blackwell series GPUs because AMD don't even promise anything spectacular, let alone actually produce it.

Honestly the pricing is better than I expected, Ada's initial pricing was very similar to Turing's, very high... perhaps Blackwell may take a note from the Ampere page and be more accessible, even if the performance figures don't end up "far higher". Even then we're going to conclude this generation without a product that fully realizes the Ada Lovelace architecture's potential, as the 4090 is severely cutdown and as far as AD102 is concerned, Nvidia is prioritizing the AI market instead.

The 4080 Super may very well represent the "finest" in consumer-grade Ada, imo. This card succeeds in being a "lite" 4080 more than it does on being a "buffed" 4070 Ti and IMO that is because it shares more in nature with the 4080 than it does with the 4070 Ti, but that's understandable as the 4070 Ti was already full-die AD104.
 
Joined
May 19, 2009
Messages
1,861 (0.33/day)
Location
Latvia
System Name Personal \\ Work - HP EliteBook 840 G6
Processor 7700X \\ i7-8565U
Motherboard Asrock X670E PG Lightning
Cooling Noctua DH-15
Memory G.SKILL Trident Z5 RGB Black 32GB 6000MHz CL36 \\ 16GB DDR4-2400
Video Card(s) ASUS RoG Strix 1070 Ti \\ Intel UHD Graphics 620
Storage 2x KC3000 2TB, Samsung 970 EVO 512GB \\ OEM 256GB NVMe SSD
Display(s) BenQ XL2411Z \\ FullHD + 2x HP Z24i external screens via docking station
Case Fractal Design Define Arc Midi R2 with window
Audio Device(s) Realtek ALC1150 with Logitech Z533
Power Supply Corsair AX860i
Mouse Logitech G502
Keyboard Corsair K55 RGB PRO
Software Windows 11 \\ Windows 10
So it came out in Latvia - cheapest is 960 EUR. Suffice to say normal Ti is cheaper and was cheaper from the very start, but still is 900 EUR, as it was.
I hate greed.
 
Joined
Jan 20, 2019
Messages
1,547 (0.73/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
Oh boy, are the children going to cry about that statement. Doesn't change its truth though.

It's a true statement. Rarely does VRAM capacity make a difference at product launch. It doesn't change the discussion one bit though. The fact is that most people are buying cards to last 3-7 years and with that scope in mind it certainly will make a difference for most assuming the card was straddling the line to begin with. Given how cheap VRAM is there's really no excuse to not be including more to begin with. It's akin to choosing between 16GB and 32GB of main system memory. In 95% of scenarios you are just fine with the 16GB but why in world would you not spend the small bit extra to ensure your PC operates smoothly all the time and into the future? It'd be one thing if memory was expensive but it's dirt cheap, it's illogical to be spending this much on parts and not justify the tiny bit extra. That Nvidia didn't include extra VRAM to begin with is purely to force more people to upgrade sooner.

Generally speaking:

I don't necessarily recall anyone suggesting 16GB was a requirement to lift immediate performance limitations on a broader spectrum. At best, a couple or few newer games are showing signs of utilizing more than 8GB/12GB (whether 1080p/1440p...etc). Furthermore we witnessed 2 reviews (one from HU and cant recall the other) where less VRAM can maintain excellent performance but at the cost of compromising on visual quality, a process involving texture streaming or dynamic assets swapping. Whether its adaptive quality, compression or dynamic pruning in some examples less VRAM observably illustrated these different levels of texture details and it didn't look pretty. If im selecting "high" quality settings i'd like my selection to pay off and not be compromised in favour of pushing the FPS "ranking" slide.

The good news: its a minor issue at the moment hence not of grave concern. 8GB still works great at 1080p and 12GB at 1440p although we don't have much material evidence to compare every GPU-intense game to see what level of visual quality compromise is taking shape. Lucky for me, the games I play at 1440p with a 10GB + 11GB cards (3080, 2080 TI) are holding up well although in rare instances enforced and poorly administered SMART assets swapping is noticeable on occasion. All in all, there is strong context here compelling the need to lift hardware limitations, especially considering VRAM isn't going to stretch production costs to any significant levels or charitably fits the bill considering mid-tier graphics cards cost a bomb, big enough MSRP to strap 16GB and not feel a pinch.

The other question being... why is more "VRAM" problematic for the consumer? Naturally we should be inclined to demanding more. If i'm investing in hardware, i don't look for something which will serve me well today but for something with some extra juice to keep me hydrated tomorrow. We're finally seeing 16GB across all high performance tiers, i hope next Gen will have 16GB standardised from the get go alongside affordable lower VRAM provisioned cards for gamers on low resolutions or less GPU intensive titles. Not only future proofing at its best but as a consumer i want these doors wide open and aiding developers creating visually demanding and resource intensive applications opposed to strategically having to manage and optimise in limited space at a constant rate. ~To the extent we eventually collide with the adaptive diminishing quality wall of "great FPS performance but shit for show quality preservation".

If we eliminate the price apprehension of $800, 16GB on the 4070 TI SUPER looks real pretty! Not fashionably pretty but should have been the norm for 70-class from the get-go. The trim on cache does raise the eyebrow but i'm not technically furnished to query whether it presents a bottleneck in some shape or form.
 
Joined
Feb 3, 2012
Messages
134 (0.03/day)
Location
Medina, Ohio
System Name Daily driver
Processor i9 13900k
Motherboard Z690 Aorus Master
Cooling Custom loop
Memory 2x16 GB GSkill DDR5 @ 6000
Video Card(s) RTX4090 FE
Storage 2x 2TB 990 Pro SSD 1x 2TB 970 evo SSD, 1x 4TB HDD
Display(s) LG 32" 2560x1440
Case Fractal Design Meshify 2 XL
Audio Device(s) onboard
Power Supply beQuiet Dark Power 12 1000W
Mouse Razer Death adder
Keyboard Razer blackwidow v3
VR HMD n/a
Software Windows 11 pro
Benchmark Scores Heaven 4.0 @ 2560x1440 270.5 FPS
People are negative because these are the specs the cards should have released with from the start.


How so? VRAM debate aside - the x70 parts have traditionally been cut down x04 parts. The bus width is the only thing you can argue.

GTX670 = cut down GK104
GTX970 = cut down GM204
GTX1070Ti = cut down GP104
RTX2070 super = cut down TU104
RTX3070Ti = fully enabled GA104
RTX4070Ti = fully enabled AD104

Now we are complaining that they gave us an extra card that is based on cut down AD103, but this is what it should have been to begin with?
 
Joined
Jul 13, 2016
Messages
3,268 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
How so? VRAM debate aside - the x70 parts have traditionally been cut down x04 parts. The bus width is the only thing you can argue.

GTX670 = cut down GK104
GTX970 = cut down GM204
GTX1070Ti = cut down GP104
RTX2070 super = cut down TU104
RTX3070Ti = fully enabled GA104
RTX4070Ti = fully enabled AD104

Now we are complaining that they gave us an extra card that is based on cut down AD103, but this is what it should have been to begin with?

You sort of answer your own question. The 670, 970, 1070 Ti, 2070 super, and 3070 Ti all have 256-bit memory buses. The 4070 Ti is a 192-bit memory bus card with 12GB of VRAM and an MSRP of $800. Not only is the price high (people expect a lot at $800), it's providing you less memory bandwidth than the previous gen 3070 Ti. It's pretty easy to see why people say the 4070 Ti Super is what the card should have originally been, it's the only xx104 die out of that list that has a 192-bit bus.

I'd also like to point out that accounting for inflation the MSRP of the 970 comes out to $432.75. Nvidia's margins are absolutely ridiculous. People should demand more, a lot more, for $800.

I don't necessarily recall anyone suggesting 16GB was a requirement to lift immediate performance limitations on a broader spectrum. At best, a couple or few newer games are showing signs of utilizing more than 8GB/12GB (whether 1080p/1440p...etc). Furthermore we witnessed 2 reviews (one from HU and cant recall the other) where less VRAM can maintain excellent performance but at the cost of compromising on visual quality, a process involving texture streaming or dynamic assets swapping. Whether its adaptive quality, compression or dynamic pruning in some examples less VRAM observably illustrated these different levels of texture details and it didn't look pretty. If im selecting "high" quality settings i'd like my selection to pay off and not be compromised in favour of pushing the FPS "ranking" slide.

The good news: its a minor issue at the moment hence not of grave concern. 8GB still works great at 1080p and 12GB at 1440p although we don't have much material evidence to compare every GPU-intense game to see what level of visual quality compromise is taking shape. Lucky for me, the games I play at 1440p with a 10GB + 11GB cards (3080, 2080 TI) are holding up well although in rare instances enforced and poorly administered SMART assets swapping is noticeable on occasion. All in all, there is strong context here compelling the need to lift hardware limitations, especially considering VRAM isn't going to stretch production costs to any significant levels or charitably fits the bill considering mid-tier graphics cards cost a bomb, big enough MSRP to strap 16GB and not feel a pinch.

Please re-read my original comment, your quoted content in between edits.

"I can't say I'm a fan of the argument that because said issue is rare makes it's a non-issue. What you are implying is that VRAM will only be an issue once game devs start making games that perform terribly on most people's systems, which is simply illogical and is unlikely to ever happen. It's a condition that nearly impossible to statisfy because it's not how the market works. Software follows the hardware, not the other way around. Game devs have chimmed in that having to optimize games for the 8 VRAM buffers for years as VRAM size has stagnated has crimped creativity and created technical strain on their teams. Makes sense, Nvidia first needs to increase VRAM allotments being the majority market leader in order for devs to be able to utilize said VRAM. At the end of the day the logic is circular, saying VRAM shouldn't be increased because games don't need it means games won't be made requiring more VRAM."

Regardless of how you quantify the strain on developers or end users, you cannot expect games to utilize additional VRAM until cards come out with more VRAM.

FYI we witnessed 5 reviews of games running into VRAM limitations, not 2. That's not a lot but that's missing the point. Statements like "x and y VRAM cards still work fine at 1080" or only looking at which games are impacted by lack of VRAM now are the actual definition of not being able to see the forest through the trees for the reasons demonstrated above.
 
Joined
Feb 3, 2012
Messages
134 (0.03/day)
Location
Medina, Ohio
System Name Daily driver
Processor i9 13900k
Motherboard Z690 Aorus Master
Cooling Custom loop
Memory 2x16 GB GSkill DDR5 @ 6000
Video Card(s) RTX4090 FE
Storage 2x 2TB 990 Pro SSD 1x 2TB 970 evo SSD, 1x 4TB HDD
Display(s) LG 32" 2560x1440
Case Fractal Design Meshify 2 XL
Audio Device(s) onboard
Power Supply beQuiet Dark Power 12 1000W
Mouse Razer Death adder
Keyboard Razer blackwidow v3
VR HMD n/a
Software Windows 11 pro
Benchmark Scores Heaven 4.0 @ 2560x1440 270.5 FPS
Right. I said the memory bus width is the only thing you can argue, and I agree that it should be wider on AD104, and AD106.
price is something totally different. you said specs in your post, not price.
Yes, the prices are too high.
Base model RTX4070Ti, fully enabled AD104 should have been 256bit, and $500.
 
Joined
Jan 20, 2019
Messages
1,547 (0.73/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
Please re-read my original comment, your quoted content in between edits.

"I can't say I'm a fan of the argument that because said issue is rare makes it's a non-issue. What you are implying is that VRAM will only be an issue once game devs start making games that perform terribly on most people's systems, which is simply illogical and is unlikely to ever happen. It's a condition that nearly impossible to statisfy because it's not how the market works. Software follows the hardware, not the other way around. Game devs have chimmed in that having to optimize games for the 8 VRAM buffers for years as VRAM size has stagnated has crimped creativity and created technical strain on their teams. Makes sense, Nvidia first needs to increase VRAM allotments being the majority market leader in order for devs to be able to utilize said VRAM. At the end of the day the logic is circular, saying VRAM shouldn't be increased because games don't need it means games won't be made requiring more VRAM."

Regardless of how you quantify the strain on developers or end users, you cannot expect games to utilize additional VRAM until cards come out with more VRAM.

FYI we witnessed 5 reviews of games running into VRAM limitations, not 2. That's not a lot but that's missing the point. Statements like "x and y VRAM cards still work fine at 1080" or only looking at which games are impacted by lack of VRAM now are the actual definition of not being able to see the forest through the trees for the reasons demonstrated above.

We're on the same page :)
 

tom_tang

New Member
Joined
Jan 25, 2024
Messages
1 (0.00/day)
Thank you very much for adding GPU compute part in your reviews especially stable diffusion and Topaz AI. Please do that consistently!
In adidition to the 512x512 generation, one kind advice for stable diffusion is that maybe you can include higher resolution generations like 768x1024 with Hires fix 2X enabled and/or SDXL resolutions like 1024x1024 above plus Hires Fix 2X(2x2=4 times data), which will show the advantage and 24GB ram clearly.
 
Joined
Dec 28, 2013
Messages
151 (0.04/day)
I dont understand why all the negativity around this. Your getting more for the same money. End of story. Nvidia could have just let things lie the way they were, but were getting more choices, and that is never a bad thing.
except it is not the same money, 4070 ti has dropped a bit in price since launch which gave it more value...now it's going away and goes back to square 1
 
Last edited:
Joined
Jul 13, 2016
Messages
3,268 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
"We used the Automatic 1111 GUI with Stable Diffusion 1.5 to request ten "dog swimming in water" images at a resolution of 512x512 with 50 steps, the reported number is images generated per minute."


Topaz is a great addition to the test suite but I'm not sure I'd want to do a benchmark based on Automatic1111 GUI for stable diffusion. Performance is highly variable depending on the front end used even when using the same model. For example with my RTX 4080 I get significantly higher performance on Automatic1111 over Easy diffusion but I prefer Easy Diffusion as the interface is just leaps and bounds better. Foocus is pretty performant and has a simple interface (too simple IMO). In addition Automatic1111 GUI's windows version only supports a directML implementation for AMD cards. It doesn't yet have a ROCm implementation like the Linux version does. AMD also recommend optimizing AI models using Microsoft Olive for up to a 9x boost. I'm not an AMD user so I cannot comment on how to optimize for AMD cards but it's not a pick up and go experience like for Nvidia cards. The program was essentially designed around Nvidia cards because that's what was first so you have to take extra steps to ensure an AMD card is getting it's full performance. There really isn't much information in regards to these factors in the test description. I'm not entirely sure how the FP precision of the model, whether it's an standard or XL model, or other Automatic1111 settings may impact performance either. There's no "typical" workload yet so it's hard to say what settings should be used. I do know is that for many the default Automatic1111 settings are not optimial, you definitely want to tweak a lot of things and plug-ins are a requiremnet if you want to things that are now considered basic like controlnet and inpaint.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,780 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
a benchmark based on Automatic1111 GUI for stable diffusion
I'm using the same version, parameters and setup for all, so should be comparable. No doubt that tweaking the parameters can affect the scores slightly, also resolution and other settings to make cards look better if they have more than x GB VRAM

AMD also recommend optimizing AI models using Microsoft Olive for up to a 9x boost
That's what I did. Even once for each different GPU. Before testing I reached out to all GPU vendors and asked for their thoughts and guidance on GPU compute. I guess you know who replied first, second and who I had to send a "did you see this?" email ;)

how the FP precision of the model
They quantize "optimize" it down to FP16

There's no "typical" workload yet
I'm aware of that and I'll be making changes to compute testing throughout the year. Really appreciate the feedback

Definitely a great suggestion, but AMD support is even worse, no Intel OneAPI either, and I think more people can relate to A1111
 

Teddy1983

New Member
Joined
Jan 25, 2024
Messages
17 (0.06/day)
On the day of the premiere in Poland, the lowest store price was close to the MSRP price, i.e. PLN 4099 in the Proshop store. It's obviously more like the suggested price, but what to do. Personally, I bought this card in another Polish store for an even lower price of 4070 PLN. But it was the so-called golden shot. I suggest you also take an interest in the Gainward Phoenix Gs or non Gs card. The card is cooler than the Tuf, has 8 instead of 6 tubes, consumes less electricity and has a higher clock than the Tuf.
 
Top