• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3070 Ti Founders Edition

Joined
Aug 24, 2016
Messages
9 (0.00/day)
Why are you worried about efficiency when this is obviously the same GA-104 silicon being pushed harder? OF COURSE efficiency is down, that's what happens when you push higher voltages and higher clocks. This isn't new silicon, and the laws of physics always apply.

As for pricing, I covered that by saying this will never be attainable at MSRP. The botters have proven that they can get the lion's share of Founders Edition stocks, and so few of those are made in the first place that whatever's left for end-users to actually buy might as well be vaporware.
The 3070 Ti could have been less efficient for any number of reasons, it makes zero difference in whether someone likes or dislikes the loss of efficiency. This is like saying "but why do you care about ice cream having sugar in it, OBVIOUSLY they added it to it so its ok because we understand it has sugar!" Or like "captain, we shouldn't arrest him because we understand that criminals are unstable people who make bad decisions so why bother with a man who just assaulted someone" Like, no, that is not how decisions work.

I'm honestly tired of seeing this sentiment every time there is discussion about Ampere's bad power efficiency, where people try to defend it and tell people what their opinion should be because we uNdErStAnD why. IT DOES NOT MATTER WHY the power efficiency is how it is. Samsung 8nm, GDDR6X, architecture, it is all irrelevant to someone's opinion of a generation that is barely more efficient than five year old Pascal. People need to stop telling others what their opinions are about graphics cards because it is incredibly annoying.
 
Last edited:
Joined
Aug 20, 2007
Messages
21,529 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Like the one on the overclocking page.. just like in my last 700 (!) reviews?
OT but GeCube Radeon sounds like a Radeon that really identifies as a GeForce.
 
Joined
Sep 24, 2020
Messages
227 (0.15/day)
Location
Stehekin, Washington
System Name (2008) Dell XPS 730x H2C
Processor Intel Extreme QX9770 @ 3.8GHz (No OC)
Motherboard Dell LGA 775 (Dell Propiatary)
Cooling Dell AIO Ceramic Water Cooling (Dell Propiatary)
Memory Corsair Dominator Platinum 16GB (4 x 4) DDR3
Video Card(s) EVGA GTX 980ti 6GB (2016 ebay-used)
Storage (2) WD 1TB Velociraptor & (1) WD 2TB Black
Display(s) Alienware 34" AW3420DW (Amazon Warehouse)
Case Stock Dell 730x with "X" Side Panel (65 pounds fully decked out)
Audio Device(s) Creative X-FI Titanium & Corsair SP2500 Speakers
Power Supply PSU: 1000 Watt (Dell Propiatary)
Mouse Alienware AW610M (Amazon Warehouse)
Keyboard Corsair K95 XT (Amazon Warehouse)
Software Windows 7 Ultimate & Alienware FX Lighting
Benchmark Scores No Benchmarking & Overclocking
It was reported that STALKER 2 will be arriving by mid-2022 and revealing a very high barrier of entry as to required minimum GPU specs. What this signaled is that RTX 3070 & RTX 3080 cards will essentially be outclassed or at that time just basically meeting minimum requirements with most new and upcoming AAA gaming titles. Since gaming developers actually never stood still, the GPU shortage in turn or its still existing unavailability with any of the 3000 series cards, considerably thus shrunk their future-proofing period. Then "ultrawide-monitors" in waves are also hitting the decks in force putting even more strain on needed GPU performance. With that NVIDIA better be coming out in the next 10-months or so with a brand new (more powerful mainstream) GPU product to catch-up! It was already a shame that NVIDIA was not listening to the masses and supplying their new 3080 TI with a paltry 12GB of memory. Surely it's all about marketing by NVIDIA of which I know very little about being a simple man on the street.
 
Last edited:
Joined
Feb 20, 2019
Messages
8,331 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
It was reported that STALKER 2 will be arriving by mid-2022 and revealing a very high barrier of entry as to required minimum GPU specs. What this signaled is that RTX 3070 & RTX 3080 cards will essentially be outclassed or at that time just basically meeting minimum requirements with most new and upcoming AAA gaming titles. Since gaming developers actually never stood still, the GPU shortage in turn or its still existing unavailability with any of the 3000 series cards, considerably thus shrunk their future-proofing period. Then "ultrawide-monitors" in waves are also hitting the decks in force putting even more strain on needed GPU performance. With that NVIDIA better be coming out in the next 10-months or so with a brand new (more powerful mainstream) GPU product to catch-up! It was already a shame that NVIDIA was not listening to the masses and supplying their new 3080 TI with a paltry 12GB of memory. Surely it's all about marketing by NVIDIA of which I now very little about being a simple man on the street.
Whether or not 8GB is enough RAM or not isn't really a question anyone can answer.
What we do know for sure, from Steam surveys and market data, is that 12GB and 16GB cards are being adopted and game devs looking for a graphical edge will absolutely start taking advantage of that extra VRAM
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,933 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
It was reported that STALKER 2 will be arriving by mid-2022 and revealing a very high barrier of entry as to required minimum GPU specs
Haven't we heard that countless times, and then the game was a huge downgrade from E3, or terrible gameplay, or some other fail?

will absolutely start taking advantage of that extra VRAM
we'll see if they go significantly beyond what the consoles offers, I don't think they ever have
 
Joined
Nov 20, 2015
Messages
68 (0.02/day)
It was reported that STALKER 2 will be arriving by mid-2022 and revealing a very high barrier of entry as to required minimum GPU specs. What this signaled is that RTX 3070 & RTX 3080 cards will essentially be outclassed or at that time just basically meeting minimum requirements with most new and upcoming AAA gaming titles. Since gaming developers actually never stood still, the GPU shortage in turn or its still existing unavailability with any of the 3000 series cards, considerably thus shrunk their future-proofing period. Then "ultrawide-monitors" in waves are also hitting the decks in force putting even more strain on needed GPU performance. With that NVIDIA better be coming out in the next 10-months or so with a brand new (more powerful mainstream) GPU product to catch-up! It was already a shame that NVIDIA was not listening to the masses and supplying their new 3080 TI with a paltry 12GB of memory. Surely it's all about marketing by NVIDIA of which I know very little about being a simple man on the street.
yeah, history is repeating itself once again. i think most know by now that Nvidia (and AMD to a lesser extent) cards released at the start of the new console cycle age like carton wine: Kepler was the prime example. developers never make games for "those who didn't upgrade in time", they develop around consoles and their limits, PC gets unoptimized scraps that we just bulldoze through with raw power.
 
Last edited:
Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
Haven't we heard that countless times, and then the game was a huge downgrade from E3, or terrible gameplay, or some other fail?


we'll see if they go significantly beyond what the consoles offers, I don't think they ever have

STALKER is an ironic example. It has massively downgraded in gameplay and storytelling from the original design documents and even beta versions... yet it is still head and shoulders above MEtro Exodus, Far Cry 5, and Fallout 4 as a game. And its mods are usually absolutely top tier in open-world FPS gameplay.

As for VRAM Wizzard -> In Cyberpunk 2077 with RT on, 8GB is not enough. Even DF (that dont test complex areas) or Computerbase agree on this. Hell, Computerbase sees other problems too not just in that game.

You can also install Wolfenstein 2 and manually enable all settings to max since Mein Leben doesnt actually enable everything on max. And it will stutter or crash at 1440p on a 8GB card. There is also Doom Eternal at 4K as well but yeah.

I used a RTX 2080 at 4K and a RX 5700 XT at 4K as well. Both were generally good, but often I had to lower textures and/or other VRAM heavy settings that impacted long term gameplay stability. Cause yeah, a 8GB card can handle a single loaded room in Urdak, but once you go to the next one the pain starts.
 

Jordlr

New Member
Joined
Jul 7, 2021
Messages
1 (0.00/day)
Personally I don't care about the extra power consumption or the extra cost above the 3070.

In the real world the 70w difference equates to 1.2pence per hour extra in electricity costs in the UK. I just managed to pick up a 3070ti for £600. The 3070 is still unobtainable and when you can get one it's around the £800 region then I would need to play for 24/7 at max load for 22 months before the costs equal out. In real terms the 3070ti plus the extra electricity costs are still going to be cheaper than buying a 3070 or 6800 gpu in the years before upgrading again, all while enjoying my extra 7% performance and RTX gaming.
 
Joined
Dec 10, 2019
Messages
27 (0.01/day)
Location
United Kingdom
Processor 13700K 5.6GHz 8/16
Motherboard PRO Z690-A DDR4
Memory V10 2x16GB 4100MHz CL15
Video Card(s) 3080 Ti Gamerock
Benchmark Scores 3DMark CPU Pr. Test /No-L. Nitrogen 13700K @6.0 1300-14543 (5.)
I replaced my 3070 to a 3070 Ti based on current market prices, and on the assumption that the voltage curves which would have been made wrong are the cause of the serious increase in terms of power consumption, and it can be fixed. I've looked after multiple reviews in several models and in all-the default usage voltages seemed protrudingly high, and also It was suspicious to me that the power draw can't be reasoned with the improved memory type on its own.

I can confirm after playing around with the card which comes from the same brand(Gigabyte) that it has been proven right, and in general the differences with a bit of tweaking could be in average reduced to around 30w on the settings where they are going the most neck and neck as possible.

The first picture is the default voltage curve in the case of the 3070 Aorus and on the second on the 3070 Gaming.
It can be seen that the already poorly set curve is further botched by raising the voltage to the same levels 90 mhz lower as in the case of the 3070.
On the second picture there is some scarce data what I can show made with a 9900k at 5.1 on all cores.

Life is strange : TC is a fairly poorly optimized game, Trackmania on the other hand runs quite well on newer gen hardware. The power draw is 22w more in the former, and 31w more in the latter.
I did not want to make more of these pictures, but in other games so far what I've tested, the distinctions are similarish(in some occasion around upper 30s).

The poor curve on default kinda makes me question whether it is deliberate act by Nvidia. Would they really be this negligent?
 

Attachments

  • Screenshot - 08_10_2021 , 13_54_24.png
    Screenshot - 08_10_2021 , 13_54_24.png
    261.7 KB · Views: 99
  • grsd.jpg
    grsd.jpg
    302.6 KB · Views: 95
Last edited:
Joined
Dec 25, 2020
Messages
6,978 (4.80/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I replaced my 3070 to a 3070 Ti based on current market prices, and on the assumption that the voltage curves which would have been made wrong are the cause of the serious increase in terms of power consumption, and it can be fixed. I've looked after multiple reviews in several models and in all-the default usage voltages seemed protrudingly high, and also It was suspicious to me that the power draw can't be reasoned with the improved memory type on its own.

I can confirm after playing around with the card which comes from the same brand(Gigabyte) that it has been proven right, and in general the differences with a bit of tweaking could be in average reduced to around 30w on the settings where they are going the most neck and neck as possible.

The first picture is the default voltage curve in the case of the 3070 Aorus and on the second on the 3070 Gaming.
It can be seen that the already poorly set curve is further botched by raising the voltage to the same levels 90 mhz lower as in the case of the 3070.
On the second picture there is some scarce data what I can show made with a 9900k at 5.1 on all cores.

Life is strange : TC is a fairly poorly optimized game, Trackmania on the other hand runs quite well on newer gen hardware. The power draw is 22w more in the former, and 31w more in the latter.
I did not want to make more of these pictures, but in other games so far what I've tested, the distinctions are similarish(in some occasion around upper 30s).

The poor curve on default kinda makes me question whether it is deliberate act by Nvidia. Would they really be this negligent?

This is some pretty good data, thanks for sharing... and yeah, the curve is less than optimal throughout, the sweet spot for my 3090 is 0.83v and even at such low voltage it still power throttles... What you're observing there is the pressure on the power budget coming from the use of G6X, which makes their curve even less efficient...

I personally wouldn't mind if NVIDIA wasn't locking down every single card from all AIBs with a signed VBIOS, I could fix that myself... 450W power limit, static clocks and i'd be good to go... but I guess you have the hard evidence to back my point up already. Cheers
 
Top