• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Ti Founders Edition

Joined
Apr 10, 2020
Messages
504 (0.29/day)
Pascal was the last great generation from nvidia the 1060 offered unbelievable value for 1080p gaming, the 1070 gave you 980ti performance at 50% less power, the 1080 was great for 1440p, and lastly how can you forget the 1080ti, a card so good, it blew the industry away with its capabilities and armed with 11gb of gddr5x.

ever since rtx 2000 series nvidia made no real improvements in performance and power efficiency, just focusing on raytracing and DLSS, rtx 3000 is even worse, abysmal power efficiency( up to 500w on rtx 3090!!!!), lackluster VRAM ( aside from 3060 and 3090), overheating gddr6x memory, no stock, and pointless SKUs like the this 3080ti,... wtf is going on at nvidia ??!!!

just when high resolution high refreshrate gaming started to becomes a reallity for everyone nvidia went full L since rtx 2000 series, no one wants ray tracing, we want 4k 144fps gaming, look how rtx 3060 promises rtx 2060 Super performance at 170watts, the 2060 super gave you gtx 1080 performance at 190watts, the gtx 1080 was 180watts gpu !!! NO REAL POWER EFFICIENCY IMPROVEMENTS SINCE 2016 !!!! AND THEY CHARGE YOU MORE

It's maybe not all bad... According to RedGamingTech leaks Intel XE & DG2 dGPU program is shaping up quite well (3070-3080 level of performance) and more importantly, Raja has allegedly got Intel's support to primarily target $200-300 mainstream market. Plus no AIB partners and putting strict distributors & retailer pricing policies in place, just like with it's CPUs. 2022 dGPU market might look much, much better if mining craze ends. Granted RDNA3 and Ampere next gen will be a tier above Intel offerings performance wise, but hey I'll gladly buy 3070-3080 level of performance GPU for 300 bucks if it has half decent drivers instead of what Ngreedia & AMD will try to charge for their new dGPU lineups.
 
Last edited:
Joined
Jan 20, 2019
Messages
1,592 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
The only way we can solve the problem is going back to 360p resolution gaming and sticking with integrated graphics. Just sit closer to the screen so it feels like a 27"/34" panel... for the wide-screeners just squint your eyes. As long as we have "huge" graphical fidelity ambitions, we will always be robbed by profiteering Ngreed'ism/co. I even regretted purchasing a 1080 TI initially for £600/£700. Once up and running it didn't feel like me money's worth. Although it did turn out to be a nice investment eventually.
 
Joined
Dec 26, 2006
Messages
3,862 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
I wonder if W1zzard has to get additional insurance with all these cards in his studio????
 
Joined
Jul 19, 2016
Messages
484 (0.16/day)
What's the point? Nvidia could package up toenail shavings for a couple thousand and countless idiots would buy it, jacking up the prices for everyone else (not for toenails but graphics cards to be clear).

I liked PC gaming before the middle class kids or casuals got interested in it about 5 or 6 years ago. Now they all want the best graphics cards so save up a whole month's worth of their McDonald's counter salary to buy one. These people don't have bills or kids.
 
Joined
Oct 22, 2014
Messages
14,170 (3.81/day)
Location
Sunshine Coast
System Name H7 Flow 2024
Processor AMD 5800X3D
Motherboard Asus X570 Tough Gaming
Cooling Custom liquid
Memory 32 GB DDR4
Video Card(s) Intel ARC A750
Storage Crucial P5 Plus 2TB.
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Eweadn Mechanical
Software W11 Pro 64 bit
What's the point? Nvidia could package up toenail shavings for a couple thousand and countless idiots would buy it, jacking up the prices for everyone else (not for toenails but graphics cards to be clear).

I liked PC gaming before the middle class kids or casuals got interested in it about 5 or 6 years ago. Now they all want the best graphics cards so save up a whole month's worth of their McDonald's counter salary to buy one. These people don't have bills or kids.
Wow, so many stupid assumptions.
 
Joined
Apr 23, 2017
Messages
22 (0.01/day)
Processor i7-13900K
Motherboard ROG Maximus Z690
Video Card(s) RTX 3090
@W1zzard

Thanks for the review!

Is there something wrong about the Cyberpunk 2077 ray-tracing results? The RX6000 series non-RT vs RT results are the same...

1622678212179.png
 
D

Deleted member 177333

Guest
Why is everybody so concerned with NVIDIA's MSRP? It's just a meaningless number. They probably didn't want to lower the x80 Ti MSRP compared to 2080 Ti, so they picked 1200, to not look bad when they announce 4080 Ti.

You will not be able to buy the 3080 Ti at that price, probably ever. As much as that sucks for all of us, that's what will happen. Look at what the card offers, compared to what's available at what price and make a decision based on that? I tried to go through some options and examples in my conclusion.

Eh the 2080 Ti was significantly overpriced at $1200. Its performance wasn't all that impressive. One of the key reasons I waited to pick one up used on ebay for just a bit above half of what its MSRP was (including a pre-installed wb to boot).

As you mentioned in a later post, though, people do need to stop buying this stuff at these price points or it won't ever go down.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
PC gaming isn't going away. It just needs a temporary adjustment in our thinking. People want to downplay the conditions wrought by the Pandemic and we have a hell of a mess to clean up going forward but it will happen.
 
Last edited:
Joined
Dec 25, 2020
Messages
7,020 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
You're giving it a Thumbs up/Pro for being 8nm? As opposed to what?

Personal opinion, but I believe that the Samsung 8 nm process isn't the reason these GPUs are so "power inefficient" (as in, hungry). This is most noticeable on the RTX 3070, I would go as far as saying that it's quite lean on power for the awesome amount of performance it provides, and I'm quite eager to see w1zz's review and the impact of GDDR6X on the 3070 Ti's power consumption and frametime stability, given that for all we know from rumor mills, they are coming with similar power limits to the vanilla 3070 variety. Being on this node is also positive for yield, as it doesn't have to compete with the numerous other products and orders that require TSMC 7 nm capacity, like AMD's entire product stack. "nm" is just marketing anyway, the actual transistor pitch isn't that small.

The biggest issue to me, so far, is the GDDR6X, it consumes an absolutely insane amount of power. This was measured in the middle of a 3DMark Time Spy Extreme run. Look at this, even at 61% memory controller load, the MVDDC (memory subsystem) is pushing 115W(!) of the 375W budget my card has... and there are games and workloads that demand more out of it.

mvddc.png


I must say, AMD's Infinity Cache solution to the bandwidth problem is simply ingenious and downright elegant over using hot and hungry PAM4 memory.
 
Joined
Dec 28, 2012
Messages
3,956 (0.90/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
I love the peanut gallery throwing out anecdotes about how PC gaming is totally dying and everyone is going to go buy consoles. Yeah, I cant get a GPU right now so I'm gonna drop high end GPU money on a console that cant do 4k60/1440p144 at ALL and can barely do 4k30/1440p60 (1080p60 for PS5 since it cant even do 1440p LMFAO) with a totally closed environment with no competition and stuck with joystick controls. :rolleyes:

Whatever you're smoking to come up with that argument, you can keep it, cause its garbage.

Also daily reminder that the 8800 ultra launch for the equivalent of $1100 in 2006. Prices go up, prices go down. :roll: LOLCALMDOWN :roll:
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
On the circuit board analysis page it would be nice to have some notes explaining the differences between the 3080FE, 3080Ti FE, and 3090FE - the chokes, VRMs, memory modules
Since the cards are so similar, something as simple as "the 3080 has X memory modules, Ti has 2 more, and 90 has them doubled onto the back of the PCB" would be really informative to those reading this first, without the background knowledge

According to the HWiNFO developer, GDDR6X modules are rated to throttle at around 110°C. They're toasty and consume a lot of power, any 3090 owner will attest to that :D
*Begins crying*
*Uses tears to fill my EK block and watercool the VRAM with an active backplate*


Yeah its a problem, and something the Ti should have resolved. They clearly just used the existing cooling setups with zero changes.
 
Joined
Sep 25, 2007
Messages
5,966 (0.95/day)
Location
New York
Processor AMD Ryzen 9 5950x, Ryzen 9 5980HX
Motherboard MSI X570 Tomahawk
Cooling Be Quiet Dark Rock Pro 4(With Noctua Fans)
Memory 32Gb Crucial 3600 Ballistix
Video Card(s) Gigabyte RTX 3080, Asus 6800M
Storage Adata SX8200 1TB NVME/WD Black 1TB NVME
Display(s) Dell 27 Inch 165Hz
Case Phanteks P500A
Audio Device(s) IFI Zen Dac/JDS Labs Atom+/SMSL Amp+Rivers Audio
Power Supply Corsair RM850x
Mouse Logitech G502 SE Hero
Keyboard Corsair K70 RGB Mk.2
VR HMD Samsung Odyssey Plus
Software Windows 10
Personal opinion, but I believe that the Samsung 8 nm process isn't the reason these GPUs are so "power inefficient" (as in, hungry). This is most noticeable on the RTX 3070, I would go as far as saying that it's quite lean on power for the awesome amount of performance it provides, and I'm quite eager to see w1zz's review and the impact of GDDR6X on the 3070 Ti's power consumption and frametime stability, given that for all we know from rumor mills, they are coming with similar power limits to the vanilla 3070 variety. Being on this node is also positive for yield, as it doesn't have to compete with the numerous other products and orders that require TSMC 7 nm capacity, like AMD's entire product stack. "nm" is just marketing anyway, the actual transistor pitch isn't that small.

The biggest issue to me, so far, is the GDDR6X, it consumes an absolutely insane amount of power. This was measured in the middle of a 3DMark Time Spy Extreme run. Look at this, even at 61% memory controller load, the MVDDC (memory subsystem) is pushing 115W(!) of the 375W budget my card has... and there are games and workloads that demand more out of it.

View attachment 202619

I must say, AMD's Infinity Cache solution to the bandwidth problem is simply ingenious and downright elegant over using hot and hungry PAM4 memory.

Agree 100%, even on my 3080 I see the memory hogging down power sometimes and you can undervolt the core by a good amount on ampere and get the power consumption down but the memory will still hog down power to the point that sometimes you can actually see the memory using more than the core when playing games that aren't using the core much often. AMDls solution was as you said pretty great in those regards.
 
Joined
Jul 13, 2016
Messages
3,334 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
I love the peanut gallery throwing out anecdotes about how PC gaming is totally dying and everyone is going to go buy consoles. Yeah, I cant get a GPU right now so I'm gonna drop high end GPU money on a console that cant do 4k60/1440p144 at ALL and can barely do 4k30/1440p60 (1080p60 for PS5 since it cant even do 1440p LMFAO) with a totally closed environment with no competition and stuck with joystick controls. :rolleyes:

Whatever you're smoking to come up with that argument, you can keep it, cause its garbage.

Also daily reminder that the 8800 ultra launch for the equivalent of $1100 in 2006. Prices go up, prices go down. :roll: LOLCALMDOWN :roll:

You misunderstood the argument prior commenters were making.

The problem is the pricing increases of GPUs in general and the complete lack of any improvements in the budget market, not temporary pricing during the pandemic. The pandemic is a separate problem that inflates prices across the board.

The pandemic is not forever, what people are worried about is that even if it does go, that still leaves little room for budget options and it won't change the fact that Nvidia is still charging $1,200 for this card. Consoles on the other hand will return to their MSRP of $500.

Most people are aware of the drawbacks of consoles, you don't have to point that out. That said at $500, if Nvidia / AMD completely fail to address the budget market you can't really blame those people for considering console when in fact Nvidia / AMD aren't even providing products most people can afford. PC elitists seem to forget that the PC market is held up mostly by budget and midrange where the vast majority of gamers reside. No amount of "Well PC can do this..." will change the price. If a person can't afford it they can't buy it, if a person thinks it isn't worth it they will spend their money elsewhere.

Speaking of the 8800 ultra:

"The 8800 Ultra, retailing at a higher price,[clarification needed] is identical to the GTX architecturally, but features higher clocked shaders, core and memory. Nvidia later[when?] told the media the 8800 Ultra was a new stepping,[clarification needed] creating less heat[clarification needed] therefore clocking higher. Originally retailing from $800 to $1000, most users thought the card to be a poor value, offering only 10% more performance than the GTX but costing hundreds of dollars more. Prices dropped to as low as $200 before being discontinued on January 23, 2008."


At the time that card released it was roundly criticized by the press for being extremely poor value and that was for a 10% gain on a 30% price increase. The 3080 Ti is a 7% increase for 70% more money. I'm glad you brought that up because it just objectively shows how piss poor value the 3080 Ti is even compared to more extreme examples. Mind you that was still a single overpriced card. Nvidia has been increasing the ASP across their entire GPU stack, not just a single model.
 
Last edited:
Joined
Aug 11, 2020
Messages
245 (0.15/day)
Location
2nd Earth
Processor Ryzen 5700X
Motherboard Gigabyte AX-370 Gaming 5, BIOS F51h
Cooling MSI Core Frozr L
Memory 32GB 3200MHz CL16
Video Card(s) MSI GTX 1080 Ti Trio
Storage Crucial MX300 525GB + Samsung 970 Evo 1TB + 3TB 7.2k + 4TB 5.4k
Display(s) LG 34UC99 3440x1440 75Hz + LG 24MP88HM
Case Phanteks Enthoo Evolv ATX TG Galaxy Silver
Audio Device(s) Edifier XM6PF 2.1
Power Supply EVGA Supernova 750 G3
Mouse Steelseries Rival 3
Keyboard Razer Blackwidow Lite Stormtrooper Edition
Pascal was the last great generation from nvidia the 1060 offered unbelievable value for 1080p gaming, the 1070 gave you 980ti performance at 50% less power, the 1080 was great for 1440p, and lastly how can you forget the 1080ti, a card so good, it blew the industry away with its capabilities and armed with 11gb of gddr5x.

ever since rtx 2000 series nvidia made no real improvements in performance and power efficiency, just focusing on raytracing and DLSS, rtx 3000 is even worse, abysmal power efficiency( up to 500w on rtx 3090!!!!), lackluster VRAM ( aside from 3060 and 3090), overheating gddr6x memory, no stock, and pointless SKUs like the this 3080ti,... wtf is going on at nvidia ??!!!

just when high resolution high refreshrate gaming started to becomes a reallity for everyone nvidia went full L since rtx 2000 series, no one wants ray tracing, we want 4k 144fps gaming, look how rtx 3060 promises rtx 2060 Super performance at 170watts, the 2060 super gave you gtx 1080 performance at 190watts, the gtx 1080 was 180watts gpu !!! NO REAL POWER EFFICIENCY IMPROVEMENTS SINCE 2016 !!!! AND THEY CHARGE YOU MORE
I can still remember clearly when consumer and media were very amazed that GTX 1080 only uses 1x 8pin to deliver flagship performance. Even GTX 1080 Ti with 8+6pin was considered power hungry at that time. I thought we were heading to a good direction with 20, 30, and 40 series and beyond in terms of power efficiency, apparently not :(

1060 was a phenomenal card, Nvidia will not be able to beat it with the current increasing MSRP. xx60 will reach xx80's price in the near future, as @RedelZaVedno said here:
It's not meaningless in the long run... just look at the direction MSRP prices are headed: GTX 680 = $499 / 780TI =$699 / 980TI = $649 / 1080TI = $699 / 2080TI = $999 / 3080TI =$1.199... 240% price hike in 9 years (18% inflation in this period). Elevated MSRPs are here to stay even after mining graze ends.
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.55/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
I can still remember clearly when consumer and media were very amazed that GTX 1080 only uses 1x 8pin to deliver flagship performance. Even GTX 1080 Ti with 8+6pin was considered power hungry at that time. I thought we were heading to a good direction with 20, 30, and 40 series and beyond in terms of power efficiency, apparently not :(

1060 was a phenomenal card, Nvidia will not be able to beat it with the current increasing MSRP. xx60 will reach xx80's price in the near future, as @RedelZaVedno said here:

Who cares about maximum power consumption when you can tweak the power limits to your liking, reducing power consumption will increase efficiency, demonstrated by the mobile GPU.

Infact you should be thankful that Nvidia/AMD keep increasing the maximum power limits on their desktop GPU because they have to design better VRMs to accomodate higher power consumption limits, better VRM --> higher VRM efficiency. Let say you have 6 phrase VRM that have 20W power loss at 150W TGP before, now you have 10+ phrase VRM that have only 10W power loss at 150W TGP

1660 Super was a super fine GPU at 230usd
 
Joined
Nov 19, 2019
Messages
108 (0.06/day)
Great review as usual.

Disappointing that $500 doesn't even get better thermal pads over the 3080. That card at msrp was exciting. This one not so much. Same number of vrms as well, although they repositioned one? Looks like very, very limited availability for the FE as well. I guess that was to be expected, but this time around seems even lower with best buy only selling in person at a limited number of stores.
 
Joined
Apr 30, 2011
Messages
2,716 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
And here is why the use of 5800X is a bottleneck for top-of-the-line GPU reviews now-a-days since some games properly utilise more threads
1622699779518.png

Surely our @W1zzard tested somewhere else in the game but the difference between 3080 and 6900XT in his review is 0 compared to the 13% in the HU review.
 
Joined
Jul 13, 2016
Messages
3,334 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Who cares about maximum power consumption when you can tweak the power limits to your liking, reducing power consumption will increase efficiency, demonstrated by the mobile GPU.

Infact you should be thankful that Nvidia/AMD keep increasing the maximum power limits on their desktop GPU because they have to design better VRMs to accomodate higher power consumption limits, better VRM --> higher VRM efficiency. Let say you have 6 phrase VRM that have 20W power loss at 150W TGP before, now you have 10+ phrase VRM that have only 10W power loss at 150W TGP

1660 Super was a super fine GPU at 230usd

A vast majority of consumers aren't going to tweak power limits. IMO it's frankly annoying to have another program running in the background and another source of potential issues.

That's not a problem customers should have to solve either. This is just like AMD users who were claiming AMD Vega is power efficient once you under-volt. That's great and all but it doesn't mean squat to the vast majority of users. Companies should ship products that hit their target markets out of the box. Customers should not have to fiddle with products after the fact. That's for enthusiasts if they want to spend the extra effort.
 
Joined
Dec 14, 2011
Messages
1,087 (0.23/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Noctua NH-D15 G2
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage SAMSUNG 990 PRO 2TB
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 PRO - OPX Linear Switches
Software Microsoft Windows 11 - Enterprise (64-bit)
Is anyone here on this forum a reseller and can actually get these cards at MSRP?
 
Joined
Mar 28, 2020
Messages
1,761 (1.02/day)
Personal opinion, but I believe that the Samsung 8 nm process isn't the reason these GPUs are so "power inefficient" (as in, hungry). This is most noticeable on the RTX 3070, I would go as far as saying that it's quite lean on power for the awesome amount of performance it provides, and I'm quite eager to see w1zz's review and the impact of GDDR6X on the 3070 Ti's power consumption and frametime stability, given that for all we know from rumor mills, they are coming with similar power limits to the vanilla 3070 variety. Being on this node is also positive for yield, as it doesn't have to compete with the numerous other products and orders that require TSMC 7 nm capacity, like AMD's entire product stack. "nm" is just marketing anyway, the actual transistor pitch isn't that small.

The biggest issue to me, so far, is the GDDR6X, it consumes an absolutely insane amount of power. This was measured in the middle of a 3DMark Time Spy Extreme run. Look at this, even at 61% memory controller load, the MVDDC (memory subsystem) is pushing 115W(!) of the 375W budget my card has... and there are games and workloads that demand more out of it.

View attachment 202619

I must say, AMD's Infinity Cache solution to the bandwidth problem is simply ingenious and downright elegant over using hot and hungry PAM4 memory.
I feel GDDR6X is a stop gap solution for a faster GDDR standard. Its almost similar to GDDR5X that never had a future beyond Nvidia's Pascal. As a result, of pushing such high clockspeed as compared to GDDR6, a lot of power is required. I wasn't very sure if GDDR6 uses that much power until I noticed the TGP of the RTX 3070 vs 3070 Ti. And in this case, its got only 8x GDDR6X 1GB. When you have 10, 12 or 24 of hot and power hungry RAM onboard, that will increase power requirement drastically. And I do agree that AMD's Infinity Cache is a great way to go around this power requirement and yet achieve better or comparable memory bandwidth.

As to Samsung's 8nm, while it is certainly more efficient than what its replacing, I don't necessarily think that its good. Its been proven that Samsung's 7nm is not as good as TSMC's 7nm, not to mention this supposed 8nm is basically Samsung's refined 10nm. Most of these RTX 3xxx runs at a fairly conservative clockspeed, i.e. around 1.8 Ghz, to keep power consumption in check. You can push it further into the 1.9GHz range, but that is generally with a +15% power limit applied. The saving grace here is probably Nvidia's Ampere architecture with ample memory bandwidth, and less of the 8nm Samsung node in my opinion.

A vast majority of consumers aren't going to tweak power limits. IMO it's frankly annoying to have another program running in the background and another source of potential issues.

That's not a problem customers should have to solve either. This is just like AMD users who were claiming AMD Vega is power efficient once you under-volt. That's great and all but it doesn't mean squat to the vast majority of users. Companies should ship products that hit their target markets out of the box. Customers should not have to fiddle with products after the fact. That's for enthusiasts if they want to spend the extra effort.
Companies ship product that works. Therefore, they ship with settings that are what they deem as "safe" to make sure the product works according to specs. They can't possibly test every chip that comes in and provide a custom setting each time.

In my opinion, its the people that are savvy that will figure out something is not right, and will try and fix it, i.e. fiddle with the power limits, etc. For people that are not savvy, they probably will live with it since while it runs hot, it works.
 
Joined
Feb 23, 2019
Messages
6,106 (2.87/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3600 CL14
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
And here is why the use of 5800X is a bottleneck for top-of-the-line GPU reviews now-a-days since some games properly utilise more threads
View attachment 202635
Surely our @W1zzard tested somewhere else in the game but the difference between 3080 and 6900XT in his review is 0 compared to the 13% in the HU review.
Dude, watch any YT video with RivaTuner running on a 5950X and 3090. An engine designed around Jaguar cores isn't going to utilize 32 threads:
1622705977828.png
 
Joined
Jul 13, 2016
Messages
3,334 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Companies ship product that works. Therefore, they ship with settings that are what they deem as "safe" to make sure the product works according to specs. They can't possibly test every chip that comes in and provide a custom setting each time.

In my opinion, its the people that are savvy that will figure out something is not right, and will try and fix it, i.e. fiddle with the power limits, etc. For people that are not savvy, they probably will live with it since while it runs hot, it works.

This is simply not true given that both AMD (for CPUs and GPUs) and Nvidia have dynamic boost features that will give the end user extra performance depending on specific silicon quality and temperature. AMD's dynamic boosting for it's CPUs in particular does an excellent job to the point where manual tuning isn't needed and can actually yield less performance than the automatic boosting system.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
3080 Ti and 3090 have their place .. For 4K-5K gaming

I will keep my 3080 till 4070-4080 launches in late 2022 tho or wait for refreshes in 2023 if pricing and availablity have not normalized by 2H 2022

Or maybe I will consider Radeon 7800XT/8800XT, if AMD can keep up the pace
 
Top