• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

will gpu continue to have crazy TDP?

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,596 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
my point here is pascal was exclusively a gaming Architecture especially, dx 11 optimized to the fullest extent. an arch that is purely made for gaming will always be much more efficient. rdna 2 was amds pascal.

Aren't there Pascal-based Tesla cards?
 
Joined
Sep 3, 2019
Messages
3,523 (1.84/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (375W current) PowerLimit, 1060mV, Adrenalin v24.10.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2161), upgraded from Win10 to Win11 on Jan 2024
This is a good reference, but I can't help but wonder how many tens of thousands of times has the performance of the hardware improved since Rage 128, even if you disregard the functionality, that's quite fascinating really
Yeah I was wondering that myself
From 2008-2010 until today its like x250-300. Going another 10years back?

Memory bandwidth tho has improved about 1K since then (ATI Rage 128 GL)

These numbers compared are not doing justice to the subject...
Untitled_80.png
 
Joined
Sep 17, 2014
Messages
22,484 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
This is a good reference, but I can't help but wonder how many tens of thousands of times has the performance of the hardware improved since Rage 128, even if you disregard the functionality, that's quite fascinating really
Yeah, I think its hard to deny we definitely made massive progress since the first GPUs. And that progress (every time...) has reinforced our belief that it can't ever stop, but reality is starting to knock on the door at this point.
 
Joined
Apr 18, 2019
Messages
2,373 (1.15/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
To provide some numbers for this statement: with the stock 200W power limit, my RTX 3060 Ti LHR averaged 26.2 FPS in Unigine Superposition (1440p ultra). After raising the power limit to 220W via a VBIOS flash, it averaged 27.2 FPS. Sample size is roughly 18 test passes at each power target. In both cases, the card was throttling against its power limit.

If you bought a fancier 3060 Ti, it would have come out of the box with the higher 220W power limit. If you have one of those cards, based on my data, you could shave 10% off power consumption with only a 3% performance loss, at most - and that's without touching VFC.
Its true that almost every CPU/GPU today its pushed to the actual edge (by default) beyond its sweetspot on the power/efficiency curve for the sake of competition
Once upon a time, we had 'whole*' GPUs, (primarily) delineated by clocks and RAM (bus width/type/amount).
The Radeon 9800series and GeForce 6800series are decent examples. The GF 4MX, GF3, and GF2 are also great examples.
*At least in the US, "Whole Milk" has already had the butterfat and cream removed.

I just recently had the opportunity to play with a couple Dell OEM 6800s (Vanilla).
1708110479126.jpeg1708110533019.jpeg
1708110585564.jpeg1708110560588.jpeg
They're 100% bus powered (~/<75W), 12pipes, and 256-bit 256MB DDR, and will OC to 425 Core and 390/780 DDR (max 'factory' limits) with ease.
Contrast that with the 'steps up' that required Auxiliary Power and much beefier coolers. Performance is increased, but not inline with the additional heat and power.

IMHO, *that* is the reason we don't see 'fat' low-clocked GPUs. It'd be like the old days, where custom cooling and quick OCing was a 'free upgrade'.
With how chips are made today, it would cannibalize the rest of their own product stack

Aren't there Pascal-based Tesla cards?
Correct.
 
Last edited:
Joined
Sep 3, 2019
Messages
3,523 (1.84/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (375W current) PowerLimit, 1060mV, Adrenalin v24.10.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2161), upgraded from Win10 to Win11 on Jan 2024
My first GPU (2001) was a PowerVR KYRO II with x2+(?) the performance of Rage128
And my next one was 6800GT (2004) with "out of the world" performance uplift.
 
Joined
Jan 14, 2019
Messages
12,359 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
it still is physics in the end it all comes down to this why. so yes it really is. you wont change natural laws and no one ever will.

without ai we would have no nvidia gpu for gamers right now lol ada gpus are 100% ai. gaming gpus are just the crap chips that couldnt make it to ai chips for selling for 10k plus.

but yes dedicated gaming gpus would be the best but wishfull thinking.

so yes unless they find a completely new way of making gpus we will not come down with the watts ever again. unlesss you would like to have the same performance like now, than they can bring that down but they will never release a next gen with the same power as before. (which they actually already do. 3060 ti to 4060 ti and rx 6800xt to 7800xt). thats how much we are near the wall.
AI (Tensor cores) doesn't take up as much die real-estate as you think. As far as I know, it's about 10-15% on Ada, and way, way less on RDNA 3. The huge majority of a GPU is still the rendering pipeline.
 
D

Deleted member 237813

Guest
while that is true i dont think you understood what my point was. fucking language barrier lol
 
Joined
Jan 14, 2019
Messages
12,359 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
while that is true i dont think you understood what my point was. fucking language barrier lol
If you mean that Ada is wholly meant for AI work, then you might not be entirely wrong, although I'm not sure how long it'll stay feasible to carve AI and gaming GPUs out of the same silicon. I'm sure Nvidia could cram a lot more AI power into a chip by seriously cutting down the unnecessary 3D rendering bits. Actually, isn't this the direction they started with Hopper?
 
D

Deleted member 237813

Guest
nvidia is nearing apple and microsoft slowly but surely with it. you can even feel that nvidia gives zero fucks about gamers anymore terrible drivers(i never had this many issues with nvidia drivers in the last 15 years than in the last year), to late gamereday drivers or plain out no optimisation for new games for weeks. usually they were on time pretty much always.


AI comes first makes sense right why selling from nvidias standpoint cheap af fuck chips to gamers when they can sell them for ten times the price to companies. ada gaming gpus are simply trash chips from ai segment. nvidia is selling features not power. rasterization perfromance is terroble on nvidia cards if we go by value. a fucking high end card should never need upscaling or framegen to get playable frames. lmao

weridly enough my 7900xt rarely needs it its alway above 60 fps in 4k native without uscpacling as it should be for 700€ yet the 4070s whioch wiould have been 40 bucks cheaper is often easily 30% slower in 4k native. no wonder with 192 bit bus(for 600+ € lmao thats a 60 card right there). nvidia wannts to disctate on what resolution you play. 4070 ti 1440p card by marketing 900€ i lost my shit. :slap: :laugh:. that shit was a 299€ card years ago and idiots pay 3 times the price now while being a vram cripple at the same time and crippled 4k performance too.
 
Joined
Jan 14, 2019
Messages
12,359 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
nvidia is nearing apple and microsoft slowly but surely with it. you can even feel that nvidia gives zero fucks about gamers anymore terrible drivers(i never had this many issues with nvidia drivers in the last 15 years than in the last year), to late gamereday drivers or plain out no optimisation for new games for weeks. usually they were on time pretty much always.


AI comes first makes sense right why selling from nvidias standpoint cheap af fuck chips to gamers when they can sell them for ten times the price to companies. ada gaming gpus are simply trash chips from ai segment. nvidia is selling features not power. rasterization perfromance is terroble on nvidia cards if we go by value. a fucking high end card should never need upscaling or framegen to get playable frames. lmao

weridly enough my 7900xt rarely needs it its alway above 60 fps in 4k native without uscpacling as it should be for 700€ yet the 4070s whioch wiould have been 40 bucks cheaper is often easily 30% slower in 4k native. no wonder with 192 bit bus(for 600+ € lmao thats a 60 card right there). nvidia wannts to disctate on what resolution you play. 4070 ti 1440p card by marketing 900€ i lost my shit. :slap: :laugh:. that shit was a 299€ card years ago and idiots pay 3 times the price now while being a vram cripple at the same time and crippled 4k performance too.
Those are fair concerns, not too far from my own. I don't know much about the driver issues you're talking about, as I refused to pay for the Ampere price hike, not to mention the Ada one. I have a 1660 Ti, and a 2070 (which is half dead, unfortunately), but that's it, no more Nvidia for me until they get things right with the pricing. I'm 100% happy with my 7800 XT anway.

The way I see the price hike is two-fold:
  1. For one, they charge way more money than they used to for chips of similar size and PCBs of similar complexity. Let's compare the 4060 and the 1060 6 GB. The 4060 has a 25% smaller GPU, a bone simple PCB and VRM, and 4 memory chips instead of 6, but costs more than the 1060 did back in the days. Wtf, seriously!?
  2. Secondly, as you said, Nvidia is starting to become a datacentre/software company, so they charge more for features you never asked for in the first place. I mean, do we need AI cores for upscaling? I don't. I'd much rather have more performance so that I don't need to rely on upscaling in the first place. Unfortunately, that's not an option these days with AMD jumping onto the bandwagon with FSR. Luckily, that one doesn't need super-duper fancy AI cores, but then that begs the question: why does RDNA 3 have them? What's the point? I just want to play my freakin' games! :laugh: :ohwell:
 
D

Deleted member 237813

Guest
Well i can tell you gpu utilisation was rarely at 99% without a cpu limit in many games thats an abolute no go, this was by far the worst. frametimes not as smooth as it should be and no its not the game when it runs just fine on amd gpu with a flat line. but most stuff comes from the beta features they release. RR was terrible at first(them faces yuck) and framegen still is imo. why should i ever use such a technique it feels horrible. if you would have told pc gamer from 2015 we would upscale and framegen they would laugh at you because its garbage and no upscaling is not better than native how blind are you maybe on a phone screen. yet the raw power barely went up which is the only important thing about a gpu besides the vram of course, still and will be for many years. because fuck that frame gen bullshit. sure i buy 700-1000€ gpu to play horrible feeling fake frames and yes i know all frames are fake but in this case they really are just doubled. absolute garbage feature on both vendors. fuck off with that gimme Tflops and increase rt cores performance thats all i need.

the price hike startet with 600 series from nvidia alias kepler. thats where it all started amd didnt they started heavily in 2019 with navi aka rdna 1.

rx 5700xt was an rx 480/580 sucessor at best. yet it was double the price.

people always cried about the prices even when 500 bucks got you the best gpu on the market but then we at least had awesome 130 bucks gpus too.

but since turing it went really really downhill and even pascal was expensive af on release but at least they cut the prices fast and when it was readily available it was really good.

but ampere and ada lmao throw it in the trashbin. nobody should buy any gpus for 1 generation so these fuckers wake up finally but people here think they got a good price when the get a 4090 for 1500€ completely brainwashed and out of touch. 4090 is such a cut down chip it barely makes an 80ti card even with all the inflation and what not 999€ would be still expensive as fuck and nvidias marge would be still gigantic. but keep on buying folks. :laugh:
 
Joined
Jan 14, 2019
Messages
12,359 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Well i can tell you gpu utilisation was rarely at 99% without a cpu limit in many games thats an abolute no go, this was by far the worst. frametimes not as smooth as it should be and no its not the game when it runs just fine on amd gpu with a flat line. but most stuff comes from the beta features they release. RR was terrible at first(them faces yuck) and framegen still is imo. why should i ever use such a technique it feels horrible. if you would have told pc gamer from 2015 we would upscale and framegen they would laugh at you because its garbage and no upscaling is not better than native how blind are you maybe on a phone screen. yet the raw power barely went up which is the only important thing about a gpu besides the vram of course, still and will be for many years. because fuck that frame gen bullshit. sure i buy 700-1000€ gpu to play horrible feeling fake frames and yes i know all frames are fake but in this case they really are just doubled. absolute garbage feature on both vendors. fuck off with that gimme Tflops and increase rt cores performance thats all i need.
All I can say is, I agree.

the price hike startet with 600 series from nvidia alias kepler. thats where it all started amd didnt they started heavily in 2019 with navi aka rdna 1.

rx 5700xt was an rx 480/580 sucessor at best. yet it was double the price.

people always cried about the prices even when 500 bucks got you the best gpu on the market but then we at least had awesome 130 bucks gpus too.

but since turing it went really really downhill and even pascal was expensive af on release but at least they cut the prices fast and when it was readily available it was really good.

but ampere and ada lmao throw it in the trashbin. nobody should buy any gpus for 1 generation so these fuckers wake up finally but people here think they got a good price when the get a 4090 for 1500€ completely brainwashed and out of touch. 4090 is such a cut down chip it barely makes an 80ti card even with all the inflation and what not 999€ would be still expensive as fuck and nvidias marge would be still gigantic. but keep on buying folks. :laugh:
Oh man, how much fun I had with my 750 Ti, or my passively cooled 1050 Ti back in the days! :cool: Now, anybody tell me how much fun you're having with your current gen x50 series card! Anyone? No one? :laugh:
 
Joined
Mar 21, 2016
Messages
2,508 (0.79/day)
Efficiency will pretty much always generally improve over time with GPU generations, but TDP relative to performance targets can change quite a bit sometimes they aren't pushed as aggressively and other times over aggressively beyond the efficiency power curve and TDP relative to performance sky rockets in line with that.
 
D

Deleted member 237813

Guest
i dont know i know some who enjoy their 300 bucks 4050(60). :laugh: we had 200 bucks 8gb gpus 8 years ago lmao. that rt and dlss sure is worth it on it tho:slap:. i actually hope it gets even more expensive then people will cry and i will sip the tears since they same people enabled these prices.

look at this shit and start thinking maybe some will finally wake up form the marketing brainwash

data#.png
 

Morgoth

Fueled by Sapphire
Joined
Aug 4, 2007
Messages
4,242 (0.67/day)
Location
Netherlands
System Name Wopr "War Operation Plan Response"
Processor 5900x ryzen 9 12 cores 24 threads
Motherboard aorus x570 pro
Cooling air (GPU Liquid graphene) rad outside case mounted 120mm 68mm thick
Memory kingston 32gb ddr4 3200mhz ecc 2x16gb
Video Card(s) sapphire RX 6950 xt Nitro+ 16gb
Storage 300gb hdd OS backup. Crucial 500gb ssd OS. 6tb raid 1 hdd. 1.8tb pci-e nytro warp drive LSI
Display(s) AOC display 1080p
Case SilverStone SST-CS380 V2
Audio Device(s) Onboard
Power Supply Corsair 850MX watt
Mouse corsair gaming mouse
Keyboard Microsoft brand
Software Windows 10 pro 64bit, Luxion Keyshot 7, fusion 360, steam
Benchmark Scores timespy 19 104
i would love to have a gpu that goes all out no limits attached with a power draw of 700w with a watercooling block on it preinstalled
 
D

Deleted member 237813

Guest
all nvifia gpus are going all out by default with their awesome boosting behaviour which has al algorithm in it theres a reason overclcoking nvidia gpus is almost pointless since pascal. stock you get like really 95% of max power the card has to offer and 5% wont make any differen in any game at all. especially without fps counter you wouldnt even feel 10% or even more.
 
Joined
Jun 27, 2019
Messages
2,109 (1.06/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
i dont know i know some who enjoy their 300 bucks 4050(60). :laugh: we had 200 bucks 8gb gpus 8 years ago lmao. that rt and dlss sure is worth it on it tho:slap:. i actually hope it gets even more expensive then people will cry and i will sip the tears since they same people enabled these prices.

look at this shit and start thinking maybe some will finally wake up form the marketing brainwash

View attachment 335064

The average joe will never really care about the % cores or any of that stuff.
To be honest I don't care either and its not what I base my GPU upgrades on.
Price in my country/performance/power draw/feature set is what I'm mainly interested in and if those are ok with me then thats all I care about and I couldn't care less what die size or core % it has compared to the prev gens,whatever else.

To be fair I've upgraded from a GTX 1070 to my second hand 3060 Ti in 2022 september so it was close enough.
 
D

Deleted member 237813

Guest
very wise upgrade from man 8gb gpu to the next one. :laugh: in my honest opinion gamers are the dumbest consumers on the planet videogames prove it and hardware does too.
 
Joined
Jun 27, 2019
Messages
2,109 (1.06/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
very wise upgrade from man 8gb gpu to the next one. :laugh: in my honest opinion gamers are the dumbest consumers on the planet videogames prove it and hardware does too.

Except that I'm yet to play any game where 8 GB was an issue but more like the lack of raw GPU power, but sure you do know everyone's use case right?:rolleyes: 'I also don't plan on upgrading my 2560x1080 monitor anytime soon which is less pixels than 1440p so Vram issues aint pressed on me yet on High settings even in current AAA games'
+ I'm also using DLSS whenever possible cause I like it better than native TAA.
 
Joined
Jan 14, 2019
Messages
12,359 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
The average joe will never really care about the % cores or any of that stuff.
To be honest I don't care either and its not what I base my GPU upgrades on.
Price in my country/performance/power draw/feature set is what I'm mainly interested in and if those are ok with me then thats all I care about and I couldn't care less what die size or core % it has compared to the prev gens,whatever else.

To be fair I've upgraded from a GTX 1070 to my second hand 3060 Ti in 2022 september so it was close enough.
It doesn't matter if you look at the number of cores vs the top-end model, PCB complexity, chip design, or just raw performance vs price, it's plain obvious that Ada is an artificial upsell of Ampere.

Except that I'm yet to play any game where 8 GB was an issue but more like the lack of raw GPU power, but sure you do know everyone's use case right?:rolleyes: 'I also don't plan on upgrading my 2560x1080 monitor anytime soon which is less pixels than 1440p so Vram issues aint pressed on me yet on High settings even in current AAA games'
+ I'm also using DLSS whenever possible cause I like it better than native TAA.
That depends on what you play. (the below link is time-stamped to the point)

TLDR: 8 GB cards use low quality assets in Halo: Infinite, and they stutter in The Last of Us, even at 1080p.
 
Joined
Jun 27, 2019
Messages
2,109 (1.06/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
It doesn't matter if you look at the number of cores vs the top-end model, PCB complexity, chip design, or just raw performance vs price, it's plain obvious that Ada is an artificial upsell of Ampere.


That depends on what you play. (the below link is time-stamped to the point)

I'm familiar with that video, I watch HUB for years now.
Like I said, my res is somewhere between 1080p and 1440p in regard of pixel count and Vram usage in games, at most I drop from Ultra to High and its 100% issue free at this res.
+ I do not play competitive games either.:)

In fact I'm starting to run out of the raster performance of my GPU in the most demanding games lately, mainly the Unreal Engine 5 games I've played where I was perfectly fine with my Vram but that engine is hammering my GPU completely and only DLSS saved me. 'Immortals of Aveum for example, no Vram issues at all but it was killing my GPU natively on High settings'

Last of us was ~fixed a while ago, its all good at my resolution and high settings even natively. 'Hogwarts too, I can't really do RT in that game anyway, no RT High is not a problem now'
No issues with last of us at 1080p high here heck even 1440p is doable:
 
Last edited:
D

Deleted member 237813

Guest
rt in hogwarts is uselss gimiick anyways cyberpunk is the only game where rt really makes sense in relative to the performance hit. metro was nice too every other implementation was very forgettable. raster 99% nad still will be mmany maqny years some of us wont be alive when rt overtakes raster completely.
 
Joined
Feb 1, 2019
Messages
3,610 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Feels like my 4070 is the most effecient GPU I've ever had. Tons and tons of processing power and usually in the 180W range. The CPU can pull more power depending on scenario!
Its kind of weird.

I undervolt my 3080 and I play games capped at either 30 ro 60fps. The result is I get very favourable wattage vs what I had on something like my 1080ti and even 1070. So I agree with you,

My issue I suppose is Nvidia like Intel, have basically started treating the part of the c/f curve thats very inefficient as part of normal boost clocks. So the cards can peak at very high levels of power (and heat). Good hardware, bad firmware. Unless of course you dont care about power and heat efficiency but just want every ounce of performance you can get.

Thats the issue they are more efficient, but not enough to get the performance without having very high max power.
 
Last edited:
Joined
Feb 1, 2019
Messages
3,610 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Now start metro exodus no matter your undervolt you will sit at 300 watts or close to it ;). thats why its so funny readfing such stuff hur dur my gpu is undervolted 200 watt max untill you launch a game thta really brings the shaders to glow and your underwolt while yes sipps less power is significatly slower than stock.

also undervolting modern nvdiia gpus is dumb und useless. nvidia does it by itsself. this did igor say not me. ;)

and when you see how small the difference is beeteeen powerlimit reducing and undervolting via frequenzy curve then you will see he is right.

Well I wont as I am pretty sure I also lowered power limit :p But I dont own the game so cant test. You mean enhanced edition I assume? Probably a version thats had RT nonsense added to it to really work the card.
 
D

Deleted member 237813

Guest
Rt is no nonsense in metro exodus ee. The complete lighting is rt. Its not even working on non rt capable cards. Its also extremely performant for what it is. :) even my amd card can get 4k native 60 plus and nvidia can use dlss.

one of the best rt implementations on the market.

this applies to more games that was just the most intense example. Unreal engine 4 games are very heavy too usually. But there you at least you can get to 270-290 watts with the 3080. i had it too in 2020. great card 320 bit bus .

if it had 16 or 20gb vram like it should have had. it would have rendered all ada cards with less vram useless. especially the 70 cards with their garbage 192 bit bus. thats the reason why in 4k the 3080 outperforms the 4070 easily.
 
Last edited by a moderator:
Top