• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Founders Edition

Joined
Sep 25, 2007
Messages
5,966 (0.95/day)
Location
New York
Processor AMD Ryzen 9 5950x, Ryzen 9 5980HX
Motherboard MSI X570 Tomahawk
Cooling Be Quiet Dark Rock Pro 4(With Noctua Fans)
Memory 32Gb Crucial 3600 Ballistix
Video Card(s) Gigabyte RTX 3080, Asus 6800M
Storage Adata SX8200 1TB NVME/WD Black 1TB NVME
Display(s) Dell 27 Inch 165Hz
Case Phanteks P500A
Audio Device(s) IFI Zen Dac/JDS Labs Atom+/SMSL Amp+Rivers Audio
Power Supply Corsair RM850x
Mouse Logitech G502 SE Hero
Keyboard Corsair K70 RGB Mk.2
VR HMD Samsung Odyssey Plus
Software Windows 10
The only real time I think you have worry about power is in cases like the RX 480 and Vega(for example)) where some cards come with core voltage that is so high it causes you to hit the power limit and throttle and you have to undervolt(to reduce power draw) to get more performance. That is the only real case I can think of where that matters.
 
Joined
Dec 22, 2011
Messages
3,890 (0.83/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
If youre truly worried about the power consumption, play games on your phone.

Fermi = bad, 290X = good, 295X2 = perfectly acceptable, Vega = underclockers dream.

It's the years they have to wait for the same peformance which drives them crazy I think.
 
Joined
Oct 25, 2018
Messages
1,432 (0.65/day)
Location
SortOfGrim
System Name Merc v8| Project DRS
Processor Ryzen 7800X3D|5800X
Motherboard Gigabyte X670 Gaming X AX|B550 AORUS ELITE
Cooling Noctua NH-D15|Arctic LFII-360
Memory 2x 16GB-6000|2x16GB-3600
Video Card(s) MSI RTX 4070 Ti SUPER|RTX 3070 Ventus
Storage Solidigm P44 1TB & 2TB, Crucial MX500 2TB|2x1TB SSD
Display(s) Acer Predator XB271HU|LG 34GP950G
Case Caselabs Mercury S8|Phanteks G500A
Audio Device(s) Schiit Magni & Modi, Edifier S351DB|DT 770 PRO headphones
Power Supply Seasonic Vertex PX-850|Prime TX-750
Mouse Logitech G600
Keyboard Epomaker F75|Logitech K400
Software W11Pro
I can't wait to play RDR2 at 90 ish fps. Although seeing the tear-down, considering I'm gonna water cool it, I might wait for one of the board partners.
W1zzard, do you already have some of board partners cards?
 
Joined
Feb 18, 2017
Messages
688 (0.24/day)
The only real time I think you have worry about power is in cases like the RX 480 and Vega(for example)) where some cards come with core voltage that is so high it causes you to hit the power limit and throttle and you have to undervolt(to reduce power draw) to get more performance. That is the only real case I can think of where that matters.
Well if you consider there was a node change, that extra 80-90W consumption can hardly be defended. Just check these 2 graphs:



I know the 980-1080 is 1440p, but 4k is the same difference (63% instead of 64).

And regarding performance (increase bw 980Ti and 1080 is around 37% and bw 2080 Ti and 3080 is around 31-32%), you should note the OC capabilities of the 2 cards: 1080 was able to be OCd by 13% whereas the 3080 can be OCd by around 4%. So there is a 10% point difference between the 2.

Fermi = bad, 290X = good, 295X2 = perfectly acceptable, Vega = underclockers dream.

It's the years they have to wait for the same peformance which drives them crazy I think.
I don't see you laughing at the 3080's power consumption or small efficiency gain. I'm definitely sure you did it with Vega or even RX 480. :)
 
Joined
Mar 14, 2014
Messages
1,375 (0.35/day)
Processor i7-4790K 4.6GHz @1.29v
Motherboard ASUS Maximus Hero VII Z97
Cooling Noctua NH-U14S
Memory G. Skill Trident X 2x8GB 2133MHz
Video Card(s) Asus Tuf RTX 3060 V1 FHR (Newegg Shuffle)
Storage OS 120GB Kingston V300, Samsung 850 Pro 512GB , 3TB Hitachi HDD, 2x5TB Toshiba X300, 500GB M.2 @ x2
Display(s) Lenovo y27g 1080p 144Hz
Case Fractal Design Define R4
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply EVGA Supernova G2 850w
Mouse Glorious Model D
Keyboard Rosewill Full Size. Red Switches. Blue Leds. RK-9100xBRE - Hate this. way to big
Software Win10
Benchmark Scores 3DMark FireStrike Score : needs updating
For the 3080, Full HD really is more of an academic resolution, I doubt anyone will buy that card for 1080p
I mean RDR2 says otherwise. Yeah it's 100fps, but there's 165Hz 1440p displays. Let's see what full blown CP2077 thinks of the 3080.
 
Joined
Dec 22, 2011
Messages
3,890 (0.83/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
B-fucking-real said:
I don't see you laughing at the 3080's power consumption or small efficiency gain. I'm definitely sure you did it with Vega or even RX 480. :)

Hmm? It's performance per watt seems to top the charts at 4k?

Vega was a late 1080 competitor, and Polaris was super juicy too, as those graphs you just posted show, that RX 590 was soooooo bad.
 
Last edited:
Joined
Jun 28, 2018
Messages
299 (0.13/day)
Compare this with Vega is a tremendous intellectual dishonesty. If we take things out of context, they lose their meaning.

Vega's consumption was criticized, due to the performance/watt ratio offered. It needed about 300W to match Nvidia's 170W~180W equivalent performing card. If Vega 64 humiliated the 1080Ti, you can be sure that consumption would have been the subject of much less talk.

If AMD presents something similar to 3080 that consumes much less power, you will see the same criticisms that were once made to Vega. As Intel's CPUs consumption is criticized against Ryzen at the moment.
 
Joined
Mar 9, 2020
Messages
80 (0.05/day)
There's no way i'm buying ANYTHING from someone who wears a leather jacket in the kitchen. :kookoo:
 
Joined
Feb 18, 2017
Messages
688 (0.24/day)
Hmm? It's performance per watt seems to top the charts at 4k?

Vega was a late 1080 competitor, and Polaris was super juicy too, as those graphs you just posted show, that RX 590 was soooooo bad.
Have I said that it's not the leader in the chart? Some were trying to defend 3080 efficiency by saying it is the leader in the chart. I mentioned that compared to the 980-1080 efficiency change, 3080 is nowhere near it compared to 2080 (55% vs. 17%) In fact, it's only a little better than the 1080-2080 efficiency gain (12%). When the 3070 or 3060 will arrive, they will definitely get the lead from 3080. And as for the TDPs, I'm assuming even the 3090 will. And don't forget RX 5000 caught up with NV, the 5600 XT even taking the first place from NV, nearly equalling the leader 1650 in FHD and surpassing it in 1440p. The RX 5700 was also better than its rival 2070 and 2070S in perf/W. So there is a good chance they bring efficient cards again.

And some personal: may I ask you to not play with my name? Thank you! :)
 
Last edited:
Joined
Feb 13, 2017
Messages
143 (0.05/day)
People should forget about this card performance, power efficiency and price/performance at 1080 and 1440p - this card is made for 4K unless you wanna get 500fps@1080p in your favourite online shooter. Period.

except for the ridiculously low VRAM size, 10GB is not enough for 4k now and won't be in the future!

About the card, good performance bump, but the VRAM size (to be fixed with a 20GB TI variant), power usage and heat are pathetic. I have an HTPC so I do care about thermals, looking forward for RDNA2 and its promised superior efficiency. If it under performs, then a 3070 TI 16GB (being more efficient in theory, as it's 220W) could be the deal.
 
Joined
Dec 26, 2006
Messages
3,806 (0.58/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
Wow that is a crazy amount of transistors!!!!!

~ 50% to almost 100% hit in performance for RTX wow

Need a relative performance per transistor chart.

For the frame time charts, for someone with red/green colour deficiency they are not very friendly. Orange - Blue or something with more contrast would be way easier to distinguish.

Nice review.

I will wait for the 350 cnd$ range/budget cards though.
 

gabiroli

New Member
Joined
Sep 17, 2020
Messages
2 (0.00/day)
I think the people who are impressed by this card are blinded by Nvidia. This card is only impressive because the 20 series was way overpriced....1200 for a 2080ti...meanwhile 2080 is a rehashed 1080ti.
20-30% improvement is still good. over 2080ti especially for the money. But if the 2080ti was 700$ like how the 1080ti was this card would be irrelevant as well. 0GB VRAM is very low... for 4k
think about It 1080ti had 11GB almost 4 years ago. It is enough today... but it wont be enough 2 years from now. It's planned Obsolescence. One of the selling features of this gpu is nvcache which takes advantage of vram. Which cpu doesn't have ANY SPARE to give at 4k anyways. All major games will take advantage of it since ps5 and xbsx will have it too...
 
Joined
Mar 20, 2019
Messages
118 (0.06/day)
Processor R7 5800X
Motherboard Asus Rog Strix B550 I
Cooling Fractal Celsius
Memory 32
Video Card(s) MSI Ventus RTX 4080 OC
Storage Lots
Display(s) LG 4k, Dell 1440p
Case Fractal Nano S
Audio Device(s) Vintage
Power Supply EVGA 650 SFF
Mouse Pwnage SYM2
Keyboard EVGA Z15
I think the people who are impressed by this card are blinded by Nvidia. This card is only impressive because the 20 series was way overpriced....1200 for a 2080ti...meanwhile 2080 is a rehashed 1080ti.
20-30% improvement is still good. over 2080ti especially for the money. But if the 2080ti was 700$ like how the 1080ti was this card would be irrelevant as well. 0GB VRAM is very low... for 4k
think about It 1080ti had 11GB almost 4 years ago. It is enough today... but it wont be enough 2 years from now. It's planned Obsolescence. One of the selling features of this gpu is nvcache which takes advantage of vram. Which cpu doesn't have ANY SPARE to give at 4k anyways. All major games will take advantage of it since ps5 and xbsx will have it too...
Very nice of you to make an account and warn the world about Nvidia, Thanks.
 
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
I think the people who are impressed by this card are blinded by Nvidia. This card is only impressive because the 20 series was way overpriced....1200 for a 2080ti...meanwhile 2080 is a rehashed 1080ti.

20-30% improvement is still good. over 2080ti especially for the money. But if the 2080ti was 700$ like how the 1080ti was this card would be irrelevant as well. 0GB VRAM is very low... for 4k

think about It 1080ti had 11GB almost 4 years ago. It is enough today... but it wont be enough 2 years from now. It's planned Obsolescence. One of the selling features of this gpu is nvcache which takes advantage of vram. Which cpu doesn't have ANY SPARE to give at 4k anyways. All major games will take advantage of it since ps5 and xbsx will have it too...

It's amazing you've just signed up to spoil the launch with your very valuable opinion. Also, remember, we live in a free society, so please stop it with "overpriced". Did NVIDIA put a gun against your head and ask you to buy any of their GPUs? No? Then how on Earth are they overpriced? Also, I agree, "0GB VRAM is very low... for 4k". Except this card features 10GB and we have close to zero games which actually require more than 8GB of VRAM at 4K. Also read the rest of my comment.

Speaking of "planned obsolescence" due a lack of VRAM:



See how the GTX 1060 3GB still works relatively OK despite not having enough VRAM even 5 years ago. Yes, it's very slow in fact 33% slower, but not 2 or 3 times as slow as its 6GB brother. Also, see how both cards are unusable at this resolution.

except for the ridiculously low VRAM size, 10GB is not enough for 4k now and won't be in the future!

About the card, good performance bump, but the VRAM size (to be fixed with a 20GB TI variant), power usage and heat are pathetic. I have an HTPC so I do care about thermals, looking forward for RDNA2 and its promised superior efficiency. If it under performs, then a 3070 TI 16GB (being more efficient in theory, as it's 220W) could be the deal.
  • Game developers target the most popular GPUs and most of them contain 8GB of VRAM or even less. Consoles also won't really feature more than 8GB of VRAM because their memory pool is shared between an OS core, game code and VRAM and that all should fit into 16GB or less. And both consoles feature uber fast textures streaming off their storage.
  • Also, it's been shown time and again, that NVIDIA has superb drivers and their GPUs performance is not seriously affected even when there's not "enough" VRAM. E.g. Call of Duty: Modern Warfare eats VRAM for breakfast (over 9.5GB of VRAM use at 4K), yet, are NVIDIA cards with 6GB of VRAM affected? Not at all. Besides with RTX IO it all becomes moot.
  • Lastly, by the time 10GB of VRAM is not enough, this GPU performance will be too low even if had twice as much VRAM.
Still you can always buy a future-proof GPU from AMD. BTW, do you remember Radeon R9 Fury? Was released with paltry 4GB of VRAM which was strangely enough for AMD fans.
 
Last edited:

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,147 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
@W1zzard , you say undervolting isn't possible, can you elaborate on that? I'm thinking perhaps you refer to the boost curve editor in MSI AB that most would use to undervolt, but is it possible to set the power limit lower than 100%? say 90% which would effectively underclock and undervolt the card.

I'm just curious about those people's opinion about the power consumption who in the old days made fun of the 290X or 390X, or even the 480 or Vega cards. What do they say now about the extra 90W power it needs compared to the 2080 or the efficiency increase compared to the 2080 vs. 1080 to the 980?

Ok lets take Vega, the problem is that it was ~1 year late to the party in matching GTX1080 performance, add to that it wanted a LOT more power. If it wanted more power but offered more performance it would have been a lot less of an issue IMO.

With the 3080 absolutely it want's a lot of power, perhaps alarmingly so for some, but it's the undisputed top dog, totally unmatched for now and managed to further the bar in efficiency, they've just ratcheted up the board power to deliver the big gains. Could it be better? sure. But it's certainly not a repeat of Vega. If AMD swoop in now and beat perf/watt ratio then it might look even worse, but it was still first to the party.
 
Last edited:

gabiroli

New Member
Joined
Sep 17, 2020
Messages
2 (0.00/day)
I won't buy an AMD GPU. As a matter of fact, I have a 1080ti in my system right now. No matter what AMD brings to the table I have to buy Nvidia for moonlight stream. That doesn't mean that I have to like everything Nvidia tries to sell me. I'm not trying to spoil the launch for everyone. And I will be picking up an Nvidia graphics card that has Adequate ram... Just not the 3080. Maybe the 3080ti. It still doesn't change the fact that this GPU is way overhyped now. Just to prove i am not and FANBOY. Ill take whichever side has features performance i need. For remote streaming, i prefer moonlight now.



It's amazing you've just signed up to spoil the launch with your very valuable opinion. Also, remember, we live in a free society, so please stop it with "overpriced". Did NVIDIA put a gun against your head and ask you to buy any of their GPUs? No? Then how on Earth are they overpriced? Also, I agree, "0GB VRAM is very low... for 4k". Except this card features 10GB and we have close to zero games which actually require more than 8GB of VRAM at 4K. Also read the rest of my comment.


  • Game developers target the most popular GPUs and most of them contain 8GB of VRAM or even less. Consoles also won't really feature more than 8GB of VRAM because their memory pool is shared between an OS core, game code and VRAM and that all should fit into 16GB or less. And both consoles feature uber fast textures streaming off their storage.
  • Also, it's been shown time and again, that NVIDIA has superb drivers and their GPUs performance is not seriously affected even when there's not "enough" VRAM. E.g. Call of Duty: Modern Warfare eats VRAM for breakfast (over 9.5GB of VRAM use at 4K), yet, are NVIDIA cards with 6GB of VRAM affected? Not at all. Besides with RTX IO it all becomes moot.
  • Lastly, by the time 10GB of VRAM is not enough, this GPU performance will be too low even if had twice as much VRAM.
Still you can always buy a future-proof GPU from AMD. BTW, do you remember Radeon R9 Fury? Was released with paltry 4GB of VRAM which was strangely enough for AMD fans.

You do realise that the reason why PS5 and xb1 comes with pci gen 4 ssd is to use some of it as RAM and video ram for games right???

It's amazing you've just signed up to spoil the launch with your very valuable opinion. Also, remember, we live in a free society, so please stop it with "overpriced". Did NVIDIA put a gun against your head and ask you to buy any of their GPUs? No? Then how on Earth are they overpriced? Also, I agree, "0GB VRAM is very low... for 4k". Except this card features 10GB and we have close to zero games which actually require more than 8GB of VRAM at 4K. Also read the rest of my comment.

Speaking of "planned obsolescence" due a lack of VRAM:



See how the GTX 1060 3GB still works relatively OK despite not having enough VRAM even 5 years ago. Yes, it's very slow in fact 33% slower, but not 2 or 3 times as slow as its 6GB brother. Also, see how both cards are unusable at this resolution.


  • Game developers target the most popular GPUs and most of them contain 8GB of VRAM or even less. Consoles also won't really feature more than 8GB of VRAM because their memory pool is shared between an OS core, game code and VRAM and that all should fit into 16GB or less. And both consoles feature uber fast textures streaming off their storage.
  • Also, it's been shown time and again, that NVIDIA has superb drivers and their GPUs performance is not seriously affected even when there's not "enough" VRAM. E.g. Call of Duty: Modern Warfare eats VRAM for breakfast (over 9.5GB of VRAM use at 4K), yet, are NVIDIA cards with 6GB of VRAM affected? Not at all. Besides with RTX IO it all becomes moot.
  • Lastly, by the time 10GB of VRAM is not enough, this GPU performance will be too low even if had twice as much VRAM.
Still you can always buy a future-proof GPU from AMD. BTW, do you remember Radeon R9 Fury? Was released with paltry 4GB of VRAM which was strangely enough for AMD fans.
 

Attachments

  • s.jpg
    s.jpg
    785.1 KB · Views: 115
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
@W1zzard , you say undervolting isn't possible, can you elaborate on that? I'm thinking perhaps you refer to the boost curve editor in MSI AB that most would use to undervolt, but is it possible to set the power limit lower than 100%? say 90% which would effectively underclock and undervolt the card.

Power limiting is possible and has been tested by computerbase.de: https://www.computerbase.de/2020-09/geforce-rtx-3080-test/6/

The card becomes a lot more power efficient once it's being limited which means NVIDIA has clocked it as high as possible to gain extra performance at the expense of efficiency. I expect both RTX 3090 and 3070 to be more power efficient than RTX 3080 where RTX 3070 will be the most power efficient of these three.

Adequate ram

I'd like to see older cards with "inadequate" VRAM where the lack of it, and not the lack of GPU power, causes them to severly underperform in modern games.

Edit: found two, but both games are limited to the idTech Vulkan engine which probably could have been optimized better for high-res textures.
and

Edit 2: here's a very interesting write-up on VRAM usage: https://www.resetera.com/threads/vram-in-2020-2024-why-10gb-is-enough.280976/ Looks like monitoring tools are often reporting incorrect data.
 
Last edited:

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,147 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Power limiting is possible and has been tested by computerbase.de: https://www.computerbase.de/2020-09/geforce-rtx-3080-test/6/

The card becomes a lot more power efficient once it's being limited which means NVIDIA has clocked it as high as possible to gain extra performance at the expense of efficiency.

ooh many thanks for that link! good news personally as I to intend to at least try and buy one, and upgrading from a GTX1080 the performance uplift is so big I would be happy to shave a little off the top for cooler/quieter/less hungry.
 
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
Actually I've found a review with undervolting: https://www.gpureport.cz/recenze/26...ektivita-undervolting.aspx?article=268&page=9

-50W power consumption for the loss of 1fps in the Division 2 at 4K.

For my NVIDIA RTX 3080 Founders Edition graphics card, for example, it was 0.806 V at 1800 MHz . With this setup, I not only saved not only a few tens of watts, degrees and decibels, but what's more, I didn't even lose virtually any power. This is also evidenced by the following video, in which you will also find instructions on how to perform undervolting on GeForce RTX 3080 graphics cards.. For my NVIDIA RTX 3080 Founders Edition graphics card, for example, it was 0.806 V at 1800 MHz . With this setup, I not only saved not only a few tens of watts, degrees and decibels, but what's more, I didn't even lose virtually any power. This is also evidenced by the following video, in which you will also find instructions on how to perform undervolting on GeForce RTX 3080 graphics cards.

 

Stefem

New Member
Joined
Mar 17, 2019
Messages
12 (0.01/day)
as expected by me, + 10~20% increase in perf / watt and + 20~30% performance from 2080ti, but which 1.9x declared in the nVidia slides. :laugh:
I guess they are really worried about the arrival of RDNA2
It isn't NVIDIA's fault if you can't read a chart, it's 1.9x at the same performance...
 
Joined
Apr 8, 2010
Messages
1,003 (0.19/day)
Processor Intel Core i5 8400
Motherboard Gigabyte Z370N-Wifi
Cooling Silverstone AR05
Memory Micron Crucial 16GB DDR4-2400
Video Card(s) Gigabyte GTX1080 G1 Gaming 8G
Storage Micron Crucial MX300 275GB
Display(s) Dell U2415
Case Silverstone RVZ02B
Power Supply Silverstone SSR-SX550
Keyboard Ducky One Red Switch
Software Windows 10 Pro 1909
First time I've seen a card so overkill for my needs that doesn't even tickle my upgrade itch.
Looking forward to single 8-pin cards in this generation and see if there is any magic in perf/watt at 1080/1440p
 
Joined
Nov 11, 2016
Messages
3,393 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
First time I've seen a card so overkill for my needs that doesn't even tickle my upgrade itch.
Looking forward to single 8-pin cards in this generation and see if there is any magic in perf/watt at 1080/1440p

It's easy to kill the performance if you so wish though, DSR for example...
Fans of maximum efficiency can reduce the TGP of 3080 to something like 180W and still get 2080 Ti performance, though 3070 can do it too at 220W TGP and cost less.
 
Joined
Apr 12, 2013
Messages
7,476 (1.77/day)
Vega's consumption was criticized, due to the performance/watt ratio offered
Vega was also on an inferior node & admittedly an inferior uarch, that compounded their problem. But what's clear that there's still bias against AMD & towards Nvidia, JHH can launch a proverbial turd (remember Fermi?) & still get accolades while AMD not only has to please the (gaming) audience but also pay them to do so :shadedshu:

I'm willing to bet even if AMD comes within 5~15% of Nvidia's performance & perf/W lots of users will still want them to be a lot cheaper!
 
Joined
Feb 13, 2012
Messages
523 (0.11/day)
Math my man. :(

that is 46% faster. You realize that 2x = 100% right? For example if card A ran at 100 FPS and card B ran at 146 FPS, card B is 46% faster than card A. If it was "double" it would be 100%.
Actually you are incorrect my friend. 54% the performance of 1080ti means it performs about half as fast. It's 100/54 not 100 - 54
 
Joined
Mar 20, 2019
Messages
118 (0.06/day)
Processor R7 5800X
Motherboard Asus Rog Strix B550 I
Cooling Fractal Celsius
Memory 32
Video Card(s) MSI Ventus RTX 4080 OC
Storage Lots
Display(s) LG 4k, Dell 1440p
Case Fractal Nano S
Audio Device(s) Vintage
Power Supply EVGA 650 SFF
Mouse Pwnage SYM2
Keyboard EVGA Z15
Vega was also on an inferior node & admittedly an inferior uarch, that compounded their problem. But what's clear that there's still bias against AMD & towards Nvidia, JHH can launch a proverbial turd (remember Fermi?) & still get accolades while AMD not only has to please the (gaming) audience but also pay them to do so :shadedshu:

I'm willing to bet even if AMD comes within 5~15% of Nvidia's performance & perf/W lots of users will still want them to be a lot cheaper!
Well, if its slower it should be cheaper no?
 
Top