• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

Joined
Jan 14, 2019
Messages
12,344 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
No mention of TDP, and the prices are a slap in the face. I don't care about omniverses, either. What a disappointment!

Portal RTX looks cool, though.
 
Joined
Aug 25, 2021
Messages
1,170 (0.98/day)
For me, too. I always look for the next Radeon generation.
The question is if AMD cares about its customers or not? I mean we don't want an RX 7700 XT for $879 lol
It's going to be interesitng to see the pricing of RDNA3 in November. Brace for some surprises, as there is a flood of GPUs on secondary markets now.
7700XT is expected to perform similarly or better than 3090, which has ~69% better raster in relative performance. If RDNA3 architecture brings this kind of uplift, which it should, then AMD needs to convince people to buy 7700XT instead of used 3090 or new 4070.

If you can buy 3090 for less than $700 on ebay, AMD will need to propose convincing price for 7700XT to attract buyers. It cannot cost more than used 3090, so I expect $600-$650 max. But this is far away, in H1 2023. Things can change a bit.
 
Last edited:
Joined
Dec 12, 2012
Messages
774 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?

The 4080s look pretty bad in games without ray-tracing. The 3090 Ti is only 25% faster than the 3080, and the "4080-12" is actually slower here. It must be the terrible memory bandwidth, considering how high GPU clocks are.

I will hold off until a game with some insane RT visuals comes out (that I actually want to play). My 3080 is more than enough for rasterization. Maybe they will drop the prices next year. No way they can keep this up after the 30 series is gone.
 
Joined
Jan 27, 2015
Messages
1,716 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
---------------------------------------------------
Judt look at the size of the Aorus, it's just ridiculous!

3090 Strix vs 4090 Strix :

1663710269225.png
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
The 4080s look pretty bad in games without ray-tracing. The 3090 Ti is only 25% faster than the 3080, and the "4080-12" is actually slower here. It must be the terrible memory bandwidth, considering how high GPU clocks are.

I will hold off until a game with some insane RT visuals comes out (that I actually want to play). My 3080 is more than enough for rasterization. Maybe they will drop the prices next year. No way they can keep this up after the 30 series is gone.

Yeah, it just shows that it is a real 4060 Ti, because the 3060 Ti memory is very similar.
And hey, what is a "4080 - 12" with a 192-bit bus? What's next? A 4060 with 64-bit bus? lol
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
I don't think its going to be that interesting, my guess is that the 7900xt will cut nvdia by $100 and release at $1499 msrp and call it a day.

It really depends on how AMD wants to manage its sales. It has now a golden chance to dismantle nvidia completely and make its lineup DOA.
If I were AMD, I would launch an RX 7600 XT with "4080-12" performance for $649 and call it a day.
 
Joined
Jun 21, 2021
Messages
3,121 (2.48/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
It really depends on how AMD wants to manage its sales. It has now a golden chance to dismantle nvidia completely and make its lineup DOA.
If I were AMD, I would launch an RX 7600 XT with "4080-12" performance for $649 and call it a day.

NVIDIA has something like 80% of the discrete desktop graphics card market. I doubt that AMD can defeat NVIDIA with one card with sharp pricing.

At the most, AMD has a chance and nibbling away at NVIDIA's market share.

Let's not forget that the NVIDIA GPU chip has excellent adoption as a compute engine (AI, etc.) and that NVIDIA's Gaming business is now smaller than their Data Center business.
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
NVIDIA has something like 80% of the discrete desktop graphics card market. I doubt that AMD can defeat NVIDIA with one card with sharp pricing.

At the most, AMD has a chance and nibbling away at NVIDIA's market share.

Let's not forget that the NVIDIA GPU chip has excellent adoption as a compute engine (AI, etc.) and that NVIDIA's Gaming business is now smaller than their Data Center business.
AMD catching up to Nivida would need to be a concerted multi-year effort, true. But they're not going to get there at all unless they outprice Nvidia.

As for that last point, it has nothing to do with Geforce sales nor desktop GPU marketshare.
 

DarkS0ul

New Member
Joined
Sep 17, 2022
Messages
3 (0.00/day)
The lack of "new"DLSS 3.0 on the entire series of RTX cards is a spit in the face of players. I understand that a given series like RTX 2000 and RTX 3000 would differ in performance by 10-30% depending on how strong the card is. Now it seems as if everything was planned from the beginning ? The cryptocurrencies is over and suddenly we have technology available only on the latest series? After all, this is not a technological change like GTX vs RTX. And then maybe it is a pre-planned process? On the one hand, a sudden and huge leap and on the other hand, no (full) support from the same family - RTX ?! It's like showing the middle finger to players.
Have to wait for reliable tests, checking the technology, but it shows that currently the duopoly, previously cryptocurrencies turned out to be deadly for players and gave huge, disproportionate profits for Nvidia, AMD. Playing on PC becomes an abstractio

If it were to be confirmed, I dont fully understand the producers and publishers of games. They limit the potential number of customers with questionable technology. Currently, RT looks like an on / off in terms of lighting, shadows etc. Yet many titles without RT looked great. Then such a technological leap would make sense - but it looks like a lack of support. It's like you buy a new iPhone and after a year your iPhone 12, 13 is no longer usable...for basic functions (performance drop by several dozen percent). With an update, you lose the usefulness of your phone even in a year or two. You have to switch to new products after the premiere of the new series because the life cycle is extremely short.

Someone will say - great, we have a huge performance increase. And I will say - the RTX 2000 series gave a very small increase in performance in games without RT. There were a few issues with RT games and the performance was mediocre. At the same price over the previous series, barely 7-10%, when the GTX 1070-1080Ti cost more, but the performance gains were much greater.
For example
https://www.techpowerup.com/review/nvidia-geforce-gtx-1070/24.html what we get for 379$ ;)
Even the next RTX series as the most efficient Nvidia RTX 3090Ti card costing $ 2,000 in 4K with RT in some games we only had 25-50 frames. For $ 2000! If now the leap in performance was so large only with DLSS 3.0 it would potentially mean that Nvidia gets the profits from cryptocurrencies, didnt have to worry about what players thinkin . One market is over* cryptocurrencies - now in if you want to play, everyone "has" to buy a new series.
 
Joined
Jun 21, 2021
Messages
3,121 (2.48/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Reviews are gonna be tricky. No DLSS vs no DLSS, DLSS2 vs DLSS2, & DLSS2 vs DLSS3.

Graphics card reviewers have already been dealing with this for a while.

The better reviewers typically compared a number of games just on pure 3D rasterization. They then add some comparisons with RT on but no enhancements like DLSS or FSR. Finally there are some comparisons between DLSS and FSR, particularly with the handful of titles that support both. With each passing week, there are more gaming titles that support these two technologies.

And we will soon see the additional of XeSS to the mix.

It's not like hardware reviewers are being blindsided by DLSS 3.0.

For the most part, today's graphics cards have enough 3D rasterization performance on a single PCB which is why NVLink wasn't even included on the Ada cards. Super sampling technologies like DLSS, FSR, and XeSS are most beneficial when ray tracing is enabled.
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
No mention of TDP, and the prices are a slap in the face. I don't care about omniverses, either. What a disappointment!
Full specs are on their site - but it's rather telling that you have to go looking for them, and look pretty hard too - you have to click and scroll through several meaningless feature lists to get to some actual specs - making it seem like Nvidia really doesn't want to oversell these cards, at least for now (my guess is they won't be pushing this until 30 series supplies are much lower). Here's everything relevant that they've posted:
1663711782457.png

1663711801813.png

So, stock power limits match the 30 series, though there are strong indication that AIB partner cards will dramatically exceed this. FE coolers are huge chunguses, except for the 12GB which ... doesn't have one? Uh, okay? 4080 12GB should be very, very noticeably slower than the 16GB.
 
Joined
Jun 21, 2021
Messages
3,121 (2.48/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
AMD catching up to Nivida would need to be a concerted multi-year effort, true. But they're not going to get there at all unless they outprice Nvidia.

As for that last point, it has nothing to do with Geforce sales nor desktop GPU marketshare.

To my knowledge, AMD Radeon has been historically less expensive than comparable GeForce products, at least in the past five years. At least from a performance-per-dollar metric on 3D rasterization, Radeon is a better buy, at least at 1080p and 1440p resolutions. However AMD's position weakens with the 4K and 8K resolutions as well as other features like RT and DLSS/FSR.

With more gaming titles adopting RT and DLSS/FSR (and now XeSS), the performance comparison now takes multiple charts, tables, and graphs. The old paradigm of "run this 3D benchmark utility to compare scores" from five years ago is increasingly obsolete.

And cherry picking one or two games (MS Flight Simulator, Cyberpunk 2077, Red Dead Redemption, whatever) isn't particularly useful. The better analyses use a battery of games to average out architectural advantages and disadvantages of each manufacturer, whether it be DX11 vs. DX12 or other newer features like super sampling.

I don't know about anyone else here but I happen to play some older titles on newer hardware, including games that aren't always part of a graphics card review.
 
Joined
Aug 25, 2021
Messages
1,170 (0.98/day)
I don't think its going to be that interesting, my guess is that the 7900xt will cut nvdia by $100 and release at $1499 msrp and call it a day.
Highest tier market is not going to be interesting for most people, so there is no excitement with 7900XT vs 4090. You are right.
More interesting will be upper mid rage and mid range. Do not forget that AMD will also offer DP 2.0 ports, PCIe 5.0 interface x16 and x8, as well as more VRAM, for more future-proof products.

It really depends on how AMD wants to manage its sales. It has now a golden chance to dismantle nvidia completely and make its lineup DOA.
If I were AMD, I would launch an RX 7600 XT with "4080-12" performance for $649 and call it a day.
It seems to me that it's 7800 that will try to be competitive with 4080 12GB, 7800XT with 4080 16 GB, and 7700XT with 4070.
 
Joined
Feb 20, 2019
Messages
8,288 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Those prices are garbage
I feel like Nvidia might be losing the plot here. We on TPU (according to the survey) definitely fit into the minority niche group of hardware enthusiasts. Something approaching 99% of the population will ask themselves "should I buy a graphics card for my PC, or should I buy a PS5/XBox? When the GPU costs so much more than a decent console and comes with the additional burden of needing a new PSU as well, it's not really relevant to the vast majority.

If, when the sub-$500 cards are launched, they can be run on a reasonable PSU that someone with a last-gen PC might have (so probably a 650-850W PSU), then we can tell for sure if Nvidia have lost the plot or not.
 
Joined
Jun 16, 2019
Messages
373 (0.19/day)
System Name Cyberdyne Systems Core
Processor AMD Sceptre 9 3950x Quantum neural processor (384 nodes)
Motherboard Cyberdyne X1470
Cooling Cyberdyne Superconduct SC5600
Memory 128TB QRAM
Storage SK 16EB NVMe PCI-E 9.0 x8
Display(s) Multiple LG C9 3D Matrix OLED Cube
Software Skysoft Skynet
My favourite part is where they decided to call the 4070 the 4080 12GB instead. That will sell more units. Watch the 4060 come out as the 4080 8GB.
 
Joined
Feb 20, 2019
Messages
8,288 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Full specs are on their site
The specs are kinda useless since Nvidia have once again redefined what units in the architecture count as "one CUDA core". Per CUDA core, Turing was vastly better than Ampere, but the numbers went up regardless. Until we get a full independent review of Lovelace vs Ampere, the core and clock numbers are meaningless because we don't know how potent each core is in relation to the current gen.

My favourite part is where they decided to call the 4070 the 4080 12GB instead. That will sell more units. Watch the 4060 come out as the 4080 8GB.
I'm in the queue for my low-profile, single-slot 4080 4GB DDR4 :D
 
Joined
Aug 25, 2021
Messages
1,170 (0.98/day)
AMD catching up to Nivida would need to be a concerted multi-year effort, true. But they're not going to get there at all unless they outprice Nvidia.
Not just that, but RDNA3 cards will offer PCIe 5.0 connectivity, DP 2.0 and more VRAM. It should be an attractive package. We shall see.
 
Joined
Oct 27, 2020
Messages
791 (0.53/day)
Full specs are on their site - but it's rather telling that you have to go looking for them, and look pretty hard too - you have to click and scroll through several meaningless feature lists to get to some actual specs - making it seem like Nvidia really doesn't want to oversell these cards, at least for now (my guess is they won't be pushing this until 30 series supplies are much lower). Here's everything relevant that they've posted:
View attachment 262383
View attachment 262384
So, stock power limits match the 30 series, though there are strong indication that AIB partner cards will dramatically exceed this. FE coolers are huge chunguses, except for the 12GB which ... doesn't have one? Uh, okay? 4080 12GB should be very, very noticeably slower than the 16GB.
I expect around -18% at QHD and a little bit more at 4K.
Reference 3090 with 350W TBP needed 2 PCIe 8pins and 4080 with 320W needs 3?
 
Joined
Sep 10, 2018
Messages
6,926 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Joined
Sep 10, 2018
Messages
6,926 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
That's what VAT does to you when comparing to US MSRPs, which are always without tax. Germany even has it pretty good at 19%, in Norway and Sweden its 25%.

Still like 1730 usd vs 1950 euro.... At least the 4080 16GB is getting the same cooler as the 4090 and not the super gimped version they used on everything below the 3090 last gen.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,344 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Full specs are on their site - but it's rather telling that you have to go looking for them, and look pretty hard too - you have to click and scroll through several meaningless feature lists to get to some actual specs - making it seem like Nvidia really doesn't want to oversell these cards, at least for now (my guess is they won't be pushing this until 30 series supplies are much lower). Here's everything relevant that they've posted:
View attachment 262383
View attachment 262384
So, stock power limits match the 30 series, though there are strong indication that AIB partner cards will dramatically exceed this. FE coolers are huge chunguses, except for the 12GB which ... doesn't have one? Uh, okay? 4080 12GB should be very, very noticeably slower than the 16GB.
I guess there's no 4080 12 GB FE, only AIB cards. It really should have been called the 4070. I also guess it would have looked really bad to release a 285 W x70-series card, that's why the two vastly different 4080s. I'm curious what the x60 and x50 tier will look like, and how the 4070 will be positioned if there's gonna be one.

I still hold my position - not impressed.
 
Top