• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

Joined
Jun 20, 2022
Messages
302 (0.34/day)
Location
Germany
System Name Galaxy Tab S8+
Processor Snapdragon 8 gen 1 SOC
Cooling passive
Memory 8 GB
Storage 256 GB + 512 GB SD
Display(s) 2.800 x 1.752 Super AMOLED
Power Supply 10.090 mAh
Software Android 12
Only mentioning efficiency and not actual power consumption is a bad sign for end customers, while it is good news for the professional sector. The cards - already coming with a nice markup - will be even more expensive taking their power needs into consideration. Only good news: Right now it's just an announcement. Undervolting testing will show if this generation might still be worthy upgrade. Time will tell. Right now RTX 4000 is rather disappointing.
 
Joined
Jun 21, 2021
Messages
3,121 (2.49/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Only mentioning efficiency and not actual power consumption is a bad sign. The cards - already coming with a nice markup - will be even more expensive taking their power needs into consideration. Only good news: Right now it's just an announcement. Undervolting testing will show if this generation might still be worthy upgrade. Time will tell. Right now RTX 4000 is rather disappointing.

There are published specifications for power.

As is usual in all of these kind of presentations, NVIDIA decides on what to highlight and what to relegate to a press release or specification page on the website. It's not like they're going to prattle on for 4-5 hours about every single data point.

And any power specifications they publish would be for reference models or their limited Founders Edition cards.

I don't know if you realize this but AIB partners often build models that exceed NVIDIA's standard specifications. NVIDIA builds in a buffer to left people extract extra performance.

It's really up to third-party reviewers to test individual cards to provide more useful real-world performance metrics. We have to wait a few days/weeks for those to trickle in.
 
Joined
Jan 27, 2015
Messages
1,715 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
I don't disagree with your take because neither of us is wrong or right everyone has to do their own research and decide based on a number of factors if a 900 gpu is right for them. A car is a necessity for most people a gpu is a commodity and much less important but I don't necessarily look at cars any different I decide what I want to spend and buy whatever offers me the most im that price range.

Oh everyone does that, but there are a lot of analysis people do off the cuff like the housing example. Go back to the 1960s and houses were 1300 sq ft, but people will compare average price to today where the average is upwards of 2200 sqft. All these companies (for everything) keep trying to move upscale, and basically ignore the lower end. If I were to put this in a social context, which I don't normally like to do, they are catering to the upper middle class and ignoring the bottom 70% or so.
 
Joined
Sep 10, 2018
Messages
6,921 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Only mentioning efficiency and not actual power consumption is a bad sign. The cards - already coming with a nice markup - will be even more expensive taking their power needs into consideration. Only good news: Right now it's just an announcement. Undervolting testing will show if this generation might still be worthy upgrade. Time will tell. Right now RTX 4000 is rather disappointing.

Well 450w for a 4090 that doubles the performance of a 4090ti at the same wattage is impressive gen on gen my guess is the 4080s will be 350-400w for the 16GB varient and 300-350w for the 12GB varient... I am a little puzzled that the 4090 only seems around 20-30% faster than the 4080 16GB card though the specs seem much higher with around 60% more cuda cores.

I'm also very interested in how the 12GB varient performs and if it warrants it's 900 price tag.

Maybe I missed it in the keynote but I'm also wondering if DLSS 3.0 is exclusive to Ada
 
Joined
Jun 3, 2013
Messages
38 (0.01/day)
When does GTA 6 come out? I mean... At first I was convinced on selling my 3080 and getting a 4080. But usually GTAs push HW to the limit at their own time....

So.... Maybe I'll sit this one out and keep playing on my power sipping 320W 3080
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
Performance:

1663692358074.png

NVIDIA introduces GeForce RTX 4090/4080 series, RTX 4090 launches October 12th for 1599 USD - VideoCardz.com
 
Joined
Nov 26, 2021
Messages
1,648 (1.50/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
The difference between the flagship and the next best GPU is the greatest it has ever been:
Well 450w for a 4090 that doubles the performance of a 4090ti at the same wattage is impressive gen on gen my guess is the 4080s will be 350-400w for the 16GB varient and 300-350w for the 12GB varient... I am a little puzzled that the 4090 only seems around 20-30% faster than the 4080 16GB card though the specs seem much higher with around 60% more cuda cores.

I'm also very interested in how the 12GB varient performs and if it warrants it's 900 price tag.

Maybe I missed it in the keynote but I'm also wondering if DLSS 3.0 is exclusive to Ada
I think you're mistaken. The 4080 will be much slower than the 4090. I'll quote from the story above this one on the front page:

The RTX 4090 is the world's fastest gaming GPU ... In full ray-traced games, the RTX 4090 with DLSS 3 is up to 4x faster compared to last generation's RTX 3090 Ti with DLSS 2. It is also up to 2x faster in today's games while maintaining the same 450 W power consumption.

The RTX 4080 16 GB has 9,728 CUDA cores and 16 GB of high-speed Micron GDDR6X memory, and with DLSS 3 is 2x as fast in today's games as the GeForce RTX 3080 Ti and more powerful than the GeForce RTX 3090 Ti at lower power
Notice that they only say it's faster than the 3090 Ti at lower power and only 2x faster than the 3080 Ti if you use DLSS 3 whereas the 4090 is 2x faster than the 3090 Ti in today's games without DLSS 3. This suggests that the 4080 16 GB isn't more than 25-30% faster than the 3090 Ti which makes it much slower than the 4090. Any one spending that much money should go for the 4090. The 4080 16 GB is only to upsell whales to the 4090.
 
Joined
Feb 23, 2019
Messages
6,069 (2.88/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3800 CL16
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
Lol, those prices are nuts.
 
Joined
Sep 10, 2018
Messages
6,921 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
The difference between the flagship and the next best GPU is the greatest it has ever been:

I think you're mistaken. The 4080 will be much slower than the 4090. I'll quote from the story above this one on the front page:


Notice that they only say it's faster than the 3090 Ti at lower power and only 2x faster than the 3080 Ti if you use DLSS 3 whereas the 4090 is 2x faster than the 3090 Ti in today's games without DLSS 3. This suggests that the 4080 16 GB isn't more than 25-30% faster than the 3090 Ti which makes it much slower than the 4090. Any one spending that much money should go for the 4090. The 4080 16 GB is only to upsell whales to the 4090.

That's why I can't wait for reviews and what real world performance looks like no dlss vs no dlss not a 3090ti without dlss vs a 4090 with dlss 3.0 which is a stupid comparison.
 
Joined
Jun 21, 2021
Messages
3,121 (2.49/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
When does GTA 6 come out? I mean... At first I was convinced on selling my 3080 and getting a 4080. But usually GTAs push HW to the limit at their own time....

So.... Maybe I'll sit this one out and keep playing on my power sipping 320W 3080

Perhaps you should wait until GTA 6 comes out. However, I'm pretty sure that Rockstar realizes that if GTA 6 only runs acceptably on a 4090, they aren't going to sell many copies.

Say, doesn't GTA usually debut on consoles?
 
Joined
Aug 10, 2008
Messages
294 (0.05/day)
Location
Saigon city
System Name Kurise PC
Processor i7 5820k 4,7ghz / Ryzen 1700x 4ghz / 8700k
Motherboard Asus X99 deluxe / MSI x370 gaming pro carbon / z370i strix
Cooling EK evo, xspc slim 360 rad, D5 pump, dual alpha cool GPU mono block, dual xspc 240 radiator, DDC 18w
Memory Crucial sport white 16gb x 8 128gb 2666mhz/ Crucial sport white 16gb x 4 64gb 2933 / ddr4 chinese 32
Video Card(s) GTX 1080Ti SLI / HP gtx 1080 SLI 1850/1520 / 2080ti ref
Storage Lite on 512GB x 3 / Plextor m2 256gb / samsung 970 evo
Display(s) AOC I2769Vm, AOC U3477PQU, AOC I2769Vm / Koios 40''/ eizo EV2730QFX 1:1
Case Xigmatek Elysium / Corsair 750D / Bitfenix prodigy M
Audio Device(s) creative blaster ZX / Blaster ZXR / Blaster x7 lmt + burson v5i upgraded
Power Supply Be Quiet 1200 / Thermaltake toughpower 1200w / chinese 750w sfx PSU
Mouse Asus Echelon/ steelseries black ops II/ james donkey
Keyboard Cm storm quickfire pro / Fire rose steampunk kb/ corsair k70
Software Windows 10
the 3090ti struggle at 1080p with cyberpunk ( not even reach 80-100 fps ) so the 4090 might be apple to play at 1440p 80-100 fps RT on . This is my best bet still nvidia & amd can't provide anything with ray tracing at 4k above 80 fps until 2025
 
Joined
Sep 10, 2018
Messages
6,921 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Perhaps you should wait until GTA 6 comes out. However, I'm pretty sure that Rockstar realizes that if GTA 6 only runs acceptably on a 4090, they aren't going to sell many copies.

Say, doesn't GTA usually debut on consoles?

Yeah, with a pc port a year later with minor improvements.
 
Joined
Jan 27, 2015
Messages
1,715 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
... the 4090 is 2x faster than the 3090 Ti in today's games without DLSS 3...

That's the comment they made that I'm wondering if it translates to other SKUs.

I doubt it, for some of the reasons you stated. But if it does, then it'll be huge. It would imply that at 4K (which they're likely talking about) a 4060 would perform like a 3080, and a 4070 would outperform anything from Ampere.

But if they follow their standard pattern for the last couple of releases, it is more likely a 4060 will get a 30% bump, meaning it will perform like a 3060 Ti at 4K.
 
Joined
May 18, 2009
Messages
2,958 (0.52/day)
Location
MN
System Name Personal / HTPC
Processor Ryzen 5900x / Ryzen 5600X3D
Motherboard Asrock x570 Phantom Gaming 4 /ASRock B550 Phantom Gaming
Cooling Corsair H100i / bequiet! Pure Rock Slim 2
Memory 32GB DDR4 3200 / 16GB DDR4 3200
Video Card(s) EVGA XC3 Ultra RTX 3080Ti / EVGA RTX 3060 XC
Storage 500GB Pro 970, 250 GB SSD, 1TB & 500GB Western Digital / lots
Display(s) Dell - S3220DGF & S3222DGM 32"
Case CoolerMaster HAF XB Evo / CM HAF XB Evo
Audio Device(s) Logitech G35 headset
Power Supply 850W SeaSonic X Series / 750W SeaSonic X Series
Mouse Logitech G502
Keyboard Black Microsoft Natural Elite Keyboard
Software Windows 10 Pro 64 / Windows 10 Pro 64
Can't say I'm impressed. I'll happily keep playing with my 3080 and she should last me a good 5+ years, like my 980Ti did. By the time I need to move on to something new, if there aren't artificial GPU shortages again, I'll be looking to go with AMD or possibly Intel (if they decide to stick around and keep at it).
 
Joined
Jun 21, 2021
Messages
3,121 (2.49/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Yeah, with a pc port a year later with minor improvements.

So, if GTA 6 (hypothetically) debuts in 2023 followed by a PC port in 2014, it would be a PC title based on console hardware that debuted in late 2020, nearly four year old graphics architecture by then.
 
Joined
Jun 3, 2013
Messages
38 (0.01/day)
Perhaps you should wait until GTA 6 comes out. However, I'm pretty sure that Rockstar realizes that if GTA 6 only runs acceptably on a 4090, they aren't going to sell many copies.

Say, doesn't GTA usually debut on consoles?
If you want to push for maximum at 4K you can have acceptable FPS...

I remember I had 2x HD7970 Matrix Platinum to run GTA V at maximum graphics...
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
That's why I can't wait for reviews and what real world performance looks like no dlss vs no dlss not a 3090ti without dlss vs a 4090 with dlss 3.0 which is a stupid comparison.

It's not about the reviews, maybe nvidia leaves an empty slot for a card targeted against the imminent Radeon RX 7800 XT launch later in November.

the 3090ti struggle at 1080p with cyberpunk ( not even reach 80-100 fps ) so the 4090 might be apple to play at 1440p 80-100 fps RT on . This is my best bet still nvidia & amd can't provide anything with ray tracing at 4k above 80 fps until 2025

You can always lower the settings and game at 4K!
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Definitely agree but everything is getting more expensive unfortunately cars, phones, electricity, food.... GPUs aren't immune to this and personally I'd rather them go all out doubling performance vs giving us 35% more performance at the same price and calling it a flagship. I'm as happy about the pricing as everyone else but I also try to be releastic. Again once reviews come out and competing RDNA3 cards are released I'll decide how good or bad these cards are at a given price.
True - but most of those other industries haven't been driving up prices for years already, padding out margins (at least not to the degree that GPU makers have). So those industries are - to some extent - driving up prices due to higher costs and needing to maintain some semblance of stability, while GPUs are seeing price hike on price hike.

Of course this also comes down to GPUs and PC gaming becoming interesting to more people, including more rich people, and thus addressing a broader (and wealthier) market, plus a whole heap of other factors. But it's undeniable that GPU makers' margins have been increasing rapidly in the last few years.
I'm personally rooting for much better priced amd cards I'm also not holding my breath.
Same here. I'd be happy to see that, but not surprised if they kept pace, sadly.
 
Joined
Jun 21, 2021
Messages
3,121 (2.49/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
If you want to push for maximum at 4K you can have acceptable FPS...

I remember I had 2x HD7970 Matrix Platinum to run GTA V at maximum graphics...

Rockstar will write GTA 6 so it runs well on Xbox Series and PS5. The title will make most of its revenue from console sales.

My guess is that the later PC port will run pretty well on mid-tier graphics cards. After all, who would want to play it on a PC using an $800 graphics card if it's only marginally better graphics-wise than a $500 console?
 

Dux

Joined
May 17, 2016
Messages
511 (0.16/day)
$200 price increase from RTX 3090 to 4090. Greedvidia at it again.
 
Joined
Feb 6, 2021
Messages
2,899 (2.09/day)
Location
Germany
Processor AMD Ryzen 7 7800X3D
Motherboard ASRock B650E Steel Legend Wifi
Cooling Arctic Liquid Freezer III 280
Memory 2x16GB Corsair Vengeance RGB 6000 CL30 (A-Die)
Video Card(s) RTX 4090 Gaming X Trio
Storage 1TB Samsung 990 PRO, 4TB Corsair MP600 PRO XT, 1TB WD SN850X, 4x4TB Crucial MX500
Display(s) Alienware AW2725DF, LG 27GR93U, LG 27GN950-B
Case Streacom BC1 V2 Black
Audio Device(s) Bose Companion Series 2 III, Sennheiser GSP600 and HD599 SE - Creative Soundblaster X4
Power Supply bequiet! Dark Power Pro 12 1500w Titanium
Mouse Razer Deathadder V3
Keyboard Razer Black Widow V3 TKL
VR HMD Oculus Rift S
Software ~2000 Video Games
i always love new tech. the 4090 looks really good... but damn selling a 4070 as a 4080 to justify 80 class pricing is ridiculous.

let me guess. an actual board partner card in europe after taxes costs like 2200€+...
 
Joined
Jun 5, 2018
Messages
237 (0.10/day)
Prices are fine, I expected $2000 or $2500 for 4090

Hah, no, not for me. Jensen is completely out of touch with reality. 899/1199 for 4080? I guess the "layering on top of Ampere" is now complete. Personally, I don't care how good these cards are, if Nvidia is making me pay 900 bucks for the entry high end, I'm out. I can skip this gen.
 
Joined
Jun 21, 2021
Messages
3,121 (2.49/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
i always love new tech. the 4090 looks really good... but damn selling a 4070 as a 4080 to justify 80 class pricing is ridiculous.

let me guess. an actual board partner card in europe after taxes costs like 2200€+...

I really think it's ill advised to pass judgment on new PC hardware (that hasn't been reviewed) simply based on the nomenclature/model number pattern. Especially since NVIDIA has frequently changed how they use these model numbers.

A wiser approach would be to wait for third-party PC reviewers to assess performance and then decide yourself on the value proposition that each product has in your market.

I've only visited Germany as a tourist so I don't know if residents have different ideas about how to buy things there. But that would seem to be a more sensible strategy than to look at a model number on a box.

At least here in the USA, Joe Consumer will heavily lean toward whatever is cheaper. Today's cards aren't mass market models though. Joe Consumer buys Toyota Celicas not Mercedes-Benz S600 or whatever.
 
Joined
Jul 4, 2018
Messages
120 (0.05/day)
Location
Seattle area, Wa
System Name Not pretty
Processor Ryzen 9 9950x
Motherboard Crosshair X870E
Cooling 420mm Arctic LF III, for now
Memory 64GB, DDR5-6000 cl30, G.Skill
Video Card(s) EVGA FTW3 RTX 3080ti
Storage 1TB Samsung 980 Pro (Win10), 2TB WD SN850X (Win11)
Display(s) old 27" Viewsonic 1080p, Asus 1080p, Viewsonic 4k
Case Corsair Obsidian 900D
Power Supply Super Flower
Benchmark Scores Cinebench r15, w/ 1680v2 @ 4.6ghz and XMP enabled, 1648 1680v2 @ 4.7ghz RAM @ stock 1333MT/s, 1696
Prices are in line with what?
The last xx80 tier card was $699.

About a week or two ago, rumors for the 4070 disappeared then reappeared as the 12gb 4080. Make no mistake, this is a renamed 4070 with a higher pricetag.

I expected the 4090 to be $2000 and the 16gb 4080 to be about $1000.....but I see what Nvidia did there. Why buy an AIB 16gb 4080 when a 4090 FE will be just $100-200 more?

Nvidia is trying to grab as much money as they can until worldwide economic conditions take a plunge. Also, Shareholders don't know any better and they think high prices means Nvidia has a really good product that can command such a high price.
 
Top