• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

Joined
May 31, 2016
Messages
4,437 (1.43/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Nvidia in 2014: "Look, here's twice the performance for the same price."
Nvidia in 2022: "You need to pay more to get more. That's just the way it is."
WTF?
Exactly that WTF. I understand that prices can go up due to some turbulence in economics we currently have but this is ridiculous and using that angle to justify unjustified price hikes which NV was preparing for for the last few years is not OK and I think the entire situation in Russia vs Ukraine war and post covid stuff makes it easier for companies to justify these hikes. But all of this is understandable from a company perspective. Use the angle you can to make people pay more. What I don't understand is some people here who are OK with these hikes ( a blind man would noticed what NV has been doing starting with Turing) and justify this because the performance is higher. Yes it is supposed to be higher every gen so you get more for the same price but it has to be evident. If you get scraps as more instead is that still OK? I disagree with the prices and angle that economy is suffering and inflation. Just because there is a little choke in the economy, prices should not be twice as high. Everything is being turned over and the worst par is, more and more people are OK with it. What a disgrace. AMD has a chance to attack NV with pricing and hard but they wont. They will go the same price hike route and will price the products accordingly to NV. At least that is what I bet on.

About the DLSS 3. Do I understand it correctly that only the 4000 series cards can use the DLSS 3.0 or will the Turing and Ampere be able too?
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Nvidia in 2014: "Look, here's twice the performance for the same price."
Nvidia in 2022: "You need to pay more to get more. That's just the way it is."
WTF?
Not only that, but the years previous to 2014 had seen relatively little price movement, while prices since then have already doubled or more.
I disagree with the prices and angle that economy is suffering and inflation. Just because there is a little choke in the economy, prices should not be twice as high.
This is a really, really important point, as there is a crucial difference between the current recession and previous ones: this one isn't caused by a systemic economic downturn from the bottom up (like the sub-prime mortgage crisis), but rather by the pandemic showcasing the precarity of our "efficient" (read: stripped to the bone) supply and value chains, and the capitalist class responding to the crisis by explicitly saying "Yes, this is a good opportunity to increase profits." What is the main driver of this recession - is it a failing major industry, is it fundamental economic instability, is it an overactive econony, is it a loss of productivity? No, it's skyrocketing prices on basic goods and services, massive wealth accumulation, coupled with stagnant or dropping wages, leading to most people being worse off and this then slowing down the economy. That's also why increasing interest rates is so inefficient this time around: the crisis isn't caused by people borrowing too much or overspending and running themselves into the ground, but by the super-rich host hoarding ever more money. Increasing interest rates won't solve that - only wealth taxes will. The staggering amounts of money stolen from public coffers by the consistent tax evasion and wage theft of the wealthy is the real cause of this recession - and Nvidia is exemplifying this with this launch. What a chilling economy like this needs is corporations to accept lower profit margins and to properly take on their important function of providing living wages and useful products to people - two crucial, fundamental responsibilities that neoliberal late stage capitalism does its utmost to let them ignore or deny the existence of.
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
About the DLSS 3. Do I understand it correctly that only the 4000 series cards can use the DLSS 3.0 or will the Turing and Ampere be able too?
From another post here on TPU:


As I understand, the hardware is there right from Turing, but whether DLSS 3 works on it or not, we'll see.

Exactly that WTF. I understand that prices can go up due to some turbulence in economics we currently have but this is ridiculous and using that angle to justify unjustified price hikes which NV was preparing for for the last few years is not OK and I think the entire situation in Russia vs Ukraine war and post covid stuff makes it easier for companies to justify these hikes. But all of this is understandable from a company perspective. Use the angle you can to make people pay more. What I don't understand is some people here who are OK with these hikes ( a blind man would noticed what NV has been doing starting with Turing) and justify this because the performance is higher. Yes it is supposed to be higher every gen so you get more for the same price but it has to be evident. If you get scraps as more instead is that still OK? I disagree with the prices and angle that economy is suffering and inflation. Just because there is a little choke in the economy, prices should not be twice as high. Everything is being turned over and the worst par is, more and more people are OK with it. What a disgrace. AMD has a chance to attack NV with pricing and hard but they wont. They will go the same price hike route and will price the products accordingly to NV. At least that is what I bet on.
Let's be honest:
1. The economic recession didn't start with covid. It started with the way governments around the world reacted to it. They could have implemented sensible safety measures, but noooo... they had to press the big red stop button on the economy. They created an artificial supply chain problem that didn't have a reason to exist. Of course companies like Nvidia jumped on it to make profit.
2. With international companies stopping business with Russia, they are ridding themselves of a huge market. It's a loss that has to be recouped by artificial price hikes. Fewer products sold = higher margins per unit. All of this while Russia still gets everything through black/grey import and China. Who's the real loser in this situation? We are. No Western company should have ever stopped business with Russia.

We tend to say that we're so much more enlightened than we were during the world wars and before, yet here we are, suffering the costs of political ideology once again.
 
Joined
May 31, 2016
Messages
4,437 (1.43/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
This is a really, really important point, as there is a crucial difference between the current recession and previous ones: this one isn't caused by a systemic economic downturn from the bottom up (like the sub-prime mortgage crisis), but rather by the pandemic showcasing the precarity of our "efficient" (read: stripped to the bone) supply and value chains, and the capitalist class responding to the crisis by explicitly saying "Yes, this is a good opportunity to increase profits." What is the main driver of this recession - is it a failing major industry, is it fundamental economic instability, is it an overactive econony, is it a loss of productivity? No, it's skyrocketing prices on basic goods and services, massive wealth accumulation, coupled with stagnant or dropping wages, leading to most people being worse off and this then slowing down the economy. That's also why increasing interest rates is so inefficient this time around: the crisis isn't caused by people borrowing too much or overspending and running themselves into the ground, but by the super-rich host hoarding ever more money. Increasing interest rates won't solve that - only wealth taxes will. The staggering amounts of money stolen from public coffers by the consistent tax evasion and wage theft of the wealthy is the real cause of this recession - and Nvidia is exemplifying this with this launch. What a chilling economy like this needs is corporations to accept lower profit margins and to properly take on their important function of providing living wages and useful products to people - two crucial, fundamental responsibilities that neoliberal late stage capitalism does its utmost to let them ignore or deny the existence of.
Everything you see nowadays, price hikes, the economy at the collapse, people whining about the inflation everything is to make people poor because those elites noticed that they are not wealthy enough or the difference between them is narrowing. What I dont understand is the ignorance of people around. They literally dont care since this has not touched them in the way it would hurt. It is like touching a hot iron. Most people have to touch it even though you tell them it is hot. Governments do nothing in my opinion to mitigate these economical chokes. Actually they welcomed these since it is easier to justify something of their doing.
Consider prices in Norway for electricity. x10 the normal price. Literally 10 times more. Previously it was due to covid x2 x3 or so. (infected cables I suppose) Now there is war in Ukraine. There is always a valid reason that will be used for the elites to justify their doings and normal people will have to pay for it. So these elites can feel superior. And yet Norway has so much renewable energy which people paid for in the first place in order for the renewable energy to contribute to the economy. So in order to get rich you have to take it from someone else unfortunately.
NV is doing this and I'm sure AMD will follow all because someone wants to have more power or more money a lot of people have to pay for it.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Everything you see nowadays, price hikes, the economy at the collapse, people whining about the inflation everything is to make people poor because those elites noticed that they are not wealthy enough or the difference between them is narrowing. What I dont understand is the ignorance of people around. They literally dont care since this has not touched them in the way it would hurt. It is like touching a hot iron. Most people have to touch it even though you tell them it is hot. Governments do nothing in my opinion to mitigate these economical chokes. Actually they welcomed them since it is easier to justify something of their doing.
Consider prices in Norway for electricity. x10 the normal price. Literally 10 times more. Previously it was due to covid x2 x3 or so. (infected cables I suppose) Now there is war in Ukraine. There is always a valid reason that will be used for the elites to justify their doings and normal people will have to pay for it. So these elites can feel superior. And yet Norway has so much renewable energy which people paid for in the first place in order for the renewable energy to contribute to the economy. So in order to get rich you have to take it from someone else unfortunately.
NV is doing this and I'm sure AMD will follow all because someone wants to have more power or more money a lot of people have to pay for it.
What the war in Ukraine has to do with energy price hikes in a country that basically produces its own energy from renewables is beyond me. Or is it?

But back to topic: I like the idea of the 4080 12 GB, I really do. But I can't justify spending nearly a grand on a graphics card alone. It's more like the price of a complete system overhaul for me - which I might do if AMD comes up with better pricing. Or maybe the Arc A770 ends up being dirt cheap (it'll have to be to compete), so maybe I'll just buy one and call it a day. At least it'll be something different.
 
Joined
Sep 10, 2018
Messages
6,912 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
What the war in Ukraine has to do with energy price hikes in a country that basically produces its own energy from renewables is beyond me. Or is it?

But back to topic: I like the idea of the 4080 12 GB, I really do. But I can't justify spending nearly a grand on a graphics card alone. It's more like the price of a complete system overhaul for me - which I might do if AMD comes up with better pricing. Or maybe the Arc A770 ends up being dirt cheap (it'll have to be to compete), so maybe I'll just buy one and call it a day. At least it'll be something different.

The A770 even if priced well will have too many headaches to buy over a competing Ampere or RDNA2 gpu.

On a separate note even with these crazy priced GPUs I'll still be happy for the tech enthusiasts who pick them up this fall/winter.

It's a shame Nvidia is pricing a lot of people out but I still wonder if that's due to them likely having over a year of ampere stock... The 4080 12GB looks like it's at best gonna match a 3090 that's already priced about the same, yeah you get DLSS 3.0 but that's still basically 0 progress at a similar price other than RT so Turing 2.0 it seems. The next 4-6 months should be interesting.

As far as AMD goes one of two things happened. Nvidia caught wind of actual RDNA3 performance and wasn't very impressed or they are super arrogant and will be blindsided by it's performance... Hoping for the latter.
 
Joined
Feb 9, 2015
Messages
41 (0.01/day)
System Name Raistlin
Processor Ryzen 5 5600X
Motherboard MSI X470 Gaming Pro
Cooling Noctua NH-D15S with dual fans
Memory 32GB G.Skill 3600MHz DDR4 CL16 (F4-3600C16-16GTZNC)
Video Card(s) Nvidia RTX 3090 (MSI Suprim X)
Storage 1 x 960GB SX8200, 1 x 1TB SX8200, 1 x 2TB Seagate HDD
Display(s) LG 34GP950G, 2x DELL S2721D, LG 48" C2 OLED (OLED48C2PUA), HiSense 75U78KM (75" Mini-LED 4K TV)
Case Thermaltake Core X9
Audio Device(s) Topping E30 + Drop O2 Amplifer + Sennheiser HD 600 / HIFIMAN HE4XX / Sound BlasterX Katana
Power Supply EVGA SuperNova 1300 G2
Mouse Razer Naga Pro wireless
Keyboard Ducky One 2 full size
VR HMD HP Reverb G2
Software Windows 10 Professional
And you've thus also clearly had the privilege of having that hard work actually pay off - unlike a lot of people. Generally, the hardest working people you'll find are those working two or three shit jobs, barely covering rent and basic living expenses. Don't confuse being lucky enough for things to work out for you with people not being that lucky not having worked as hard, or not being as deserving of good things, please.

Also, 6-tier GPUs are distinctly not Ferraris - they're supposed to be the Toyotas of the GPU world. And if supposedly cheap Toyotas start being priced like Ferraris, there's something serious wrong going on.
I never said anything about anyone else's financial situations, how hard they worked, or anything else you ranted about. Don't put words in my mouth.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Everything you see nowadays, price hikes, the economy at the collapse, people whining about the inflation everything is to make people poor because those elites noticed that they are not wealthy enough or the difference between them is narrowing. What I dont understand is the ignorance of people around. They literally dont care since this has not touched them in the way it would hurt. It is like touching a hot iron. Most people have to touch it even though you tell them it is hot. Governments do nothing in my opinion to mitigate these economical chokes. Actually they welcomed these since it is easier to justify something of their doing.
Consider prices in Norway for electricity. x10 the normal price. Literally 10 times more. Previously it was due to covid x2 x3 or so. (infected cables I suppose) Now there is war in Ukraine. There is always a valid reason that will be used for the elites to justify their doings and normal people will have to pay for it. So these elites can feel superior. And yet Norway has so much renewable energy which people paid for in the first place in order for the renewable energy to contribute to the economy. So in order to get rich you have to take it from someone else unfortunately.
NV is doing this and I'm sure AMD will follow all because someone wants to have more power or more money a lot of people have to pay for it.
What the war in Ukraine has to do with energy price hikes in a country that basically produces its own energy from renewables is beyond me. Or is it?
This is surprisingly simple, really: Norway is tied into the EU's energy trade system through being an EEC member, and the EU's energy trade system is an incredibly dumb system built around an assumption that there will never, ever be an energy shortage. I mean, the concept is blatantly idiotic: Whoever places the highest bid for the most expensive form of energy sets the price for all kinds of energy for everyone (within their sales region). It's easy enough to understand why this was set up: to protect more expensive forms of energy production (which used to be renewables, but is now Russian gas), so that they wouldn't be left with no sales in periods of excess energy production, protecting the industry and allowing for innovation even if it isn't immediately profitable. The obvious problem with this arises as soon as there is even a tiny shortage, especially if said shortage is linked to a specific form of energy. Which is the current situation.

It would also be trivially simple to solve this: go from a "highest bid sets the universal price" system to a system that distributes the price difference from more expensive forms of energy onto cheaper ones, averaging out prices rather than raising them all to match the most expensive. Sure, this would be more complex than the current system (you wouldn't be able to set final prices until after all sales were settled), but it really wouldn't be that hard - it would just require active regulation. Instead, we're maintaining a system that now works explicitly to shovel money into the coffers of whoever is producing cheap energy, as they literally can't sell it anywhere near cost. (Of course, there's also a somewhat unique situation in southern Norway with a very dry summer and thus little water in hydroelectric reservoirs, which drives up costs through fear of a future shortage, alongside increases in consumption from increased electrification of industry and widespread adoption of electric cars, which have yet to be met by increased production of electricity.)

As I understand, the hardware is there right from Turing, but whether DLSS 3 works on it or not, we'll see.
From what I gathered, DLSS 3 is reliant on a degree of FP8 support that no generation previous to Lovelace has. I mean, they're very explicit about previous generations only supporting DLSS2 on their site:
1663835728718.png


But back to topic: I like the idea of the 4080 12 GB, I really do.
I think the 4080 12GB will probably be a good GPU in a vacuum - it's just such an insulting proposition alongside the 16GB, the 4090, and the 30 series. A $200 price increase - and this isn't even the 8-tier card, but an explicitly and significantly cut-down variant? The only "value" proposition of this is "hey, look, it's supposed to match the 3090 at a much lower price" - which of course ignores the fact that the 3090 was a stupidly overpriced card to begin with.

If they called this the 4070 Ti and sold it for $600, it would be fantastic even if that would still represent a significant per-tier price hike. Instead, they're framing this as "8-tier for those of you who can't afford the full-fat version", on top of a $200 price increase. It's just such an obvious slap in the face.

I never said anything about anyone else's financial situations, how hard they worked, or anything else you ranted about. Don't put words in my mouth.
I'm not saying that you did, but the rhetoric of "I worked hard for my wealth" can't be extracted from its inherent politics or implications. Claiming that such statements aren't making a strong implication that less wealthy people are either not working as hard, and/or are just less deserving overall is ... well, IMO impossible - after all, what you said strongly references meritocracy and the idea that there's a linear(ish) relation between work and rewards (both of which are pure and utter BS). Which is what I was pointing out. I'm not saying that you were explicitly saying this, but the implication is there whether you like it or not. There is no other logically sound argument to be extracted from what you said - and after all, you were quite explicit in your "if it's too expensive, then this just isn't for you" approach. I'm not putting words in your mouth, just making the very strong implications from your rhetoric and point of view explicit. You're very welcome to argue for how you mean something different than that - but that would require actual arguments. Beyond that, putting the two statements of your post together very strongly implies "well, if you can't afford this, you haven't been working hard enough, unlike me". That might not be what you intended to say, but it's right there in what you said regardless of intention.
 
Last edited:
Joined
Oct 10, 2018
Messages
147 (0.07/day)
I believe that Turing and Ampere would get DLSS 3.0 support when FSR 3.0 released. Also, frame generation is pointless to me it is an illusion which is increasing latency. Nvidia shares its presentation slides with detailed information. In Cyberpunk video, they used Reflex with DLSS 3.0 to decrease latency. Even they used reflex with unknown version and DLSS was setted to performance mode, latency is almost same with DLSS 2.0 without reflex. I am assuming probably without reflex, DLSS 3.0 becomes disaster, latency would be like 80-100ms. It is very disappointing. FSR getting 2.1 update and it has nice improvement to compare to 2.0.

jz1KynOwimJFnhJ7.jpg


jWeCXcotbiGuMZnF.jpg

From what I gathered, DLSS 3 is reliant on a degree of FP8 support that no generation previous to Lovelace has. I mean, they're very explicit about previous generations only supporting DLSS2 on their site:
Please look at that,
It is a evidence of what I said. Waiting RDNA 3 to kick Nvidia from the throne.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I believe that Turing and Ampere would get DLSS 3.0 support when FSR 3.0 released. Also, frame generation is pointless to me it is an illusion which is increasing latency. Nvidia shares its presentation slides with detailed information. In Cyberpunk video, they used Reflex with DLSS 3.0 to decrease latency. Even they used reflex with unknown version and DLSS was setted to performance mode, latency is almost same with DLSS 2.0 without reflex. I am assuming probably without reflex, DLSS 3.0 becomes disaster, latency would be like 80-100ms. It is very disappointing. FSR getting 2.1 update and it has nice improvement to compare to 2.0.

View attachment 262561

View attachment 262563

Please look at that,
It is a evidence of what I said. Waiting RDNA 3 to kick Nvidia from the throne.
I see that tweet as saying "they could run DLSS3 on Ampere, but without Lovelace's FP8 support, it would be so slow as to make it very laggy". Which, again, translates to it not having the required hardware to really run it. I mean, you can run anything FP8 in an FP16 core, but unless that core can execute two consecutive FP8 instructions you need as many FP16 cores as you would need FP8 cores on the newer model - which the older hardware is extremely unlikely to have. In other words: it could technically run it, but it would run it at such a low performance level that it would be unusable. This is also a very strong argument for DLSS3 never being back-ported to previous architectures - unless they could make it work in FP16 at a similar performance level (which is highly unlikely).

If Lovelace delivers packed FP8 instructions (i.e. 1xFP32, 2xFP16, 4xFP8) through the tensor cores, then for anything actually running FP8 that is an instant 2x performance increase over previous generations running the same workload, with no real way of working around it.

This also doesn't really say that DLSS3 is necessarily laggy on Lovelace - as long as the FP8 can keep up, delivering that interpolated frame in the gap between the previous and next frame, then it should be completely fine. There's no reason why this should increase input lag unless the interpolated frame causes the next "real" frame to be delayed.
 
Joined
May 31, 2016
Messages
4,437 (1.43/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
This is surprisingly simple, really: Norway is tied into the EU's energy trade system through being an EEC member, and the EU's energy trade system is an incredibly dumb system built around an assumption that there will never, ever be an energy shortage. I mean, the concept is blatantly idiotic: Whoever places the highest bid for the most expensive form of energy sets the price for all kinds of energy for everyone. It's easy enough to understand why this was set up: to protect more expensive forms of energy production (which used to be renewables, but is now Russian gas), so that they wouldn't be left with no sales in periods of excess energy production, protecting the industry and allowing for innovation even if it isn't immediately profitable. The obvious problem with this arises as soon as there is even a tiny shortage, especially if said shortage is linked to a specific form of energy. Which is the current situation.
Not really. Nobody has a problem with energy if you ask people in germany (oil price yes but not energy) other countries too like Poland. The prices for electricity are up a bit OK but still not x10 for a country that is literally producing energy which is supposed to be free energy. People who pay taxes in Norway contribute to that "free energy" source by paying taxes and yet they still have to pay x10 (even more now) price for electricity. That is absurd.

rom what I gathered, DLSS 3 is reliant on a degree of FP8 support that no generation previous to Lovelace has. I mean, they're very explicit about previous generations only supporting DLSS2 on their site:
So DLSS 3 is an exclusive to ADA? Well, that is even worse than I expected it to be. It is like you have to pay for the right to use DLSS 3 and obviously people here will say it is open source and it is worth it. I really dont understand that approach.
I think the 4080 12GB will probably be a good GPU in a vacuum - it's just such an insulting proposition alongside the 16GB, the 4090, and the 30 series. A $200 price increase - and this isn't even the 8-tier card, but an explicitly and significantly cut-down variant? The only "value" proposition of this is "hey, look, it's supposed to match the 3090 at a much lower price" - which of course ignores the fact that the 3090 was a stupidly overpriced card to begin with.

If they called this the 4070 Ti and sold it for $600, it would be fantastic even if that would still represent a significant per-tier price hike. Instead, they're framing this as "8-tier for those of you who can't afford the full-fat version", on top of a $200 price increase. It's just such an obvious slap in the face.
The price for 4080 12GB what is a 4070 Ti or so is ridiculous. For me, it looks like NV wants to confuse people. They buy with their eyes and 4080 is a 4080 despite the 12 vs 16gb. I'm sure 90% of people will not know the difference between 4080 12 and 16 is not just Vram capacity. It is exactly like in the mobile market when you buy a laptop with a 3080 which basically is barely 3070's performance. Neat trick i call it.

It is a evidence of what I said. Waiting RDNA 3 to kick Nvidia from the throne.
GTX never got DLSS support. Price hikes are not for NV offering DLSS 3 to previous gen cards. The reason for DLSS 3 is to convince consumers to ditch previous gen cards and go for expensive 4000 series because of the performance (2x has been advertised constantly) and DLSS 3 which I'm pretty sure will stay exclusive to ADA despite new FSR release.
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
So DLSS 3 is an exclusive to ADA? Well, that is even worse than I expected it to be. It is like you have to pay for the right to use DLSS 3 and obviously people here will say it is open source and it is worth it. I really dont understand that approach.
They're basically saying "Thanks for saving up to use our new exclusive features with Turing and Ampere. Now it's time to save up for our next super-exclusive feature with Ada. Chop chop."

One can think of it as Nvidia innovating gaming technologies, but in a world where we also have FSR that runs even on Intel Xe integrated graphics, I'd rather not.

The price for 4080 12GB what is a 4070 Ti or so is ridiculous. For me, it looks like NV wants to confuse people. They buy with their eyes and 4080 is a 4080 despite the 12 vs 16gb. I'm sure 90% of people will not know the difference between 4080 12 and 16 is not just Vram capacity. It is exactly like in the mobile market when you buy a laptop with a 3080 which basically is barely 3070's performance. Neat trick i call it.
Yes. It's a x70-tier card with a x70-tier GPU (AD104 - the xx4 has always been x60 or x70 level) and x70-tier VRAM, but with an x80 name so they can sell it for more money. Disgusting.

This also doesn't really say that DLSS3 is necessarily laggy on Lovelace - as long as the FP8 can keep up, delivering that interpolated frame in the gap between the previous and next frame, then it should be completely fine. There's no reason why this should increase input lag unless the interpolated frame causes the next "real" frame to be delayed.
I can see it being laggy if the generated frames don't rely on user input - and I don't see how they would.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Not really. Nobody has a problem with energy if you ask people in germany (oil price yes but not energy) other countries too like Poland. The prices for electricity are up a bit OK but still not x10 for a country that is literally producing energy which is supposed to be free energy. People who pay taxes in Norway contribute to that "free energy" source by paying taxes and yet they still have to pay x10 (even more now) price for electricity. That is absurd.
Sorry, but this is nonsense. There have been literal protests across Europe, including Germany, due to rising energy costs, and Norway is - by far! - not the country suffering the most from this. The Baltic states have been hit extremely hard with very high energy costs - they've been the only ones to reach the mandated maximum price for EU energy trading (€4/kWh, IIRC), and have generally had far higher prices than even the worst regions in Norway. Also, most of Europe uses a lot of gas for heating (both houses and water), which means their energy systems work somewhat differently, but as Norway exclusively uses electricity for this you then need to include gas prices in any such comparison (which is also why these are traded within the same EU system, unlike oil or other sources of energy that are used for different things). Nobody is protesting oil prices, they're protesting gas and electric prices, which are tied together through the same system of trading.

The absurd thing here is the system as I described it above, and how it is built so that it shovels money into the pockets of whoever is generating cheap electricity during a shortage. This has nothing to do with taxation or who is paying for what, but is down to a fundamentally flawed system of trade regulation, which needs to be fixed.
So DLSS 3 is an exclusive to ADA? Well, that is even worse than I expected it to be.
They've been pretty explicit about this from the launch - it's one of the main selling points of Lovelace.
The price for 4080 12GB what is a 4070 Ti or so is ridiculous. For me, it looks like NV wants to confuse people. They buy with their eyes and 4080 is a 4080 despite the 12 vs 16gb. I'm sure 90% of people will not know the difference between 4080 12 and 16 is not just Vram capacity. It is exactly like in the mobile market when you buy a laptop with a 3080 which basically is barely 3070's performance. Neat trick i call it.
Yep, obviously. It does a lot of things at once: makes the $1200(!!!) 16GB look less completely absurd ("there's a cheaper 8-tier too!"); it deliver a "reasonable" 8-tier GPU ("it's only +$200 from the 3080, and it's much faster!"), it hides the massive performance difference between the two (they're both "the 4080"), and it allows Nvidia to keep pushing prices ever higher without implementing a much-needed change to their model tier scheme. I don't see it as a parallel to mobile though - mobile has real-world physical limitations that necessitate performance differences, and it's a distinct and discrete product segment fromd desktops even if they share core naming conventions. A mobile 3080 is still the 8-tier mobile GPU, the fastest or second fastest there is - you just can't expect a 150W mobile part to match a 330W desktop part. That is fundamentally different from this - the 12GB and 16GB 4080s are just very, very different hardware, with the 12GB having drastically less shaders, memory and memory bandwidth, with cost being the only explanation - but it's still wildly expensive. It just makes zero sense outside of a perspective that only cares about Nvidia's profits.
 
Last edited:
Joined
May 31, 2016
Messages
4,437 (1.43/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Yes. It's a x70-tier card with a x70-tier GPU (AD104 - the xx4 has always been x60 or x70 level) and x70-tier VRAM, but with an x80 name so they can sell it for more money. Disgusting.
Somebody here even calculated, that the difference in resources between 4090 and 3080 12Gb is a difference between a 3090 and 3060Ti. Imagine that. I need to check this and obviously check the performance of the new stuff but I'm shocked by the arrogance of the company.
I hope AMD will do better and the price probably will be OK-ish but not good. The prices have been going up since Turing and they are accelerating.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I can see it being laggy if the generated frames don't rely on user input - and I don't see how they would.
That's true, but there's a real question of how perceptible that lag will be, assuming framerates are sufficiently high. 120fps visuals with 60fps input lag will clearly not be as responsive as full 120fps, but it will feel smoother than 60fps in meaningful ways. The big question is whether the oscillation between real input/interpolated motion/real input etc. will be very perceptible, or if their motion interpolation is good enough at guessing at what motion is coming up.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
May 31, 2016
Messages
4,437 (1.43/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Sorry, but this is nonsense. There have been literal protests across Europe, including Germany, due to rising energy costs, and Norway is - by far! - not the country suffering the most from this. The Baltic states have been hit extremely hard with very high energy costs - they've been the only ones to reach the mandated maximum price for EU energy trading (€4/kWh, IIRC), and have generally had far higher prices than even the worst regions in Norway. Also, most of Europe uses a lot of gas for heating (both houses and water), which means their energy systems work somewhat differently, but as Norway exclusively uses electricity for this you then need to include gas prices in any such comparison. Nobody is protesting oil prices, they're protesting gas and electric prices, which are tied together through the same system of trading.

The absurd thing here is the system as I described it above, and how it is built so that it shovels money into the pockets of whoever is generating cheap electricity during a shortage. This has nothing to do with taxation or who is paying for what, but is down to a fundamentally flawed system of trade regulation, which needs to be fixed.
nonsense really? I have friends in Germany and Poland. They pay peanuts for energy. Literally nothing per month. to give you and example.
In Poland, my friend pays 320NOK per 2 MOTHS OF ELECTRICITY using around 250-300KW/H per month (you get it? ) Two fucking months. You know how much I paid for 300KW/H for last month? 2200NOK
Germany, the last time I checked they were only concerned about the gas price. When I mentioned Electricity, they have asked my. What electricity prices?
Energy meaning GAS PRICE not electricity per se. how do I know? My wife lived there for 7 months before moving in to Norway. Of course they have had a price hike but not ten times more like in Norway. No way you going to tell me that the price in Germany and Poland for instance (also France and Spain) went up ten times over the last 2 years.
So no it is not a nonsense to me rather it is nonsense of what the price for electricity is here in Norway when you clearly produce electricity from renewable sources which is supposed to be free.
They've been pretty explicit about this from the launch - it's one of the main selling points of Lovelace.
Main selling point or downfall when people see they get stiffed. DLSS 3 is out so who cares about DLSS 2 development?
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Yep, obviously. It does a lot of things at once: makes the $1200(!!!) 16GB look less completely absurd ("there's a cheaper 8-tier too!"); it deliver a "reasonable" 8-tier GPU ("it's only +$200 from the 3080, and it's much faster!"), it hides the massive performance difference between the two (they're both "the 4080"), and it allows Nvidia to keep pushing prices ever higher without implementing a much-needed change to their model tier scheme. I don't see it as a parallel to mobile though - mobile has real-world physical limitations that necessitate performance differences, and it's a distinct and discrete product segment fromd desktops even if they share core naming conventions. A mobile 3080 is still the 8-tier mobile GPU, the fastest or second fastest there is - you just can't expect a 150W mobile part to match a 330W desktop part. That is fundamentally different from this - the 12GB and 16GB 4080s are just very, very different hardware, with the 12GB having drastically less shaders, memory and memory bandwidth, with cost being the only explanation - but it's still wildly expensive. It just makes zero sense outside of a perspective that only cares about Nvidia's profits.
It's not just the amount of VRAM, shaders, memory controller, etc... it's literally a smaller chip with higher number of dies per wafer, less defects, lower power (VRM) and PCB complexity requirements, etc. It's a MUCH cheaper card to manufacture at a SLIGHTLY lower price just because it's got an '80' in the name. This is the disgusting thing about it.

That's true, but there's a real question of how perceptible that lag will be, assuming framerates are sufficiently high. 120fps visuals with 60fps input lag will clearly not be as responsive as full 120fps, but it will feel smoother than 60fps in meaningful ways. The big question is whether the oscillation between real input/interpolated motion/real input etc. will be very perceptible, or if their motion interpolation is good enough at guessing at what motion is coming up.
If that's the case, then I guess it'll be just another feature for the "300+ fps gaming ftw" mob. I'm fine with 30-45 fps 99% of the time, but if it comes with a 15 fps input response, then it'll be just as unplayable as 15 fps on screen.

Main selling point or downfall when people see they get stiffed. DLSS 3 is out so who cares about DLSS 2 development?
Personally, I find it hard to look at it as a selling point as long as the cheapest card that can run it costs 900 bucks.
 
Joined
May 31, 2016
Messages
4,437 (1.43/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
If that's the case, then I guess it'll be just another feature for the "300+ fps gaming ftw" mob. I'm fine with 30-45 fps 99% of the time, but if it comes with a 15 fps input response, then it'll be just as unplayable as 15 fps on screen.
I think if it would have been laggy on turing or Ampere you could make it better with the driver to mitigate that problem
 
Joined
Aug 10, 2021
Messages
166 (0.14/day)
System Name Main
Processor 5900X
Motherboard Asrock 570X Taichi
Memory 32GB
Video Card(s) 6800XT
Display(s) Odyssey C49G95T - 5120 x 1440
Nord pool, the exchange for energy trading is a private institute, not EU exchange. 66% ENX and 34% TSO Holding owned.
At least vs Sweden and Denmark, Norway have a way higher percentage use spot prices, instead of fixed long term prices. I'm assuming it's similar in the mentioned places. I've seen people from UK being nervous about what their new fixed price plan will be when their current expires.
Of course prices in energy will increase, when demand is above/close to supply.


Won't DLSS fall away as a selling point now? Will new games stop getting DLSS2? Can 4xxx buyers trust being able to use DLSS3 (or next versions), or will it just disappear when 5xxx comes with DLSS4 (if it does, who knows, not buyers)?
 
Joined
May 31, 2016
Messages
4,437 (1.43/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Personally, I find it hard to look at it as a selling point as long as the cheapest card that can run it costs 900 bucks.
I know, same here. But believe me it is, in NV eyes and some people. Remember how people here were hyping DLSS 2.0 and that is stays forever and OMG how great it is? Now it's being replaced by DLSS 3 and it will not get any development because NV will focus on DLSS 3 which they have been refused to use. Buy new expensive shit from NV and you will be able to. The problem here is next year (is it next year?) new gen NV card come out and DLSS 4 will only work with the new one. If you buy any of the cards from NV you are being left alone with all the hyped features you purchased the card for in the first place.
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I know, same here. But believe me it is, in NV eyes and some people. Remember how people here were hyping DLSS 2.0 and that is stays forever and OMG how great it is? Now it's being replaced by DLSS 3 and it will not get any development because NV will focus on DLSS 3 which they have been refused to use. Buy new expensive shit from NV and you will be able to. The problem here is next year (is it next year?) new gen NV card come out and DLSS 4 will only work with the new one. If you buy any of the cards from NV you are being left alone with all the hyped features you purchased the card for in the first place.
... before the previous hyped feature even got proper support in games (other than the ones sponsored by Nvidia), I might add.
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
It's not just the amount of VRAM, shaders, memory controller, etc... it's literally a smaller chip with higher number of dies per wafer, less defects, lower power (VRM) and PCB complexity requirements, etc. It's a MUCH cheaper card to manufacture at a SLIGHTLY lower price just because it's got an '80' in the name. This is the disgusting thing about it.


If that's the case, then I guess it'll be just another feature for the "300+ fps gaming ftw" mob. I'm fine with 30-45 fps 99% of the time, but if it comes with a 15 fps input response, then it'll be just as unplayable as 15 fps on screen.


Personally, I find it hard to look at it as a selling point as long as the cheapest card that can run it costs 900 bucks.
Well dlss 1/2 suffered hitches in FPS with extreme mouse movement IE. 180° spin and it's worth noting dlss3 was only shown moving in one direction without head turns.

Not sure I like it.
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Well dlss 1/2 suffered hitches in FPS with extreme mouse movement IE. 180° spin and it's worth noting dlss3 was only shown moving in one direction without head turns.

Not sure I like it.
Let's get to the point: no circus trick can beat good old native resolution gaming. ;)
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Somebody here even calculated, that the difference in resources between 4090 and 3080 12Gb is a difference between a 3090 and 3060Ti. Imagine that. I need to check this and obviously check the performance of the new stuff but I'm shocked by the arrogance of the company.
I hope AMD will do better and the price probably will be OK-ish but not good. The prices have been going up since Turing and they are accelerating.
Honestly, I kind of feel like Nvidia went too far with the 4090, and this is part of why they're making themselves look so damn terrible right now. Sure, it's not an absurdly large die in and of itself - it's even smaller than its predecessor. But the density jump from Samsung 8 to TSMC N4 is so large that to fill that space they just made it too big - making everything else look ridiculously bad in comparison.

I can see two reasons for choosing this path: competitive pressure from AMD, and a desire to sell hyper-expensive ultra-flagships. Without either of those, they could have scaled AD102 down to a die size like the GP102 instead, making a much smaller and more reasonably sized die that would have delivered tons of performance still. But no - they chose to go balls-to-the-wall, on a super expensive node.

I mean, what if they went a bit more moderate with this instead - or at least didn't try to make a smooth gradient in pricing from a 380mm2 die to a 600mm2 one? If TPU's die size for the AD103 is correct, charging $1200 for that and $1600 for the 4090 - at 60% more die area, and 50% more VRAM - is downright absurd. But Nvidia has clearly chosen the path of "we'll sell on mindshare and flagship cred alone, screw any idea of value".

nonsense really? I have friends in Germany and Poland. They pay peanuts for energy. Literally nothing per month. to give you and example.
In Poland, my friend pays 320NOK per 2 MOTHS OF ELECTRICITY using around 250-300KW/H per month (you get it? ) Two fucking months. You know how much I paid for 300KW/H for last month? 2200NOK
Germany, the last time I checked they were only concerned about the gas price. When I mentioned Electricity, they have asked my. What electricity prices?
Energy meaning GAS PRICE not electricity per se. how do I know? My wife lived there for 7 months before moving in to Norway. Of course they have had a price hike but not ten times more like in Norway. No way you going to tell me that the price in Germany and Poland for instance (also France and Spain) went up ten times over the last 2 years.
So no it is not a nonsense to me rather it is nonsense of what the price for electricity is here in Norway when you clearly produce electricity from renewable sources which is supposed to be free.
Sorry, but where are you getting the idea that energy from renewable resources is supposed to be free? Do you imagine that building, running and maintaining a power plant doesn't have a cost? They're cheap in the long run, but not free.

Also, did you read anything at all of the post I responded to? The countries you mention do not use electricity for heating, they use gas for heating. Gas is their major domestic energy expenditure, and gas is traded in the same system as electricity in the EU. There are of course exceptions - there is a move towards less reliance on natural gas for heating, but pricing hasn't followed yet, meaning anyone there using electricity for heating is in a pretty good position.

As for gas prices: more than 5x higher according to this source (and historically, EU energy prices have been significantly higher than Norwegian prices, which makes any increase here look bigger). Here are reports of recent energy pricing protests in Germany; energy prices are a big part of the ongoing and hotly protested cost of living crisis across the continent. It's not getting much exclusive attention because it's tied into rising prices for food and other necessities as well, but energy prices are a central part of this crisis.

I mean, you're explicitly contradicting yourself here, on the one hand you say "they pay peanuts for energy", and on the other you say "they pay for gas, not electricity", which ... well, if you read my previous post, maybe you'd understand why this is effectively the same thing - it's the core domestic source of energy, is traded within the same system, and is used for the same things. Norway uses mainly electricity and that's where prices are high; continental Europe uses mostly gas and that's where prices are high. Trying to separate these two is actively misleading and just downright misrepresenting reality - the current situation in Norway is in no way unique.
Nord pool, the exchange for energy trading is a private institute, not EU exchange. 66% ENX and 34% TSO Holding owned.
At least vs Sweden and Denmark, Norway have a way higher percentage use spot prices, instead of fixed long term prices. I'm assuming it's similar in the mentioned places. I've seen people from UK being nervous about what their new fixed price plan will be when their current expires.
Of course prices in energy will increase, when demand is above/close to supply.
Yes, it is a private exchange - just as most stock markets are private - but it's still subject to EU trade regulations. And the EU could quite easily mandate a change in their trade policies. And crucially, nobody here is arguing that it's unnatural for prices to rise, it's just the magnitude of this rise and how it's intrinsically tied into the mechanisms of the trade system rather than any even remotely sensible system of price setting that makes this problematic. The solution is simple: price regulation targeted towards averaging out prices closer to the average cost of production/sourcing the energy. This will of course eat into corporate profits, but, well, tough luck. Corporations do not have a right to exploit people in a crisis.

If that's the case, then I guess it'll be just another feature for the "300+ fps gaming ftw" mob. I'm fine with 30-45 fps 99% of the time, but if it comes with a 15 fps input response, then it'll be just as unplayable as 15 fps on screen.
This might well be - but then, games that are playable at 30-45fps are typically not all that reliant on smooth and rapid input, so the difference will also be less noticeable. But if the "source" framerate is indeed as low as 15fps, this will most likely be unplayable, yes.

Let's get to the point: no circus trick can beat good old native resolution gaming. ;)
True to some extent, but native resolution gaming is also a rather silly brute-force solution as resolutions scale higher, simply because the perceptible increase in detail and sharpness is pretty much inversely proportional to the resolution increase at this point. 4K has 4x the pixels of 1080p, and is clearly sharper even at 27", but it's not night and day. 8k is 4x the pixels of 4k, and the increase in sharpness is essentially imperceptible unless you're sitting very close to a very large TV. And as new nodes and increased transistor density becomes more difficult, we need to abandon the simplistic brute force solutions for improved visual fidelity - they're getting too expensive. If moving up one step in resolution has a 4x compute cost but a ... let's say 50% increase in perceptible detail/sharpness, then that is terrible, and never worth it. Upscaling is really the only viable way forward - though precisely how said upscaling will work is another question entirely.

I think cables are a good analogy for this: the signal requirements for the excessive bandwidth of DP 2.0 and HDMI 2.1 isn't bringing with it thumb-thick cables, but rather bringing about a shift to active cabling instead of passive copper. This takes us from a simple, brute-force solution to a more complex one. Where this analogy falls apart is that active cabling is easily 10x the BOM cost of passive, while upscaling is about as close to a free performance upgrade as you'll find. But it's another example of needing to find more complex, smarter solutions to a problem as the older brute-force ones are failing.
 
Top