• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA 2025 International CES Keynote: Liveblog

Joined
Jan 20, 2019
Messages
1,607 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
Goodness Gracious Me

I like reading peoples comments and then playing catch up with each new comment with TPU notifications. But, 11-pages already!! F-that!

50-series:

They've got us by the balls again! Its the same old game - toss out rumors and leaks to make us think prices are gonna be through the roof, so when launch day hits and its still expensive, but not as expensive as our fuelled predictions, we’re somehow sitting there like “Oh thank goodness, what a bargain!”. Classic hustle.

Not gonna lie, I’m kinda relieved though. Definitely ready to upgrade and finally break free from my current GPU bottleneck. A $750 5070 Ti might just do the job. If the 5080 actually delivers 4090-level or better performance, it’s definitely up for consideration. The least i'm expecting, absent of FG/DLSS, at 1440p, is a 50% increase in performance over my 3080 - if the 5070 TI is capable, its a BUY.

Come on Whizzy, jump over them NDAs and drop them reviews already!! :clap:
 
Joined
Jun 14, 2020
Messages
3,741 (2.24/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Yeah I don't understand how people do math either. Ever since we knew the 5080 was half the 5090, it was a given this would never be faster than the 4090. Its also the best outcome for Nvidia because now they just position the 4090 between the new line up, and it sits there just fine. Effectively, nothing happened between Ada and Blackwell if you think of it. The 4090 is what, 1499? It slots perfectly in the middle there. The perf/$ metric has moved exactly zero that way. You're just paying the extra performance with a higher power target = power consumption. Its not on Nvidia's bill at all. It is complete stagnation. But hey, here's DLSS4! hahaha And look at my leather jacket.

And here we have people saying 'muh, good prices'. :roll::roll: what the fck

The 5080 is also a big nothing burger if you know the 4080 Super exists. Same shader count. Same VRAM but a slight bit faster. Similar price. Every last bit of extra perf is probably paid by having to buy a new PSU, as this fantastic x80 is the first one to consume power like a flagship card on an OC.
5070 = 4090 bro, what you are talking about

/s
 
Joined
Jan 8, 2017
Messages
9,579 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
FG doesn't necessarily double your framerate, therefore I assume the new FG isn't going to quadruple them either, so whatever comparison you are trying to make from that screenshot are wrong.
Yes it does double the framerate, that's the whole point, it inserts a frame between every 2 rendered ones thus doubling it. The scaling isn't 100% because it has a cost per interpolated frame.

I think we're entering the realm of some serious coping right here, 2X FG had ~90-95% scaling and clearly 4X has similar scaling as well from 2X perhaps ever so slightly worse, 85-90%. 5080 with no FG is barely any faster than a 4080 Super in this game, the writing is on the wall.
 
Joined
Jun 14, 2020
Messages
3,741 (2.24/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Yes it does double the framerate, that's the whole point, it inserts a frame between every 2 rendered ones thus doubling it. The scaling isn't 100% because it has a cost per interpolated frame.

I think we're entering the realm of some serious coping right here, 2X FG had ~90-95% scaling and clearly 4X has similar scaling as well from 2X perhaps ever so slightly worse, 85-90%. 5080 with no FG is barely any faster than a 4080 Super in this game, the writing is on the wall.
Clearly you haven't used it. It doesn't always translate to a 90-95% scaling. Especially at 4k im usually seeing 40-70% scaling. EG Ghost of tsushima, which was the most recent game I played with FG.

What would I be coping about? Im just explaining to you how the thing works. Whatever
 
Joined
Oct 3, 2019
Messages
166 (0.09/day)
Processor Ryzen 3600 / intel i7 11800H
Motherboard MSI X470 Gaming Plus Max / whatever's in razer blade 15 2021
Cooling stock noisy AMD wraith cooler / noisy razer blade fans
Memory Corsair Vengeance RGB Pro 16GB DDR4-3200MHz / Samsung 16GB DDR4-3200 so dimms
Video Card(s) Sapphire Nitro RX580 8GBs / nVidia Geforce 3070 mobile
Storage Adata Gammix S11 Pro 1TB nvme / Kingston KC3000 2TB
Display(s) Gigabyte M32U
Case Corsair Caribide Air 540 / Razer Blade 15 2021 laptop
Mouse anything with optical switches
Here's a more general question. Is there a point of GPU price inflation at which console makers start reexamining going outside , or developing their own custom graphics? They don't have to account for backwards compatibility after all, and that's a big barrier to entry in discrete graphics that they wouldn't have to deal with. Or are the designs simply too complex for such a project to have any hope of succeeding?

I know it sounds like craziness, but Apple isn't doing all that bad with their imagination technologies derived graphics silicon, after all.
 
Joined
Feb 24, 2023
Messages
3,261 (4.76/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Here's a more general question. Is there a point of GPU price inflation at which console makers start reexamining going outside , or developing their own custom graphics? They don't have to account for backwards compatibility after all, and that's a big barrier to entry in discrete graphics that they wouldn't have to deal with. Or are the designs simply too complex for such a project to have any hope of succeeding?

I know it sounds like craziness, but Apple isn't doing all that bad with their imagination technologies derived graphics silicon, after all.
It's us average Joes who go to local Micro Centers, BestBuys, Amazons, eBays, AliExpresses and the sorts and pay $20 for a GPU, $30 for the cooling, $600 for NV tax, $50 for ASUS tax and $100 in other taxes so that a die that cost $20 for NVIDIA to print ends up an 8-hunnit retail price GPU. Console developers bulk on chips on much merrier terms. There is no way they start inventing their own GPUs because first off, reaching RTX 2000 series in terms of performance is already extremely problematic if not impossible for a complete dGPU market newbie, and cakes are coming in cheap anyway.

Apple are way more experienced in this regard than any other "non-GPU" player out here.

On topic: I expected crystal clear vast nothingness from this Blackwell generation but it slightly proved me wrong as prices aren't THAT insane. I assume 5070 Ti will make short work of 4080 in virtually every scenario. 15% IPC gains will be enough for this GPU to become my likely purchase as it'll crawl dangerously close to 8 times the RT performance I have now. And since my pure raster needs are basically covered by whatever GPU beefier than 3080 it's totally a hmmmmmmmm.
 
Joined
Jun 14, 2020
Messages
3,741 (2.24/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
It's us average Joes who go to local Micro Centers, BestBuys, Amazons, eBays, AliExpresses and the sorts and pay $20 for a GPU, $30 for the cooling, $600 for NV tax, $50 for ASUS tax and $100 in other taxes so that a die that cost $20 for NVIDIA to print ends up an 8-hunnit retail price GPU. Console developers bulk on chips on much merrier terms. There is no way they start inventing their own GPUs because first off, reaching RTX 2000 series in terms of performance is already extremely problematic if not impossible for a complete dGPU market newbie, and cakes are coming in cheap anyway.

Apple are way more experienced in this regard than any other "non-GPU" player out here.

On topic: I expected crystal clear vast nothingness from this Blackwell generation but it slightly proved me wrong as prices aren't THAT insane. I assume 5070 Ti will make short work of 4080 in virtually every scenario. 15% IPC gains will be enough for this GPU to become my likely purchase as it'll crawl dangerously close to 8 times the RT performance I have now. And since my pure raster needs are basically covered by whatever GPU beefier than 3080 it's totally a hmmmmmmmm.
Short work? I wouldn't bet on that, I think the 70ti will be close to the 4080
 
Joined
Feb 24, 2023
Messages
3,261 (4.76/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Short work? I wouldn't bet on that, I think the 70ti will be close to the 4080
70 VS 76 SM (slight disadvantage) but higher clocks, possibly higher IPC and significantly higher VRAM bandwidth at a lower price and lower TGP will mean it's an overall better GPU. Of course it's very subtle and sometimes one'll need a microscope to see a performance difference but all in all, it's more interesting. Especially considering overbuilt coolers, my 1 kW PSU and an absolute crap ton of cold days per year. If possible to OC beyond 3200 MHz on air then it's awesome. Expensive but awesome.
 
Joined
Jan 8, 2017
Messages
9,579 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Clearly you haven't used it. It doesn't always translate to a 90-95% scaling. Especially at 4k im usually seeing 40-70% scaling. EG Ghost of tsushima, which was the most recent game I played with FG.

What would I be coping about? Im just explaining to you how the thing works. Whatever
I think the problem is most of you just don't know math, that's why everybody is so mystified. In reality we have all the information we need.

1736277192603.png


This is from the same video, 580% to 1000% is a ~72% increase, that's the scaling from 2X to 4X, from the previous screenshot I posted in order to reach a final percentage of 185% on 4X the starting value must be around ~108%.

This puts the 5080 at a meager ~10-8% faster with 2X FG vs 4080 Super.
 
Joined
Jun 14, 2020
Messages
3,741 (2.24/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I think the problem is most of you just don't know math, that's why everybody is so mystified. In reality we have all the information we need.

View attachment 378843

This is from the same video, 580% to 1000% is a ~72% increase, that's the scaling from 2X to 4X, from the previous screenshot I posted in order to reach a final percentage of 185% on 4X the starting value must be around ~108%.

This puts the 5080 at a meager ~10-8% faster with 2X FG vs 4080 Super.
Ok bud
 
Joined
Jan 19, 2023
Messages
269 (0.37/day)
I think the problem is most of you just don't know math, that's why everybody is so mystified. In reality we have all the information we need.

View attachment 378843

This is from the same video, 580% to 1000% is a ~72% increase, that's the scaling from 2X to 4X, from the previous screenshot I posted in order to reach a final percentage of 185% on 4X the starting value must be around ~108%.

This puts the 5080 at a meager ~10-8% faster with 2X FG vs 4080 Super.
And that's in a game that heavily utilizes RT Cores. What will happen in games that do not have RT or have light implementation.
 
Joined
Feb 20, 2019
Messages
8,439 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
In the meantime DF released a video essentially confirming basically all of that performance comes from the 4X FG.

View attachment 378809
it's slower. Twice as many fake frames, not twice as many frames per second

Net result, input lag gets even worse, and input lag is the main reason people don't like fake frames in the first place. Not the only reason, but definitely the main one.

More seriously, of the thousand or so demanding titles from the last half decade, only a tiny tiny handful (under 50) actually even support Nvidia's frame-gen.
 
Joined
Sep 26, 2022
Messages
2,190 (2.62/day)
Location
Brazil
System Name G-Station 2.0 "YGUAZU"
Processor AMD Ryzen 7 5700X3D
Motherboard Gigabyte X470 Aorus Gaming 7 WiFi
Cooling Freezemod: Pump, Reservoir, 360mm Radiator, Fittings / Bykski: Blocks / Barrow: Meters
Memory Asgard Bragi DDR4-3600CL14 2x16GB
Video Card(s) Sapphire PULSE RX 7900 XTX
Storage 240GB Samsung 840 Evo, 1TB Asgard AN2, 2TB Hiksemi FUTURE-LITE, 320GB+1TB 7200RPM HDD
Display(s) Samsung 34" Odyssey OLED G8
Case Lian Li Lancool 216
Audio Device(s) Astro A40 TR + MixAmp
Power Supply Cougar GEX X2 1000W
Mouse Razer Viper Ultimate
Keyboard Razer Huntsman Elite (Red)
Software Windows 11 Pro, Garuda Linux
More seriously, of the thousand or so demanding titles from the last half decade, only a tiny tiny handful (under 50) actually even support Nvidia's frame-gen.
DLSS4 MFG will be a driver-level toggle, won't it? Will it only apply to whatever game already has DLSS3 FG enabled, or will it work over anything like AFMF does?
 

3x0

Joined
Oct 6, 2022
Messages
967 (1.17/day)
Processor AMD Ryzen 7 5800X3D
Motherboard MSI MPG B550I Gaming Edge Wi-Fi ITX
Cooling Scythe Fuma 2 rev. B Noctua NF-A12x25 Edition
Memory 2x16GiB G.Skill TridentZ DDR4 3200Mb/s CL14 F4-3200C14D-32GTZKW
Video Card(s) PowerColor Radeon RX7800 XT Hellhound 16GiB
Storage Western Digital Black SN850 WDS100T1X0E-00AFY0 1TiB, Western Digital Blue 3D WDS200T2B0A 2TiB
Display(s) Dell G2724D 27" IPS 1440P 165Hz, ASUS VG259QM 25” IPS 1080P 240Hz
Case Cooler Master NR200P ITX
Audio Device(s) Altec Lansing 220, HyperX Cloud II
Power Supply Corsair SF750 Platinum 750W SFX
Mouse Lamzu Atlantis Mini Wireless
Keyboard HyperX Alloy Origins Aqua
DLSS4 MFG will be a driver-level toggle, won't it? Will it only apply to whatever game already has DLSS3 FG enabled, or will it work over anything like AFMF does?
It's based on DLSS3 frame gen, it's not universal.
 
Joined
Jul 24, 2024
Messages
314 (1.85/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
Architecture-level improvements. And 40 series are not being deprived of any feature, they just won't support frame generation at factors above 2x...
That's the thing - architectural level inprovements are non existent. Blackwell is shrinked Ada on steroids. Brute force. Nvidia added as much new compute units as was possible and tried to balance it power-wise.

You can tell from 5090's specs that efficiency is also a problem now. 4090 has 5000 shaders less but also much lower TGP than 5090. Were there any significant architectural changes, it would not end like that. Nvidia brute forced everything towards so called AI features (DLSS, FG). Jensen already stated before that this is the only way for new stage of gaming. I have my doubts, though.

As for new DLSS, please, don't say that 4000 series will be deprived of nothing and basically they just won't support something here, something there, there and also there and god knows where epse as well. RTX 4090 is surely capable (hardware-wise) for new DLSS tech when slower 5080 is capable (and anything below 5080 as well). Or change my mind, give me one real reason why 4090 would not be capable.

RTX 5080 will not beat RTX 4090 in native. Because:
- not enough computing power
- that would negatively affect 4090 sales which is Jensen's golden goose, they can't just release something more powerful and price it 20-30% less, or else they would cripple their own sales
- there will be RTX 5080 Ti with around 14k shaders and this one maybe will be on par with 4090

Performance-wise, from best:
RTX 5090
RTX 4090
RTX 5080 Ti (Super) with around 400W TGP
RTX 5080
RTX 5070 Ti (Super)
RTX 5070
 
Last edited:
Joined
Sep 10, 2018
Messages
7,266 (3.14/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
FG doesn't necessarily double your framerate, therefore I assume the new FG isn't going to quadruple them either, so whatever comparison you are trying to make from that screenshot are wrong.

I agree though it is looking like apples to apples the 5080 probably isn't much faster than the 4080..
 
Joined
Dec 28, 2012
Messages
4,029 (0.92/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Here's a more general question. Is there a point of GPU price inflation at which console makers start reexamining going outside , or developing their own custom graphics? They don't have to account for backwards compatibility after all, and that's a big barrier to entry in discrete graphics that they wouldn't have to deal with. Or are the designs simply too complex for such a project to have any hope of succeeding?

I know it sounds like craziness, but Apple isn't doing all that bad with their imagination technologies derived graphics silicon, after all.
The console makers are not paying GPU inflation. Console chips are, famously, very low margin designs for chip makers, part of why nvidia is happy to let AMD have it.

It's also monstrously expensive, if Sony/MS had to design their own, neither one would be making any profit from consoles, even with software sales. Apple gets away with it because they sell more iphones in 6 months then xbox series x/s and ps5/pro have sold combined the ENTIRE generation, and that tech is also used on all their ipads and macs.
 
Joined
Jan 14, 2019
Messages
13,229 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
U take this like personal offence? Why?
its Amd not You! Dont hurt u feelings if someone say bad about tech company.
I'm not taking it personally. I'd just like to stay on topic. I am equally disappointed in the AMD keynote, but there is a place to discuss that, which isn't here. I have expectations on certain products, but I do not have feelings for either company. It rather looks like you have feelings for Nvidia which you're trying to justify by convincing me. Believe me, it's pointless.

60 FPS is not smooth at all when playing years +100fps.
OFC u cant see difference if u are using 60Hz monitors
I'm on 144 Hz. Our perceptions are different. What's smooth for you might not be for me and vice versa. I want stability in my performance, and I want low input lag with no graphical glitches. Whether it's at 60 or 100 or 200 FPS, I don't care. But unfortunately, when I only have 30 FPS, I can't make 60 out of it with FG without introducing other problems, that's why I don't like the tech.

But can u just cool off and wait for reviews? we got u point allredy. Ok?
You asked for it.

Because Nvidia have best gpus also best features..
no need to use FG, but its still there when needed.
Ai is future
Believe that if it makes you feel better. Personally, I see the same games running on Nvidia and AMD cards (yes, I have both). The colour of the box doesn't matter. Price and performance do.

The console makers are not paying GPU inflation. Console chips are, famously, very low margin designs for chip makers, part of why nvidia is happy to let AMD have it.
More because Nvidia doesn't do APUs (if you don't count low-performance designs with ARM cores like in the Nintendo Switch).
 
Joined
Sep 10, 2018
Messages
7,266 (3.14/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
it's slower. Twice as many fake frames, not twice as many frames per second

Net result, input lag gets even worse, and input lag is the main reason people don't like fake frames in the first place. Not the only reason, but definitely the main one.

More seriously, of the thousand or so demanding titles from the last half decade, only a tiny tiny handful (under 50) actually even support Nvidia's frame-gen.

I'm annoyed with how Nvidia portrayed 50 series but honestly the best announcement is DLSS RR, DLSS SR, and DLAA are getting meaningful improvements and that's coming to all RTX owners.
 
Joined
Dec 25, 2023
Messages
60 (0.16/day)
Nvidia didn't show any raster gains and AMD didn't even show their GPUs. I was telling people that AMD exited the GPU business. They'll argue that UDNA is coming. They killed off RDNA. They don't want to pour any money into graphics. You'll be gaming on their compute units. It migth work, it might not but they don't care about graphics and they made it very clear.

I was expecting the 5080 on down to be similar to the Super update. Single digit improvements. They focused on everything other than raster. Can't even buy any old stuff. Shelves are clear at Microcenter. 5080 here I come. I can't believe how hard it is to replace my 6950XT. 7900XTX is only 50% gain. 4080 close to the same. I prefer to at least double my fps when I upgrade. This will be the saddest upgrade ever for me. I previously went from 1060 to 6950XT. That's 4-5x fps improvement. Are we reaching diminishing returns? Good news will be that we don't have to upgrade as often with such measly gains.
 
Joined
Dec 25, 2020
Messages
7,210 (4.88/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
You're not being serious when you're saying that you also believe a 5080 is going to be 30% faster than a 4090 ?

We'll have to wait and see but I personally don't think it's impossible
 
Joined
Jan 14, 2019
Messages
13,229 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Ive read comments similar to yours a hundred times this past week. You are not doing your side any favors. Honestly, on the list of why im not buying amd GPUs "obnoxious comments by the company's fans" is at the top.
If I limited my choices by a few idiots and blind fans on an online forum, then I wouldn't have a PC at all (let alone three). There's plenty of them in every camp.
 
Joined
Dec 25, 2020
Messages
7,210 (4.88/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
That's the thing - architectural level inprovements are non existent. Blackwell is shrinked Ada on steroids. Brute force. Nvidia added as much new compute units as was possible and tried to balance it power-wise.

You can tell from 5090's specs that efficiency is also a problem now. 4090 has 5000 shaders less but also much lower TGP than 5090. Were there any significant architectural changes, it would not end like that. Nvidia brute forced everything towards so called AI features (DLSS, FG). Jensen already stated before that this is the only way for new stage of gaming. I have my doubts, though.

As for new DLSS, please, don't say that 4000 series will be deprived of nothing and basically they just won't support something here, something there, there and also there and god knows where epse as well. RTX 4090 is surely capable (hardware-wise) for new DLSS tech when slower 5080 is capable (and anything below 5080 as well). Or change my mind, give me one real reason why 4090 would not be capable.

RTX 5080 will not beat RTX 4090 in native. Because:
- not enough computing power
- that would negatively affect 4090 sales which is Jensen's golden goose, they can't just release something more powerful and price it 20-30% less, or else they would cripple their own sales
- there will be RTX 5080 Ti with around 14k shaders and this one maybe will be on par with 4090

Performance-wise, from best:
RTX 5090
RTX 4090
RTX 5080 Ti (Super) with around 400W TGP
RTX 5080
RTX 5070 Ti (Super)
RTX 5070

Following this logic, there would never be a generational uplift over the previous halo part. This has been the case for the past few generations. It is possible, but personally I'm optimistic on at least a match. We'll have to wait and see. After all, it's pretty much what AMD is proposing with the 9070 XT. A leaner and meaner chip that will go toe to toe with their previous generation flagship with less raw hardware resources.
 
Joined
Sep 10, 2018
Messages
7,266 (3.14/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
As much as I don't get all the hating about DLSS 4 etc., fact of the matter is they are not actual frames (they do not affect the game engine) and therefore they just shouldn't be on a framerate graph / slide from nvidias marketing or from reviewers.

On the other hand, it's really hard to demonstrate what FG actually does so what other way do you actually have besides putting them on a graph?

I have no issues with them showing how they've improve frame generation personally I think it's awesome that they are I'm just not a fan of them omitting actual apples to apples performance difference especially when turning frame generation on increases latency at each step from 2x-3x-4x I will say I'm impressed it isn't much higher after the first step but until there are no noticeable artifacts and latency goes down at each step it shouldn't be sold as extra performance.

I am happy that the core DLSS technologies are improving for all RTX owners probably the best announcement period.
 
Top