• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces FidelityFX Super Resolution 3 (FSR 3) Fluid Motion Rivaling DLSS 3, Broad Hardware Support

Status
Not open for further replies.
Joined
Sep 10, 2018
Messages
6,966 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
That's proof that it works better on Ada. It's not proof that it doesn't work on Turing and Ampere.
If Nvidia could enable RT on cards that have zero dedicated RT hardware (GTX 1000 series), then this shouldn't be a problem, either.

My guess is they are in a damn if they do damn if they don't situation and decided against releasing it on Turing/Ampere.

Let's say they did release it and it performs like crap and has a ton of artifacts people will say they gimped it on purpose so basically the same situation that they are in now.

Whenever I buy a gpu I buy it for the performance it gives me that day I think most people just buy whatever performs the best within their budget regardless anyone buying a gpu because the box is red or green if there are better options at the same price point is only doing themselves a disservice.
 
Joined
Apr 12, 2013
Messages
7,563 (1.77/day)
I buy it for the RGB :pimp:
Bad Boy Deal With It GIF by TikTok
Cat Swag GIF by MOODMAN
Any way future proofing or gaining more features over time is hardly something which potential buyers should not pursue! This is also why a lot of phone makers these days are promising 3-4 years of major OS updates, even on mid range phones. Which is to say that expecting/hoping for more or a lot more out of your GPU (or CPU) down the line is not unfair as far as I'm concerned.
 
Joined
Dec 28, 2013
Messages
151 (0.04/day)
In my gtx 970 days I Had the daily black screen with the "Display Driver Stopped Responding and Has Recovered" error for years.
Don't talk to me about nvidia's perfect driver stability. Just check their driver & hotfix releases change log
 
Joined
Jul 20, 2020
Messages
1,152 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
That's proof that it works better on Ada. It's not proof that it doesn't work on Turing and Ampere.
If Nvidia could enable RT on cards that have zero dedicated RT hardware (GTX 1000 series), then this shouldn't be a problem, either.

I think the difference is that one is adding a new feature, RT. The other is not and is instead adding perceived smoothness, FrameGen.

RT sucks the frames out of your card no matter which one you use, so having it cut to 20-25% on a 1xxx series card was merely 2-3x worse than a Turing card but still allowed the user to "see what they were missing." It's a decent advertising gimmick.

Frame Generation exists specifically to make more frames to increase perceived smoothness. If adding FrameGen to Turing and Ampere ends up adding few or no additional frames, then you are getting nothing yet taking a hit on latency in the process.

One (RT) adds something while the other (FG) adds nothing on "unsupported" cards hence why RT got added to those cards and not FG.
 
Joined
Sep 10, 2018
Messages
6,966 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I think the difference is that one is adding a new feature, RT. The other is not and is instead adding perceived smoothness, FrameGen.

RT sucks the frames out of your card no matter which one you use, so having it cut to 20-25% on a 1xxx series card was merely 2-3x worse than a Turing card but still allowed the user to "see what they were missing." It's a decent advertising gimmick.

Frame Generation exists specifically to make more frames to increase perceived smoothness. If adding FrameGen to Turing and Ampere ends up adding few or no additional frames, then you are getting nothing yet taking a hit on latency in the process.

One (RT) adds something while the other (FG) adds nothing on "unsupported" cards hence why RT got added to those cards and not FG.

The argument now is that if AMD regardless of being nearly a year late can get it working with Asynchronous compute surely nvidia could. We will have to wait till the technology is actually out before critiquing how close they are actually coming to the competing technology. So far according to digital foundry in hands off demonstration it's promising.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
My guess is they are in a damn if they do damn if they don't situation and decided against releasing it on Turing/Ampere.

Let's say they did release it and it performs like crap and has a ton of artifacts people will say they gimped it on purpose so basically the same situation that they are in now.

Whenever I buy a gpu I buy it for the performance it gives me that day I think most people just buy whatever performs the best within their budget regardless anyone buying a gpu because the box is red or green if there are better options at the same price point is only doing themselves a disservice.
I think the difference is that one is adding a new feature, RT. The other is not and is instead adding perceived smoothness, FrameGen.

RT sucks the frames out of your card no matter which one you use, so having it cut to 20-25% on a 1xxx series card was merely 2-3x worse than a Turing card but still allowed the user to "see what they were missing." It's a decent advertising gimmick.

Frame Generation exists specifically to make more frames to increase perceived smoothness. If adding FrameGen to Turing and Ampere ends up adding few or no additional frames, then you are getting nothing yet taking a hit on latency in the process.

One (RT) adds something while the other (FG) adds nothing on "unsupported" cards hence why RT got added to those cards and not FG.
Maybe, maybe not. We won't know unless they do decide to roll it out for Turing and Ampere in the future.

Personally, I don't like all this "new tech" Nvidia introduces with every generation. One may see it as something new and exciting, but to me, it's just gimmicks to make people spend money on an upgrade even if they wouldn't have to otherwise. I'm more of an advocate of unified, hardware-agnostic standards, and a level playing field where the only major qualities of a graphics card are its computing power and price. If Nvidia is really a software company as some may claim, then they should develop software that runs on everything instead of hardware dedicated for not giving people a choice when buying a GPU.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
That's proof that it works better on Ada. It's not proof that it doesn't work on Turing and Ampere.
If Nvidia could enable RT on cards that have zero dedicated RT hardware (GTX 1000 series), then this shouldn't be a problem, either.
But nvidia themselves said that yes, it can work on older hardware. It will just look like crap.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
But nvidia themselves said that yes, it can work on older hardware. It will just look like crap.
I'd rather judge that for myself than believe Nvidia without any evidence presented.
 
Joined
Oct 28, 2012
Messages
1,195 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
Maybe, maybe not. We won't know unless they do decide to roll it out for Turing and Ampere in the future.

Personally, I don't like all this "new tech" Nvidia introduces with every generation. One may see it as something new and exciting, but to me, it's just gimmicks to make people spend money on an upgrade even if they wouldn't have to otherwise. I'm more of an advocate of unified, hardware-agnostic standards, and a level playing field where the only major qualities of a graphics card are its computing power and price. If Nvidia is really a software company as some may claim, then they should develop software that runs on everything instead of hardware dedicated for not giving people a choice when buying a GPU.
You need to imagine how someone who isn't a big tech nerd might react to a lesser implementation of DLSS 3: they will toggle the setting by curiosity, see that it looks/perform like crap, and base their whole opinion of the tech based on their personal experience. They are not going to research about how DLSS3 perform best starting from a specific generation because the hardware used for FG is more powerful. Letting the client trying out everything that they want is a double edge sword: if it doesn't work, they will still expect you to fix it somehow, if you don't fix you hurt your brand image, and the product will be deemed as crap. Nvidia isn't the first brand that would rather not poke that bear. The more mainstream something is meant to be, the lesser control you'll have over it because tech support doesn't want to get swarmed by people who don't understand what "not officially supported" means :D

Sometimes the industry needs a push. Vulkan was born from mantle, anything that tressFX and gameworks did is now a standard feature in games engine.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Sometimes the industry needs a push. Vulkan was born from mantle, anything that tressFX and gameworks did is now a standard feature in games engine.
Yes, but they were all hardware-agnostic, just like they are now. The industry needs a push, but not by X company to buy only X company's cards.

As for the longer part of your post: I guess I see the point. It's just now how I would prefer. Nvidia at least could release some footage of a Turing GPU running FG like crap.
 
Joined
Oct 28, 2012
Messages
1,195 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
Yes, but they were all hardware-agnostic, just like they are now. The industry needs a push, but not by X company to buy only X company's cards.
Honestly, the impression that I'm getting from the GPU market right now is that there's a growing pain about the machine learning hardware. For a while nvidia was alone to have that, Intel followed them, but their ML hardware isn't software compatible with Nvidia, and AMD doesn't seem to think that dev accessible AI on a consumer GPU is the future, and it's MIA on the consoles.
-So Nvidia want to power everything with machine learning.
-Intel wants to do it as well, but they still propose an agnostic solution because they can't make XESS works with the tensor core apparently.
-AMD just want to use the basic GPU hardware since that seems to be the only workable agnostic solution at the moment.
-Direct ML is a thing that supposed to be hardware agnostic, but no one use it for upscaling and frame generation? (genuine question)

Upscaling/FG seems to suffer from a difference of philosophy about the means to achieve it, and the fact that each company seems unable to make use of the specialised hardware of the other. So, there's something to clean up and standardise there.... but I think that Microsoft would need make direct X 12_3 (direct x Ultimate ML) where every constructor would have a guideline about what the ML hardware need to be able to do to be compliant.
 
Joined
Oct 27, 2009
Messages
1,190 (0.21/day)
Location
Republic of Texas
System Name [H]arbringer
Processor 4x 61XX ES @3.5Ghz (48cores)
Motherboard SM GL
Cooling 3x xspc rx360, rx240, 4x DT G34 snipers, D5 pump.
Memory 16x gskill DDR3 1600 cas6 2gb
Video Card(s) blah bigadv folder no gfx needed
Storage 32GB Sammy SSD
Display(s) headless
Case Xigmatek Elysium (whats left of it)
Audio Device(s) yawn
Power Supply Antec 1200w HCP
Software Ubuntu 10.10
Benchmark Scores http://valid.canardpc.com/show_oc.php?id=1780855 http://www.hwbot.org/submission/2158678 http://ww
3dfx killed itself. Stop making up stupid bullshit to justify your lack of actual argument.

Ah yes the good old "I don't actually have an argument so I'm going to bring up everything that I think NVIDIA has ever done wrong". I could do the same for ATI/AMD, but I won't, because I'm smart enough to know that that's not an argument, it's just stupid whining by a butthurt AMD fanboy.
You asked for a history lesson, don't be annoyed you got one.
 
Joined
Apr 13, 2023
Messages
38 (0.06/day)
Are you not happy with your 13900KS/4090 setup? The inevitable and excessive crying never ends for team green/blue. :D
 
Last edited by a moderator:
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
That's proof that it works better on Ada. It's not proof that it doesn't work on Turing and Ampere.
Of course it isn't. But you keep claiming that it will work on Turing and Ampere, also without any proof. Do you see your hypocrisy?

You asked for a history lesson, don't be annoyed you got one.
I didn't ask for a history lesson, and the stupid bullshit you made up in a pathetic attempt to support your not-argument wasn't one. Unless it's a history of your inability to make a coherent argument.
 
Joined
May 8, 2018
Messages
1,571 (0.65/day)
Location
London, UK
The way AMD presented it, it seems too good to be true.
 
Joined
Oct 27, 2009
Messages
1,190 (0.21/day)
Location
Republic of Texas
System Name [H]arbringer
Processor 4x 61XX ES @3.5Ghz (48cores)
Motherboard SM GL
Cooling 3x xspc rx360, rx240, 4x DT G34 snipers, D5 pump.
Memory 16x gskill DDR3 1600 cas6 2gb
Video Card(s) blah bigadv folder no gfx needed
Storage 32GB Sammy SSD
Display(s) headless
Case Xigmatek Elysium (whats left of it)
Audio Device(s) yawn
Power Supply Antec 1200w HCP
Software Ubuntu 10.10
Benchmark Scores http://valid.canardpc.com/show_oc.php?id=1780855 http://www.hwbot.org/submission/2158678 http://ww
I didn't ask for a history lesson, and the stupid bullshit you made up in a pathetic attempt to support your not-argument wasn't one. Unless it's a history of your inability to make a coherent argument.
Since you forgot,

You asked for history of anti-consumer behavior.
And frankly I am not sure why you are in denial of it, both the historical facts and having asked for it lol.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Of course it isn't. But you keep claiming that it will work on Turing and Ampere, also without any proof. Do you see your hypocrisy?
No - I keep claiming that we have proof that Turing and Ampere have the necessary hardware, and that we have no proof that it doesn't work. Nvidia is kindly asking us to believe whatever they say at face value. If they provided a video to compare how it runs across Turing/Ampere/Ada, so we could see with our own eyes why they chose to only enable it on Ada, it would be the difference of night and day.

Edit: Here's a little info morsel on the topic:

Honestly, the impression that I'm getting from the GPU market right now is that there's a growing pain about the machine learning hardware. For a while nvidia was alone to have that, Intel followed them, but their ML hardware isn't software compatible with Nvidia, and AMD doesn't seem to think that dev accessible AI on a consumer GPU is the future, and it's MIA on the consoles.
-So Nvidia want to power everything with machine learning.
-Intel wants to do it as well, but they still propose an agnostic solution because they can't make XESS works with the tensor core apparently.
-AMD just want to use the basic GPU hardware since that seems to be the only workable agnostic solution at the moment.
-Direct ML is a thing that supposed to be hardware agnostic, but no one use it for upscaling and frame generation? (genuine question)

Upscaling/FG seems to suffer from a difference of philosophy about the means to achieve it, and the fact that each company seems unable to make use of the specialised hardware of the other. So, there's something to clean up and standardise there.... but I think that Microsoft would need make direct X 12_3 (direct x Ultimate ML) where every constructor would have a guideline about what the ML hardware need to be able to do to be compliant.
That makes perfect sense. And I agree - standardisation is needed.
 
Last edited:
Joined
Oct 27, 2009
Messages
1,190 (0.21/day)
Location
Republic of Texas
System Name [H]arbringer
Processor 4x 61XX ES @3.5Ghz (48cores)
Motherboard SM GL
Cooling 3x xspc rx360, rx240, 4x DT G34 snipers, D5 pump.
Memory 16x gskill DDR3 1600 cas6 2gb
Video Card(s) blah bigadv folder no gfx needed
Storage 32GB Sammy SSD
Display(s) headless
Case Xigmatek Elysium (whats left of it)
Audio Device(s) yawn
Power Supply Antec 1200w HCP
Software Ubuntu 10.10
Benchmark Scores http://valid.canardpc.com/show_oc.php?id=1780855 http://www.hwbot.org/submission/2158678 http://ww
No - I keep claiming that we have proof that Turing and Ampere have the necessary hardware, and that we have no proof that it doesn't work. Nvidia is kindly asking us to believe whatever they say at face value. If they provided a video to compare how it runs across Turing/Ampere/Ada, so we could see with our own eyes why they chose to only enable it on Ada, it would be the difference of night and day.


That makes perfect sense. And I agree - standardisation is needed.
I would say AMD didn't go directML route as that would cut out all their old cards, RX7000 is the first with "tensor cores"
https://gpuopen.com/learn/wmma_on_rdna3/ edit: it looks like directML could work just slow on old cards...
 
Last edited:

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
we have proof that Turing and Ampere have the necessary hardware
It's a piece of hardware with the same name, but lesser capability by a significant amount, so at least by their own measure, it's not the necessary hardware.
 
Joined
Oct 27, 2009
Messages
1,190 (0.21/day)
Location
Republic of Texas
System Name [H]arbringer
Processor 4x 61XX ES @3.5Ghz (48cores)
Motherboard SM GL
Cooling 3x xspc rx360, rx240, 4x DT G34 snipers, D5 pump.
Memory 16x gskill DDR3 1600 cas6 2gb
Video Card(s) blah bigadv folder no gfx needed
Storage 32GB Sammy SSD
Display(s) headless
Case Xigmatek Elysium (whats left of it)
Audio Device(s) yawn
Power Supply Antec 1200w HCP
Software Ubuntu 10.10
Benchmark Scores http://valid.canardpc.com/show_oc.php?id=1780855 http://www.hwbot.org/submission/2158678 http://ww
It's a piece of hardware with the same name, but lesser capability by a significant amount, so at least by their own measure, it's not the necessary hardware.
That is the problem with a lack of transparency in product details.
AMD has this same issue with the WMMA, they have dedicated AI cores but... that's all we know, they will do things sometime in the future...
In the same way Nvidia doesn't mention the differences between its consumer tensor core implementation and workstation tensor cores.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
It's a piece of hardware with the same name, but lesser capability by a significant amount, so at least by their own measure, it's not the necessary hardware.
I would still like to see how it handles (or doesn't handle) FG instead of believing Nvidia's claims without a second thought.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I would still like to see how it handles (or doesn't handle) FG instead of believing Nvidia's claims without a second thought.
I too would be very interested to know, and benefit of the doubt goes in all directions, I need to see certain claims tested before I am willing to accept AMD's word for it, especially after showing they have extensive "we can be dodgy and anti consumer" chops, especially recently.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I too would be very interested to know, and benefit of the doubt goes in all directions, I need to see certain claims tested before I am willing to accept AMD's word for it, especially after showing they have extensive "we can be dodgy and anti consumer" chops, especially recently.
Absolutely. Marketing material is never to be believed from any company.
 
Joined
Aug 25, 2021
Messages
1,183 (0.97/day)
I admit, I said mean things about the queen and some other royal family and was told by someone on Twitter they were going to phone me into the local police. However living in Montana I really don't give a fuck who they call, and even offered to donate to get the local constable a row boat to make the trip.
Some places aren't speech nazis.
Montana? The voice from the depth of whale's belly.
 
Joined
Feb 1, 2019
Messages
3,667 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
So AMD just boosted Ampere cards on behalf of Nvidia lol.

Nvidia meanwhile will continue to use software to sell hardware.

In my gtx 970 days I Had the daily black screen with the "Display Driver Stopped Responding and Has Recovered" error for years.
Don't talk to me about nvidia's perfect driver stability. Just check their driver & hotfix releases change log
I used to get that, I now routinely increase the driver timeout out of paranoia.
 
Status
Not open for further replies.
Top