• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Bundles Metal Gear Solid V: The Phantom Pain with Select GeForce Products

Joined
May 2, 2013
Messages
170 (0.04/day)
@newtekie1
Do you have a proof of the GTX960 being better at MSAA?

With regard to your stand on CPU "physics", well... good luck letting the DX11 runtime manage 10 synchronous threads. It's simply not possible with the current or rather previous gen APIs.

I'd say that AMD's HSA chips are quite a good option for heavy simulation and AI processing. The GPU in these chips can address the whole virtual memory space and communication between CPU and GPU is done through hardware and without any context switches. All you need to do is add pointers in your code where you want execution to switch from or to either the CPU or the GPU. AMDs HSA chips seem like the most sensible platform for these kinda workloads.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.81/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
It actually runs well on my R9 290... until you decide to put it on Ultra, and suddenly it's the crappiest frame rate I've ever seen. I also tried with Crossfire and it's the same crappy frame rate, so clearly it's doing something that the GPU doesn't know how to do.
That sounds like you exceeded 4GB of video memory. CFX won't make a game go faster if you're already running out of VRAM. That's most of the reason why I upgrade from 2x 1GB 6870s to a 390. I finally started hitting a usability limit in CFX with a 1GB framebuffer and I wasn't playing many newer games because I knew it would run like crap.
Do you have a proof of the GTX960 being better at MSAA?
We can do some math to figure this one out. AA is mainly handled by the ROPs. Both GPUs have the same number of ROPs. As a result, the GPU with a higher core clock is most likely going to be better at anti-aliasing and have more pixel pumping power. It doesn't take too long to notice which GPU tends to be clocked higher.
 
Last edited:
Joined
May 2, 2013
Messages
170 (0.04/day)
We can do some math to figure this one out. AA is mainly handled by the ROPs. Both GPUs have the same number of ROPs. As a result, the GPU with a higher core clock is most likely going to be better at anti-aliasing and have more pixel pumping power. It doesn't take too long to notice which GPU tends to be clocked higher.

Are you an electrical engineer? You and the other guy sound too confident in your assumptions. I'm no expert on this, so I'm not gonna voice an opinion. You could be right, but then 25% higher frequency gives over twice the performance of the 285? Nice math dindo!
 
Last edited:

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.81/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Are you an electrical engineer? You and the other guy sound too confident in your assumptions. I'm no expert on this, so I'm not gonna voice an opinion. You could be right, but then 25% higher frequency amounts to less than half the performance hit from AA?
I'm a software engineer, so while this isn't my exact area of expertise, I have tried to do a little bit of research on the matter so my understanding might be a little rough around the edges. It seems to me that AA is processed by the ROPs because it's a pixel level change to the scene getting rendered and it's at this point pixels themselves are getting written to the frame buffer so it seems to be the optimal time to do it. Also, Wikipedia (so take it for what it is,) seems to suggest also that the ROPs handle this. The rest or the article seems to be inline, so it seems to make sense to me.
Wikipedia said:
The Render Output Pipeline is an inherited term, and more often referred to as the render output unit. Its job is to control the sampling of pixels (each pixel is a dimensionless point), so it controls antialiasing, where more than one sample is merged into one pixel. All data rendered has to travel through the ROP in order to be written to the framebuffer, from there it can be transmitted to the display.

Therefore, the ROP is where the GPU's output is assembled into a bitmapped image ready for display.
Wiki article

If you read a little bit about how AA works, it makes sense that it gets done at this point. I'm not saying I'm right, I'm just saying that given the information I have, it seems to be the case and makes sense. It has been my observation that AMD excels on texture-level operations where nvidia excels at pixel level operations in terms of performance.
 
Joined
Feb 18, 2011
Messages
1,259 (0.25/day)
I had a huge bunch of Nvidia cards in the last decade, and I never received a single game from them....
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.10/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I dhad a huge bunch of Nvidia cards in the last decade, and I never received a single game from them....

Batman was the first one I got from them. I always seem to buy the cards right before they start offering games with them...
 
Joined
Feb 18, 2011
Messages
1,259 (0.25/day)
Batman was the first one I got from them. I always seem to buy the cards right before they start offering games with them...
Perhaps the best would be to have some kind of a utility like gpu-z, which would read the "id" (serial?) of the cards and you could redeem the "currently" offered game (only once /card ofc)
I think you can do it with Batman now, at least I saw some page where you can do it, but they asked for a scan of the receipt and I was cba to scan it, and now I just can't find that page atm.

edit.: I found it. I will report back if I get the game.
https://redeem.geforce.com/en-in/invoice-information
 
Last edited:
Top