Thursday, July 23rd 2015

NVIDIA Bundles Metal Gear Solid V: The Phantom Pain with Select GeForce Products

NVIDIA, which recently withdrew its Batman: Arkham Knight GeForce Bundle following the game's spectacular failure that forced its publishers to pull it off the shelves; announced a new game bundle. The company will be giving away "Metal Gear Solid V: The Phantom Pain" with GeForce GTX 980 Ti, GTX 980, GTX 970, GTX 960, and notebooks with GTX 980M and GTX 970M. There's no info on whether the game got the company's GameWorks varnish (GeForce-exclusive effects), but NVIDIA will give it a proper support package (Game Ready driver with SLI profiles and GeForce Experience settings). As one of the comments to the press-release suggest, it would be nice if the people who fell for the Batman: Arkham Knight bundle could be compensated with keys to this game.
Source: NVIDIA
Add your own comment

31 Comments on NVIDIA Bundles Metal Gear Solid V: The Phantom Pain with Select GeForce Products

#26
Aquinus
Resident Wat-man
profoundWHALEIt actually runs well on my R9 290... until you decide to put it on Ultra, and suddenly it's the crappiest frame rate I've ever seen. I also tried with Crossfire and it's the same crappy frame rate, so clearly it's doing something that the GPU doesn't know how to do.
That sounds like you exceeded 4GB of video memory. CFX won't make a game go faster if you're already running out of VRAM. That's most of the reason why I upgrade from 2x 1GB 6870s to a 390. I finally started hitting a usability limit in CFX with a 1GB framebuffer and I wasn't playing many newer games because I knew it would run like crap.
FrustratedGarrettDo you have a proof of the GTX960 being better at MSAA?
We can do some math to figure this one out. AA is mainly handled by the ROPs. Both GPUs have the same number of ROPs. As a result, the GPU with a higher core clock is most likely going to be better at anti-aliasing and have more pixel pumping power. It doesn't take too long to notice which GPU tends to be clocked higher.
Posted on Reply
#27
FrustratedGarrett
AquinusWe can do some math to figure this one out. AA is mainly handled by the ROPs. Both GPUs have the same number of ROPs. As a result, the GPU with a higher core clock is most likely going to be better at anti-aliasing and have more pixel pumping power. It doesn't take too long to notice which GPU tends to be clocked higher.
Are you an electrical engineer? You and the other guy sound too confident in your assumptions. I'm no expert on this, so I'm not gonna voice an opinion. You could be right, but then 25% higher frequency gives over twice the performance of the 285? Nice math dindo!
Posted on Reply
#28
Aquinus
Resident Wat-man
FrustratedGarrettAre you an electrical engineer? You and the other guy sound too confident in your assumptions. I'm no expert on this, so I'm not gonna voice an opinion. You could be right, but then 25% higher frequency amounts to less than half the performance hit from AA?
I'm a software engineer, so while this isn't my exact area of expertise, I have tried to do a little bit of research on the matter so my understanding might be a little rough around the edges. It seems to me that AA is processed by the ROPs because it's a pixel level change to the scene getting rendered and it's at this point pixels themselves are getting written to the frame buffer so it seems to be the optimal time to do it. Also, Wikipedia (so take it for what it is,) seems to suggest also that the ROPs handle this. The rest or the article seems to be inline, so it seems to make sense to me.
WikipediaThe Render Output Pipeline is an inherited term, and more often referred to as the render output unit. Its job is to control the sampling of pixels (each pixel is a dimensionless point), so it controls antialiasing, where more than one sample is merged into one pixel. All data rendered has to travel through the ROP in order to be written to the framebuffer, from there it can be transmitted to the display.

Therefore, the ROP is where the GPU's output is assembled into a bitmapped image ready for display.
Wiki article

If you read a little bit about how AA works, it makes sense that it gets done at this point. I'm not saying I'm right, I'm just saying that given the information I have, it seems to be the case and makes sense. It has been my observation that AMD excels on texture-level operations where nvidia excels at pixel level operations in terms of performance.
Posted on Reply
#29
Ikaruga
I had a huge bunch of Nvidia cards in the last decade, and I never received a single game from them....
Posted on Reply
#30
newtekie1
Semi-Retired Folder
IkarugaI dhad a huge bunch of Nvidia cards in the last decade, and I never received a single game from them....
Batman was the first one I got from them. I always seem to buy the cards right before they start offering games with them...
Posted on Reply
#31
Ikaruga
newtekie1Batman was the first one I got from them. I always seem to buy the cards right before they start offering games with them...
Perhaps the best would be to have some kind of a utility like gpu-z, which would read the "id" (serial?) of the cards and you could redeem the "currently" offered game (only once /card ofc)
I think you can do it with Batman now, at least I saw some page where you can do it, but they asked for a scan of the receipt and I was cba to scan it, and now I just can't find that page atm.

edit.: I found it. I will report back if I get the game.
redeem.geforce.com/en-in/invoice-information
Posted on Reply
Add your own comment
Dec 4th, 2024 03:44 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts