Tuesday, June 27th 2023

AMD Announced as Starfield's Exclusive Partner on PC

AMD and Bethesda have today revealed that Starfield will be best experienced on a Ryzen processor and Radeon graphics card-equipped PC. Team Red has been announced as the giant open world game's official graphics and GPU partner, but its Xbox Series hardware also gets a couple of friendly shout-outs. Todd Howard, director and executive producer at Bethesda Game Studios, stated in the video presentation: "We have AMD engineers in our code base working on FSR (FidelityFX Super Resolution) 2.0 image processing and upscaling and it looks incredible. You're going to get the benefits of that obviously on your PC but also on Xbox. We're super excited and can't wait to show everybody more."

Jack Huynh, Senior Vice President and General Manager of its Computing and Graphics Group at AMD, added: "Making this game even more special, is the close collaboration between Bethesda and AMD to unlock the full potential of Starfield. We have worked hand-in-hand with Bethesda Game Studios to optimize Starfield for both Xbox and PC with Ryzen 7000 series processors and Radeon 7000 series graphics. The optimizations both accelerate performance and enhance the quality of your gameplay using highly multi-threaded code that both Xbox and PC players will get to take advantage of."
AMD is proud to announce that we are Bethesda's exclusive PC partner for the next-generation role-playing game, Starfield. Watch this special announcement video to learn how AMD and Bethesda are working together to bring the galaxy to all players this September:


About Starfield
Starfield is the first new universe in over 25 years from Bethesda Game Studios, the award-winning creators of The Elder Scrolls V: Skyrim and Fallout 4. In this next generation role-playing game set amongst the stars, create any character you want and explore with unparalleled freedom as you embark on an epic journey to answer humanity's greatest mystery. In the year 2330, humanity has ventured beyond our solar system, settling new planets, and living as a spacefaring people. You will join Constellation - the last group of space explorers seeking rare artifacts throughout the galaxy - and navigate the vast expanse of space in Bethesda Game Studios' biggest and most ambitious game.

AMD is the exclusive PC partner for Starfield, promising to deliver the most complete PC gaming experience in the galaxy. We cannot wait to explore the universe with you this September. Ready to learn more?
Source: AMD
Add your own comment

226 Comments on AMD Announced as Starfield's Exclusive Partner on PC

#126
phanbuey
Dr. Dro

The official stance is the "no comment" card now. GN inquired and got deflected with something that all but amounts to a yes.

I suspect that the bad PR may yet result in all techs being released for this game. I really really really mean it when I say I'm interested in Starfield, I'm pre-ordering the Premium Edition as soon as the first previews go live and already started amassing the coins on my Steam wallet.

As I mentioned earlier if this game is particularly strong on Radeon I may even be willing to flip my 3090 and get a XTX, however bad of a deal that could be otherwise. I already upgraded my CPU in anticipation to it after all.
It is a Bethesda game so giving it a few weeks to iron things out might make you live longer.
Posted on Reply
#127
ratirt
phanbueyIn quite a few cases the 4070TI plays games like Hogwarts Legacy, Atomic Heart, Cyberpunk etc. better than a 7900XTX in real life --- why? because the nvidia nerd just enables DLSS 2/3 sets it to balanced and BOOM - game plays smoother and looks better than it does on the 7900XTX, no matter what settings the AMD owner uses. How do I know? Just built a 4070ti 5800x3d upgrade rig for a friend, and another 12600K 7900xtx mini itx build... And those were the games I happened to be testing with at the time.

The 7900XTX is super powerful card and a MUCH better card in raw stats and raster, but technology is a thing -- you can get to a good gaming experience without brute force-only.

That's why there's so much drama in this thread -- nvidia's software shenanigans actually work well, and when we're forced to use Raster only or (god forbid) FSR it's a big deal for people that use NVidia because it materially degrades the gaming experience simply because AMD can't compete with their vaseline smear upscaler.

Im not mad at AMD for what they did, I'm just generally mad that i'm probably going to have to subject my eyeballs to FSR if I cant get the FPS. Hopefully they do a good job like in Cyberpunk so it's not too bad.
weird. 4070Ti faster than a 7900xtx in hogwarts?
This is the 7900xt not xtx. I have included RT as well.
Look at the 4k results with RT very interesting.

phanbuey

Raw raster-hogwarts legacy...

Now look at the difference with DLSS3 on vs off.... You can run 4k with RT no issues. At 4k it smashes the 7900xtx by 40FPS just with DLSS3 alone.
(9) Hogwarts Legacy - DLSS 3 test @ INNO3D RTX 4070 Ti | 160W TDP limit - YouTube

Now turn on DLSS 3 - and you get over 150 FPS on the 4070ti.

Or you can build the two rigs and see for yourself.

Or let's do atomic heart native TAA vs DLSS vs FSR
Atomic Heart: FSR 2.2 vs. DLSS 2 vs. DLSS 3 Comparison Review | TechPowerUp

"Speaking of FSR 2.2 image quality, the FSR 2.2 implementation comes with noticeable compromises in image quality—in favor of performance in most sequences of the game. We spotted excessive shimmering and flickering in motion on vegetation, tree leaves and thin steel objects, which might be quite distracting for some people."

"DLSS Frame Generation technology, which has the ability to bypass CPU limitations and increase the framerate. With DLSS Super Resolution in Quality mode and DLSS Frame Generation enabled, you can expect almost doubled performance at 1440p and 1080p, and during our testing, overall gameplay felt very smooth and responsive, we haven't spotted any issues with input latency."

^ from TPU reviewers.

I've played on both, and I can tell you there are quite a few games that the 4070ti outright smashes the 7900xtx in gaming experience due to the settings it allows purely due to DLSS. And in just DLSS2 games, no frame gen, DLSS 2 balanced still looks better than any FSR 2 Quality - so you're basically getting the same performance at a better IQ.
Medium quality? how will we test in the future low quality and make it a representative for the performance?
DLSS is an upscale it does not matter here.
Posted on Reply
#128
MarsM4N
kapone32I am so glad I watched the MSI Gaming livestream this week. They showed DLSS3 with Frame Gen and the perosn playing could not shoot anyone in a FPS and admitted the floaty feeling and lag that those "innovations" introduced into the Game. If you like them good for you. I spent my money on VRAM as the 127 FPS that the Hitman 3 shows is perfect to be smooth. Then I have an X3D chip for the 1% lows so I am Golden.
I didn't even know that DLSS & FSR is getting the magic FPS boost by running the game in a lower resolution and then scaling it up, lol. :laugh: Which could explain why hitreg is messed up. Maybe not because of the lag but because the game is running in a lower resolution with less pixels, which could result in less pixel accuracy?
Posted on Reply
#129
HOkay
kapone32I am so glad I watched the MSI Gaming livestream this week. They showed DLSS3 with Frame Gen and the perosn playing could not shoot anyone in a FPS and admitted the floaty feeling and lag that those "innovations" introduced into the Game. If you like them good for you. I spent my money on VRAM as the 127 FPS that the Hitman 3 shows is perfect to be smooth. Then I have an X3D chip for the 1% lows so I am Golden.
Hey now, Frame Generation is great...as long as you have a solid 100+ fps before the Frame Generation is added...& therefore don't really need the extra frames anyway...
It's a fun tech for the top end, but having tried it a little myself, I will definitely lower settings instead. DLSS by itself is getting very close to good enough with motion artifacts that it's kinda 50:50 whether it'll use it or lower settings, depending on the implementation in the specific game.
Posted on Reply
#130
phanbuey
ratirtweird. 4070Ti faster than a 7900xtx in hogwarts?
This is the 7900xt not xtx. I have included RT as well.
Look at the 4k results with RT very interesting.




Medium quality? how will we test in the future low quality and make it a representative for the performance?
DLSS is an upscale it does not matter here.
That’s without any upscaling tech - enable dlss and everything changes; that was the point of that post.

Back to the thread — that’s why people who have dlss are pissed it’s not there.
Posted on Reply
#131
ratirt
phanbueyThat’s without any upscaling tech - enable dlss and everything changes; that was the point of that post.
Exactly, this is without an upscaler and that is what this card is capable off. Since when do we measure performance of a card with upscalers? It is pointless in my opinion to even mention it as a performance metric of a graphics card when upscaler is on . 7900xt can also use an upscaler you know.
Posted on Reply
#132
phanbuey
ratirtExactly, this is without an upscaler and that is what this card is capable off. Since when do we measure performance of a card with upscalers? It is pointless in my opinion to even mention it as a performance metric of a graphics card when upscaler is on . 7900xt can also use an upscaler you know.
Because it can materially enhance the experience. I don’t actually care what card I use or what settings are enabled — I care about the actual gaming experience and visual quality.

At the end of the day I will enable the settings that give the best game experience - and so do most gamers.

And when you do that, things change massively— it’s only with everything disabled that benchmarks are run for reviewed, but it’s not really how people play.

Even hwunboxed in that review said his first numbers were so massively in favor of Nvidia he had to turn off dlss because it was bugged and on — he had to turn off features to make it competitive.

Im not saying that the 4070ti is a superior product - at all, I’m saying these features are so huge that they propel this product +2 tiers when they’re available, and that’s why gamers who use them get their panties in a wad when they’re not.

DLSS 2 + reflex in shooters +30-40% fps boost, DLSS 3 + reflex in other games (Diablo 4, hw legacy etc) +80% fps…
Posted on Reply
#133
Unregistered
We love ina weird world where people defend a company that not only locked other GPUs from using their technology but worse even its own customers.
Rather than blaming AMD blame nVidia for not allowing their DLSS to work on all GPUs (at least recent), Intel and AMD that it is possible to have upscaling working without requiring specialised hardware, nVidia à la Apple wants to lock everyone to their hardware.
Very disappointed by the community.
#134
ratirt
phanbueyBecause it can materially enhance the experience. I don’t actually care what card I use or what settings are enabled — I care about the actual gaming experience and visual quality.

At the end of the day I will enable the settings that give the best game experience - and so do most gamers.

And when you do that, things change massively— it’s only with everything disabled that benchmarks are run for reviewed, but it’s not really how people play.

Even hwunboxed in that review said his first numbers were so massively in favor of Nvidia he had to turn off dlss because it was bugged and on — he had to turn off features to make it competitive.
Tell me
When you buy a graphics card you pay for the performance the card gives?
or you just buy any card and pay for the upscalling technique to play a game with playable frame rate? That Hogwarts 4k with 4070 Ti with RT. Not even DLSS3 can lift it off the ground.

You know what. I wish NV made the upscaler so good that you literally pay $1k for a 4050 and then just use upscaler to get 60 or 100 fps in a game like Hogwarts. I really cheer for that and encourage NV to think about that approach. Im sure the likes of you will admire NV's effort to make the upscaler so good that you can literally play on any graphics newly released by NV
Posted on Reply
#135
phanbuey
ratirtTell me
When you buy a graphics card you pay for the performance the card gives?
or you just buy any card and pay for the upscalling technique to play a game with playable frame rate? That Hogwarts 4k with 4070 Ti with RT. Not even DLSS3 can lift it off the ground.

You know what. I wish NV made the upscaler so good that you literally pay $1k for a 4050 and then just use upscaler to get 60 or 100 fps in a game like Hogwarts. I really cheer for that and encourage NV to think about that approach. Im sure the likes of you will admire NV's effort to make the upscaler so good that you can literally play on any graphics newly released by NV
That’s not true - enable dlss 3 with rt - dlss 2 lifts it way off the ground - and dlss 3 almost doubles it.

When I buy a gfx card I buy to game - and I do take upscalers into account as do most gamers.
Posted on Reply
#136
HOkay
phanbueyThat’s not true - enable dlss 3 with rt - dlss 2 lifts it way off the ground - and dlss 3 almost doubles it.

When I buy a gfx card I buy to game - and I do take upscalers into account as do most gamers.
I do think you're right in that the majority of people buying computers to game on will care more about the performance with all the tricks enabled. I'm curious whether it "feels" ok to you when you've got e.g. 30fps doubled to 60fps? I really hated the fact it still felt like 30fps due to the responsiveness to inputs still being at the lower frame rate.
Posted on Reply
#137
ratirt
phanbueyThat’s not true - enable dlss 3 with rt - dlss 2 lifts it way off the ground - and dlss 3 almost doubles it.

When I buy a gfx card I buy to game - and I do take upscalers into account as do most gamers.
It is hard to imagine you by a graphics card to plow a field. That's not the point.
You pay for performance the GPU offers as it is not with upscalers. With that notion you will purchase cards like 4050 for thousands of dollars since the upscaler will do the trick to make it playable in high res. Need to remind you, it is still an upscaler and it has flaws.
Posted on Reply
#138
TheoneandonlyMrK
phanbueyThat’s not true - enable dlss 3 with rt - dlss 2 lifts it way off the ground - and dlss 3 almost doubles it.

When I buy a gfx card I buy to game - and I do take upscalers into account as do most gamers.
I don't think your opinion is as widespread as you think, even if it was widespread within the enthusiast community, I think the majority cannot afford to and does not consider it at all in reality and just want to play Roblox or cod on whatever their parents got talked into buying at Currys.

I also think Fsr 2.2 on quality not as bad as many make out IF you need to use it, is free.

To be honest I previously wouldn't use scaling, I still would rather not since none have no negative points dlss 3 being a total no no on cod apex ,battle bit, any racing game especially rallying.

But when I have used it, it was always on really, now low end hardware like a Steamdeck or 2060 laptop on Dead space etc, the hate for fsr is hyped, when I have used it and that's rarely for either tech, it was fine in use.

AMD passed a no comment, clearly Nvidia got it's schill army to kick up a fuss after AMD rejected streamline, a way for Nvidia to leverage dlss into every game menu, that's a proprietary competitive tech they wanted AMD to assist in the spreading of.
I get the annoyance now but as a company who back open source, But do so to push they're own sale's, I can't see any issues even with them asking for exclusivity, but they can't contract such for features, just like. A Dev team can choose to use or incorporate any technology they want, and that shouldn't be forcefully limited by contract by a partner, asking would though be fine, to me.

As is refusing that request.

The whole story is probably more nuanced.
Posted on Reply
#139
phanbuey
ratirtIt is hard to imagine you by a graphics card to plow a field. That's not the point.
You pay for performance the GPU offers as it is not with upscalers. With that notion you will purchase cards like 4050 for thousands of dollars since the upscaler will do the trick to make it playable in high res. Need to remind you, it is still an upscaler and it has flaws.
What?

You buy a graphics card to play games... some people buy it to debate hardware online - but those are usually contrarian hardware enthusiasts. There is a difference between gamers and hardware guys, which is why nvidia dominates in gamer mind share. If AMD comes out with better FSR, or even if FSR improves substantially (or if theres another upscaler that becomes popular) that gives +30% to cards universally you will see AMD's GPU market share spike.
HOkayI do think you're right in that the majority of people buying computers to game on will care more about the performance with all the tricks enabled. I'm curious whether it "feels" ok to you when you've got e.g. 30fps doubled to 60fps? I really hated the fact it still felt like 30fps due to the responsiveness to inputs still being at the lower frame rate.
30 FPS is a nightmare -- 60 FPS with FGis better but still feels awful. It's really good at like 50 to 80-100FPS and even better at 70 -> 140FPS - anything really below 50 FPS is not great in general.
Posted on Reply
#140
kapone32
phanbueyThat’s not true - enable dlss 3 with rt - dlss 2 lifts it way off the ground - and dlss 3 almost doubles it.
Frame Generation (Fujisu Line Doubling) is not a good example when what you see is not the reality if the Game gives you 150 FPS with all of those applied and I already get 120 FPS. Do you really believe on a Freesync monitor that gives you butter smooth frames from 45-144 Hz using a 20 GB GPU with a card that maintains at least 2600MHz GPU clock in Gaming would feel worse than a card that runs at 57 FPS native to them use other technology to render the exact same frame.

Though not direct, FPS and Hz are alike in that your monitor will directly effect your enjoyment of a product. When everyone had 60Hz monitors it didn't matter. Now we have panels that go as high as 400 HZ+ but a lot of us have 120+ hz monitors at various resolutions. That directly translates with GPU tier. If you have a 6600/3060/4060 a 1080P 120hz Freesync (Or Gsync) panel will be great. If you have a 3070/3080/4070/6700-50XT/6800/6800XT 1440P 144 Hz+ panels are excellent. Then you have the highest tier with that 6900-50XT7900XT,XTX,4080,3090,3090 TI and 4090 which are for 4K 144Hz+ panels but from the 3070 you can get away with 4K 60Hz. As long as you stick to that principle you will enjoy Gaming like never before. The thing is that was what was needed in the space. About 10+ years ago you could buy 1440P monitors from Korea (Qnix) that had no scalar and were just basically raw panels but with GPU scaling no one who owned one of those complained. Just do a Google search on Forums circa 2012-2013 on Korean 1440P panels for $300 from Newegg. That meant that 60Hz had been exposed. But I was not convinced until I was playing the Division on my 4K 60Hz panel and struggled with the Machine Gun. I bought a 1440P 144 Hz panel (Qnix) next and was blown away that I could use the Machine Gun with a Scope to make head shots. That is tangible and that is why I bought a FV43U after my 1440P 165Z 32QC but I bought that when I heard about the announcement of 7000 GPUs. The promise of the increase in VRAM and Clockspeed vs the 6800XT that I was running meant that I no longer had to run at 1440P for some Games using that panel and 4K Mini LED is better than OLED to me for one reason Power draw. I love James Webb images and you should see those on this panel I am typing on with the Contrast and Colour turned up. That also means that Gaming is Fing sweet. Diablo 4, Grid Legends, Spiderman, Guardians, Witcher 3, Everspace 2, Total War Anything, Fear, Hellblade, Hitman and any of the other library of Games I own all look spectacular and run butter smooth. I am not even sure if any of those support DLSS but DLSS is not in even 1% of all Games but neither is FSR.

One of the examples of what I am talking about is how cheated some Nvidia owners feel because Starfield won't support DLSS. Like 1 Game in a space that has Homeworld 3, Armored Core 6, Avatar Whatever it's called HFB whenever it launches on PC and Game like Aliens Fire Team Elite that are are plenty of fun. We are missing the mark anyways as playing a Game like Titanfall 2 with modern hardware is blissful and I doubt anyone really knows the number of Games that have been made for PC because new ones come out every week.

I will go even further down the rabbit hole you opened. Get a 5600/10-131to400 and a 6650XT/3060 12GB/ and maybe 4060 buy yourself a 1080P high refresh panel and do a 1 year subscription to Humble Choice and actually play the Games and you will laugh at people making these Hotwheels vs Matchbox, Tyco vs AFX f me Green Machine vs the Big Wheel (That is actually a proper analogy)arguments. If you do that you will enjoy Gaming for what it is. Exploring the minds of the Human experience by projecting the mind's eye onto the screen. Games you have never heard of or thought about. Genres that you wouldn't typically play. There is a platformer called Raji that is one of the most beautiful Games I have ever seen think Prince of Persia with a South Asian theme. If you have read the Mahabarata (I think) it will be relevant.

In fact I am going to recommend that all the people who feel cheated by Starfield. Take that money and get 4 months of Humble Choice. You will be impressed and maybe enjoy Gaming enough to not be so uptight about software variables.
Posted on Reply
#141
Vayra86
phanbueyIt is a Bethesda game so giving it a few years weeks to iron things out might make you live longer.
FTFY. Im in anticipation of the Starfield GOTY, the Legendary edition, the Remastered GOTY legendary with horse armor edition... oh yes Ill get them all!
Xex360We love ina weird world where people defend a company that not only locked other GPUs from using their technology but worse even its own customers.
Rather than blaming AMD blame nVidia for not allowing their DLSS to work on all GPUs (at least recent), Intel and AMD that it is possible to have upscaling working without requiring specialised hardware, nVidia à la Apple wants to lock everyone to their hardware.
Very disappointed by the community.
Yep, its a bunch of fools blindly believing marketing. We have a world full of them. Unfortunately

Nobody is denying the advantage the tech brings. But we should not accept the conditions under which we get it. We are ALL best served by unified technology push. We can do it for various AA methods, so why not here? Why are devs constantly bothered with up to THREE implementations to support? Why is the tech not available on every gpu with varying perf impact? Etc.

I mean look at Freesync. Its been the best thing that happened to us in the past decade wrt gaming, while Gsync is near dead despite Nvidia leading the charge. DLSS is the Gsync thing all over again. It wont last. It will be abused to the max.
kapone32Frame Generation (Fujisu Line Doubling) is not a good example when what you see is not the reality if the Game gives you 150 FPS with all of those applied and I already get 120 FPS. Do you really believe on a Freesync monitor that gives you butter smooth frames from 45-144 Hz using a 20 GB GPU with a card that maintains at least 2600MHz GPU clock in Gaming would feel worse than a card that runs at 57 FPS native to them use other technology to render the exact same frame.

Though not direct, FPS and Hz are alike in that your monitor will directly effect your enjoyment of a product. When everyone had 60Hz monitors it didn't matter. Now we have panels that go as high as 400 HZ+ but a lot of us have 120+ hz monitors at various resolutions. That directly translates with GPU tier. If you have a 6600/3060/4060 a 1080P 120hz Freesync (Or Gsync) panel will be great. If you have a 3070/3080/4070/6700-50XT/6800/6800XT 1440P 144 Hz+ panels are excellent. Then you have the highest tier with that 6900-50XT7900XT,XTX,4080,3090,3090 TI and 4090 which are for 4K 144Hz+ panels but from the 3070 you can get away with 4K 60Hz. As long as you stick to that principle you will enjoy Gaming like never before. The thing is that was what was needed in the space. About 10+ years ago you could buy 1440P monitors from Korea (Qnix) that had no scalar and were just basically raw panels but with GPU scaling no one who owned one of those complained. Just do a Google search on Forums circa 2012-2013 on Korean 1440P panels for $300 from Newegg. That meant that 60Hz had been exposed. But I was not convinced until I was playing the Division on my 4K 60Hz panel and struggled with the Machine Gun. I bought a 1440P 144 Hz panel (Qnix) next and was blown away that I could use the Machine Gun with a Scope to make head shots. That is tangible and that is why I bought a FV43U after my 1440P 165Z 32QC but I bought that when I heard about the announcement of 7000 GPUs. The promise of the increase in VRAM and Clockspeed vs the 6800XT that I was running meant that I no longer had to run at 1440P for some Games using that panel and 4K Mini LED is better than OLED to me for one reason Power draw. I love James Webb images and you should see those on this panel I am typing on with the Contrast and Colour turned up. That also means that Gaming is Fing sweet. Diablo 4, Grid Legends, Spiderman, Guardians, Witcher 3, Everspace 2, Total War Anything, Fear, Hellblade, Hitman and any of the other library of Games I own all look spectacular and run butter smooth. I am not even sure if any of those support DLSS but DLSS is not in even 1% of all Games but neither is FSR.

One of the examples of what I am talking about is how cheated some Nvidia owners feel because Starfield won't support DLSS. Like 1 Game in a space that has Homeworld 3, Armored Core 6, Avatar Whatever it's called HFB whenever it launches on PC and Game like Aliens Fire Team Elite that are are plenty of fun. We are missing the mark anyways as playing a Game like Titanfall 2 with modern hardware is blissful and I doubt anyone really knows the number of Games that have been made for PC because new ones come out every week.

I will go even further down the rabbit hole you opened. Get a 5600/10-131to400 and a 6650XT/3060 12GB/ and maybe 4060 buy yourself a 1080P high refresh panel and do a 1 year subscription to Humble Choice and actually play the Games and you will laugh at people making these Hotwheels vs Matchbox, Tyco vs AFX f me Green Machine vs the Big Wheel (That is actually a proper analogy)arguments. If you do that you will enjoy Gaming for what it is. Exploring the minds of the Human experience by projecting the mind's eye onto the screen. Games you have never heard of or thought about. Genres that you wouldn't typically play. There is a platformer called Raji that is one of the most beautiful Games I have ever seen think Prince of Persia with a South Asian theme. If you have read the Mahabarata (I think) it will be relevant.

In fact I am going to recommend that all the people who feel cheated by Starfield. Take that money and get 4 months of Humble Choice. You will be impressed and maybe enjoy Gaming enough to not be so uptight about software variables.
Well spoken, graphics are just presentation but some people seem to think theyre the game.
phanbueyThat’s without any upscaling tech - enable dlss and everything changes; that was the point of that post.

Back to the thread — that’s why people who have dlss are pissed it’s not there.
That's indeed the point. It goes to show you can also just invest the money in a 7900XT and have the frames without tying yourself to DLSS ;) You even have the VRAM then to push (nearly?) the full bells & whistle factory on 4K in Hogwarts, where a 4070ti goes to 11 minimum FPS. Not that 22 is a fantastic number, but that is 2x the frames and it shows why 12GB kills the 4070ti - the GPU core power of each one is much closer than that gap. The real question is why RTX On Ultra @ 4K kills the 4070ti so hard, given the tier it is in and the price it has got. Not how fantastic Nvidia is capable of masking it with an upscale rendering just 1/8th of a frame's actual load.

Thát is precisely the rationale we need, and the one I had looking at the vendor lock-in this technology presents. Underneath the DLSS guise is a very weak piece of silicon. These GPU vendors can ALL push whatever goddamn silicon they want, in all fairness, but ONLY if and when they work together to get a full industry going on the technology that crutches those pieces of silicon. Until then though? We are delusional buying into it; there are way too many moving parts in gaming land to maintain all that content for a solid period of time.
Posted on Reply
#142
phanbuey
Vayra86That's indeed the point. It goes to show you can also just invest the money in a 7900XT and have the frames without tying yourself to DLSS ;) You even have the VRAM then to push (nearly?) the full bells & whistle factory on 4K in Hogwarts, where a 4070ti goes to 11 minimum FPS. Not that 22 is a fantastic number, but that is 2x the frames and it shows why 12GB kills the 4070ti - the GPU core power of each one is much closer than that gap. The real question is why RTX On Ultra @ 4K kills the 4070ti so hard, given the tier it is in and the price it has got. Not how fantastic Nvidia is capable of masking it with an upscale rendering just 1/8th of a frame's actual load.

Thát is precisely the rationale we need, and the one I had looking at the vendor lock-in this technology presents. Underneath the DLSS guise is a very weak piece of silicon. These GPU vendors can ALL push whatever goddamn silicon they want, in all fairness, but ONLY if and when they work together to get a full industry going on the technology that crutches those pieces of silicon. Until then though? We are delusional buying into it.
Correct -- that's why i used the 4070 - 4070ti as an example -- they are objectively weak, but can be made extremely strong using DLSS... my point wasn't about the product, my point was about the impact of not releasing with DLSS day 1 for starfield and how much it can hurt in these newer more demanding games. 4070 users, instead of getting 60 FPS min will go to 11. Hence the salt.
Posted on Reply
#143
Vayra86
phanbueyCorrect -- that's why i used the 4070 - 4070ti as an example -- they are objectively weak, but can be made extremely strong using DLSS... my point wasn't about the product, my point was about the impact of not releasing with DLSS day 1 for starfield and how much it can hurt in these newer more demanding games. 4070 users, instead of getting 60 FPS min will go to 11. Hence the salt.
Maybe its a good byproduct of evolutionary processes, where people hopefully actually get wiser one day before buying into a POS.
Posted on Reply
#144
gffermari
phanbueyCorrect -- that's why i used the 4070 - 4070ti as an example -- they are objectively weak, but can be made extremely strong using DLSS... my point wasn't about the product, my point was about the impact of not releasing with DLSS day 1 for starfield and how much it can hurt in these newer more demanding games. 4070 users, instead of getting 60 FPS min will go to 11. Hence the salt.
Although I agree with your posts above....

If I want to game no matter what, I would purchase a 6800XT or 7900XT or 7900XTX. The brute force is so big that you don't rely on anything, meaning upscaling tech etc. The only thing that I would sacrifice is the RT which objectively doesn't work with AMD cards.
On the other hand, the nVidias while with all the bell and whistles on, give an amazing result, you practically rely on them.
Ex. The 3080 is an amazing gpu but cannot run CP77 Overdrive due to lack of DLSS3. The same gpu or a 4070Ti will crash if the games in 1-2-3 years require 12-16GB of VRAM and don't support DLSS.

I like nVidias tech and i'm one of the lucky ones who can afford buying a high end gpu. But in any case I wouldn't risk of not game at all because of lack of DLSS.

**I don't have to mention that FSR 2+ is fine. And even if it's not at DLSS level, when you want to play a game and don't have the power for it, ok....I wouldn't mind if the cable lines in 500 meters look broken.
Posted on Reply
#145
JAB Creations
AdmiralThrawnWhy?????

Why would the do this when 70% of PC players are on NVIDIA and Intel systems. You are cutting more than half the market share from your player base by telling them unless they buy AMD their game will run bad. I am starting to get very tired of these exclusive bullshit releases.

Edit: This also means no DLSS or RTX for starfield. One of the aspects of this game I was most excited for was RTX and the visuals. But since I have a card made by a company with green letters I am not allowed to experience the game at "maximum overdrive super ultra mega resolution 9000 plus + Max" graphics.
Because I wouldn't give two damns about optimizing for corporations that are anti-competitive and anti-consumer.

I also wouldn't care about people dumb enough to stop a high FPS game, take a screenshot and waste time posting about it saying FSR is "inferior".

I also wouldn't care about people dumb enough to protect Nvidia when their DLSS only works on Nvidia cards.

I also wouldn't care about people who don't comprehend that RTX isn't the sole way ray-tracing is implemented.

I also wouldn't care about people willing to pay 60% more for a 10% increase in FPS when the FPS is already 150FPS+.

I also wouldn't care about people not intelligent enough to buy video cards with enough VRAM so they would last longer than two years.

I also wouldn't care about anyone dumb enough to buy an Intel/Nvidia system with a 12GB 3080 for an MSRP $1,200 at launch!
Posted on Reply
#146
Dr. Dro
JAB CreationsBecause I wouldn't give two damns about optimizing for corporations that are anti-competitive and anti-consumer.
After what AMD did to us X370 owners... i'll lump em on the anti-competitive, anti-consumer and highly opportunistic baskets at the same time. But anyway, I digress.

I was sent this on Discord, and I presume it's what the whole fuss is about:

Posted on Reply
#147
TheoneandonlyMrK
Dr. DroAfter what AMD did to us X370 owners... i'll lump em on the anti-competitive, anti-consumer and highly opportunistic baskets at the same time. But anyway, I digress.

I was sent this on Discord, and I presume it's what the whole fuss is about:

Oh two more tangent's.
Posted on Reply
#148
kapone32
Dr. DroAfter what AMD did to us X370 owners... i'll lump em on the anti-competitive, anti-consumer and highly opportunistic baskets at the same time. But anyway, I digress.

I was sent this on Discord, and I presume it's what the whole fuss is about:

What are you trying to establish? All I see is the advancement of adoption for both technologies. Indeed X370 owners were shafted because of what? PCI 3.0 support? I struggle to see how a platform that was released in 2017 that is still viable is anti consumer?
Posted on Reply
#149
sLowEnd
kapone32What are you trying to establish? All I see is the advancement of adoption for both technologies. Indeed X370 owners were shafted because of what? PCI 3.0 support? I struggle to see how a platform that was released in 2017 that is still viable is anti consumer?
The issue was AMD was wishy washy about Zen 3 support for 300/400 series boards until the community complained enough and they eventually capitulated. AMD threw up excuses like "the BIOS is too large" and dragged their feet for months.
Posted on Reply
#150
Unregistered
Vayra86Haha. Its a great reality check for those counting on that free Nvidia TLC.

Welcome to the Nvidia clusterfuck because youre also lacking the VRAM to run native at some point.

As predicted. Fools & Money will be parted

I dont 'count' on FSR either. We all need to judge raw raster performance and nothing else. This confirms it.
Ya while I have only played a title or two with DLSS, it's a nice technology along with FSR. But what I'll take away from things like this is that when I peruse the forums for games like Jedi Survivor and people are talking about how bad the game is running in stock form, but runs so much better with DLSS (there's a mod out there apparently) is that now it's another variable that one has to account for depending on their setup.

As you noted, people who 100% count on it may find a game unplayable if it lacks those "boosts" that they are expecting. Be mindful of refund windows and the like and do research on games before buying them.
Posted on Edit | Reply
Add your own comment
Dec 21st, 2024 10:02 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts