• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announced as Starfield's Exclusive Partner on PC

Joined
Dec 25, 2020
Messages
7,011 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~

The official stance is the "no comment" card now. GN inquired and got deflected with something that all but amounts to a yes.

I suspect that the bad PR may yet result in all techs being released for this game. I really really really mean it when I say I'm interested in Starfield, I'm pre-ordering the Premium Edition as soon as the first previews go live and already started amassing the coins on my Steam wallet.

As I mentioned earlier if this game is particularly strong on Radeon I may even be willing to flip my 3090 and get a XTX, however bad of a deal that could be otherwise. I already upgraded my CPU in anticipation to it after all.
 
Joined
Nov 13, 2007
Messages
10,845 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.

The official stance is the "no comment" card now. GN inquired and got deflected with something that all but amounts to a yes.

I suspect that the bad PR may yet result in all techs being released for this game. I really really really mean it when I say I'm interested in Starfield, I'm pre-ordering the Premium Edition as soon as the first previews go live and already started amassing the coins on my Steam wallet.

As I mentioned earlier if this game is particularly strong on Radeon I may even be willing to flip my 3090 and get a XTX, however bad of a deal that could be otherwise. I already upgraded my CPU in anticipation to it after all.

It is a Bethesda game so giving it a few weeks to iron things out might make you live longer.
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
In quite a few cases the 4070TI plays games like Hogwarts Legacy, Atomic Heart, Cyberpunk etc. better than a 7900XTX in real life --- why? because the nvidia nerd just enables DLSS 2/3 sets it to balanced and BOOM - game plays smoother and looks better than it does on the 7900XTX, no matter what settings the AMD owner uses. How do I know? Just built a 4070ti 5800x3d upgrade rig for a friend, and another 12600K 7900xtx mini itx build... And those were the games I happened to be testing with at the time.

The 7900XTX is super powerful card and a MUCH better card in raw stats and raster, but technology is a thing -- you can get to a good gaming experience without brute force-only.

That's why there's so much drama in this thread -- nvidia's software shenanigans actually work well, and when we're forced to use Raster only or (god forbid) FSR it's a big deal for people that use NVidia because it materially degrades the gaming experience simply because AMD can't compete with their vaseline smear upscaler.

Im not mad at AMD for what they did, I'm just generally mad that i'm probably going to have to subject my eyeballs to FSR if I cant get the FPS. Hopefully they do a good job like in Cyberpunk so it's not too bad.
weird. 4070Ti faster than a 7900xtx in hogwarts?
This is the 7900xt not xtx. I have included RT as well.
Look at the 4k results with RT very interesting.
1688103676759.png

1688103617360.png


View attachment 302891

Raw raster-hogwarts legacy...

Now look at the difference with DLSS3 on vs off.... You can run 4k with RT no issues. At 4k it smashes the 7900xtx by 40FPS just with DLSS3 alone.
(9) Hogwarts Legacy - DLSS 3 test @ INNO3D RTX 4070 Ti | 160W TDP limit - YouTube

Now turn on DLSS 3 - and you get over 150 FPS on the 4070ti.

Or you can build the two rigs and see for yourself.

Or let's do atomic heart native TAA vs DLSS vs FSR
Atomic Heart: FSR 2.2 vs. DLSS 2 vs. DLSS 3 Comparison Review | TechPowerUp

"Speaking of FSR 2.2 image quality, the FSR 2.2 implementation comes with noticeable compromises in image quality—in favor of performance in most sequences of the game. We spotted excessive shimmering and flickering in motion on vegetation, tree leaves and thin steel objects, which might be quite distracting for some people."

"DLSS Frame Generation technology, which has the ability to bypass CPU limitations and increase the framerate. With DLSS Super Resolution in Quality mode and DLSS Frame Generation enabled, you can expect almost doubled performance at 1440p and 1080p, and during our testing, overall gameplay felt very smooth and responsive, we haven't spotted any issues with input latency."

^ from TPU reviewers.

I've played on both, and I can tell you there are quite a few games that the 4070ti outright smashes the 7900xtx in gaming experience due to the settings it allows purely due to DLSS. And in just DLSS2 games, no frame gen, DLSS 2 balanced still looks better than any FSR 2 Quality - so you're basically getting the same performance at a better IQ.
Medium quality? how will we test in the future low quality and make it a representative for the performance?
DLSS is an upscale it does not matter here.
 
Last edited:
Joined
Apr 6, 2021
Messages
1,131 (0.83/day)
Location
Bavaria ⌬ Germany
System Name ✨ Lenovo M700 [Tiny]
Cooling ⚠️ 78,08% N² ⌬ 20,95% O² ⌬ 0,93% Ar ⌬ 0,04% CO²
Audio Device(s) ◐◑ AKG K702 ⌬ FiiO E10K Olympus 2
Mouse ✌️ Corsair M65 RGB Elite [Black] ⌬ Endgame Gear MPC-890 Cordura
Keyboard ⌨ Turtle Beach Impact 500
I am so glad I watched the MSI Gaming livestream this week. They showed DLSS3 with Frame Gen and the perosn playing could not shoot anyone in a FPS and admitted the floaty feeling and lag that those "innovations" introduced into the Game. If you like them good for you. I spent my money on VRAM as the 127 FPS that the Hitman 3 shows is perfect to be smooth. Then I have an X3D chip for the 1% lows so I am Golden.

I didn't even know that DLSS & FSR is getting the magic FPS boost by running the game in a lower resolution and then scaling it up, lol. :laugh: Which could explain why hitreg is messed up. Maybe not because of the lag but because the game is running in a lower resolution with less pixels, which could result in less pixel accuracy?
 
Joined
Apr 9, 2013
Messages
309 (0.07/day)
Location
Chippenham, UK
System Name Hulk
Processor 7800X3D
Motherboard Asus ROG Strix X670E-F Gaming Wi-Fi
Cooling Custom water
Memory 32GB 3600 CL18
Video Card(s) 4090
Display(s) LG 42C2 + Gigabyte Aorus FI32U 32" 4k 120Hz IPS
Case Corsair 750D
Power Supply beQuiet Dark Power Pro 1200W
Mouse SteelSeries Rival 700
Keyboard Logitech G815 GL-Tactile
VR HMD Quest 2
I am so glad I watched the MSI Gaming livestream this week. They showed DLSS3 with Frame Gen and the perosn playing could not shoot anyone in a FPS and admitted the floaty feeling and lag that those "innovations" introduced into the Game. If you like them good for you. I spent my money on VRAM as the 127 FPS that the Hitman 3 shows is perfect to be smooth. Then I have an X3D chip for the 1% lows so I am Golden.
Hey now, Frame Generation is great...as long as you have a solid 100+ fps before the Frame Generation is added...& therefore don't really need the extra frames anyway...
It's a fun tech for the top end, but having tried it a little myself, I will definitely lower settings instead. DLSS by itself is getting very close to good enough with motion artifacts that it's kinda 50:50 whether it'll use it or lower settings, depending on the implementation in the specific game.
 
Joined
Nov 13, 2007
Messages
10,845 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
weird. 4070Ti faster than a 7900xtx in hogwarts?
This is the 7900xt not xtx. I have included RT as well.
Look at the 4k results with RT very interesting.
View attachment 302941
View attachment 302940


Medium quality? how will we test in the future low quality and make it a representative for the performance?
DLSS is an upscale it does not matter here.
That’s without any upscaling tech - enable dlss and everything changes; that was the point of that post.

Back to the thread — that’s why people who have dlss are pissed it’s not there.
 
Last edited:
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
That’s without any upscaling tech - enable dlss and everything changes; that was the point of that post.
Exactly, this is without an upscaler and that is what this card is capable off. Since when do we measure performance of a card with upscalers? It is pointless in my opinion to even mention it as a performance metric of a graphics card when upscaler is on . 7900xt can also use an upscaler you know.
 
Joined
Nov 13, 2007
Messages
10,845 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Exactly, this is without an upscaler and that is what this card is capable off. Since when do we measure performance of a card with upscalers? It is pointless in my opinion to even mention it as a performance metric of a graphics card when upscaler is on . 7900xt can also use an upscaler you know.
Because it can materially enhance the experience. I don’t actually care what card I use or what settings are enabled — I care about the actual gaming experience and visual quality.

At the end of the day I will enable the settings that give the best game experience - and so do most gamers.

And when you do that, things change massively— it’s only with everything disabled that benchmarks are run for reviewed, but it’s not really how people play.

Even hwunboxed in that review said his first numbers were so massively in favor of Nvidia he had to turn off dlss because it was bugged and on — he had to turn off features to make it competitive.

Im not saying that the 4070ti is a superior product - at all, I’m saying these features are so huge that they propel this product +2 tiers when they’re available, and that’s why gamers who use them get their panties in a wad when they’re not.

DLSS 2 + reflex in shooters +30-40% fps boost, DLSS 3 + reflex in other games (Diablo 4, hw legacy etc) +80% fps…
 
Last edited:
D

Deleted member 185088

Guest
We love ina weird world where people defend a company that not only locked other GPUs from using their technology but worse even its own customers.
Rather than blaming AMD blame nVidia for not allowing their DLSS to work on all GPUs (at least recent), Intel and AMD that it is possible to have upscaling working without requiring specialised hardware, nVidia à la Apple wants to lock everyone to their hardware.
Very disappointed by the community.
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Because it can materially enhance the experience. I don’t actually care what card I use or what settings are enabled — I care about the actual gaming experience and visual quality.

At the end of the day I will enable the settings that give the best game experience - and so do most gamers.

And when you do that, things change massively— it’s only with everything disabled that benchmarks are run for reviewed, but it’s not really how people play.

Even hwunboxed in that review said his first numbers were so massively in favor of Nvidia he had to turn off dlss because it was bugged and on — he had to turn off features to make it competitive.
Tell me
When you buy a graphics card you pay for the performance the card gives?
or you just buy any card and pay for the upscalling technique to play a game with playable frame rate? That Hogwarts 4k with 4070 Ti with RT. Not even DLSS3 can lift it off the ground.

You know what. I wish NV made the upscaler so good that you literally pay $1k for a 4050 and then just use upscaler to get 60 or 100 fps in a game like Hogwarts. I really cheer for that and encourage NV to think about that approach. Im sure the likes of you will admire NV's effort to make the upscaler so good that you can literally play on any graphics newly released by NV
 
Joined
Nov 13, 2007
Messages
10,845 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Tell me
When you buy a graphics card you pay for the performance the card gives?
or you just buy any card and pay for the upscalling technique to play a game with playable frame rate? That Hogwarts 4k with 4070 Ti with RT. Not even DLSS3 can lift it off the ground.

You know what. I wish NV made the upscaler so good that you literally pay $1k for a 4050 and then just use upscaler to get 60 or 100 fps in a game like Hogwarts. I really cheer for that and encourage NV to think about that approach. Im sure the likes of you will admire NV's effort to make the upscaler so good that you can literally play on any graphics newly released by NV
That’s not true - enable dlss 3 with rt - dlss 2 lifts it way off the ground - and dlss 3 almost doubles it.

When I buy a gfx card I buy to game - and I do take upscalers into account as do most gamers.
 
Joined
Apr 9, 2013
Messages
309 (0.07/day)
Location
Chippenham, UK
System Name Hulk
Processor 7800X3D
Motherboard Asus ROG Strix X670E-F Gaming Wi-Fi
Cooling Custom water
Memory 32GB 3600 CL18
Video Card(s) 4090
Display(s) LG 42C2 + Gigabyte Aorus FI32U 32" 4k 120Hz IPS
Case Corsair 750D
Power Supply beQuiet Dark Power Pro 1200W
Mouse SteelSeries Rival 700
Keyboard Logitech G815 GL-Tactile
VR HMD Quest 2
That’s not true - enable dlss 3 with rt - dlss 2 lifts it way off the ground - and dlss 3 almost doubles it.

When I buy a gfx card I buy to game - and I do take upscalers into account as do most gamers.
I do think you're right in that the majority of people buying computers to game on will care more about the performance with all the tricks enabled. I'm curious whether it "feels" ok to you when you've got e.g. 30fps doubled to 60fps? I really hated the fact it still felt like 30fps due to the responsiveness to inputs still being at the lower frame rate.
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
That’s not true - enable dlss 3 with rt - dlss 2 lifts it way off the ground - and dlss 3 almost doubles it.

When I buy a gfx card I buy to game - and I do take upscalers into account as do most gamers.
It is hard to imagine you by a graphics card to plow a field. That's not the point.
You pay for performance the GPU offers as it is not with upscalers. With that notion you will purchase cards like 4050 for thousands of dollars since the upscaler will do the trick to make it playable in high res. Need to remind you, it is still an upscaler and it has flaws.
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
That’s not true - enable dlss 3 with rt - dlss 2 lifts it way off the ground - and dlss 3 almost doubles it.

When I buy a gfx card I buy to game - and I do take upscalers into account as do most gamers.
I don't think your opinion is as widespread as you think, even if it was widespread within the enthusiast community, I think the majority cannot afford to and does not consider it at all in reality and just want to play Roblox or cod on whatever their parents got talked into buying at Currys.

I also think Fsr 2.2 on quality not as bad as many make out IF you need to use it, is free.

To be honest I previously wouldn't use scaling, I still would rather not since none have no negative points dlss 3 being a total no no on cod apex ,battle bit, any racing game especially rallying.

But when I have used it, it was always on really, now low end hardware like a Steamdeck or 2060 laptop on Dead space etc, the hate for fsr is hyped, when I have used it and that's rarely for either tech, it was fine in use.

AMD passed a no comment, clearly Nvidia got it's schill army to kick up a fuss after AMD rejected streamline, a way for Nvidia to leverage dlss into every game menu, that's a proprietary competitive tech they wanted AMD to assist in the spreading of.
I get the annoyance now but as a company who back open source, But do so to push they're own sale's, I can't see any issues even with them asking for exclusivity, but they can't contract such for features, just like. A Dev team can choose to use or incorporate any technology they want, and that shouldn't be forcefully limited by contract by a partner, asking would though be fine, to me.

As is refusing that request.

The whole story is probably more nuanced.
 
Joined
Nov 13, 2007
Messages
10,845 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
It is hard to imagine you by a graphics card to plow a field. That's not the point.
You pay for performance the GPU offers as it is not with upscalers. With that notion you will purchase cards like 4050 for thousands of dollars since the upscaler will do the trick to make it playable in high res. Need to remind you, it is still an upscaler and it has flaws.
What?

You buy a graphics card to play games... some people buy it to debate hardware online - but those are usually contrarian hardware enthusiasts. There is a difference between gamers and hardware guys, which is why nvidia dominates in gamer mind share. If AMD comes out with better FSR, or even if FSR improves substantially (or if theres another upscaler that becomes popular) that gives +30% to cards universally you will see AMD's GPU market share spike.


I do think you're right in that the majority of people buying computers to game on will care more about the performance with all the tricks enabled. I'm curious whether it "feels" ok to you when you've got e.g. 30fps doubled to 60fps? I really hated the fact it still felt like 30fps due to the responsiveness to inputs still being at the lower frame rate.

30 FPS is a nightmare -- 60 FPS with FGis better but still feels awful. It's really good at like 50 to 80-100FPS and even better at 70 -> 140FPS - anything really below 50 FPS is not great in general.
 
Last edited:
Joined
Jun 2, 2017
Messages
9,370 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
That’s not true - enable dlss 3 with rt - dlss 2 lifts it way off the ground - and dlss 3 almost doubles it.
Frame Generation (Fujisu Line Doubling) is not a good example when what you see is not the reality if the Game gives you 150 FPS with all of those applied and I already get 120 FPS. Do you really believe on a Freesync monitor that gives you butter smooth frames from 45-144 Hz using a 20 GB GPU with a card that maintains at least 2600MHz GPU clock in Gaming would feel worse than a card that runs at 57 FPS native to them use other technology to render the exact same frame.

Though not direct, FPS and Hz are alike in that your monitor will directly effect your enjoyment of a product. When everyone had 60Hz monitors it didn't matter. Now we have panels that go as high as 400 HZ+ but a lot of us have 120+ hz monitors at various resolutions. That directly translates with GPU tier. If you have a 6600/3060/4060 a 1080P 120hz Freesync (Or Gsync) panel will be great. If you have a 3070/3080/4070/6700-50XT/6800/6800XT 1440P 144 Hz+ panels are excellent. Then you have the highest tier with that 6900-50XT7900XT,XTX,4080,3090,3090 TI and 4090 which are for 4K 144Hz+ panels but from the 3070 you can get away with 4K 60Hz. As long as you stick to that principle you will enjoy Gaming like never before. The thing is that was what was needed in the space. About 10+ years ago you could buy 1440P monitors from Korea (Qnix) that had no scalar and were just basically raw panels but with GPU scaling no one who owned one of those complained. Just do a Google search on Forums circa 2012-2013 on Korean 1440P panels for $300 from Newegg. That meant that 60Hz had been exposed. But I was not convinced until I was playing the Division on my 4K 60Hz panel and struggled with the Machine Gun. I bought a 1440P 144 Hz panel (Qnix) next and was blown away that I could use the Machine Gun with a Scope to make head shots. That is tangible and that is why I bought a FV43U after my 1440P 165Z 32QC but I bought that when I heard about the announcement of 7000 GPUs. The promise of the increase in VRAM and Clockspeed vs the 6800XT that I was running meant that I no longer had to run at 1440P for some Games using that panel and 4K Mini LED is better than OLED to me for one reason Power draw. I love James Webb images and you should see those on this panel I am typing on with the Contrast and Colour turned up. That also means that Gaming is Fing sweet. Diablo 4, Grid Legends, Spiderman, Guardians, Witcher 3, Everspace 2, Total War Anything, Fear, Hellblade, Hitman and any of the other library of Games I own all look spectacular and run butter smooth. I am not even sure if any of those support DLSS but DLSS is not in even 1% of all Games but neither is FSR.

One of the examples of what I am talking about is how cheated some Nvidia owners feel because Starfield won't support DLSS. Like 1 Game in a space that has Homeworld 3, Armored Core 6, Avatar Whatever it's called HFB whenever it launches on PC and Game like Aliens Fire Team Elite that are are plenty of fun. We are missing the mark anyways as playing a Game like Titanfall 2 with modern hardware is blissful and I doubt anyone really knows the number of Games that have been made for PC because new ones come out every week.

I will go even further down the rabbit hole you opened. Get a 5600/10-131to400 and a 6650XT/3060 12GB/ and maybe 4060 buy yourself a 1080P high refresh panel and do a 1 year subscription to Humble Choice and actually play the Games and you will laugh at people making these Hotwheels vs Matchbox, Tyco vs AFX f me Green Machine vs the Big Wheel (That is actually a proper analogy)arguments. If you do that you will enjoy Gaming for what it is. Exploring the minds of the Human experience by projecting the mind's eye onto the screen. Games you have never heard of or thought about. Genres that you wouldn't typically play. There is a platformer called Raji that is one of the most beautiful Games I have ever seen think Prince of Persia with a South Asian theme. If you have read the Mahabarata (I think) it will be relevant.

In fact I am going to recommend that all the people who feel cheated by Starfield. Take that money and get 4 months of Humble Choice. You will be impressed and maybe enjoy Gaming enough to not be so uptight about software variables.
 
Joined
Sep 17, 2014
Messages
22,666 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
It is a Bethesda game so giving it a few years weeks to iron things out might make you live longer.
FTFY. Im in anticipation of the Starfield GOTY, the Legendary edition, the Remastered GOTY legendary with horse armor edition... oh yes Ill get them all!

We love ina weird world where people defend a company that not only locked other GPUs from using their technology but worse even its own customers.
Rather than blaming AMD blame nVidia for not allowing their DLSS to work on all GPUs (at least recent), Intel and AMD that it is possible to have upscaling working without requiring specialised hardware, nVidia à la Apple wants to lock everyone to their hardware.
Very disappointed by the community.
Yep, its a bunch of fools blindly believing marketing. We have a world full of them. Unfortunately

Nobody is denying the advantage the tech brings. But we should not accept the conditions under which we get it. We are ALL best served by unified technology push. We can do it for various AA methods, so why not here? Why are devs constantly bothered with up to THREE implementations to support? Why is the tech not available on every gpu with varying perf impact? Etc.

I mean look at Freesync. Its been the best thing that happened to us in the past decade wrt gaming, while Gsync is near dead despite Nvidia leading the charge. DLSS is the Gsync thing all over again. It wont last. It will be abused to the max.

Frame Generation (Fujisu Line Doubling) is not a good example when what you see is not the reality if the Game gives you 150 FPS with all of those applied and I already get 120 FPS. Do you really believe on a Freesync monitor that gives you butter smooth frames from 45-144 Hz using a 20 GB GPU with a card that maintains at least 2600MHz GPU clock in Gaming would feel worse than a card that runs at 57 FPS native to them use other technology to render the exact same frame.

Though not direct, FPS and Hz are alike in that your monitor will directly effect your enjoyment of a product. When everyone had 60Hz monitors it didn't matter. Now we have panels that go as high as 400 HZ+ but a lot of us have 120+ hz monitors at various resolutions. That directly translates with GPU tier. If you have a 6600/3060/4060 a 1080P 120hz Freesync (Or Gsync) panel will be great. If you have a 3070/3080/4070/6700-50XT/6800/6800XT 1440P 144 Hz+ panels are excellent. Then you have the highest tier with that 6900-50XT7900XT,XTX,4080,3090,3090 TI and 4090 which are for 4K 144Hz+ panels but from the 3070 you can get away with 4K 60Hz. As long as you stick to that principle you will enjoy Gaming like never before. The thing is that was what was needed in the space. About 10+ years ago you could buy 1440P monitors from Korea (Qnix) that had no scalar and were just basically raw panels but with GPU scaling no one who owned one of those complained. Just do a Google search on Forums circa 2012-2013 on Korean 1440P panels for $300 from Newegg. That meant that 60Hz had been exposed. But I was not convinced until I was playing the Division on my 4K 60Hz panel and struggled with the Machine Gun. I bought a 1440P 144 Hz panel (Qnix) next and was blown away that I could use the Machine Gun with a Scope to make head shots. That is tangible and that is why I bought a FV43U after my 1440P 165Z 32QC but I bought that when I heard about the announcement of 7000 GPUs. The promise of the increase in VRAM and Clockspeed vs the 6800XT that I was running meant that I no longer had to run at 1440P for some Games using that panel and 4K Mini LED is better than OLED to me for one reason Power draw. I love James Webb images and you should see those on this panel I am typing on with the Contrast and Colour turned up. That also means that Gaming is Fing sweet. Diablo 4, Grid Legends, Spiderman, Guardians, Witcher 3, Everspace 2, Total War Anything, Fear, Hellblade, Hitman and any of the other library of Games I own all look spectacular and run butter smooth. I am not even sure if any of those support DLSS but DLSS is not in even 1% of all Games but neither is FSR.

One of the examples of what I am talking about is how cheated some Nvidia owners feel because Starfield won't support DLSS. Like 1 Game in a space that has Homeworld 3, Armored Core 6, Avatar Whatever it's called HFB whenever it launches on PC and Game like Aliens Fire Team Elite that are are plenty of fun. We are missing the mark anyways as playing a Game like Titanfall 2 with modern hardware is blissful and I doubt anyone really knows the number of Games that have been made for PC because new ones come out every week.

I will go even further down the rabbit hole you opened. Get a 5600/10-131to400 and a 6650XT/3060 12GB/ and maybe 4060 buy yourself a 1080P high refresh panel and do a 1 year subscription to Humble Choice and actually play the Games and you will laugh at people making these Hotwheels vs Matchbox, Tyco vs AFX f me Green Machine vs the Big Wheel (That is actually a proper analogy)arguments. If you do that you will enjoy Gaming for what it is. Exploring the minds of the Human experience by projecting the mind's eye onto the screen. Games you have never heard of or thought about. Genres that you wouldn't typically play. There is a platformer called Raji that is one of the most beautiful Games I have ever seen think Prince of Persia with a South Asian theme. If you have read the Mahabarata (I think) it will be relevant.

In fact I am going to recommend that all the people who feel cheated by Starfield. Take that money and get 4 months of Humble Choice. You will be impressed and maybe enjoy Gaming enough to not be so uptight about software variables.
Well spoken, graphics are just presentation but some people seem to think theyre the game.

That’s without any upscaling tech - enable dlss and everything changes; that was the point of that post.

Back to the thread — that’s why people who have dlss are pissed it’s not there.
That's indeed the point. It goes to show you can also just invest the money in a 7900XT and have the frames without tying yourself to DLSS ;) You even have the VRAM then to push (nearly?) the full bells & whistle factory on 4K in Hogwarts, where a 4070ti goes to 11 minimum FPS. Not that 22 is a fantastic number, but that is 2x the frames and it shows why 12GB kills the 4070ti - the GPU core power of each one is much closer than that gap. The real question is why RTX On Ultra @ 4K kills the 4070ti so hard, given the tier it is in and the price it has got. Not how fantastic Nvidia is capable of masking it with an upscale rendering just 1/8th of a frame's actual load.

Thát is precisely the rationale we need, and the one I had looking at the vendor lock-in this technology presents. Underneath the DLSS guise is a very weak piece of silicon. These GPU vendors can ALL push whatever goddamn silicon they want, in all fairness, but ONLY if and when they work together to get a full industry going on the technology that crutches those pieces of silicon. Until then though? We are delusional buying into it; there are way too many moving parts in gaming land to maintain all that content for a solid period of time.
 
Last edited:
Joined
Nov 13, 2007
Messages
10,845 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
That's indeed the point. It goes to show you can also just invest the money in a 7900XT and have the frames without tying yourself to DLSS ;) You even have the VRAM then to push (nearly?) the full bells & whistle factory on 4K in Hogwarts, where a 4070ti goes to 11 minimum FPS. Not that 22 is a fantastic number, but that is 2x the frames and it shows why 12GB kills the 4070ti - the GPU core power of each one is much closer than that gap. The real question is why RTX On Ultra @ 4K kills the 4070ti so hard, given the tier it is in and the price it has got. Not how fantastic Nvidia is capable of masking it with an upscale rendering just 1/8th of a frame's actual load.

Thát is precisely the rationale we need, and the one I had looking at the vendor lock-in this technology presents. Underneath the DLSS guise is a very weak piece of silicon. These GPU vendors can ALL push whatever goddamn silicon they want, in all fairness, but ONLY if and when they work together to get a full industry going on the technology that crutches those pieces of silicon. Until then though? We are delusional buying into it.

Correct -- that's why i used the 4070 - 4070ti as an example -- they are objectively weak, but can be made extremely strong using DLSS... my point wasn't about the product, my point was about the impact of not releasing with DLSS day 1 for starfield and how much it can hurt in these newer more demanding games. 4070 users, instead of getting 60 FPS min will go to 11. Hence the salt.
 
Joined
Sep 17, 2014
Messages
22,666 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Correct -- that's why i used the 4070 - 4070ti as an example -- they are objectively weak, but can be made extremely strong using DLSS... my point wasn't about the product, my point was about the impact of not releasing with DLSS day 1 for starfield and how much it can hurt in these newer more demanding games. 4070 users, instead of getting 60 FPS min will go to 11. Hence the salt.
Maybe its a good byproduct of evolutionary processes, where people hopefully actually get wiser one day before buying into a POS.
 
Joined
Apr 14, 2022
Messages
758 (0.77/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
Correct -- that's why i used the 4070 - 4070ti as an example -- they are objectively weak, but can be made extremely strong using DLSS... my point wasn't about the product, my point was about the impact of not releasing with DLSS day 1 for starfield and how much it can hurt in these newer more demanding games. 4070 users, instead of getting 60 FPS min will go to 11. Hence the salt.

Although I agree with your posts above....

If I want to game no matter what, I would purchase a 6800XT or 7900XT or 7900XTX. The brute force is so big that you don't rely on anything, meaning upscaling tech etc. The only thing that I would sacrifice is the RT which objectively doesn't work with AMD cards.
On the other hand, the nVidias while with all the bell and whistles on, give an amazing result, you practically rely on them.
Ex. The 3080 is an amazing gpu but cannot run CP77 Overdrive due to lack of DLSS3. The same gpu or a 4070Ti will crash if the games in 1-2-3 years require 12-16GB of VRAM and don't support DLSS.

I like nVidias tech and i'm one of the lucky ones who can afford buying a high end gpu. But in any case I wouldn't risk of not game at all because of lack of DLSS.

**I don't have to mention that FSR 2+ is fine. And even if it's not at DLSS level, when you want to play a game and don't have the power for it, ok....I wouldn't mind if the cable lines in 500 meters look broken.
 
Joined
May 13, 2015
Messages
632 (0.18/day)
Processor AMD Ryzen 3800X / AMD 8350
Motherboard ASRock X570 Phantom Gaming X / Gigabyte 990FXA-UD5 Revision 3.0
Cooling Stock / Corsair H100
Memory 32GB / 24GB
Video Card(s) Sapphire RX 6800 / AMD Radeon 290X (Toggling until 6950XT)
Storage C:\ 1TB SSD, D:\ RAID-1 1TB SSD, 2x4TB-RAID-1
Display(s) Samsung U32E850R
Case be quiet! Dark Base Pro 900 Black rev. 2 / Fractal Design
Audio Device(s) Creative Sound Blaster X-Fi
Power Supply EVGA Supernova 1300G2 / EVGA Supernova 850G+
Mouse Logitech M-U0007
Keyboard Logitech G110 / Logitech G110
Why?????

Why would the do this when 70% of PC players are on NVIDIA and Intel systems. You are cutting more than half the market share from your player base by telling them unless they buy AMD their game will run bad. I am starting to get very tired of these exclusive bullshit releases.

Edit: This also means no DLSS or RTX for starfield. One of the aspects of this game I was most excited for was RTX and the visuals. But since I have a card made by a company with green letters I am not allowed to experience the game at "maximum overdrive super ultra mega resolution 9000 plus + Max" graphics.
Because I wouldn't give two damns about optimizing for corporations that are anti-competitive and anti-consumer.

I also wouldn't care about people dumb enough to stop a high FPS game, take a screenshot and waste time posting about it saying FSR is "inferior".

I also wouldn't care about people dumb enough to protect Nvidia when their DLSS only works on Nvidia cards.

I also wouldn't care about people who don't comprehend that RTX isn't the sole way ray-tracing is implemented.

I also wouldn't care about people willing to pay 60% more for a 10% increase in FPS when the FPS is already 150FPS+.

I also wouldn't care about people not intelligent enough to buy video cards with enough VRAM so they would last longer than two years.

I also wouldn't care about anyone dumb enough to buy an Intel/Nvidia system with a 12GB 3080 for an MSRP $1,200 at launch!
 
Joined
Dec 25, 2020
Messages
7,011 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Because I wouldn't give two damns about optimizing for corporations that are anti-competitive and anti-consumer.

After what AMD did to us X370 owners... i'll lump em on the anti-competitive, anti-consumer and highly opportunistic baskets at the same time. But anyway, I digress.

I was sent this on Discord, and I presume it's what the whole fuss is about:

IMG_4676.jpg
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
After what AMD did to us X370 owners... i'll lump em on the anti-competitive, anti-consumer and highly opportunistic baskets at the same time. But anyway, I digress.

I was sent this on Discord, and I presume it's what the whole fuss is about:

View attachment 303021
Oh two more tangent's.
 
Joined
Jun 2, 2017
Messages
9,370 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
After what AMD did to us X370 owners... i'll lump em on the anti-competitive, anti-consumer and highly opportunistic baskets at the same time. But anyway, I digress.

I was sent this on Discord, and I presume it's what the whole fuss is about:

View attachment 303021
What are you trying to establish? All I see is the advancement of adoption for both technologies. Indeed X370 owners were shafted because of what? PCI 3.0 support? I struggle to see how a platform that was released in 2017 that is still viable is anti consumer?
 
Joined
Sep 27, 2008
Messages
1,210 (0.20/day)
What are you trying to establish? All I see is the advancement of adoption for both technologies. Indeed X370 owners were shafted because of what? PCI 3.0 support? I struggle to see how a platform that was released in 2017 that is still viable is anti consumer?

The issue was AMD was wishy washy about Zen 3 support for 300/400 series boards until the community complained enough and they eventually capitulated. AMD threw up excuses like "the BIOS is too large" and dragged their feet for months.
 
Top