• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

Joined
Jan 14, 2019
Messages
13,278 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Didi you notice that the 3D mark test was done with 285K?
If that makes any difference...
In Speedway? No, it makes absolutely no difference.
 
Joined
Sep 20, 2024
Messages
13 (0.12/day)
Location
Italy
System Name Biscuit maker
Processor AMD Ryzen 5 7600 PBO
Motherboard Asus TUF B650 Plus
Cooling Thermalright Frozen 360
Memory Silicon Power 2x16 DDR5 6000 CL30
Video Card(s) AMD Radeon RX 6800 OC
Storage 2TB P5 Plus
Display(s) ROG Strix XG27UCS 4K 160HZ
Case Phanteks XT PRO ULTRA
Power Supply PSU Asus TUF 850G ATX 3.0
Software W11 PRO
Exactly. 3DMark is a mess for this stupid reason and that's why I asked, because honestly the numbers are totally meaningless without stock results to compare to.

Does any website test GPUs at stock settings with 3DMark and publish scores of GPUs that are representative of what people actually own?
On the 3dmark website in the search section, you can enter the standard clock values of gpu and gpu memory clock of the 7900xt or xtx and then compare them without overclocking.

5900X + 7900XTX (TBP 366+10%=402W, GPU clock 2620~2670MHz, VRAM 2600MHz)

View attachment 379325
What brand is your GPU? Sapphire?
 
Joined
Aug 3, 2006
Messages
167 (0.02/day)
Location
Austin, TX
Processor Ryzen 6900HX
Memory 32 GB DDR4LP
Video Card(s) Radeon 6800m
Display(s) LG C3 42''
Software Windows 11 home premium
Something is very off about all of this if you think about it.

  • The die size of the 9070xt is bigger than the 4080, but it has 2k less shaders than the 7900xt despite being on a smaller process node.
  • AMD has explicitly stated in public that the “performance figures” are all wrong.
  • Rumors that they have deliberately sent out gimped drivers to throw off leaks.
Something else is under the hood of the 9070 that AMD is not sharing with us that is taking up valuable real estate. Something top that could be a fruition of their purchase of Xilenx or some new type of programming unit.

If the 9700pro was "Isildur" slicing the fingers off of Nvidia's hand, the 9070xt might be the return of the King.
 
Joined
Apr 18, 2019
Messages
2,427 (1.16/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
Something is very off about all of this if you think about it.

  • The die size of the 9070xt is bigger than the 4080, but it has 2k less shaders than the 7900xt despite being on a smaller process node.
  • AMD has explicitly stated in public that the “performance figures” are all wrong.
  • Rumors that they have deliberately sent out gimped drivers to throw off leaks.
Something else is under the hood of the 9070 that AMD is not sharing with us that is taking up valuable real estate. Something top that could be a fruition of their purchase of Xilenx or some new type of programming unit.
Perhaps. I've wondered where the die real estate went to, myself.
I do hope AMD is allowing disappointing leaks out, and that the final product is much more impressive, but... Even AMD themselves, are positioning RX 9070 (XT) as a replacement to the 7900GRE - XT 'range'.
1736479758346.png
^That slide alone, is what had me pull the trigger on a $900+ 24GB XTX, right after CES. AMD has no replacement tier card.

Funny enough, AMD seems to be ending the separation of RDNA 'Graphics' and CDNA 'Compute' the exact same way it started:
The RX 7900 XTX becomes the 'Radeon VII of Today'
and the
RX 9070 (XT) becomes the 'RX 5700 (XT) of Today'

If the 9700pro was "Isildur" slicing the fingers off of Nvidia's hand, the 9070xt might be the return of the King.
Extraordinarily wishful thinking.
The situation has changed considerably since then. nVidia is a monster of a company, today -resources and all.
 
Last edited:
Joined
Jan 14, 2019
Messages
13,278 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Something is very off about all of this if you think about it.

  • The die size of the 9070xt is bigger than the 4080, but it has 2k less shaders than the 7900xt despite being on a smaller process node.
Perhaps by "improved RT" they meant they're giving us more RT cores? Or maybe the AI that does FSR 4 is taking up space? It could explain why some models need so much power. Personally, as long as it's a fine card for a decent price, I don't care.

  • AMD has explicitly stated in public that the “performance figures” are all wrong.
Where?

  • Rumors that they have deliberately sent out gimped drivers to throw off leaks.
Why would they have done that?

Something else is under the hood of the 9070 that AMD is not sharing with us that is taking up valuable real estate. Something top that could be a fruition of their purchase of Xilenx or some new type of programming unit.

If the 9700pro was "Isildur" slicing the fingers off of Nvidia's hand, the 9070xt might be the return of the King.
If the price is right, it very well may be.
 
Joined
Sep 3, 2019
Messages
3,663 (1.87/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MT/s 1.38V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
Joined
Oct 30, 2020
Messages
291 (0.19/day)
Location
Toronto
System Name GraniteXT
Processor Ryzen 9950X
Motherboard ASRock B650M-HDV
Cooling 2x360mm custom loop
Memory 2x24GB Team Xtreem DDR5-8000 [M die]
Video Card(s) RTX 3090 FE underwater
Storage Intel P5800X 800GB + Samsung 980 Pro 2TB
Display(s) MSI 342C 34" OLED
Case O11D Evo RGB
Audio Device(s) DCA Aeon 2 w/ SMSL M200/SP200
Power Supply Superflower Leadex VII XG 1300W
Mouse Razer Basilisk V3
Keyboard Steelseries Apex Pro V2 TKL
Honestly couldn't give a rats tit about RT performance but if raster comes around the same as 7900XT, it would be pretty decent. I'm more interested in figuring out what they've done die wise because it looks strangely similar to two 9060XT's side by side. Are they upto some sort of modular arch or what i'm not sure but i want that die annotation..
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,328 (1.29/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Have you checked Cyberpunk, Indiana Jones or Alan Wake II with Full RT? It's a huge tanking in performance, yes, but it's beautiful.
I wouldn't expect the usual suspects here to admit that even if they had seen it tbh, it'd be easy enough to showcase some gorgeous differences and cherry pick those screenshots or video segments (like what was done to show how little difference it can make - which I don't deny depending on the game or scene), but I certainly wouldn't expect to convince those so vocally against it anyway, they've already made up their minds and appear to enjoy patting themselves on the back for it.

-------------------------------------------------------------------------------------------------------------

Personally, this is shaping up to be ever more appetising to me as my upgrade path. If it really is;
  • A raster match~ish for a 7900XTX or 4080/S
  • RT performance that is generationally ahead of Ampere
  • FSR4 (or what was directly called by them to be a research project) is as good as what we saw in their booth for all or at least most games (and not just fine tuned for 1 title), and is easily adopted widespread or able to be substituted in place of 3.1 as has been rumoured
  • Some AIB cards have 2x HDMI 2.1
  • And of course, priced to party...
Well then I'm going to have a hard time justifying to myself paying a bare minimum of $1519 AUD for a 5070Ti or better.

bring on the release and reviews.
 
Joined
Jun 18, 2015
Messages
355 (0.10/day)
Location
Perth , West Australia
System Name schweinestalle
Processor AMD Ryzen 7 3700 X
Motherboard Asus Prime - Pro X 570 + Asus PCI -E AC68 Dual Band Wi-Fi Adapter
Cooling Standard Air
Memory Kingston HyperX 2 x 16 gb DDR 4 3200mhz
Video Card(s) AMD Radeon RX 7800 XT 16GB Pulse
Storage Crucial 1TB M.2 SSD
Display(s) Asus XG 32 V ROG
Case Corsair AIR ATX
Audio Device(s) Realtech standard
Power Supply Corsair 850 Modular
Mouse CM Havoc
Keyboard Corsair Cherry Mechanical
Software Win 10
Benchmark Scores Soon !
Im talking about Cyberpunk specifically. I've poured 600+ hours into that game -- the RT is one of the best implementations ive seen, and it still looks like crap (IMO).

View attachment 379240


It's grainier, blurrier:

View attachment 379241

Pick the RT shot -- it's the one on the left.
maybe my eyes are bad but its hardly any different imo
 
Joined
Nov 13, 2007
Messages
10,895 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
maybe my eyes are bad but its hardly any different imo
right - it's almost the same with the RT being slightly blurrier... for -60% FPS.

And it's not like i had the lowest RT setting turned on - was 4090 with everything cranked. Some scenes look cool, but then you turn off RT and realize they're just as cool with it off, but now you also get 150FPS.
 

Outback Bronze

Super Moderator
Staff member
Joined
Aug 3, 2011
Messages
2,077 (0.42/day)
Location
Walkabout Creek
System Name Raptor Baked
Processor 14900k w.c.
Motherboard Z790 Hero
Cooling w.c.
Memory 48GB G.Skill 7200
Video Card(s) Zotac 4080 w.c.
Storage 2TB Kingston kc3k
Display(s) Samsung 34" G8
Case Corsair 460X
Audio Device(s) Onboard
Power Supply PCIe5 850w
Mouse Asus
Keyboard Corsair
Software Win 11
Benchmark Scores Cool n Quiet.
Im talking about Cyberpunk specifically. I've poured 600+ hours into that game -- the RT is one of the best implementations ive seen, and it still looks like crap (IMO).

View attachment 379240


It's grainier, blurrier:

View attachment 379241

Pick the RT shot -- it's the one on the left.

RT reminds me of what Nvidia did several moons ago with HDR - Pixel Shader 3.0. If my memory serves me correctly it was the 6xxx series that brough HDR which was their sales pitch.

ATi at the time with their X800 series was just running Pixel Shader 2.0. I'm going to leave an example here of the difference of HDR on/off from back in the old days with Elder Scrolls IV: Oblivion

With HDR:
Oblivion_2025_01_10_12_40_40_802.jpg


Without HDR:
Oblivion_2025_01_10_12_43_08_786.jpg


I used to think HDR was the shizz nizz back in the day but as I've been playing over the years, I've been noticing that the HDR colours were very bright and that bloom was a more proper representation of real life colours.

You could also run 8x AA on Bloom but not HDR which I have done with the example screen shots shown above.

Which one do you think is better?

Great screen shots of your RT implementation btw. Nice Work!
 
Joined
Jan 14, 2019
Messages
13,278 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
right - it's almost the same with the RT being slightly blurrier... for -60% FPS.

And it's not like i had the lowest RT setting turned on - was 4090 with everything cranked. Some scenes look cool, but then you turn off RT and realize they're just as cool with it off, but now you also get 150FPS.
Exactly. No one denies that RT is nice. The problem is the performance cost even on Nvidia, and the fact that it isn't really a night and day difference, just a little icing on the cake. If you turn it on, you see it's nice. But then you turn it off and still enjoy your game just the same. After 5-10 minutes, you don't even care.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,328 (1.29/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
maybe my eyes are bad but its hardly any different imo
Someone posts screenshots chosen to demonstrate no/little difference;

Wow, there's hardly any difference! :rolleyes:
 
Last edited:
Joined
Nov 13, 2007
Messages
10,895 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
RT reminds me of what Nvidia did several moons ago with HDR - Pixel Shader 3.0. If my memory serves me correctly it was the 6xxx series that brough HDR which was their sales pitch.

ATi at the time with their X800 series was just running Pixel Shader 2.0. I'm going to leave an example here of the difference of HDR on/off from back in the old days with Elder Scrolls IV: Oblivion

With HDR:
View attachment 379337

Without HDR:View attachment 379338

I used to think HDR was the shizz nizz back in the day but as I've been playing over the years, I've been noticing that the HDR colours were very bright and that bloom was a more proper representation of real life colours.

You could also run 8x AA on Bloom but not HDR which I have done with the example screen shots shown above.

Which one do you think is better?

Great screen shots of your RT implementation btw. Nice Work!
That game was amazing -- I am partial to the HDR oversaturated mushroom trip version - especially with the expansions. Bethesda at their peak.

The 8xAA looks 'better' but the no AA crisp with the oversaturated colors kind of have that oblivion mood. When I got my mitts on the 8800GT you could do HDR with CSAA and a 60fps vsync lock, which back then was like the pinnacle of gaming graphics for me.
 

Outback Bronze

Super Moderator
Staff member
Joined
Aug 3, 2011
Messages
2,077 (0.42/day)
Location
Walkabout Creek
System Name Raptor Baked
Processor 14900k w.c.
Motherboard Z790 Hero
Cooling w.c.
Memory 48GB G.Skill 7200
Video Card(s) Zotac 4080 w.c.
Storage 2TB Kingston kc3k
Display(s) Samsung 34" G8
Case Corsair 460X
Audio Device(s) Onboard
Power Supply PCIe5 850w
Mouse Asus
Keyboard Corsair
Software Win 11
Benchmark Scores Cool n Quiet.
When I got my mitts on the 8800GT

I was running an X800 XTPE at the time of oblivion so only pixel shader 2.0 when I first started playing it. Then got a 7800 GT which would allow me to run Pixel Shader 3.0. Yes, they were great graphics for that era but it wasn't until the 8800GTS 640mb when I was running Crysis in DX10 did I think that was the Pinnacle of graphics and for some time I might add.

I haven't read the whole thread guys, but the 9070 XT doesn't look too bad if they price it competitively.

Anybody got any idea of these cards pricing atm?
 
Joined
Jan 14, 2019
Messages
13,278 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
That game was amazing -- I am partial to the HDR oversaturated mushroom trip version - especially with the expansions. Bethesda at their peak.

The 8xAA looks 'better' but the no AA crisp with the oversaturated colors kind of have that oblivion mood. When I got my mitts on the 8800GT you could do HDR with CSAA and a 60fps vsync lock, which back then was like the pinnacle of gaming graphics for me.
I agree. That game made me swap my amazing ATi X800 XT for an overheating, loud mess of a card known as 7800 GS AGP just to be able to play it with HDR. Good old times! :)

And we have people here saying that I don't care about features. Of course I do when they're good. The problem with features these days is that they either make your game run like a slideshow (RT), or make it a blurry mess (upscaling), not to mention manufacturers use them as excuses to pay more for cards that don't have any business being in the price range that they're in, which I find disgusting.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,328 (1.29/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
With HDR:
Honestly love this one with the colour saturation, sky highlight and seemingly deeper contrast. 6800Ultra was the first top GPU I ever bought and it was great to taste those visuals.

Pity for me, much like CP2077, it's just not quite my kind of game from an actual gameplay perspective.

For RT, clearly I'm an enjoyer but that doesn't mean I vouch for universally turning it on in every game, every situation and so on. But boy I've had times it has absolutely added to the visual immersion and to an extent - blown me away.

AMD's talk and posturing would seem to suggest they are focusing on it too, seeing's it's merit in customer attraction and a more rounded capable product. They already have a fairly dedicated crowd of people that are all about their cards, their bigger issue is getting the ones that don't currently use them, and perhaps haven't for a few generations, to come back/jump on.
 
Last edited:
Joined
Dec 25, 2020
Messages
7,229 (4.89/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
RT reminds me of what Nvidia did several moons ago with HDR - Pixel Shader 3.0. If my memory serves me correctly it was the 6xxx series that brough HDR which was their sales pitch.

ATi at the time with their X800 series was just running Pixel Shader 2.0. I'm going to leave an example here of the difference of HDR on/off from back in the old days with Elder Scrolls IV: Oblivion

With HDR:
View attachment 379337

Without HDR:View attachment 379338

I used to think HDR was the shizz nizz back in the day but as I've been playing over the years, I've been noticing that the HDR colours were very bright and that bloom was a more proper representation of real life colours.

You could also run 8x AA on Bloom but not HDR which I have done with the example screen shots shown above.

Which one do you think is better?

Great screen shots of your RT implementation btw. Nice Work!

Ah, that was DIrectX 9.0c. Shader Model 3.0 brought parity between DirectX on Windows and the Xbox 360's graphics capabilities, Oblivion's bloom shader was a fallback path for older DirectX 9.0b cards like the GeForce FX series that were about 3 years old when it came out. Oblivion really stretched the pre-unified shader GPUs to the max, and IMO it still looks stunning to this day. No official pricing info on the new Radeon cards either, I reckon it comes soon. That being said...

STOP RIGHT THERE, CRIMINAL SCUM. Nobody plays Oblivion with the nasty bloom shader on my watch. I'm confiscating your stolen goods. Now pay your fine, or it's off to jail.
 

Outback Bronze

Super Moderator
Staff member
Joined
Aug 3, 2011
Messages
2,077 (0.42/day)
Location
Walkabout Creek
System Name Raptor Baked
Processor 14900k w.c.
Motherboard Z790 Hero
Cooling w.c.
Memory 48GB G.Skill 7200
Video Card(s) Zotac 4080 w.c.
Storage 2TB Kingston kc3k
Display(s) Samsung 34" G8
Case Corsair 460X
Audio Device(s) Onboard
Power Supply PCIe5 850w
Mouse Asus
Keyboard Corsair
Software Win 11
Benchmark Scores Cool n Quiet.
Nobody plays Oblivion with the nasty bloom shader on my watch.

It's funny you know, I used to not touch bloom when I was running GeForce cards, then one day I tried it and pow, I was hooked. Not sure if it was the 8xAA that was helping through the forests or what.

I remember when I first turned on HDR vs Bloom with the GeForce cards, gees it chewed the card. Kinda why I'm referencing it a bit to what RT does this day and age.

Thread/
 
Joined
Dec 25, 2020
Messages
7,229 (4.89/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
It's funny you know, I used to not touch bloom when I was running GeForce cards, then one day I tried it and pow, I was hooked. Not sure if it was the 8xAA that was helping through the forests or what.

I remember when I first turned on HDR vs Bloom with the GeForce cards, gees it chewed the card. Kinda why I'm referencing it a bit to what RT does this day and age.

Thread/

It's true, though. In the beginning it was hardware transform and lighting, then HDR rendering effects, then GPU accelerated physics, tessellation, instancing, now raytracing... each generation of games has brought its own challenges to hardware and graphics drivers. RT is one of the most complex graphics techniques ever and widely considered to be the holy grail of computer graphics as it enables truly photorealistic scene generation, the problem is the simply ginormous amount of compute required to pull this off. NV uses AI as a crutch to achieve that goal, but true RT is probably within the next 5 GPU generations IMO.
 
Joined
Jan 14, 2019
Messages
13,278 (6.07/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
It's funny you know, I used to not touch bloom when I was running GeForce cards, then one day I tried it and pow, I was hooked. Not sure if it was the 8xAA that was helping through the forests or what.

I remember when I first turned on HDR vs Bloom with the GeForce cards, gees it chewed the card. Kinda why I'm referencing it a bit to what RT does this day and age.

Thread/
My favourite workaround to gain more performance was enabling HDR while disabling grass. Grass ate your GPU even harder than HDR, I'd say. Not to mention you could find your missed arrows a lot easier without it. :laugh:
 
Joined
Mar 31, 2012
Messages
867 (0.19/day)
Location
NL
System Name SIGSEGV
Processor AMD Ryzen 9 9950X
Motherboard MSI MEG ACE X670E
Cooling Noctua NF-A14 IndustrialPPC Fan 3000RPM | Arctic P14 MAX
Memory Fury Beast 64 Gb CL30
Video Card(s) TUF 4090 OC
Storage 1TB 7200/256 SSD PCIE | ~ TB | 970 Evo | WD Black SN850X 2TB
Display(s) 27" /34"
Case O11 EVO XL
Audio Device(s) Realtek
Power Supply FSP Hydro TI 1000
Mouse g402
Keyboard Leopold|Ducky
Software LinuxMint
Benchmark Scores i dont care about scores
I really wish that AMD could develop an open accelerator in AI and destroy the CUDA accelerator.
Don't sell only gimmicks. Proof that AMD!!
 
Joined
Feb 20, 2019
Messages
8,455 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
You literally said:

So which is it?
It's speculation. Nobody knows for sure.
It would be great if Navi44 was a 192-bit design with 12GB in it's full configuration. That would give us a 12GB XT model and maybe an 8GB cut down variant as the vanilla 9060.
Let's face it though, nobody outside of AMD and partners really knows yet, but the die-size leaks suggest that Navi44 is much smaller than Navi48, which doesn't make sense if it's supposed to have 75% of the hardware that Navi48 does. If the die-size leaks are accurate, Navi44 is somewhere between Navi 24 and Navi 23/33 - ie somewhere between the 6500XT and 6600-series in size.

Asus TUF leaks/rumours from WCCF, for example:
1736504758144.png


DigitalTrends:
1736504816068.png
 
Joined
Jan 2, 2024
Messages
30 (0.08/day)
Exactly. No one denies that RT is nice. The problem is the performance cost even on Nvidia, and the fact that it isn't really a night and day difference, just a little icing on the cake. If you turn it on, you see it's nice. But then you turn it off and still enjoy your game just the same. After 5-10 minutes, you don't even care.
This thing always lets me ashtonished. How is it possibile that a such useless and - at the same time - very very very expensive feature (both from hardware and price stand point) has reached a so prominent role in EVERY Gpu discussion among users?
Why people ALWAYS pop up with "eh, but the Ray Tracing..."?
 
Joined
Jan 19, 2023
Messages
273 (0.38/day)
This thing always lets me ashtonished. How is it possibile that a such useless and - at the same time - very very very expensive feature (both from hardware and price stand point) has reached a so prominent role in EVERY Gpu discussion among users?
Why people ALWAYS pop up with "eh, but the Ray Tracing..."?
Because it's not useless. If you want games with dynamic lighting, open worlds that also looks good you need RT in some way, shape or form.
It doesn't need to be full on Path Tracing, can be mixed with probes, or done in software like Lumen when it's tracing against SDF, or even AMD version of GI that also uses RT acceleration. It's either RT or we can go back to baked lighting or probes and we can stagnate and have the same looking games forever.
Shadows are the same story.

Sure many games just implement it just to say they have RT, and it's only a gimmick that can be turned off but that is on the devs of that game, not on the tech itself.
 
Top