• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-Gen GPUs: Pricing and Raster 3D Performance Matter Most to TPU Readers

Joined
Aug 12, 2010
Messages
130 (0.02/day)
Location
Brazil
Processor Ryzen 7 7800X3D
Motherboard ASRock B650M PG Riptide
Cooling Wraith Max + 2x Noctua Redux NF-P12
Memory 2x16GB ADATA XPG Lancer Blade DDR5-6000 CL30
Video Card(s) Powercolor RX 7800 XT Fighter OC
Storage ADATA Legend 970 2TB PCIe 5.0
Display(s) Dell 32" S3222DGM - 1440P 165Hz + P2422H
Case HYTE Y40
Audio Device(s) Microsoft Xbox TLL-00008
Power Supply Cooler Master MWE 750 V2
Mouse Alienware AW320M
Keyboard Alienware AW510K
Software Windows 11 Pro
In some cases RT might make games look better, but isn't worth the performance penalty nor is it a revolutionary technology.
I'm personally more interested in Unreal Engine 5 future implementations, such as Global Illumination.
 
Joined
Apr 13, 2022
Messages
1,174 (1.23/day)
The worst part is that only some elements (water, lighting, etc) are ray traced. The performance drop for this partial quality improvement is enormous and eventually unnoticeable after playing a fast paced game for a while.

As more elements are ray traced, the performance will drop to zero fps on today’s cards which effectively ‘zeros’ out any chance of future proofing.

Ray tracing is a scam that tries to justify high GPU prices. All manufacturers are in on it but none worse than Nvidia. I look forward to AMD and Intel bringing some sense back to the GPU market. Hopefully PC enthusiasts will reward these GPU makers with their hard earned cash as hoping better competition brings down Nvidia prices doesn’t make sense if the vast majority only buy Nvidia and refuses consideration of other GPUs due to brand loyalty or internet myths about quality. That didn’t work out so well for Intel fans for the past two gens of CPUs.

Ray tracing is not a scam. The hardware isn't their yet and it's going to take several generations for it to actually be there. We aren't going to see good ray tracing or good performance at 4k for any sort of remotely reasonable price (a little over 1000 USD) until consoles can pull off RT 4k 120fps on their SOC.

RT is also not just for gaming. Get gaming out of your head for a moment it's not the end all be all. RT is used in professional editing and has been for longer than it's been on nvidia cards. However having it on the card makes it much faster than doing it on workstations or clusters for professionals. As these GPUs cover consumer (gaming), creative, professional, and AI purposes you're not getting RT or AI off them. It's just going to take a while till you see a benfit in gaming.

The frustration with all this and nvidia is you keep looking at a GPU as something soley for gaming but it has never truly been that and when the 8800GTX hit with CUDA gaming was no longer even close to the biggest focus of a GPU.

AI upscaling is take it or leave it but most people need it to actually use a 4k monitor and people have been screaming for 4k playability and it just so happens that the same stuff that produces massive gains for actual productivity can also help hit 4k. It's better to have it than to not use something that has to be in any GPU now.
 

Am*

Joined
Nov 1, 2011
Messages
332 (0.07/day)
System Name 3D Vision & Sound Blaster
Processor Intel Core i5 2500K @ 4.5GHz (stock voltage)
Motherboard Gigabyte P67A-D3-B3
Cooling Thermalright Silver Arrow SB-E Special Edition (with 3x 140mm Black Thermalright fans)
Memory Crucial Ballistix Tactical Tracer 16GB (2x8GB 1600MHz CL8)
Video Card(s) Nvidia GTX TITAN X 12288MB Maxwell @1350MHz
Storage 6TB of Samsung SSDs + 12TB of HDDs
Display(s) LG C1 48 + LG 38UC99 + Samsung S34E790C + BenQ XL2420T + PHILIPS 231C5TJKFU
Case Fractal Design Define R4 Windowed with 6x 140mm Corsair AFs
Audio Device(s) Creative SoundBlaster Z SE + Z906 5.1 speakers/DT 990 PRO
Power Supply Seasonic Focus PX 650W 80+ Platinum
Mouse Logitech G700s
Keyboard CHERRY MX-Board 1.0 Backlit Silent Red Keyboard
Software Windows 7 Pro (RIP) + Winbloat 10 Pro
Benchmark Scores 2fast4u,bro...
Glad to see the results of this poll -- lines up perfectly with what I think.

Also if Ngreedia don't want to add more VRAM to their GPUs in fear of cannibalising their AI GPU sales, they can cut away most of the tensor cores and replace them with good old fashioned CUDA cores, TMUs and ROPs. Problem solved.
 
Joined
Oct 6, 2009
Messages
2,827 (0.51/day)
Location
Midwest USA
System Name My Gaming System
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte b650 Aorus Elite AX
Cooling Phanteks Glacier One 360D30
Memory G.Skill 64000 Mhz 32 Gb
Video Card(s) ASRock Phantom 7900XT OC
Storage 4 TB NVMe Total
Case Hyte y40
Power Supply Corsair 850 Modular PSU
Software Windows 11 Home Premium
While this Poll says TPU users care most about rasterization and pricing, it's a contradiction to see that most of the cards in use according to Steam are Nvidia cards. (I would also assume that this stat is represented on TPU also). Just shows how good Nvidia's marketing department was over the last few generations, otherwise AMD would have a much larger share of the market.
And before the fanboys get all up in arms, I've given both AMD and Nvidia my money several times. I go with whoever offers the best performance vs value! But I refuse, to buy another NVidia card until they stop gouging their customers, and stop selling chips that should have been classified as a lower model for a ridiculous price. I get they have a business to run, but their tactics are just shiesty right now. Selling RTX XX50 cards for 400+ dollars as RTXXX60 or 60ti cards is ridiculous, when they should cost $250 at max even with inflation.
AMD isn't a perfect little angel either, but nowhere near as bad at the moment.
 
Joined
Sep 17, 2014
Messages
22,442 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
While TPU voters might be caring about raster performance, 80% of the buyers care about RT performance and DLSS. That's Nvidia's market share.
Even SONY pressured AMD to get it's sh!t together and improve RT performance and stop fooling around like what they did with RDNA3.

Personally I am going to insist in what I was saying the day reviews of RX 7900XTX/XT came out. RT performance must be a priority because that's where all the marketing is. Also upscaling and Frame Generation today is seen as a God send gift, not as cheating, we are not in 200x where cheating was exposed as something negative. Today it's a feature. This means that raster performance is more than enough when combined with upscaling and Frame Generation, meaning what AMD needs to do is to focus on RT performance. Only then they can level the field with Nvidia in performance and force Nvidia to search for another gimmick to differentiate their cards, while subotaging of course the competition.
So 80% of buyers are idiots that can't see what's happening in front of them, then.

I think that's a good match with the realistic market conditions of the mainstream vs the niche. I bet the same ish 80% listens to the top music only, whatever gets aired, they listen. I bet the same happens wrt console ownership vs the gaming PC, 80/20, seems real.

But 20% of the market is still a multi billion dollar market, even if its a niche in a niche, go figure.
There's a place for all of it, and funneling all markets into a situation where they're overpaying for shitty graphics isn't The Way.

I don't think Nvidia sells cards better because of RT and DLSS. They position their products better, they market them better, their time to market is shorter, and they're first rather than last with new features. Features being much more than RT and DLSS. Those are just examples that are live today. Its really quite amazing AMD held on to some order of 40% share for so long, given its performance over the last few decades.

They simply need to do better and be actually consistent for a change. There are almost no two generations next to each other where AMD has made a simple move forward, doing what they did last time, executing their successful product strategy not once, but twice. It hasn't happened a single time since Nvidia's Kepler at least, well MAYBE with the HD7000 series, but then they just rebranded it to R-series for god knows what reason but here we are: no consistency. Suddenly a 7970 was 280x... They've been all over the place, and the customer loses trust. Its only logical and tháts where that extra 20% in market share loss was created. AMD has definitely bled some fanbase over the last few years, and they can blame only themselves. Also, bad product positioning/strategy overall: Fury X 4GB was a complete misfire, got eclipsed by the 980ti 6GB (go figure... Nvidia pulled the VRAM card on AMD, but even destroyed it at 1080p, and overclocked much better) and a year post release nearly lost all game support/optimization. Again: this kills trust.

Heck even I am not so sure I'll dive into another AMD GPU right now. Look at the per-game performance on some new titles. Its abysmal. Forget RT - AMD needs full focus on the basics first. Every time AMD needs another kick in the nuts to keep doing things right. RDNA2 was great, the consoles forced them to make a very solid driver and support cadence. Apparently they've reached that milestone now and the focus is off again. Its like... WTF dudes?
 
Last edited:
Joined
Feb 3, 2017
Messages
3,754 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
In some cases RT might make games look better, but isn't worth the performance penalty nor is it a revolutionary technology.
I'm personally more interested in Unreal Engine 5 future implementations, such as Global Illumination.
You do understand that the next step in UE5 GI (that already does raytracing) will move more and more into hardware-accelerated raytracing, right?
 
Joined
Sep 17, 2014
Messages
22,442 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Fact is, Ray Tracing is easier to work with for developers compared to baked lighting.
Right, so they're pushing part of the cost of development into our lap. Thanks, I guess?

The amount of bullshit they need to stack on top of one another to get there kills the performance, but ironically also kills the image quality.
 
Joined
Sep 5, 2024
Messages
33 (0.40/day)
Processor Ryzen 3700X
Motherboard MSI B550
Cooling DeepCool AK620. 3 x 140mm intake fans, 1 x 140mm exhaust fan
Memory 32 Gb DDR4 3000
Video Card(s) RX 6750 XT
Storage NVME, SATA SSD and NAS HDD
Display(s) Dell 24' 1440p, Samsung 24' 1080p
Case Fractal Design Define 7
Audio Device(s) Onboard
Power Supply Super Flower ATX 3.0 850w
Mouse Corsair M65
Keyboard Corsair mechanical
Software Win 10, Ubuntu 24.04 LTS
I'm honestly surprised Energy Efficiency was 16% I must me a luggite because I DGAF about it.

Each to their own I suppose. I have an inverter, battery and solar setup so even though I game on a desktop when I play at night I'm essentially on battery.
 
Joined
Nov 27, 2022
Messages
57 (0.08/day)
Ray tracing is good. It's nice.
Microsoft Ray tracing API is bad. It's a black box and so when you implement it in your game you have a vague idea about what it's gonna do. All that HUB video proves is this.
That's why Unreal does their own version of RT.

Another thing is speed of ray tracing in games would be acceptable if all of the CUDA cores could do RT. Instead what we got is a very small part of the whole GPU can do RT (1/128 to be exact on Ada cards). This also means we are very, very, very far away from games looking awesomely RTd and running fast at the same time.

In terms of Nvidia market share it's not about the average Joe buying a video card: OEMs and system integrators sell their PCs with Nvidia cards 90% of time (not to mention notebooks). Why? Because "AMD driver bad", at least that's what the management at these companies think/know about AMD and they don't want to deal with that. After someone bought their first PC/notebook if it works as intended they most likely won't switch to AMD.
So it's not about marketing. NV doesn't do jackshiet marketing because there is no need.

Btw this poll was conducted in a very small enthusiast bubble on the internet. These enthusiast bubbles tend to be more knowledgeable than the average and tend to be filled with more than average AMD users.
So view the results according to this.
 
Joined
Feb 3, 2017
Messages
3,754 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Microsoft Ray tracing API is bad. It's a black box and so when you implement it in your game you have a vague idea about what it's gonna do. All that HUB video proves is this.
That's why Unreal does their own version of RT.
UE does hardware accelerated raytracing using DX12 DXR.
 
Joined
Dec 24, 2022
Messages
78 (0.11/day)
How much die space does AI and RT make up on Ada and RDNA3. Wonder what the cost would be if these were cut out, or how much more raster you could fit on the same die space. RT is take it or leave it, and fg/upscaling can still be decent/good and could still be made even better using regular old shaders. Seems like in the near future every chip is going to have AI on it, id rather buy a dedicated AI card. Your CPU has an NPU, the integrated GPU has NPUs, your dedicated video card has NPUs. Let's just make the NPU it's own dedicated chip.

Could RT work be split out to like a daughter board, or a dedicated card? Have the RT calculations offloaded on that.

Always wondered if we could get more out of RT, AI and traditional GPUs if they were split out in their own individual card. Would have a ton more die space combined. Like imagine a dedicated RT card the size of big Ada and run fully path traced games on.
That'd actually be a great idea. There's nothing stopping the mainboard manufacturers to add a simple socket for additional chip. It'd add some latency and all, but it's not like it cannot be mitigated with software.
 
Joined
Nov 27, 2022
Messages
57 (0.08/day)
UE does hardware accelerated raytracing using DX12 DXR.
Yeah but they "simplified" it so it can run on hw without RT capabilities and doesn't rely solely on MS API. Megalights wants to circumwent the API.
Btw Ubisoft did their own thing in Avatar with the Snowdrop engine (HUB somehow forgot to look at it), they circumvented the "black box" and look at that game.

The MS API is good for the hardware sales.
 
Joined
Feb 3, 2017
Messages
3,754 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Yeah but they "simplified" it so it can run on hw without RT capabilities and doesn't rely solely on MS API. Megalights wants to circumwent the API.
Did you happen to wonder what Lumen HWRT means?

Edit:
For all the topics brought out it is pretty surprising how much strange understandings there are.

Why the hate on DXR? It is just an API, part of DX12. There is also Vulkan and Vulkan Ray-Tracing but that seems to have less support and clout. Partly because it came noticeably later and partly because Vulkan underneath it also needed push for adoption that kind of never came in the AAA space. Unreal Engine 5 is basically built to run on DX12, which is also a Microsoft API. As a sidenote - DX12 adoption was also very slow until Nvidia came with the RTX push that required a proper DX12 engine underneath to even start using DXR.

Unreal Engine or when talking about lighting solutions then Lumen is not a separate thing. Lumen is a marketing term for Unreal Engine 5 lighting engine. While the classical rendering lighting pipeline is still there everything beyond that is concentrated under Lumen. Practically, Lumen is a global illumination system that aims to replace a number of traditional components. As far as technologies it utilizes and hardware it is able to benefit from that is a pretty wide scale. It has a bunch of configuration targets for a game developer starting from distance fields based software ray tracing solution, then a hardware-accelerated hybrid ray-tracing next and eventually a full path tracing. The quality of the resulting image and hardware or performance requirements go up in that same scale.

Why the question about Lumen HWRT above? Because this is a HardWare accelerated hybrid Ray-Tracing solution being demonstrated. Really the differences between a path traced result and a less performance-intensive configuration.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,754 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Can someone explain to me what is it with this strange obsession with software when it comes to ray-tracing these days? :)
 
Joined
Jul 24, 2024
Messages
224 (1.79/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
In some cases RT might make games look better, but isn't worth the performance penalty nor is it a revolutionary technology.
I'm personally more interested in Unreal Engine 5 future implementations, such as Global Illumination.
Unlike DLSS-like stuff and fake frames generation, RT actually improves images. It is a step forward in achieving most realistic images. RT is extremely taxy on hardware resources. It reminds me times when Tesselation was a thing or even before that the 8xMSAA was a performance killer. RT performance will get better over time. Today it's nice, but expensive.

Then there is the other approach - let's render image in lower resolution, upscale it to native resolution while guessing missing image data by interpolation or such algorithms. This is a step backwards. This deviates from image realism, and bundling such stuff on top of each other just helps it to deviate even more. Sometimes I think: what the heck is the goal of game devs nowadays? They add RT to games but in order to run that game at reasonable FPS, you need to turn on DLSS/FSR/XeSS and frame generation. What's the point in adding that RT then? I mean you're increasing realism but straight after you're f*cking it up.

When Crysis or Metro was released back then, they were considered kind of "etalons" of game graphics. Performance was so bad, but at least it was about increasing image quality. Same will happen for RT over time.
 
Last edited:
Joined
Apr 10, 2020
Messages
504 (0.30/day)
It's still all about raster performace for me. Sure DLSS is nice to have, but then again higher raw raster frames also translates into higher DLSS frames. My rule of thumb is upgrade GPU when the new thing is at least 50% faster than the old one and 30% for a CPU. I used to upgrade every 2nd gen on average then it came to GPUs and every 3th gen in CPUs now it looks like I'm gonna be upgrading every 3th gen or even less frequently when it comes to GPUs and maybe every 4-5th gen then it comes to CPUs. It looks like Ngreedia/AMD/Intel don't want our money anymore as it's all in AI and servers atm. Well things might change in the future if/when AI buble bursts.
 
Joined
Dec 24, 2022
Messages
78 (0.11/day)
Unlike DLSS-like stuff and fake frames generation, RT actually improves images. It is a step forward in achieving most realistic images. RT is extremely taxy on hardware resources. It reminds me times when Tesselation was a thing or even before that the 8xMSAA was a performance killer. RT performance will get better over time. Today it's nice, but expensive.

Then there is the other approach - let's render image in lower resolution, upscale it to native resolution while guessing missing image data by interpolation or such algorithms. This is a step backwards. This deviates from image realism, and bundling such stuff on top of each other just helps it to deviate even more. Sometimes I think: what the heck is the goal of game devs nowadays? They add RT to games but in order to run that game at reasonable FPS, you need to turn up DLSS/FSR/XeSS and frame generation. What's the point in turning that RT then? I mean you're inreasing realism but straight after you're f*cking it up.

When Crysis or Metro was released back then, they were considered kind of "etalons" of game graphics. Performance was so bad, but at least it was about increasing image quality. Same will happen for RT over time.
I've seen plenty games with great visuals and those were before RT. Every game with RT I saw is basically shadows and lightning. That's not very great. There's a greater change in visual quality by simply changing from low to high and the devs could use those game engines and API's to the fullest.
 
Joined
Apr 30, 2011
Messages
2,703 (0.55/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
100+FPS @1080P and high quality settings are more than enough to me. VFM is the most critical parameter in my decisions of GPU purchasing selection. Next are stability, cooling capacity, efficiency, features in this specific order.
 
Joined
Dec 14, 2011
Messages
1,035 (0.22/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Corsair iCUE H115i Elite Capellix 280mm
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage Sabrent Rocket 1TB M.2
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Redragon K618 RGB PRO
Software Microsoft Windows 11 - Enterprise (64-bit)
Yeah but they "simplified" it so it can run on hw without RT capabilities and doesn't rely solely on MS API. Megalights wants to circumwent the API.
Btw Ubisoft did their own thing in Avatar with the Snowdrop engine (HUB somehow forgot to look at it), they circumvented the "black box" and look at that game.

The MS API is good for the hardware sales.
The graphics sure are impressive, however, I like the "fantasy" setting games have, don't want it too real, dunno about others, but that's how I feel about it. :)
 
Joined
Sep 13, 2020
Messages
139 (0.09/day)
I would love to trade these for a better GPU or a cheaper one:
  1. RT
  2. Upscaling & frame gen
  3. AI
RT is cool, but tanks FPS, and is not a big deal.
Frame-generation is a sin, and you guys will all to go hell for using it.
AI is just a promise for gaming. I would not pay for it rn. Imagine the AAA-trash released recently but with "AI features"... yeah...
 
Joined
Mar 31, 2009
Messages
99 (0.02/day)
Yea. UE 5 Lumen do great job! Hellblade 2 Senua's Saga the best pc graphics at the moment.
 
Joined
Aug 10, 2020
Messages
314 (0.20/day)
A lot of armchair engineering going on here. Rasterization is a dead end. Caring about more performance there instead of RT is pointless. There are too many things you can't go any further with on rasterization to care about or develop for now that we have acceptable RT performance with RTX or competing cores (including all related tech: DLSS, RR, FG, etc.). By next-gen consoles everything will have some element of RT going on for much more realistic lighting, shadows, reflections, global illumination, etc. and normal rasterization will start to look like "hacks" to simply avoid (SSR and SSAO are good examples that never looked very good).
 
Joined
May 6, 2020
Messages
78 (0.05/day)
Fact is, Ray Tracing is easier to work with for developers compared to baked lighting.


As if AMD, Intel and NVIDIA are going to listen, lol
Up to them, next gpus generation rtx 6xxx/ rx 8xxx will surely handle 4k 60fps+ easily there will be no need for that bullshit specially if graphics does not make a huge evolution !!
 
Top