- Joined
- Nov 15, 2024
- Messages
- 100 (1.41/day)
Ah. I've got a mod bookmarked but It looks like that just improves upon what is already there. It's on the backlog!Control? It was THE poster game for RT (and DLSS) with Turing.
Ah. I've got a mod bookmarked but It looks like that just improves upon what is already there. It's on the backlog!Control? It was THE poster game for RT (and DLSS) with Turing.
System Name | Bragging Rights |
---|---|
Processor | Atom Z3735F 1.33GHz |
Motherboard | It has no markings but it's green |
Cooling | No, it's a 2.2W processor |
Memory | 2GB DDR3L-1333 |
Video Card(s) | Gen7 Intel HD (4EU @ 311MHz) |
Storage | 32GB eMMC and 128GB Sandisk Extreme U3 |
Display(s) | 10" IPS 1280x800 60Hz |
Case | Veddha T2 |
Audio Device(s) | Apparently, yes |
Power Supply | Samsung 18W 5V fast-charger |
Mouse | MX Anywhere 2 |
Keyboard | Logitech MX Keys (not Cherry MX at all) |
VR HMD | Samsung Oddyssey, not that I'd plug it into this though.... |
Software | W10 21H1, barely |
Benchmark Scores | I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000. |
The 5090 is going to be vaporware; Outside of the very wealthy willing to pay more than businesses who will be queueing for stock and able to justify spending 3000, 4000, 5000+ on a 5090.I don't think the 5090 and 5080 are that important in this regard. They're both priced way out of reach of people looking for a 5070-level card. We've also learned in the last 2-3 generations that the performance of the halo card has little to no effect on the rest of the product stack. Personally, I won't even read their reviews in entirety, just the architectural differences, because how much faster, hungrier and more expensive we can go above the 4090, I honestly don't care.
System Name | The Workhorse |
---|---|
Processor | AMD Ryzen R9 5900X |
Motherboard | Gigabyte Aorus B550 Pro |
Cooling | CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front |
Memory | GSkill Trident Z 3200CL14 |
Video Card(s) | NVidia GTX 1070 MSI QuickSilver |
Storage | Adata SX8200Pro |
Display(s) | LG 32GK850G |
Case | Fractal Design Torrent (Solid) |
Audio Device(s) | FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone |
Power Supply | Corsair RMx850 (2018) |
Mouse | Razer Viper (Original) on a X-Raypad Equate Plus V2 |
Keyboard | Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black) |
Software | Windows 11 Pro (24H2) |
There’s another reason that’s not exactly gaming related or at least not fully - a lot of professional software, be it 3D rendering, video editing or compute still are based around using shader cores for doing what they are doing. NV can’t really just come out with an architecture lacking those capabilities since they’d very much like to sell to that market as well. Tensor Cores and RT cores are a solution for THAT particular problem as much (well, more so) as for gaming - separating pure compute and RT hardware to make sure they are still present for most common professional uses down the line when unified shaders will start to get wound down.Because if NVIDIA doesn't increase its generation-on-generation performance in every aspect, the reviewers and buying public are going to trash them, and their investors will be mad. So NVIDIA, which literally doesn't care about rasterisation anymore, has to keep delivering linear rasterisation improvements anyway with more of the same old fixed-function rasterisation hardware, at the same time they deliver far greater RT improvements with new fixed-function hardware. At some stage I expect that they will either merge the rasterisation and RT hardware somewhat to prevent so much duplication, or be able to reuse the RT hardware to emulate rasterisation workloads, or possibly some combination of the two. Knowing NVIDIA they're already working hard on this problem.
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
Do they really?they deliver far greater RT improvements with new fixed-function hardware.
It doesn't look like it for now. This is Turing vs Blackwell. I assume that if Nvidia really was working on this problem, then we would see at least some indication by the 4th generation of their RT hardware.At some stage I expect that they will either merge the rasterisation and RT hardware somewhat to prevent so much duplication, or be able to reuse the RT hardware to emulate rasterisation workloads, or possibly some combination of the two. Knowing NVIDIA they're already working hard on this problem.
I worded it badly. I'm tired after work.Close your eyes and tell me how much you see. Now reconsider your statement that light is "just a portion" of the world.
Possibly.There are multiple reasons for this.
- Rasterisation has become incredibly good at simulating the real world, so good that the basic RT we currently have isn't able to outperform it visually. That's a consequence of literally decades of work on rasterisation, and far less on RT.
Why would you think that RT is always correct? I've seen it make errors. There was someone posting a screenshot in another thread not long ago of RT casting shadows that it logically shouldn't.
- You've become used to how rasterisation simulates the real world, so games using rasterisation don't look "off" to you, even when their rendering is actually incorrect compared to the real world.
Then why does my brain get used to no RT equally quickly? My conclusion is that both raster and RT do a pretty good job these days, just differently.
- Conversely your brain gets used to RT quickly, because the latter does such a good job at simulating the real world.
So much markup for AIB cards! You could buy a whole GPU for such price. Total ripoff on AIBs, total ripoff on cunsumers.The 5080 will be the GPU to try and get, and if you want one get in the limited-stock queue for the $999 founders edition directly from Nvidia because all of the AIB models are being listed at $1200+
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Its the future, but not the way it is pushed today. You've pointed out the issues, but the introduction of RT 'just before we hit Moore's Law wall' is a business strategy Nvidia deployed knowing there is an insurmountable challenge to be met. Its a fantastic business model: you can never solve this issue in real time, GPUs will never be fast enough to brute force everything. Remarkably similar to AI.Completely and utterly wrong. Real-time ray tracing is the future of graphics rendering, despite how many times you and others like you attempt to poo-poo it. Its uptake has simply been delayed by a number of factors:
RT has been the holy grail of graphics rendering forever. We may not yet be able to hold that grail, but we can at least touch it. If you'd suggested the latter would be a possibility to any computer graphics researcher a decade ago, they'd have laughed you out of the room - and yet here we are.
- It's a complete paradigm shift compared to rasterisation - you don't just need appropriate tools, but the appropriate mindset. Game developers who were born and raised on rasterisation are going to take time to get to grips with RT, and especially will have to unlearn a vast quantity of the stupid bulls**t hackery required to coerce rasterisation to render things somewhat realistically.
- Game development is no longer about pushing the boundaries of technology, but making money. Even if developers want to implement RT, their managers aren't necessarily going to let them because of the extra training and development time, and thus cost. This creates inertia.
- Hardware isn't quite powerful enough to handle it yet. You might say "then it shouldn't have been introduced", but you need to make game developers aware of and comfortable with a technology sooner rather than later.
- Hardware isn't getting powerful enough at a fast enough rate to handle it. Unfortunately RT was introduced just before we hit the Moore's Law wall, which is particularly important given how hardware-intensive RT is.
You don't like RT, we get it, but stop allowing that irrational dislike to blind you to the fact that RT is, in every aspect, the future of realistic graphics rendering that is superior to rasterisation in every conceivable way. In another decade, the only conversation about the latter will be in relation to graphics from before the RT era.
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
Yet, here we are, brute forcing our way into everything for 4 Nvidia generations straight. The architecture doesn't change much, we just get more parts crammed into a tighter space.Its the future, but not the way it is pushed today. You've pointed out the issues, but the introduction of RT 'just before we hit Moore's Law wall' is a business strategy Nvidia deployed knowing there is an insurmountable challenge to be met. Its a fantastic business model: you can never solve this issue in real time, GPUs will never be fast enough to brute force everything.
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Yeah we are... with abysmal performance and artifacting everywhere. WooptiedooYet, here we are, brute forcing our way into everything for 4 Nvidia generations straight. The architecture doesn't change much, we just get more parts crammed into a tighter space.
If RT really was the future, then I'd like to see some indication that we're moving towards more RT-oriented architectures. But for now, RT cores are still just an add-on, and not really improved, either.
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
It does, and hey... isn't that director's touch exactly the same touch you wanted on that archaic lighting on all those awesome games we already have?It does look awesome in places. I just think you need a directors touch. Reminds me of procedural generation in that aspect.