- Joined
- Nov 15, 2024
- Messages
- 175 (2.19/day)
Ah. I've got a mod bookmarked but It looks like that just improves upon what is already there. It's on the backlog!Control? It was THE poster game for RT (and DLSS) with Turing.
Ah. I've got a mod bookmarked but It looks like that just improves upon what is already there. It's on the backlog!Control? It was THE poster game for RT (and DLSS) with Turing.
System Name | Bragging Rights |
---|---|
Processor | Atom Z3735F 1.33GHz |
Motherboard | It has no markings but it's green |
Cooling | No, it's a 2.2W processor |
Memory | 2GB DDR3L-1333 |
Video Card(s) | Gen7 Intel HD (4EU @ 311MHz) |
Storage | 32GB eMMC and 128GB Sandisk Extreme U3 |
Display(s) | 10" IPS 1280x800 60Hz |
Case | Veddha T2 |
Audio Device(s) | Apparently, yes |
Power Supply | Samsung 18W 5V fast-charger |
Mouse | MX Anywhere 2 |
Keyboard | Logitech MX Keys (not Cherry MX at all) |
VR HMD | Samsung Oddyssey, not that I'd plug it into this though.... |
Software | W10 21H1, barely |
Benchmark Scores | I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000. |
The 5090 is going to be vaporware; Outside of the very wealthy willing to pay more than businesses who will be queueing for stock and able to justify spending 3000, 4000, 5000+ on a 5090.I don't think the 5090 and 5080 are that important in this regard. They're both priced way out of reach of people looking for a 5070-level card. We've also learned in the last 2-3 generations that the performance of the halo card has little to no effect on the rest of the product stack. Personally, I won't even read their reviews in entirety, just the architectural differences, because how much faster, hungrier and more expensive we can go above the 4090, I honestly don't care.
System Name | The Workhorse |
---|---|
Processor | AMD Ryzen R9 5900X |
Motherboard | Gigabyte Aorus B550 Pro |
Cooling | CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front |
Memory | GSkill Trident Z 3200CL14 |
Video Card(s) | NVidia GTX 1070 MSI QuickSilver |
Storage | Adata SX8200Pro |
Display(s) | LG 32GK850G |
Case | Fractal Design Torrent (Solid) |
Audio Device(s) | FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone |
Power Supply | Corsair RMx850 (2018) |
Mouse | Razer Viper (Original) on a X-Raypad Equate Plus V2 |
Keyboard | Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black) |
Software | Windows 11 Pro (24H2) |
There’s another reason that’s not exactly gaming related or at least not fully - a lot of professional software, be it 3D rendering, video editing or compute still are based around using shader cores for doing what they are doing. NV can’t really just come out with an architecture lacking those capabilities since they’d very much like to sell to that market as well. Tensor Cores and RT cores are a solution for THAT particular problem as much (well, more so) as for gaming - separating pure compute and RT hardware to make sure they are still present for most common professional uses down the line when unified shaders will start to get wound down.Because if NVIDIA doesn't increase its generation-on-generation performance in every aspect, the reviewers and buying public are going to trash them, and their investors will be mad. So NVIDIA, which literally doesn't care about rasterisation anymore, has to keep delivering linear rasterisation improvements anyway with more of the same old fixed-function rasterisation hardware, at the same time they deliver far greater RT improvements with new fixed-function hardware. At some stage I expect that they will either merge the rasterisation and RT hardware somewhat to prevent so much duplication, or be able to reuse the RT hardware to emulate rasterisation workloads, or possibly some combination of the two. Knowing NVIDIA they're already working hard on this problem.
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
Do they really?they deliver far greater RT improvements with new fixed-function hardware.
It doesn't look like it for now. This is Turing vs Blackwell. I assume that if Nvidia really was working on this problem, then we would see at least some indication by the 4th generation of their RT hardware.At some stage I expect that they will either merge the rasterisation and RT hardware somewhat to prevent so much duplication, or be able to reuse the RT hardware to emulate rasterisation workloads, or possibly some combination of the two. Knowing NVIDIA they're already working hard on this problem.
I worded it badly. I'm tired after work.Close your eyes and tell me how much you see. Now reconsider your statement that light is "just a portion" of the world.
Possibly.There are multiple reasons for this.
- Rasterisation has become incredibly good at simulating the real world, so good that the basic RT we currently have isn't able to outperform it visually. That's a consequence of literally decades of work on rasterisation, and far less on RT.
Why would you think that RT is always correct? I've seen it make errors. There was someone posting a screenshot in another thread not long ago of RT casting shadows that it logically shouldn't.
- You've become used to how rasterisation simulates the real world, so games using rasterisation don't look "off" to you, even when their rendering is actually incorrect compared to the real world.
Then why does my brain get used to no RT equally quickly? My conclusion is that both raster and RT do a pretty good job these days, just differently.
- Conversely your brain gets used to RT quickly, because the latter does such a good job at simulating the real world.
So much markup for AIB cards! You could buy a whole GPU for such price. Total ripoff on AIBs, total ripoff on cunsumers.The 5080 will be the GPU to try and get, and if you want one get in the limited-stock queue for the $999 founders edition directly from Nvidia because all of the AIB models are being listed at $1200+
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Its the future, but not the way it is pushed today. You've pointed out the issues, but the introduction of RT 'just before we hit Moore's Law wall' is a business strategy Nvidia deployed knowing there is an insurmountable challenge to be met. Its a fantastic business model: you can never solve this issue in real time, GPUs will never be fast enough to brute force everything. Remarkably similar to AI.Completely and utterly wrong. Real-time ray tracing is the future of graphics rendering, despite how many times you and others like you attempt to poo-poo it. Its uptake has simply been delayed by a number of factors:
RT has been the holy grail of graphics rendering forever. We may not yet be able to hold that grail, but we can at least touch it. If you'd suggested the latter would be a possibility to any computer graphics researcher a decade ago, they'd have laughed you out of the room - and yet here we are.
- It's a complete paradigm shift compared to rasterisation - you don't just need appropriate tools, but the appropriate mindset. Game developers who were born and raised on rasterisation are going to take time to get to grips with RT, and especially will have to unlearn a vast quantity of the stupid bulls**t hackery required to coerce rasterisation to render things somewhat realistically.
- Game development is no longer about pushing the boundaries of technology, but making money. Even if developers want to implement RT, their managers aren't necessarily going to let them because of the extra training and development time, and thus cost. This creates inertia.
- Hardware isn't quite powerful enough to handle it yet. You might say "then it shouldn't have been introduced", but you need to make game developers aware of and comfortable with a technology sooner rather than later.
- Hardware isn't getting powerful enough at a fast enough rate to handle it. Unfortunately RT was introduced just before we hit the Moore's Law wall, which is particularly important given how hardware-intensive RT is.
You don't like RT, we get it, but stop allowing that irrational dislike to blind you to the fact that RT is, in every aspect, the future of realistic graphics rendering that is superior to rasterisation in every conceivable way. In another decade, the only conversation about the latter will be in relation to graphics from before the RT era.
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
Yet, here we are, brute forcing our way into everything for 4 Nvidia generations straight. The architecture doesn't change much, we just get more parts crammed into a tighter space.Its the future, but not the way it is pushed today. You've pointed out the issues, but the introduction of RT 'just before we hit Moore's Law wall' is a business strategy Nvidia deployed knowing there is an insurmountable challenge to be met. Its a fantastic business model: you can never solve this issue in real time, GPUs will never be fast enough to brute force everything.
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Yeah we are... with abysmal performance and artifacting everywhere. WooptiedooYet, here we are, brute forcing our way into everything for 4 Nvidia generations straight. The architecture doesn't change much, we just get more parts crammed into a tighter space.
If RT really was the future, then I'd like to see some indication that we're moving towards more RT-oriented architectures. But for now, RT cores are still just an add-on, and not really improved, either.
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
It does, and hey... isn't that director's touch exactly the same touch you wanted on that archaic lighting on all those awesome games we already have?It does look awesome in places. I just think you need a directors touch. Reminds me of procedural generation in that aspect.
Processor | Core i7-12700 |
---|---|
Motherboard | MSI B660 MAG Mortar |
Cooling | Noctua NH-D15 |
Memory | G.Skill Ripjaws V 64GB (4x16) DDR4-3600 CL16 @ 3466 MT/s |
Video Card(s) | AMD RX 6800 |
Storage | Too many to list, lol |
Display(s) | Gigabyte M27Q |
Case | Fractal Design Define R5 |
Power Supply | Corsair RM750x |
Mouse | Too many to list, lol |
Keyboard | Keychron low profile |
Software | Fedora, Mint |
Yep. I'm a broken record on this, but I must point out that Cyberpunk's pathtracing is also limited to two rays and two bounces.The 5090 is a 750mm2 GPU - it hits 29 FPS in PT Cyberpunk.
I don't think Star Citizen uses ray tracing, but I think they blend static and dynamic lighting together well, along with procedural and more tailored environments. They are probably a good example of how unfeasible it is to master your own engine these days, although I did hear mismanagement plays its part. Look at scientific progress, disregarding the lone antivax geniuses. How did Carmacks foray into AI go?! Speaking of which, haven't "they" started to spit out PHD level thesis? DOGE comes to mind. It would certainly be nice if progress translated into gameplay / more interactive environments. I mentioned that Black Parade mod for Thief earlier, compare that to Thief 4. I've also read that Unreal Engine 5.4 is an improvement in many ways, although I'm not sure how much there is beneath the veneer in that Matrix Demo. Apt that it's set in New York.It does, and hey... isn't that director's touch exactly the same touch you wanted on that archaic lighting on all those awesome games we already have?
That's the point. Shitty devs aren't going to be any less shitty because they can optimize a workflow now. They're just going to have a lower budget to work with. It is the same thing @BSim500 just pointed out and I did too in another post elsewhere; those hours saved on doing lighting aren't going to be spent elsewhere. They're going to be cut. I have yet to see the numbers of both approaches, too. Is it really faster, really cheaper? Or will you never really master your own engine and game that way and develop the same efficiency yourself? We're already seeing that happen in front of us with the stream of UE engine based games that run like absolute horse manure and don't even look good doing so. The gameplay is often nothing to write home about either. But yeah, they managed to release a game. yay. They also managed to foot part of their bill to our GPUs.
The whole using lights to give the illusion of windows in reflections sprang to mind!The thing I always come back to is the analogy to movies. The main reason that movies are so much more expensive (and more importantly, the main reason they tend to look so much more expensive) than traditional television is the lighting. In traditional TV, you don't have time to mess around with the lighting; you just film the scene on a set with static lighting and call it done. In movies, by contrast, lighting is meticulously micromanaged, sometimes altered several times in the same scene.
They sure look pretty in Flight Sims though, and Star Citizen!No one in his right mind enabled "Volumetric Clouds" in e.g. AC Odyssey. RT's in a similar spot, not always, but most of the time. The main difference is hype.
Processor | Core i7-12700 |
---|---|
Motherboard | MSI B660 MAG Mortar |
Cooling | Noctua NH-D15 |
Memory | G.Skill Ripjaws V 64GB (4x16) DDR4-3600 CL16 @ 3466 MT/s |
Video Card(s) | AMD RX 6800 |
Storage | Too many to list, lol |
Display(s) | Gigabyte M27Q |
Case | Fractal Design Define R5 |
Power Supply | Corsair RM750x |
Mouse | Too many to list, lol |
Keyboard | Keychron low profile |
Software | Fedora, Mint |
Right, and I guess that's part of the point too. Volumetric clouds aren't always bad, but in that particular game, the highest setting incurred something like a 40% perf penalty, IIRC even indoors, and the visual difference, even if you turned your camera up to stare at the sky, was undetectable. Odyssey's Volumetric Clouds are an extreme example, but most AAA games have at least one or two expensive-but-pointless settings, which is why running at stock Ultra is widely considered dumb.They sure look pretty in Flight Sims though, and Star Citizen!
System Name | Unimatrix |
---|---|
Processor | Intel i9-9900K @ 5.0GHz |
Motherboard | ASRock x390 Taichi Ultimate |
Cooling | Custom Loop |
Memory | 32GB GSkill TridentZ RGB DDR4 @ 3400MHz 14-14-14-32 |
Video Card(s) | EVGA 2080 with Heatkiller Water Block |
Storage | 2x Samsung 960 Pro 512GB M.2 SSD in RAID 0, 1x WD Blue 1TB M.2 SSD |
Display(s) | Alienware 34" Ultrawide 3440x1440 |
Case | CoolerMaster P500M Mesh |
Power Supply | Seasonic Prime Titanium 850W |
Keyboard | Corsair K75 |
Benchmark Scores | Really Really High |
Nvidia is investing tons in RT. It's people poopooing it, "it so hard", "performances sucks", etc, etc. The minute Nvidia doesn't invest as much in raster and general performance (like in the 50xx), and develop new tech to improve performance, people complain that it's "fake frames", bla bla bla. Frame generation was designed so that more people can run RT.If that's the case, then why do we have the same ratio of raster vs RT hardware even on Nvidia GPUs since Turing? If RT is the way to go, then surely we should be seeing some indication of at least Nvidia investing in it more heavily than in raster and/or general performance, right? - This isn't a form of disagreement, more like a genuine question.
RT only simulates lights and shadows as far as I know. They're just a portion of the world around us, not the entirety of it.
System Name | Main PC |
---|---|
Processor | 13700k |
Motherboard | Asrock Z690 Steel Legend D4 - Bios 13.02 |
Cooling | Noctua NH-D15S |
Memory | 32 Gig 3200CL14 |
Video Card(s) | 4080 RTX SUPER FE 16G |
Storage | 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red |
Display(s) | LG 27GL850 |
Case | Fractal Define R4 |
Audio Device(s) | Soundblaster AE-9 |
Power Supply | Antec HCG 750 Gold |
Software | Windows 10 21H2 LTSC |
Also that futuristic Turing generation runs like crap on modern RT as well, in the 5080 review only the 2080ti made it on to the graph (near the bottom).Its the future, but not the way it is pushed today. You've pointed out the issues, but the introduction of RT 'just before we hit Moore's Law wall' is a business strategy Nvidia deployed knowing there is an insurmountable challenge to be met. Its a fantastic business model: you can never solve this issue in real time, GPUs will never be fast enough to brute force everything. Remarkably similar to AI.
I have no issues with RT. As pointed out, its already actively being used for lots of games. I hate doing it in real time, on an entire scene, introducing an ungodly amount of latency, and I also hate paying excessive money for it like we see today. The 5090 is a 750mm2 GPU - it hits 29 FPS in PT Cyberpunk. And the cost of that 750mm2 GPU isn't going down either. The gap's just too large, and as long upscaling isn't perfect (and its not), will remain so. We can be all happy about DLSS4 now, but the latency is here to stay regardless.
So far, the overall situation and deal I'm offered just still looks unconvincing and more like an Nvidia clusterfuck than anything else. Not convinced. Not buying into it.
Its a similar thing to me as VR. Sure, there are some niche situations where it really makes a dent (especially if you run into your TV)... but its not viable economically yet. You require an expensive headset (that's not perfect either), higher FPS thus more GPU, and a special suite of games. It hasn't taken off, and it won't, with that set of conditions. Now, for RT, you need an expensive GPU (that's not going to last either, and effectively already struggles from day one), you need an upscale to get playable FPS, and you need a special suite of games. See the similarities?
Now, some reflection on the beginning of this circus:
Back when SIGGRAPH happened and Huang told us this was the future, and Turing launched shortly after... a lot of people shared the idea this could take 2-3 generations before it actually took off and 10 years for the real change. Where are we now? 3 generations, six years ahead... 95%+ of all games are still built entirely on a non-RT framework. So we have four years left for that paradigm shift. I think its safe to add another six on top.
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Its funny how a highly efficient and tried and tested approach is now called 'a hack' Way to downplay decades of refinement that runs well and looks good on hardware costing a fraction of what you need today.Also that futuristic Turing generation runs like crap on modern RT as well, in the 5080 review only the 2080ti made it on to the graph (near the bottom).
Also for me visuals got good enough during the PS4's lifetime. I dont care if hacks were used to get there, thats not my concern as a consumer. I dont play games for realism, I play them to escape from it.
System Name | Bragging Rights |
---|---|
Processor | Atom Z3735F 1.33GHz |
Motherboard | It has no markings but it's green |
Cooling | No, it's a 2.2W processor |
Memory | 2GB DDR3L-1333 |
Video Card(s) | Gen7 Intel HD (4EU @ 311MHz) |
Storage | 32GB eMMC and 128GB Sandisk Extreme U3 |
Display(s) | 10" IPS 1280x800 60Hz |
Case | Veddha T2 |
Audio Device(s) | Apparently, yes |
Power Supply | Samsung 18W 5V fast-charger |
Mouse | MX Anywhere 2 |
Keyboard | Logitech MX Keys (not Cherry MX at all) |
VR HMD | Samsung Oddyssey, not that I'd plug it into this though.... |
Software | W10 21H1, barely |
Benchmark Scores | I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000. |
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
But at what cost? Terrible latency and artefacts and other visual glitches? No, thanks.Nvidia is investing tons in RT. It's people poopooing it, "it so hard", "performances sucks", etc, etc. The minute Nvidia doesn't invest as much in raster and general performance (like in the 50xx), and develop new tech to improve performance, people complain that it's "fake frames", bla bla bla. Frame generation was designed so that more people can run RT.
Hold that thoughtBut at what cost? Terrible latency and artefacts and other visual glitches? No, thanks.
There does seem to be a lot of negative sentiment toward Nvidia at the moment.I've just remembered the biggest reason that AMD are failing hard in the market after watching/reading a bunch of 5080 reviews - it reminded me of a couple of conversations I'd had with people over the last couple of years.
When AMD waits until after Nvidia's launch, the POPULAR, HEAVILY-SEARCHED, IMPORTANT coverage of the new Nvidia cards *doesn't* include AMD's answer.
For the next 18 month product cycle, people googling for "RTX 5080" are going to read or watch today's reviews. That's right, the "$1000" 5080 is better than the 7900XTX. "WTF is a 9070, man? It's not even on the charts!"
Where is the 9070XT? Nowhere in sight:
Zero exposure.Zero coverage in the first-impressions review cycl.Zero recommendations.Zero inclusion the discussion.
The 5080 will feature in 9070XT reviews, but the potentially abysmal price/performance compared to what AMD keep promising they'll deliver won't matter because the people who search for RTX 5080 will never see those reviews.
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
If you geared that comment towards me, then no. I'm not negative against Nvidia. I'm negative against a technology that doesn't work. I hate the AMD and Nvidia version of it equally.Hold that thought
There does seem to be a lot of negative sentiment toward Nvidia at the moment.
"You come for the king"...
Hold that thought as in you might be pleasantly surprised with the improvements in FSR etc. The negativity aspect was mainly about AMD having to release a decent product across the board, the negativity toward Nvidia at the moment, not necessarily gaming related, might give them an opportunity to win people over. I'm sure people have argued that it'll have to be better than decent given perception (like renewable uptake if we are going on tangents).If you geared that comment towards me, then no. I'm not negative against Nvidia. I'm negative against a technology that doesn't work. I hate the AMD and Nvidia version of it equally.
Processor | Various Intel and AMD CPUs |
---|---|
Motherboard | Micro-ATX and mini-ITX |
Cooling | Yes |
Memory | Overclocking is overrated |
Video Card(s) | Various Nvidia and AMD GPUs |
Storage | A lot |
Display(s) | Monitors and TVs |
Case | The smaller the better |
Audio Device(s) | Speakers and headphones |
Power Supply | 300 to 750 W, bronze to gold |
Mouse | Wireless |
Keyboard | Mechanic |
VR HMD | Not yet |
Software | Linux gaming master race |
Well, AMD f*ed up their marketing (surprise-surprise), while Nvidia just launched the 40-series again under a new name with an overpriced power hog on top. Let's see which blunder ends up being bigger.Hold that thought as in you might be pleasantly surprised with the improvements in FSR etc. The negativity aspect was mainly about AMD having to release a decent product across the board, the negativity toward Nvidia at the moment, not necessarily gaming related, might give them an opportunity to win people over. I'm sure people have argued that it'll have to be better than decent given perception (like renewable uptake if we are going on tangents).
Easy to say as we pick one up off the shelf (or click a button). If people could see germs...Well, AMD f*ed up their marketing (surprise-surprise), while Nvidia just launched the 40-series again under a new name with an overpriced power hog on top. Let's see which blunder ends up being bigger.
Negativity is through the roof in both camps, that's for sure.