- Joined
- Nov 14, 2012
- Messages
- 1,693 (0.39/day)
System Name | Meh |
---|---|
Processor | 7800X3D |
Motherboard | MSI X670E Tomahawk |
Cooling | Thermalright Phantom Spirit |
Memory | 32GB G.Skill @ 6000/CL30 |
Video Card(s) | Gainward RTX 4090 Phantom / Undervolt + OC |
Storage | Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server |
Display(s) | 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR |
Case | Fractal Design North XL |
Audio Device(s) | FiiO DAC |
Power Supply | Corsair RM1000x / Native 12VHPWR |
Mouse | Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro |
Keyboard | Corsair K60 Pro / MX Low Profile Speed |
Software | Windows 10 Pro x64 |
Do you have access to a review, or have a CPU in hand? Otherwise you're making baseless assumptions. We have no idea how these CPUs will perform. They'll likely be faster, but by how much, and in which scenarios? Without knowing this, speculation is useless.
Good for you? Most people don't play PC games on their TVs (for the most part, though it is definitely gaining popularity). And as I said, DLSS is great, and I think smart upscaling tech is a major part in the future of real-time 3D graphics. But closed-off, single-vendor stuff doesn't tend to last.
Poorly executed amateur injections does not say anything about the quality of FSR, it says something about the skill and/or methods used by those implementing it. I, at least, expect game developers to be reasonably competent. And I never said DLSS wasn't superior - it tends to win out slightly either in quality, framerate, or both. But that ultimately doesn't matter if FSR is more broadly adopted due to ease of implementation or other factors. We'll see how this plays out.
Yes, but who has said that here? Nobody. So, can you please stop with the straw man arguments?
There are still plenty of people buying reasonably high end GPUs for 1080p gaming - those with 360Hz displays, for example. That's also a small group, but I'd wager it's bigger than those with 2160p120/144 displays. I don't think that's reasonable either, but then my tastes aren't universal. I would assume yours aren't either - that's generally not how tastes work, after all.
They're going to have access to the 5nm node when they have products ready for it - fab orders are placed well in advance, after all, and products are taped out long before they are put into mass production.
Lolwut? 6600 XT outperforms the 3060 in pretty much everything, and at MSRP is very evenly matched with the Ti (it's a bit cheaper and a bit weaker). The 6700 XT is a bit of an odd duck in terms of value, that's true - the 6800 makes it look bad, and IMO it should have been a $399 card. But I don't control that, so ... *shrug*
As for who sold better - so what? Nvidia has a huge marketshare and mindshare advantage. Given their rough 80/20 market split, it would be shocking if the 3060/- Ti didn't outsell similarly priced AMD GPUs by ~4x. That's what the market share tells us to expect, after all.
Well, that's that mindshare thing I mentioned. AMD is struggling with an undeserved bad image for driver issues and a lack of features (their current drivers are no more buggy than Nvidia's, and they generally have a wider featureset than Nvidia, though the 5700 XT did have some serious issues that screwed up a years-long track record of well-maintained drivers for AMD), something that takes a very long time to rectify, especially when you're a much smaller player in the market. Going for value/price is one way forward, but also one that risks continuing these stigma among users ("Nvidia is best, AMD is cheap"), which is ultimately counterproductive. They're clearly playing a long game in order to position themselves as equal to Nvidia, which they largely are, despite their much smaller size and budgets. IMO, that's the only way towards growing or even maintaining their share in a competitive market. Positioning yourself as a cheaper also-ran is not a recipe for success. How Intel will play into this I have absolutely no idea about - that depends on how their GPUs perform, how bad their drivers and support are, and how much they're willing to put into marketing. I'm guessing the start of it will be a combined bombardment of ads + terrible drivers, but how it develops from there is anyone's guess.
AMD's next consumer CPUs will be Zen3 with V-cache. For AM4. This is not news. AMD confirmed this months ago.
If you buy 6900XT or 3080Ti/3090 for 1080p gaming, even tho you are running 360 Hz, then you are clueless about how gaming PCs work. You will be CPU (and somewhat RAM) bound anyway and a card like 6700XT/3060 Ti will deliver pretty much the same performance in esport titles which people buying these monitors are playing anyway. Going 1080p/360Hz for maxing out AAA games is downright stupid because the 80% pixel increase going to 1440p will deliver way more impressive visuals while still delivering very high fps.
Most "pro gamers" are NOT using high-end GPUs for 1080p and below. Games like CSGO etc requires pretty much NOTHING from the GPU. Streamers generally use higher-end cards, but they tend to run 1440p and above too.
I'm running 1440p at 240 Hz and I'm CPU bound in most cases too, unless I'm maxing out demanding AAA games. Even in Warzone I'm pretty much at 200+ fps at all times with GPU usage going dropping. BF2042 later today should probably require more, I still expect 120 fps minimum tho, but I will aim for 200+ like usual, using custom settings, while retaining 95% of Ultra IQ. Game has DLSS tho, so that will be tested.
There have been plenty of Alder Lake leaks showing big leaps in several benchmarks. I don't deny them, like some people do tho. Even tho I'm not interested in buying Alder Lake at all (maybe for laptop, not for desktop).
Last edited: