What driver support exactly? That they support their cards 10 years instead of 8 years from AMD?It is. And it's, in my opinion, currently some $250 overpriced. That'd be how much I value Nvidia's driver support and ecosystem features, $250.
Pretty control panels don't make a quality KMD. Much less a feature complete one. Has AMD already implemented DX11 driver command lists?
Feature complete it is for AMD. Nvidia is playing catchup with the app.
Most games released now are DX12. Why devote resources to DX11?
Exactly. For a "software company" their consumer facing software sure sucks.I expect a better control panel when the leather jacket man charges a premium for it, Nvidia claims they're a software company after all.
AMD has their share of screw ups, however the mindshare works when reviewers are always finding reasons to criticize AMD.
Nothing is ever certain in life except for death and taxes but it makes sense. Since RDNA4's high end (meaning 899 and 999 cards) were canned then that leaves only the 500 range 8700/8800 series.And how are we completely sure it's gonna be just $500?
One month ago. The release was at the end of October. Is there something not working that they need to release new drivers NOW?To leave the gDPU market. Have you noticed that they haven't released a new software since October?
That is two months ago!
Ah yes. How many times have the doomsayers been predicting AMD's downfall?Now, if Radeon 8000 turns out to be a flop and the sales do not improve, their market share will go under 5%, maybe even 2-3%.
In which case they will no longer have money for R&D, and they will stop all dGPU projects, if currently any.
Im still waiting. When Intel entered the market people were claiming that AMD will quickly fall to under 5% and yet here we are years later with Intel at 0%.
Tell me what was the last AMD GPU flop? I think it might have been either Fiji due to it's 4GB or VII. That was more than six years ago.
Consoles are fast moving to 60fps. Even Mark Cerny said that he was surprised at this development but it makes sense. Standards have raised. Once console players got taste of 60fps they were bound to reject 30fps even if it came with better visuals. On PC the minimum acceptable framerate has risen even higher. I now see many people saying 90 is their minimum. 144Hz monitors are very cheap now so it makes perfect sense.People don't play games with a frame counter visible. While it is essential, don't expect people out there looking how to secure a 60+ fps frame rate. Even 20-30fps will look as smooth gameplay to many out there and if you ask them they wouldn't know what framerate they have. Don't forget that consoles many times are targeting 30 fps not 60.
A buyer who spends 2000 on a card will not be satisfied with 60fps even if it's 4x faster than competition. Why would this buyer care about competition at that point? They already bought the card. Now they want to enjoy maxed out high refreshrate gaming, not some 60fps slog in a tech demo that's nice to look at for a while but gets boring really fast.Also someone having payed $2000 would want to see that that $2000 graphics card can be 3-4 times faster than the $1000 model from the other company. And for that person, 60fps will be more than enough.
GDDR7 will not be cheap and Nvidia will use it across the board on 50 series from only one supplier. Super models are a scam. Slightly lower prices for miniscule performance improvement enticing people to upgrade. In reality it's one step back from a two step forward situation.you tell me that they wouldn't lower prices? They will. Even in gaming, the fact that they lowered prices and released the Super models when AMD's prices got too low, shows that they will react.
But sure. You wait your lower blackwell prices. I recon you'll be waiting a while...
Yes it was. Well before this whole RT thing came about and tanked performance on even the fastest cards to near unacceptable levels...That was literally never the case. I had a bunch of high end cards the last 20 years. They all struggled to play the demanding games of their time. Watchdogs 2 was dropping to 20 frames at 1080p on my brand new 1080ti. What are you talking about man?
I really dont know how you managed 20fps on a 1080 Ti on WD2 as TPU's review shows it achieving 40+ even at 4K and this was in 2017 when 4K was much more rare than today. Unless you ran at 8K or 4K with a really weak CPU it makes little sense. https://www.techpowerup.com/review/nvidia-geforce-gtx-1080-ti/27.html