• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Stares at Sales Ban as US-ITC Rules in Samsung's Favor in Patent Dispute

I don't think they lost as the DX12 race hasn't even begun. Pascal will be out before Arctic Islands and if history plays any favor I have a feeling we all know will win in the performance segment. On that note, I don't expect any true DX12 games to properly utilize the API to be released next year. Only the ones that partially utilize it like we had with the beginnings of DX11. I hope I'm wrong, but only time will tell.
NVIDIA is almost certainly going to TSMC 16nm where AMD is going to Global Foundries/Samsung 14nm. Even if NVIDIA maintains there architecture performance edge (probably will), AMD may be able to pack in more transistors for the same price erasing that edge. Not to mention, a huge advantage of DirectX 12 is async compute shaders and if NVIDIA doesn't make big changes in that area with Pascal, AMD will be way ahead where async compute is used (erases the difference in frame time).


Side note: maybe this lawsuit is why NVIDIA isn't considering Samsung fabs.
 
NVIDIA is almost certainly going to TSMC 16nm where AMD is going to Global Foundries/Samsung 14nm. Even if NVIDIA maintains there architecture performance edge (probably will), AMD may be able to pack in more transistors for the same price erasing that edge. Not to mention, a huge advantage of DirectX 12 is async compute shaders and if NVIDIA doesn't make big changes in that area with Pascal, AMD will be way ahead where async compute is used (erases the difference in frame time).


Side note: maybe this lawsuit is why NVIDIA isn't considering Samsung fabs.
I doubt it, but we have to stay optimistic for AMD right?

Maxwell V2 doesn't have that many Async queues because the target is DX11 performance. It'd be pretty stupid of Nvidia not to fix that with Pascal. Nvidia has been ahead of the game performance wise for a while and they should be able to see that developers are jumping onto DX12 like a fat kid on cupcakes. If not, their loss, and I'll have AMD GPUs.
 
I doubt it, but we have to stay optimistic for AMD right?

Maxwell V2 doesn't have that many Async queues because the target is DX11 performance. It'd be pretty stupid of Nvidia not to fix that with Pascal. Nvidia has been ahead of the game performance wise for a while and they should be able to see that developers are jumping onto DX12 like a fat kid on cupcakes. If not, their loss, and I'll have AMD GPUs.

Nvidia already stated the way they were able to keep energy consumption and heat lower with Maxwell was because Async wasn't needed for DX11. They knew what they were doing and have stated it will be back with Pascal. This was deliberate. Check my post early in the thread. They did things this way because they had the money to do it.
 
Nvidia already stated the way they were able to keep energy consumption and heat lower with Maxwell was because Async wasn't needed for DX11. They knew what they were doing and have stated it will be back with Pascal. This was deliberate. Check my post early in the thread. They did things this way because they had the money to do it.
Oh I already know. I've already dabbled in convo about it even though it's not "enterprise" I still ask questions about the consumer side at work. Although I'm cleared for NDA they still don't say much just "we're prepared" and such. They didn't like it when I told them I was disappointed by Titan X performance.
 
I doubt it, but we have to stay optimistic for AMD right?
GCN already does async compute correctly where Maxwell cuts corners. Assuming Pascal fixes that, the mere implementation of the fix may lengthen Pascal frame times to be similar to AMD. Async compute has overhead that AMD has had implemented for years where NVIDIA has dodged it to this day.

Async compute is something that will become common place over the next few years because it vastly improves GPU resource utilization (very important for consoles and tablets).
 
Last edited:
nVidia doesn't guess. They use their money to control it. AMD tries to innovate but the other more powerful players pull the strings.

I'm sorry, but let's be real here. I'll defend AMD and Nvidia both, when they deserve a defense. Somehow linking a proposed ban on Nvidia product, because the lawyers got overzealous and tried to utilize patent law to retard the development of their "competition," to AMD being the under-dogs because their "innovation" is lost is 100% insane fanboy territory.

If this were actually the case, I'd like you to answer a few questions.
1) How's Vulkan going...? Oh wait, that was subsumed by DX12.
2) How's the whole HBM thing going? Oh wait, the technology is only present on insanely high end cards, and is limited to a certain amount of capacity. This means AMD's current "innovation" is a memory technology less useful and more costly than the GDDR5 available everywhere else. It has potential, but potential innovations aren't exactly quantifiable.
3) How does any of this relate to Nvidia? If Nvidia did supposedly control everything why not just murder AMD. That supposed level of control would mean that Nvidia shouldn't need to fear the FTC.


I'm sorry, but this is fanboy territory. You've gone out of your way to focus on AMD in an article about Nvidia. Either you're a troll, fanboy, or incapable of reading the headline to the thread. As you seem to be able to respond, the third option is out. Telling which of the first two is impossible, so I'll go with the low hanging fruit of the fanboy. I'd be glad if you could prove otherwise, and get back to the discussion about Nvidia or Samsung.
 
GCN already does async compute correctly where Maxwell cuts corners. Assuming Pascal fixes that, the mere implementation of the fix may lengthen Pascal frame times to be similar to AMD. Async compute has overhead that AMD has had implemented for years where NVIDIA has dodged it to this day.

Async compute is something that will become common place over the next few years because it vastly improves GPU resource utilization (very important for consoles and tablets).
They didn't cut corners, they didn't include what wasn't needed. Do we have any on the shelf games that utilize it fully? No? Then why put it in when the target of Maxwell V2 was DX11. Nvidia's Async performance is great when queue length is proper for what the chip can handle. The Ashes thread had plenty of examples showing this. I fully expect Pascal to have more queueing for Async for future games that actually utilize the technology. As an owner of 3 Titan X's I really don't care about Async Compute when there's nothing that uses it currently. By the time it's around and used we will have Pascal, or maybe a better performing AMD chip. I can't speak for others, but I have no issue jumping ship if AMD is the clear winner in both arenas at the time (DX11 and DX12). Idk about you but Async is a bit out of topic and this thread isn't really the place to raw dog Nvidia on their architecture decision making (which seemed pretty spot on to me), but rather raw dog them on poking the fire and getting burned :laugh:
 
but rather raw dog them on poking the fire and getting burned :laugh:
And THAT they certainly did! It's like a mouse attacking a lion.
 
I forgo politically correct terms, and instead use the correct political terms... Corporate Raiders, people! Don't expect them to respect their customers anytime soon....
 
I forgo politically correct terms, and instead use the correct political terms... Corporate Raiders, people! Don't expect them to respect their customers anytime soon....

You've not paid attention. Read through. Using that philosophy of yours, one should expect the same treatment from Samsung. It doesn't jive.
 
The problem with that is it often takes that long to bring the idea to market.

You have a point. But I am very very very tired of being kept back (as a market-species) by greedy shotsighted assholes. Do you think that Musk is stupid to freely share many of Teslas patents? That bullshit argument led to 3D printing being out of reach for 25 years. Stop and think for a second where would we be if we had that tech available since the 70s...

PS. I wait for the moment the Nassholes cross Intel's path.. It will be so epic to see them try some of their own medicine.
 
Last edited:
Back
Top