• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Editorial NVIDIA's 20-series Could be Segregated via Lack of RTX Capabilities in Lower-tier Cards

Too bad that you're practically selling your soul by using nVidia's closed source drivers. I'm still waiting for them to release their firmwares so nouveau can suck a little less but, obvious nVidia has no intent to make the open source community happy.
Aah, that same old BS. First of all, AMD's "open" driver still rely on proprietary firmware, and it's a bloated non-native "Gallium" driver. Secondly, nearly everyone who complains about Nvidia's "closed" driver doesn't use the claimed "open" driver from AMD anyway.
 
No...?
It's been 10 years since I stopped overclocking, modding, unlocking and so on. It was fun when I was a teenager, but really seem to have been a waste of time. I should have just spent more time outside or learn German instead. :)

No. "Gaming" cards are not altered "pro" cards. It's just a chip which can do multiple things. Firmware and drivers tell him what to do.
The reason why gaming chips have ECC functionality included (but blocked) is fairly simple: there's nothing special about ECC. Pretty much all CPUs and GPUs sold today are ECC-compliant.
Manufacturers are using ECC as a factor in lineup segmentation, which really makes sense.
It's not like ECC is crucial for what most people do with their PCs and ECC RAM is more expensive.

Sure, we could suddenly make all computers use ECC.
But sooner or later someone would notice that consumers are paying a premium for a feature they don't need, so it must be a conspiracy by chip makers! Just look how people reacted to RTX. :)
And the overclocking crowd would soon decide that they miss the non-ECC times, because they could OC higher. :)
So you're happy with Nvidia moving their ethos towards apple , i get the ecc thing however security happened and at the minute CPU vendors are having a nightmare, which it is temporarily theirs to deal with, the general compute abilities of GPU are bring targeted in the same way and could develop vulnerabilities , especially without ecc.
That still wouldn't bother most but needs more consideration going forward.

Oh and that was 20 years ago not 10;).
 
No...?
It's been 10 years since I stopped overclocking, modding, unlocking and so on. It was fun when I was a teenager, but really seem to have been a waste of time. I should have just spent more time outside or learn German instead. :)

To be fair, it's much easier than back then too.

English is the only German I'm willing to know.
 
English is the only German I'm willing to know.
Off topic, but it’s refreshing to see someone who knows that English is a Germanic language. :)
 
Off topic, but it’s refreshing to see someone who knows that English is a Germanic language. :)

Ah, I only know because my other hobby is early literature. Although I suppose English is the least Germanic of the Germanic languages.

I should add that I'm not overclocking this 7820x I own as of yet. I'm happy with Intel's turbo boost these days. And haven't touched the Vega. Tbh, I don't know a thing about it... I heard they should be undervolted.
 
To be fair, it's much easier than back then too.
IMO it's a lot more pointless as well.
I don't see the point of OC anymore (both CPU and GPU) with all the automatic stuff going on. Of course I'm talking about performance, not just having fun from the process.
As for unlocking GPUs and so on... I don't think there's literally a single purely software feature of Quadro that I miss.
But sure: if it was possible to buy a non-RTX card and unlock tensor / RT ones, I would at least meditate on the idea.
English is the only German I'm willing to know.
I guess people in Europe have different perspective. ;-)
And personally, since I've been working for 2 Austrian companies for the last 7 years and now moved to a German one... Knowing the language wouldn't hurt for sure. :-)
 
Aah, that same old BS. First of all, AMD's "open" driver still rely on proprietary firmware, and it's a bloated non-native "Gallium" driver. Secondly, nearly everyone who complains about Nvidia's "closed" driver doesn't use the claimed "open" driver from AMD anyway.
At least AMD provides their firmware so open source drivers can actually be developed and why would you want to redo that work? You end up with nouveau which runs like crap as a result and can't really do anything more useful than boot your machine so you can install some drivers that are 100% proprietary. You can't even get nVidia's firmware to even try to make a driver. People are left having to figure everything out for themselves because nVidia doesn't provide anything. No documentation, no white papers, no firmware. AMD provides firmware and documentation which is a heck of a lot more than nVidia provides. You literally can't compare the two on this front. I am using AMDGPU (not Pro,) and the only bit of proprietary code is the firmware but, if I have a problem I can crack open the kernel source and investigate. I also can use the latest kernel as a result which is a thing too. There is a reason why Linus said that nVidia is the single worst company that they've ever had to deal with and it's not because they're willing to help or share. So saying that this is the "the same old BS," is itself BS because you're basically equating AMD which is, lets say 10% proprietary (which is generous on top of the fact that AMD provides documentation,) to nVidia which is 100% proprietary.

...and I do use the open source AMDGPU (not pro,) driver.
1536510071021.png
 
Aah, that same old BS. First of all, AMD's "open" driver still rely on proprietary firmware, and it's a bloated non-native "Gallium" driver. Secondly, nearly everyone who complains about Nvidia's "closed" driver doesn't use the claimed "open" driver from AMD anyway.
Have you ever used nouveau?
That bloated non-native Gallium AMD driver is better than the proprietary Windows one.
 
People are left having to figure everything out for themselves because nVidia doesn't provide anything. No documentation, no white papers, no firmware. AMD provides firmware and documentation which is a heck of a lot more than nVidia provides.
And yet Nvidia rules the GPU market and most games are better optimized for their chips. Not to mention CUDA, which is the de facto standard for GPGPU API.

What's your problem with proprietary software? Have you ever considered a possibility that it might be the better way to go? :-)
I don't get this obsession with everything having to be open-source or free.
Thankfully, you're still allowing PC part manufacturers to make money! :-D
 
And yet Nvidia rules the GPU market and most games are better optimized for their chips. Not to mention CUDA, which is the de facto standard for GPGPU API.

What's your problem with proprietary software? Have you ever considered a possibility that it might be the better way to go? :)
I don't get this obsession with everything having to be open-source or free.
Thankfully, you're still allowing PC part manufacturers to make money! :-D

It seems I run into only a handful when it's actually better if I had Nvidia. Of course, I don't play every thing out there either.

The vast majority work well either way. And AMD has their own handful of optimized games (mostly from console land.. like Forza).
 
What's your problem with proprietary software? Have you ever considered a possibility that it might be the better way to go? :)
I don't get this obsession with everything having to be open-source or free.

Open Source does not mean free.
 
Open Source does not mean free.

Not all of them at least.

FreeBSD is truly free imo. So free it'll let you commercialize and then close out your own source.

GNU demands you be a neckbearded freak in Stallman's cult.. and name your firstborn after him.
 
And yet Nvidia rules the GPU market and most games are better optimized for their chips. Not to mention CUDA, which is the de facto standard for GPGPU API.
So yes, nVidia beefed up the compute on these cards and back to my original point is that if these were cards initially designed for the professional market, they're going to have a lot more die space dedicated to doing those kinds of jobs and less for typical rendering tasks. For that reason, I think we're being mislead with respect to how well it's going to perform.
What's your problem with proprietary software? Have you ever considered a possibility that it might be the better way to go? :)
I don't have a problem with proprietary software most of the time but, even proprietary solutions tend to have parts that are open (consider AMD's firmware and documentation on how their GPUs operate,) but nVidia keeps that portion restrictively small. If you want to do something beyond just using CUDA (which has its own requirements like particular GCC versions mind you,) then you're SOL. You're also locking yourself into a vendor that's going to squeeze every penny out of you. If you use OpenCL, it's going to run on a lot more devices, including nVidia's. Other than being a little faster, I see very little motivation for using CUDA other than having hardware that already can do it which is rather myopic.

So, tl;dr: I don't have an issue with proprietary software so long that it is balanced by an appropriate amount of open documentation and code. That means not being completed closed and pulling the kind of shit that nVidia does.
It's the free as in freedom or free as in beer thing.
 
So yes, nVidia beefed up the compute on these cards and back to my original point is that if these were cards initially designed for the professional market, they're going to have a lot more die space dedicated to doing those kinds of jobs and less for typical rendering tasks. For that reason, I think we're being mislead with respect to how well it's going to perform.

I don't have a problem with proprietary software most of the time but, even proprietary solutions tend to have parts that are open (consider AMD's firmware and documentation on how their GPUs operate,) but nVidia keeps that portion restrictively small. If you want to do something beyond just using CUDA (which has its own requirements like particular GCC versions mind you,) then you're SOL. You're also locking yourself into a vendor that's going to squeeze every penny out of you. If you use OpenCL, it's going to run on a lot more devices, including nVidia's. Other than being a little faster, I see very little motivation for using CUDA other than having hardware that already can do it which is rather myopic.

So, tl;dr: I don't have an issue with proprietary software so long that it is balanced by an appropriate amount of open documentation and code. That means not being completed closed and pulling the kind of shit that nVidia does.

It's the free as in freedom or free as in beer thing.

It's not freedom when it comes to GNU. I just explained why.

I'd agree though that Nvidia is problematic. But honestly, not enough people give a shit.
 
It seems I run into only a handful when it's actually better if I had Nvidia. Of course, I don't play every thing out there either.

The vast majority work well either way. And AMD has their own handful of optimized games (mostly from console land.. like Forza).

The PC GPU market is actually getting kind of weird (some might call it interesting). In the old days you could expect the main difference between a Nvidia and AMD gpu to simple be price and performance. Sure, there might be extra VRAM and a handful of feature differences - but for the most part they did the same things.

Nowadays buyers really should pay attention to which games they play because it makes a MASSIVE difference in performance. For instance I play a TON of Battlefielld, and my other favorite games (on PC) of recent years were Wolfenstein II, Deus Ex, Far Cry, and Fallout 4. With my 20% overclock on Vega, I beat even aftermarket 1080 Ti's in almost every game I play! Even Fallout 4 typically runs very well on AMD if you simply turn godrays down (and it's an easy to run game now either way).

However if someone played a lot of games like Frostpunch or PUBG.... Even after overclocking my Vega would likely lose to a 1080! That is an incredibly weird variance in performance...
 
So yes, nVidia beefed up the compute on these cards and back to my original point is that if these were cards initially designed for the professional market, they're going to have a lot more die space dedicated to doing those kinds of jobs and less for typical rendering tasks. For that reason, I think we're being mislead with respect to how well it's going to perform.
But the "die space" dedicated to rendering stuff is the workhorse of CUDA, i.e. also fundamental pro part. You're looking at GPUs in a very conservative way. :)
Tensor and RT cores are specialized on particular problems, but again - they can't be called "pro" just because they aren't designed for basic rendering.

The traditional approach was to make a homogeneous chip and push all instructions through the same core structure.
Nvidia slowly switches to a more effective idea of building a GPU from different elements specialized at different tasks.

We knew from the start that tensor cores could help with rendering and now look: with RTX some studios started to implement AA this way (and it's way faster).
The RT cores are still a mystery to me. It will be interesting to see what else they can do. I mean: even while gaming, if you decide not to use RTRT, they could do something else and boost performance instead of slowing it down.
Looking at the general idea of how ray tracing works, IMO there is a chance that RT cores turn out to be great for collision detection, which means taking over a huge chunk of CPU gaming load.
I don't have a problem with proprietary software most of the time but, even proprietary solutions tend to have parts that are open (consider AMD's firmware and documentation on how their GPUs operate,) but nVidia keeps that portion restrictively small.
But why is that bad? I could call that an issue, if Nvidia's "closed" software was rubbish, but it isn't.
If you use OpenCL, it's going to run on a lot more devices, including nVidia's. Other than being a little faster, I see very little motivation for using CUDA other than having hardware that already can do it which is rather myopic.
Well, I don't care about device compatibility which makes these choices a lot easier. :)
And it's not about being faster for computation. CUDA is just way nicer to work with and, as a result, much faster to code.
From what I've seen, it's also much easier to learn for people without straight programming background.
OpenCL ticks all the typical FOSS boxes - both good and bad. Generally speaking, I'm not a huge fan of the direction open-source is going lately, to be honest.
 
But why is that bad? I could call that an issue, if Nvidia's "closed" software was rubbish, but it isn't.
It IS bad on Linux, BSD, etc.
You can't update the system because the lazy Nvidia team still doesn't have a new driver for the most recent kernel, and since they don't provide any documentation, the kernel driver is shit.
Both AMD and Intel have great kernel drivers, sometimes even better than the Windows ones, while in Nvidia you are limited to geforce.com

Of course this point is moot if you only use Windows, where the only option is the Nvidia driver, support is faster, and the kernel gets updated a lot slower.
 
It IS bad on Linux, BSD, etc.
No, it isn't. On Linux you use proprietary or closed software just like you would on Windows. The only problem here is psychological, i.e. an average Linux user wants everything to be open and free. :-)
Of course this point is moot if you only use Windows, where the only option is the Nvidia driver, support is faster, and the kernel gets updated a lot slower.
I've never experienced any issues with Nvidia GPUs on Linux, so I can't really comment yours.
That said, I try not to use the latest kernels nor software versions, so maybe I just miss this kind of problems (that's the whole point of using "stable" releases, anyway).
 
No, it isn't. On Linux you use proprietary or closed software just like you would on Windows. The only problem here is psychological, i.e. an average Linux user wants everything to be open and free. :)

I've never experienced any issues with Nvidia GPUs on Linux, so I can't really comment yours.
That said, I try not to use the latest kernels nor software versions, so maybe I just miss this kind of problems (that's the whole point of using "stable" releases, anyway).
Any new release is a stable release (that's why any new kernel version gets 7 or 8 release candidates), and it's not because of everything being FOSS, it's because the Nvidia driver breaks with newer kernels.
There is no excuse to stay put on old software because Nvidia takes it's time to upgrade the drivers. Not everyone uses Ubuntu.
 
The future is ray tracing, unless you can't afford a $1000 graphics card, or they aren't in stock due to yields, or there are only a few games that support them..... then the future is not ray tracing, its defective cores sold at high prices.
 
The future is ray tracing, unless you can't afford a $1000 graphics card, or they aren't in stock due to yields, or there are only a few games that support them..... then the future is not ray tracing, its defective cores sold at high prices.
Nah, you just can't see the glorious ray traced future all that well because you don't have enough GIGA RAYS. If you had more GIGA RAYS, the future would clear up and look prettier for you. More GIGA RAYS, people!
 
what if GTX starts with 2070? a non RTX version (with defective RT or Tensore cores and shaders), priced at below 400 usd and continues with GTX 2060 to 2050
 
The PC GPU market is actually getting kind of weird (some might call it interesting). In the old days you could expect the main difference between a Nvidia and AMD gpu to simple be price and performance.
You have rather rosy memories of the past. AMD/ATI, Nvidia and 3dfx cards all had some features that were different. Hell, even tessellation was a killer feature introduced by ATI back in 2001 on R200 (Radeon 8500 and rebrands). There were filtering issues on both sides, 16/24/32-bit differences - Nvidia's FX series is the recent one, color depth handling was different on 3dfx/Nvidia/ATI cards back in the day, T&L was a new thing when Geforce introduced it (followed by Radeon and Savage 2000). Shaders had differences for a while when new things were introduced - shaders, unified shaders and some intermediate steps.
:D
 
Back
Top