• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Editorial NVIDIA's 20-series Could be Segregated via Lack of RTX Capabilities in Lower-tier Cards

Joined
Jun 10, 2014
Messages
2,987 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Too bad that you're practically selling your soul by using nVidia's closed source drivers. I'm still waiting for them to release their firmwares so nouveau can suck a little less but, obvious nVidia has no intent to make the open source community happy.
Aah, that same old BS. First of all, AMD's "open" driver still rely on proprietary firmware, and it's a bloated non-native "Gallium" driver. Secondly, nearly everyone who complains about Nvidia's "closed" driver doesn't use the claimed "open" driver from AMD anyway.
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
No...?
It's been 10 years since I stopped overclocking, modding, unlocking and so on. It was fun when I was a teenager, but really seem to have been a waste of time. I should have just spent more time outside or learn German instead. :)

No. "Gaming" cards are not altered "pro" cards. It's just a chip which can do multiple things. Firmware and drivers tell him what to do.
The reason why gaming chips have ECC functionality included (but blocked) is fairly simple: there's nothing special about ECC. Pretty much all CPUs and GPUs sold today are ECC-compliant.
Manufacturers are using ECC as a factor in lineup segmentation, which really makes sense.
It's not like ECC is crucial for what most people do with their PCs and ECC RAM is more expensive.

Sure, we could suddenly make all computers use ECC.
But sooner or later someone would notice that consumers are paying a premium for a feature they don't need, so it must be a conspiracy by chip makers! Just look how people reacted to RTX. :)
And the overclocking crowd would soon decide that they miss the non-ECC times, because they could OC higher. :)
So you're happy with Nvidia moving their ethos towards apple , i get the ecc thing however security happened and at the minute CPU vendors are having a nightmare, which it is temporarily theirs to deal with, the general compute abilities of GPU are bring targeted in the same way and could develop vulnerabilities , especially without ecc.
That still wouldn't bother most but needs more consideration going forward.

Oh and that was 20 years ago not 10;).
 
Joined
Sep 7, 2017
Messages
3,244 (1.23/day)
System Name Grunt
Processor Ryzen 5800x
Motherboard Gigabyte x570 Gaming X
Cooling Noctua NH-U12A
Memory Corsair LPX 3600 4x8GB
Video Card(s) Gigabyte 6800 XT (reference)
Storage Samsung 980 Pro 2TB
Display(s) Samsung CFG70, Samsung NU8000 TV
Case Corsair C70
Power Supply Corsair HX750
Software Win 10 Pro
No...?
It's been 10 years since I stopped overclocking, modding, unlocking and so on. It was fun when I was a teenager, but really seem to have been a waste of time. I should have just spent more time outside or learn German instead. :)

To be fair, it's much easier than back then too.

English is the only German I'm willing to know.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,995 (2.34/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
English is the only German I'm willing to know.
Off topic, but it’s refreshing to see someone who knows that English is a Germanic language. :)
 
Joined
Sep 7, 2017
Messages
3,244 (1.23/day)
System Name Grunt
Processor Ryzen 5800x
Motherboard Gigabyte x570 Gaming X
Cooling Noctua NH-U12A
Memory Corsair LPX 3600 4x8GB
Video Card(s) Gigabyte 6800 XT (reference)
Storage Samsung 980 Pro 2TB
Display(s) Samsung CFG70, Samsung NU8000 TV
Case Corsair C70
Power Supply Corsair HX750
Software Win 10 Pro
Off topic, but it’s refreshing to see someone who knows that English is a Germanic language. :)

Ah, I only know because my other hobby is early literature. Although I suppose English is the least Germanic of the Germanic languages.

I should add that I'm not overclocking this 7820x I own as of yet. I'm happy with Intel's turbo boost these days. And haven't touched the Vega. Tbh, I don't know a thing about it... I heard they should be undervolted.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
To be fair, it's much easier than back then too.
IMO it's a lot more pointless as well.
I don't see the point of OC anymore (both CPU and GPU) with all the automatic stuff going on. Of course I'm talking about performance, not just having fun from the process.
As for unlocking GPUs and so on... I don't think there's literally a single purely software feature of Quadro that I miss.
But sure: if it was possible to buy a non-RTX card and unlock tensor / RT ones, I would at least meditate on the idea.
English is the only German I'm willing to know.
I guess people in Europe have different perspective. ;-)
And personally, since I've been working for 2 Austrian companies for the last 7 years and now moved to a German one... Knowing the language wouldn't hurt for sure. :)
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.81/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Aah, that same old BS. First of all, AMD's "open" driver still rely on proprietary firmware, and it's a bloated non-native "Gallium" driver. Secondly, nearly everyone who complains about Nvidia's "closed" driver doesn't use the claimed "open" driver from AMD anyway.
At least AMD provides their firmware so open source drivers can actually be developed and why would you want to redo that work? You end up with nouveau which runs like crap as a result and can't really do anything more useful than boot your machine so you can install some drivers that are 100% proprietary. You can't even get nVidia's firmware to even try to make a driver. People are left having to figure everything out for themselves because nVidia doesn't provide anything. No documentation, no white papers, no firmware. AMD provides firmware and documentation which is a heck of a lot more than nVidia provides. You literally can't compare the two on this front. I am using AMDGPU (not Pro,) and the only bit of proprietary code is the firmware but, if I have a problem I can crack open the kernel source and investigate. I also can use the latest kernel as a result which is a thing too. There is a reason why Linus said that nVidia is the single worst company that they've ever had to deal with and it's not because they're willing to help or share. So saying that this is the "the same old BS," is itself BS because you're basically equating AMD which is, lets say 10% proprietary (which is generous on top of the fact that AMD provides documentation,) to nVidia which is 100% proprietary.

...and I do use the open source AMDGPU (not pro,) driver.
1536510071021.png
 
Joined
Oct 2, 2015
Messages
3,144 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
Aah, that same old BS. First of all, AMD's "open" driver still rely on proprietary firmware, and it's a bloated non-native "Gallium" driver. Secondly, nearly everyone who complains about Nvidia's "closed" driver doesn't use the claimed "open" driver from AMD anyway.
Have you ever used nouveau?
That bloated non-native Gallium AMD driver is better than the proprietary Windows one.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
People are left having to figure everything out for themselves because nVidia doesn't provide anything. No documentation, no white papers, no firmware. AMD provides firmware and documentation which is a heck of a lot more than nVidia provides.
And yet Nvidia rules the GPU market and most games are better optimized for their chips. Not to mention CUDA, which is the de facto standard for GPGPU API.

What's your problem with proprietary software? Have you ever considered a possibility that it might be the better way to go? :)
I don't get this obsession with everything having to be open-source or free.
Thankfully, you're still allowing PC part manufacturers to make money! :-D
 
Joined
Sep 7, 2017
Messages
3,244 (1.23/day)
System Name Grunt
Processor Ryzen 5800x
Motherboard Gigabyte x570 Gaming X
Cooling Noctua NH-U12A
Memory Corsair LPX 3600 4x8GB
Video Card(s) Gigabyte 6800 XT (reference)
Storage Samsung 980 Pro 2TB
Display(s) Samsung CFG70, Samsung NU8000 TV
Case Corsair C70
Power Supply Corsair HX750
Software Win 10 Pro
And yet Nvidia rules the GPU market and most games are better optimized for their chips. Not to mention CUDA, which is the de facto standard for GPGPU API.

What's your problem with proprietary software? Have you ever considered a possibility that it might be the better way to go? :)
I don't get this obsession with everything having to be open-source or free.
Thankfully, you're still allowing PC part manufacturers to make money! :-D

It seems I run into only a handful when it's actually better if I had Nvidia. Of course, I don't play every thing out there either.

The vast majority work well either way. And AMD has their own handful of optimized games (mostly from console land.. like Forza).
 
Joined
Mar 9, 2016
Messages
39 (0.01/day)
What's your problem with proprietary software? Have you ever considered a possibility that it might be the better way to go? :)
I don't get this obsession with everything having to be open-source or free.

Open Source does not mean free.
 
Joined
Sep 7, 2017
Messages
3,244 (1.23/day)
System Name Grunt
Processor Ryzen 5800x
Motherboard Gigabyte x570 Gaming X
Cooling Noctua NH-U12A
Memory Corsair LPX 3600 4x8GB
Video Card(s) Gigabyte 6800 XT (reference)
Storage Samsung 980 Pro 2TB
Display(s) Samsung CFG70, Samsung NU8000 TV
Case Corsair C70
Power Supply Corsair HX750
Software Win 10 Pro
Open Source does not mean free.

Not all of them at least.

FreeBSD is truly free imo. So free it'll let you commercialize and then close out your own source.

GNU demands you be a neckbearded freak in Stallman's cult.. and name your firstborn after him.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.81/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
And yet Nvidia rules the GPU market and most games are better optimized for their chips. Not to mention CUDA, which is the de facto standard for GPGPU API.
So yes, nVidia beefed up the compute on these cards and back to my original point is that if these were cards initially designed for the professional market, they're going to have a lot more die space dedicated to doing those kinds of jobs and less for typical rendering tasks. For that reason, I think we're being mislead with respect to how well it's going to perform.
What's your problem with proprietary software? Have you ever considered a possibility that it might be the better way to go? :)
I don't have a problem with proprietary software most of the time but, even proprietary solutions tend to have parts that are open (consider AMD's firmware and documentation on how their GPUs operate,) but nVidia keeps that portion restrictively small. If you want to do something beyond just using CUDA (which has its own requirements like particular GCC versions mind you,) then you're SOL. You're also locking yourself into a vendor that's going to squeeze every penny out of you. If you use OpenCL, it's going to run on a lot more devices, including nVidia's. Other than being a little faster, I see very little motivation for using CUDA other than having hardware that already can do it which is rather myopic.

So, tl;dr: I don't have an issue with proprietary software so long that it is balanced by an appropriate amount of open documentation and code. That means not being completed closed and pulling the kind of shit that nVidia does.
It's the free as in freedom or free as in beer thing.
 
Joined
Sep 7, 2017
Messages
3,244 (1.23/day)
System Name Grunt
Processor Ryzen 5800x
Motherboard Gigabyte x570 Gaming X
Cooling Noctua NH-U12A
Memory Corsair LPX 3600 4x8GB
Video Card(s) Gigabyte 6800 XT (reference)
Storage Samsung 980 Pro 2TB
Display(s) Samsung CFG70, Samsung NU8000 TV
Case Corsair C70
Power Supply Corsair HX750
Software Win 10 Pro
So yes, nVidia beefed up the compute on these cards and back to my original point is that if these were cards initially designed for the professional market, they're going to have a lot more die space dedicated to doing those kinds of jobs and less for typical rendering tasks. For that reason, I think we're being mislead with respect to how well it's going to perform.

I don't have a problem with proprietary software most of the time but, even proprietary solutions tend to have parts that are open (consider AMD's firmware and documentation on how their GPUs operate,) but nVidia keeps that portion restrictively small. If you want to do something beyond just using CUDA (which has its own requirements like particular GCC versions mind you,) then you're SOL. You're also locking yourself into a vendor that's going to squeeze every penny out of you. If you use OpenCL, it's going to run on a lot more devices, including nVidia's. Other than being a little faster, I see very little motivation for using CUDA other than having hardware that already can do it which is rather myopic.

So, tl;dr: I don't have an issue with proprietary software so long that it is balanced by an appropriate amount of open documentation and code. That means not being completed closed and pulling the kind of shit that nVidia does.

It's the free as in freedom or free as in beer thing.

It's not freedom when it comes to GNU. I just explained why.

I'd agree though that Nvidia is problematic. But honestly, not enough people give a shit.
 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
It seems I run into only a handful when it's actually better if I had Nvidia. Of course, I don't play every thing out there either.

The vast majority work well either way. And AMD has their own handful of optimized games (mostly from console land.. like Forza).

The PC GPU market is actually getting kind of weird (some might call it interesting). In the old days you could expect the main difference between a Nvidia and AMD gpu to simple be price and performance. Sure, there might be extra VRAM and a handful of feature differences - but for the most part they did the same things.

Nowadays buyers really should pay attention to which games they play because it makes a MASSIVE difference in performance. For instance I play a TON of Battlefielld, and my other favorite games (on PC) of recent years were Wolfenstein II, Deus Ex, Far Cry, and Fallout 4. With my 20% overclock on Vega, I beat even aftermarket 1080 Ti's in almost every game I play! Even Fallout 4 typically runs very well on AMD if you simply turn godrays down (and it's an easy to run game now either way).

However if someone played a lot of games like Frostpunch or PUBG.... Even after overclocking my Vega would likely lose to a 1080! That is an incredibly weird variance in performance...
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
So yes, nVidia beefed up the compute on these cards and back to my original point is that if these were cards initially designed for the professional market, they're going to have a lot more die space dedicated to doing those kinds of jobs and less for typical rendering tasks. For that reason, I think we're being mislead with respect to how well it's going to perform.
But the "die space" dedicated to rendering stuff is the workhorse of CUDA, i.e. also fundamental pro part. You're looking at GPUs in a very conservative way. :)
Tensor and RT cores are specialized on particular problems, but again - they can't be called "pro" just because they aren't designed for basic rendering.

The traditional approach was to make a homogeneous chip and push all instructions through the same core structure.
Nvidia slowly switches to a more effective idea of building a GPU from different elements specialized at different tasks.

We knew from the start that tensor cores could help with rendering and now look: with RTX some studios started to implement AA this way (and it's way faster).
The RT cores are still a mystery to me. It will be interesting to see what else they can do. I mean: even while gaming, if you decide not to use RTRT, they could do something else and boost performance instead of slowing it down.
Looking at the general idea of how ray tracing works, IMO there is a chance that RT cores turn out to be great for collision detection, which means taking over a huge chunk of CPU gaming load.
I don't have a problem with proprietary software most of the time but, even proprietary solutions tend to have parts that are open (consider AMD's firmware and documentation on how their GPUs operate,) but nVidia keeps that portion restrictively small.
But why is that bad? I could call that an issue, if Nvidia's "closed" software was rubbish, but it isn't.
If you use OpenCL, it's going to run on a lot more devices, including nVidia's. Other than being a little faster, I see very little motivation for using CUDA other than having hardware that already can do it which is rather myopic.
Well, I don't care about device compatibility which makes these choices a lot easier. :)
And it's not about being faster for computation. CUDA is just way nicer to work with and, as a result, much faster to code.
From what I've seen, it's also much easier to learn for people without straight programming background.
OpenCL ticks all the typical FOSS boxes - both good and bad. Generally speaking, I'm not a huge fan of the direction open-source is going lately, to be honest.
 
Joined
Oct 2, 2015
Messages
3,144 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
But why is that bad? I could call that an issue, if Nvidia's "closed" software was rubbish, but it isn't.
It IS bad on Linux, BSD, etc.
You can't update the system because the lazy Nvidia team still doesn't have a new driver for the most recent kernel, and since they don't provide any documentation, the kernel driver is shit.
Both AMD and Intel have great kernel drivers, sometimes even better than the Windows ones, while in Nvidia you are limited to geforce.com

Of course this point is moot if you only use Windows, where the only option is the Nvidia driver, support is faster, and the kernel gets updated a lot slower.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
It IS bad on Linux, BSD, etc.
No, it isn't. On Linux you use proprietary or closed software just like you would on Windows. The only problem here is psychological, i.e. an average Linux user wants everything to be open and free. :)
Of course this point is moot if you only use Windows, where the only option is the Nvidia driver, support is faster, and the kernel gets updated a lot slower.
I've never experienced any issues with Nvidia GPUs on Linux, so I can't really comment yours.
That said, I try not to use the latest kernels nor software versions, so maybe I just miss this kind of problems (that's the whole point of using "stable" releases, anyway).
 
Joined
Oct 2, 2015
Messages
3,144 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
No, it isn't. On Linux you use proprietary or closed software just like you would on Windows. The only problem here is psychological, i.e. an average Linux user wants everything to be open and free. :)

I've never experienced any issues with Nvidia GPUs on Linux, so I can't really comment yours.
That said, I try not to use the latest kernels nor software versions, so maybe I just miss this kind of problems (that's the whole point of using "stable" releases, anyway).
Any new release is a stable release (that's why any new kernel version gets 7 or 8 release candidates), and it's not because of everything being FOSS, it's because the Nvidia driver breaks with newer kernels.
There is no excuse to stay put on old software because Nvidia takes it's time to upgrade the drivers. Not everyone uses Ubuntu.
 
Joined
Nov 4, 2005
Messages
11,984 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
The future is ray tracing, unless you can't afford a $1000 graphics card, or they aren't in stock due to yields, or there are only a few games that support them..... then the future is not ray tracing, its defective cores sold at high prices.
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,745 (3.30/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
The future is ray tracing, unless you can't afford a $1000 graphics card, or they aren't in stock due to yields, or there are only a few games that support them..... then the future is not ray tracing, its defective cores sold at high prices.
Nah, you just can't see the glorious ray traced future all that well because you don't have enough GIGA RAYS. If you had more GIGA RAYS, the future would clear up and look prettier for you. More GIGA RAYS, people!
 
Joined
Jan 31, 2011
Messages
2,211 (0.44/day)
System Name Ultima
Processor AMD Ryzen 7 5800X
Motherboard MSI Mag B550M Mortar
Cooling Arctic Liquid Freezer II 240 rev4 w/ Ryzen offset mount
Memory G.SKill Ripjaws V 2x16GB DDR4 3600
Video Card(s) Palit GeForce RTX 4070 12GB Dual
Storage WD Black SN850X 2TB Gen4, Samsung 970 Evo Plus 500GB , 1TB Crucial MX500 SSD sata,
Display(s) ASUS TUF VG249Q3A 24" 1080p 165-180Hz VRR
Case DarkFlash DLM21 Mesh
Audio Device(s) Onboard Realtek ALC1200 Audio/Nvidia HD Audio
Power Supply Corsair RM650
Mouse Rog Strix Impact 3 Wireless | Wacom Intuos CTH-480
Keyboard A4Tech B314 Keyboard
Software Windows 10 Pro
what if GTX starts with 2070? a non RTX version (with defective RT or Tensore cores and shaders), priced at below 400 usd and continues with GTX 2060 to 2050
 
Joined
Feb 3, 2017
Messages
3,757 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
The PC GPU market is actually getting kind of weird (some might call it interesting). In the old days you could expect the main difference between a Nvidia and AMD gpu to simple be price and performance.
You have rather rosy memories of the past. AMD/ATI, Nvidia and 3dfx cards all had some features that were different. Hell, even tessellation was a killer feature introduced by ATI back in 2001 on R200 (Radeon 8500 and rebrands). There were filtering issues on both sides, 16/24/32-bit differences - Nvidia's FX series is the recent one, color depth handling was different on 3dfx/Nvidia/ATI cards back in the day, T&L was a new thing when Geforce introduced it (followed by Radeon and Savage 2000). Shaders had differences for a while when new things were introduced - shaders, unified shaders and some intermediate steps.
:D
 
Top