• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Editorial NVIDIA's 20-series Could be Segregated via Lack of RTX Capabilities in Lower-tier Cards

Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
Any new release is a stable release (that's why any new kernel version gets 7 or 8 release candidates)
Not exactly. I mean, sure, a kernel release is expected to be stable as in "no serious code issues". But:
1) a f*ckup happens from time to time
2) obviously there's some delay before packages get updated. You should be used to that.

Earlier I meant stable Linux releases. If you want everything to work, just stay with main LTS releases of whatever you use. It's that simple.
LTS kernels get the important security updates anyway, so there's little reason to update.

My Debian is still 4.9 and Mint is 4.15 (Ubuntu 18.04 LTS).
I mostly use Manjaro these days and, since it's Arch-based, it already got the latest kernel. But I use it in VMs only, so no GPU issues. :)

You have rather rosy memories of the past. AMD/ATI, Nvidia and 3dfx cards all had some features that were different. Hell, even tessellation was a killer feature introduced by ATI back in 2001 on R200 (Radeon 8500 and rebrands). There were filtering issues on both sides, 16/24/32-bit differences - Nvidia's FX series is the recent one, color depth handling was different on 3dfx/Nvidia/ATI cards back in the day, T&L was a new thing when Geforce introduced it (followed by Radeon and Savage 2000). Shaders had differences for a while when new things were introduced - shaders, unified shaders and some intermediate steps.
:D
Exactly. Back in the day GPUs differed massively. 3D was young and each company had its own approach. Every major GPU launch meant new features rather than more fps, so a bit like with RTX now. Also the APIs were less high-level and were often updated to support new games and GPUs.

Honestly, it's a miracle that games actually were cross-platform in the 90s. :-D

Today you can literally change the card to the other brand and launch some games on generic drivers. It's great how things have changed. I mean: installing a game 15-20 years ago often was a day-long process of rebooting, fighting with DirectX compatibility and editing config files (especially if you didn't have Internet access). Today you click "install" in Steam or whatever and go for a walk.
 
Joined
Sep 17, 2014
Messages
22,439 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
The PC GPU market is actually getting kind of weird (some might call it interesting). In the old days you could expect the main difference between a Nvidia and AMD gpu to simple be price and performance. Sure, there might be extra VRAM and a handful of feature differences - but for the most part they did the same things.

Nowadays buyers really should pay attention to which games they play because it makes a MASSIVE difference in performance. For instance I play a TON of Battlefielld, and my other favorite games (on PC) of recent years were Wolfenstein II, Deus Ex, Far Cry, and Fallout 4. With my 20% overclock on Vega, I beat even aftermarket 1080 Ti's in almost every game I play! Even Fallout 4 typically runs very well on AMD if you simply turn godrays down (and it's an easy to run game now either way).

However if someone played a lot of games like Frostpunch or PUBG.... Even after overclocking my Vega would likely lose to a 1080! That is an incredibly weird variance in performance...

But that mostly just applies to Vega which is clearly an 'odd one out' - just like Fury was, which also had performance that was all over the place, and that problem has only become greater. In some games, it just outright sucks and in others it shines. This is something AMD needs to look at, and it has nothing to do with Nvidia's optimization which is 'across the board' for most games. And, *without* disabling features to get stellar FPS - which again, makes sense because Nvidia essentially just has one architecture with one VRAM setup in different speeds and sizes.

If you look at Polaris, the performance is far more consistent.

You can twist this 'performance' and explain it in AMD/Vega's favor but the reality is, we're looking at a lack of consistency in performance, which is worrying rather than 'great' and highlights that this was never primarily a gaming GPU.
 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
You have rather rosy memories of the past. AMD/ATI, Nvidia and 3dfx cards all had some features that were different. Hell, even tessellation was a killer feature introduced by ATI back in 2001 on R200 (Radeon 8500 and rebrands). There were filtering issues on both sides, 16/24/32-bit differences - Nvidia's FX series is the recent one, color depth handling was different on 3dfx/Nvidia/ATI cards back in the day, T&L was a new thing when Geforce introduced it (followed by Radeon and Savage 2000). Shaders had differences for a while when new things were introduced - shaders, unified shaders and some intermediate steps.
:D

Well keep in mind that my memory goes back to the HD 3870 at the absolute oldest lol. But that was over a decade ago too - most people I talk to building PC's have no memory past Maxwell!

I am aware of the 3DFX days, and the choices you had to make back then though.

But that mostly just applies to Vega which is clearly an 'odd one out' - just like Fury was, which also had performance that was all over the place, and that problem has only become greater. In some games, it just outright sucks and in others it shines. This is something AMD needs to look at, and it has nothing to do with Nvidia's optimization which is 'across the board' for most games. And, *without* disabling features to get stellar FPS - which again, makes sense because Nvidia essentially just has one architecture with one VRAM setup in different speeds and sizes.

If you look at Polaris, the performance is far more consistent.

You can twist this 'performance' and explain it in AMD/Vega's favor but the reality is, we're looking at a lack of consistency in performance, which is worrying rather than 'great' and highlights that this was never primarily a gaming GPU.

Not trying to twist anything - just pointing out that you cannot definitively say certain cards are "stronger" as easily as some would think. From my point of view Vega is clearly and firmly in-between a 1080 and 1080 Ti, and in the games I play that is 100% correct. But I am aware that isn't true in a handful of games someone else might play.

You make a fair point though that this could be argued to be "inconsistent" instead of "underutilized." But yeah this shouldn't surprise anyone - it has less gaming components than the 1080 Ti, but almost as many compute components as Volta.
 
Joined
Jan 29, 2016
Messages
128 (0.04/day)
System Name Ryzen 5800X-PC / RyzenITX (2nd system 5800X stock)
Processor AMD Ryzen 7 5800X (atx) / 5800X itx (soon one pc getting 5800X3D upgrade! ;)
Motherboard Gigabyte X570 AORUS MASTER (ATX) / X570 I Aorus Pro WiFi (ITX)
Cooling AMD Wrath Prism Cooler / Alphenhone Blackridge (ITX)
Memory OLOY 4000Mhz 16GB x 2 (32GB) DDR4 4000 Mhz CL18, (22,22,22,42) 1.40v AT & ITX PC's (2000 Fclk)
Video Card(s) AMD Radeon RX 6800 XT (ATX) /// AMD Radeon RX 6700 XT 12GB GDDR6 (ITX)
Storage (Sys)Sammy 970EVO 500GB & SabrentRocket 4.0+ 2TB (ATX) | SabrentRocket4.0+ 1TB NVMe (ITX)
Display(s) 30" Ultra-Wide 21:9 200Hz/AMD FREESYNC 200hz/144hz LED LCD Montior Connected Via Display Port (x2)
Case Lian Li Lancool II Mesh (ATX) / Velkase Velka 7 (ITX)
Audio Device(s) Realtek HD ALC1220 codec / Onboard HD Audio* (BOTH) w/ EQ settings
Power Supply 850w (Antec High-Current Gamer) HC-850 PSU (80+ gold certified) ATX) /650Watt Thermaltake SFX (ITX)
Mouse Logitech USB Wireless KB & MOUSE (Both Systems)
Keyboard Logitech USB Wireless KB & MOUSE (Both Systems)
VR HMD Oculus Quest 2 - 128GB - Standalone + Oculus link PC
Software Windows 10 Home x64bit 2400 /BOTH SYSTEMS
Benchmark Scores CPUZ - ATX-5800X (ST:670) - (MT: 6836.3 ) CPUZ - ITX -5800X (ST:680.2) - (MT: 7015.2) ??? same CPU?
You do realise it's all built on the same chip right? They won't do that. It's not like disabling RTX would be an option either... in fact that would only add to the manufacturing cost.


no, but remember yeilds can be shitty on a lot of dies,, may the RTX part of some GPUS are bad so they can just disable all of t them if not many are viable tensor cores or whatever does the Ray tracing,, and just sell the GTX 2080 GPU as just that a direct GTX 1080 replacement, and id be down, but 700-800 bucks for a 2080 non ti model, no thanks.. i want RT... but its not good enough yet, and really the new tombraider game isnt really that impressive in TR for me to even consider getting an RTX 20 series gpu, it just doesnt seem to be worth it yet, i need @ minimum 75 FPS in 1080p in games with adaptive refresh rate with my monitor @ 75hz @ 1080p and my LCD display only runs 75hz @1080p because one .. 1920 x 1080 is my LCDs NATIVE RESOLUTION, I cant run games lower than that..... it looks like dog shit if i do... blurry and such... not smooth either..


screw nvidia, i think they fucked up this time?

I am aware of the 3DFX days, and the choices you had to make back then though.

3DFX.. oh me , me too, iv had the 3Dfx cards back then, especially the Voodoo2 1000 with 2x in SLI (THIS IS WHERE NVIDIA GOT THE SLI TECH FROM IN THE 1ST PLACE buying out 3dFX Interactive///)
 
Joined
Oct 22, 2014
Messages
14,091 (3.82/day)
Location
Sunshine Coast
System Name H7 Flow 2024
Processor AMD 5800X3D
Motherboard Asus X570 Tough Gaming
Cooling Custom liquid
Memory 32 GB DDR4
Video Card(s) Intel ARC A750
Storage Crucial P5 Plus 2TB.
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Eweadn Mechanical
Software W11 Pro 64 bit
Language please kiddies
 
Joined
Feb 3, 2017
Messages
3,753 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
3DFX.. oh me , me too, iv had the 3Dfx cards back then, especially the Voodoo2 1000 with 2x in SLI (THIS IS WHERE NVIDIA GOT THE SLI TECH FROM IN THE 1ST PLACE buying out 3dFX Interactive///)
No, they did not. That is where they took the acronym (not even the name) from but that is it.
3dfx's SLI was and meant Scan-Line Interleave - two identical cards, each rendering one line of pixels on the screen. That did not prove to be a viable technical solution when graphics had advanced.
Both SLI (Scalable Link Interface) and Crossfire either use AFR (Alternate Frame Rendering) or SFR (Split Frame Rendering). There are couple of other minor modes but these are the two big ones.
 
Joined
May 9, 2012
Messages
8,525 (1.86/day)
Location
Ovronnaz, Wallis, Switzerland
System Name main/SFFHTPCARGH!(tm)/Xiaomi Mi TV Stick/Samsung Galaxy S23/Ally
Processor Ryzen 7 5800X3D/i7-3770/S905X/Snapdragon 8 Gen 2/Ryzen Z1 Extreme
Motherboard MSI MAG B550 Tomahawk/HP SFF Q77 Express/uh?/uh?/Asus
Cooling Enermax ETS-T50 Axe aRGB /basic HP HSF /errr.../oh! liqui..wait, no:sizable vapor chamber/a nice one
Memory 64gb DDR4 3600/8gb DDR3 1600/2gbLPDDR3/8gbLPDDR5x/16gb(10 sys)LPDDR5 6400
Video Card(s) Hellhound Spectral White RX 7900 XTX 24gb/GT 730/Mali 450MP5/Adreno 740/Radeon 780M 6gb LPDDR5
Storage 250gb870EVO/500gb860EVO/2tbSandisk/NVMe2tb+1tb/4tbextreme V2/1TB Arion/500gb/8gb/256gb/4tb SN850X
Display(s) X58222 32" 2880x1620/32"FHDTV/273E3LHSB 27" 1920x1080/6.67"/AMOLED 2X panel FHD+120hz/7" FHD 120hz
Case Cougar Panzer Max/Elite 8300 SFF/None/back/back-front Gorilla Glass Victus 2+ UAG Monarch Carbon
Audio Device(s) Logi Z333/SB Audigy RX/HDMI/HDMI/Dolby Atmos/KZ x HBB PR2/Moondrop Chu II + TRN BT20S
Power Supply Chieftec Proton BDF-1000C /HP 240w/12v 1.5A/4Smart Voltplug PD 30W/Asus USB-C 65W
Mouse Speedlink Sovos Vertical-Asus ROG Spatha-Logi Ergo M575/Xiaomi XMRM-006/touch/touch
Keyboard Endorfy Thock 75% <3/none/touch/virtual
VR HMD Medion Erazer
Software Win10 64/Win8.1 64/Android TV 8.1/Android 13/Win11 64
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
well that's not an issue ... RTRT place the game at a slideshow level on anything above 1080p on a 2080Ti (1080,5Ti ... sorry ) still wondering why RT core are a thing on this gen ... (tensor on the other hand might be useful for something else than gaming ) they should have focused on bringing more perf, well in RT mode the 2080Ti is above the 1080Ti by a fair margin but still 1080p<60fps ... errrrrrrrr nope ... but without it's not really stellar above

tho thanks to the rage from the "faithful" (to be polite) it will probably still translate into a price lowering on the previous gen and Vega (good for me ... my 1070 is a bit ... meh, although still correct, ... in 1620p75Hz)


but i am sure of one thing, no 1070Ti, 1080, 1080Ti nor 2070, 2080 or 2080Ti for me.

the next upgrade path will be interesting ...
 
Joined
Mar 21, 2016
Messages
2,508 (0.79/day)
Well, some expect it even if it runs at 10fps in the end. Remember the times when 3DMark ran like that on top of the line cards? These days, I'm getting like 50+ fps in anything I throw at graphic cards... Not much of a benchmark if it doesn't bring top of the line graphic card to its knees.
That's also about what Stunt Race FX ran at on the SNES.
 
Top