• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Editorial NVIDIA's 20-series Could be Segregated via Lack of RTX Capabilities in Lower-tier Cards

Joined
Apr 19, 2012
Messages
122 (0.03/day)
Location
San Diego, California
System Name Mi Negra
Processor Intel Core i7-2600K Sandy Bridge 3.4GHz (3.8GHz Turbo Boost) LGA 1155 95W Quad-Core Desktop Processo
Motherboard Gigabyte GA-Z68XP-UD3-iSSD LGA 1155 Intel Z68 HDMI SATA 6Gb/s USB 3.0 ATX Intel Motherboard
Cooling Arctic Cooling Freezer 7 Pro Rev.2 with 92mm PWM Fan
Memory Patriot Viper Xtreme Series DDR3 8 GB (2 x 4 GB) PC3-12800 1600MHz
Video Card(s) Nvidia Founders Edition GeForce GTX 1080 8GB GDDR5X PCI Express 3.0 Graphics Card
Storage Samsung 750 EVO 250GB 2.5" 250G SATA III Internal SSD 3-D 3D Vertical Solid State Drive MZ-750250BW
Display(s) Samsung UN40JU6500 40" Class 4K Ultra HD Smart LED TV
Case In Win 303 Black SECC Steel/Tempered Glass Case ATX Mid Tower, Dual Chambered/High Air Computer Case
Audio Device(s) Creative Sound Blaster X-Fi Titanium Fatal1ty Professional 70SB088600002 7.1 Channels 24-bit 96KHz P
Power Supply Antec High Current Pro HCP-1200 1200W ATX12V / EPS12V SLI Ready 80 PLUS GOLD Certified Yes, High Cur
Mouse Logitech G700s Black 13 Buttons Tilt Wheel USB RF Wireless Laser 5700 dpi Gaming Mouse
Keyboard Logitech G810 Orion Spectrum RGB Mechanical Gaming Keyboard
Software Microsoft Windows 10 Professional 64-bit
maybe a 2080 RTX at 1500$ and then a 2080 GTX at 750$ [with 20$ mail in rebate]
It would probably just be 10 bucks of Fortnite credit.
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,745 (3.30/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
As I understand it, process defects at the foundry are largely responsible for product segmentation. For instance (pulling numbers out of my ass here, because I don't know the chip names that well) they may be "trying" to make GP100 chips at the foundry, and the perfect chips get put on Quadros and such, and the "defective" chips (which they may call GP102 or something) may get slapped on a Titan or high end GTXwhatever, with some shader units disabled or something, wherever the defect was. This theory seems to be supported by that one Quadro in the current lineup that is faster than even the current Titan in games, by a margin that isn't insignificant.
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
As I understand it, process defects at the foundry are largely responsible for product segmentation. For instance (pulling numbers out of my ass here, because I don't know the chip names that well) they may be "trying" to make GP100 chips at the foundry, and the perfect chips get put on Quadros and such, and the "defective" chips (which they may call GP102 or something) may get slapped on a Titan or high end GTXwhatever, with some shader units disabled or something, wherever the defect was. This theory seems to be supported by that one Quadro in the current lineup that is faster than even the current Titan in games, by a margin that isn't insignificant.
That's not just a theory , it's how the majority of chips are made, that and with spare overprovisioned part's to step up when another parts faulty(mostly Amds tactic) in chip.
Without such tactics 20-80% of every wafer could be wasted and the cost of the working ones would be astronomical.
 
Joined
Sep 7, 2017
Messages
3,244 (1.23/day)
System Name Grunt
Processor Ryzen 5800x
Motherboard Gigabyte x570 Gaming X
Cooling Noctua NH-U12A
Memory Corsair LPX 3600 4x8GB
Video Card(s) Gigabyte 6800 XT (reference)
Storage Samsung 980 Pro 2TB
Display(s) Samsung CFG70, Samsung NU8000 TV
Case Corsair C70
Power Supply Corsair HX750
Software Win 10 Pro
Glad to see you figured it out too @Raevenlord

I knew this when I saw Jensen shout TEN GIGA RAYS like a fool. He tried to pull a Steve Jobs on people. Everything was amazing, fantastic, never seen before, first time you could get proper use out of this new tech... it was like watching an Apple keynote.

Except with Nvidia, what they showed were jerky tech demoes at 30 FPS and a huge blurry mess of a dude dancing at the end. Oh yeah, it had reflections, too.

Its hilarious to see people on TPU echoing that RTRT is the next best thing. As if Nvidia re-invented the wheel. I guess this generation separates the fools from the realists.

Except instead of like Steve Jobs, he replaced the turtleneck with a leather jacket. He probably thinks he's Tony Stark.
 
Joined
Feb 12, 2015
Messages
1,104 (0.31/day)
This.
It looks like TU102/4 was specifically made for Quadro cards and making a RT and Tensor core-less GPU just for gaming cards wasnt finacially viable if 7nm is just around the corner.

Exactly. They are launching both the 2080 and the 2080 Ti at the same time, and neither are the full die - this is a 10 month series at most haha.
 
Joined
Sep 15, 2007
Messages
3,946 (0.63/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
Except instead of like Steve Jobs, he replaced the turtleneck with a leather jacket. He probably thinks he's Tony Stark.

Hahahaha, I forgot just how big of a douche Jobs was. It all makes sense, now. It's like requisite apparel to milk your dummy customers.
 
Joined
Aug 2, 2012
Messages
1,986 (0.44/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7-11700
Motherboard ASRock Z590 Steel Legend
Cooling Noctua NH-D15S
Memory Crucial Ballistix 3200/C16 32GB
Video Card(s) Nvidia RTX 4070 Ti 12GB
Storage Crucial P5 Plus 2TB / Crucial P3 Plus 2TB / Crucial P3 Plus 4TB
Display(s) EIZO CX240
Case Lian-Li O11 Dynamic Evo XL / Noctua NF-A12x25 fans
Audio Device(s) Creative Sound Blaster ZXR / AKG K601 Headphones
Power Supply Seasonic PRIME Fanless TX-700
Mouse Logitech G500S
Keyboard Keychron Q6
Software Windows 10 Pro 64-Bit
Benchmark Scores None, as long as my games runs smooth.
So 2060 should sell for $100 then?
 
Joined
Oct 14, 2017
Messages
210 (0.08/day)
System Name Lightning
Processor 4790K
Motherboard asrock z87 extreme 3
Cooling hwlabs black ice 20 fpi radiator, cpu mosfet blocks, MCW60 cpu block, full cover on 780Ti's
Memory corsair dominator platinum 2400C10, 32 giga, DDR3
Video Card(s) 2x780Ti
Storage intel S3700 400GB, samsung 850 pro 120 GB, a cheep intel MLC 120GB, an another even cheeper 120GB
Display(s) eizo foris fg2421
Case 700D
Audio Device(s) ESI Juli@
Power Supply seasonic platinum 1000
Mouse mx518
Software Lightning v2.0a
- GP100’s SM incorporates 64 single-precision (FP32) CUDA Cores. In contrast, the Maxwell and Kepler SMs had 128 and 192 FP32 CUDA Cores, respectively.
that ! also in linux on nouveau you get full speed FP64 without artificial crippling from the driver on the 780 Ti. what a card that !, it can run on all windows 5 and nouveau. now I know why my 780 Ti can protect 1 TB of data in PAR2 in 1 hour !
 
Joined
Aug 31, 2016
Messages
104 (0.03/day)
Do those new RTX card support 10bit colour or still 8bit like rest of Nvidia Geforce series?

Unlikely. There is a pretty good explanation for that on Nvidia website:

"NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector."

I don't think there is any reason for this to change, they have no interest in bringing pro features into mainstream.
 
Joined
Sep 15, 2007
Messages
3,946 (0.63/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
Unlikely. There is a pretty good explanation for that on Nvidia website:

"NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector."

I don't think there is any reason for this to change, they have no interest in bringing pro features into mainstream.

I don't think this is nvidia or AMD, but rather adobe and their delusions of grandeur, unless it's collusion for segmentation (which sounds illegal...not that anyone has given two shits in 20 yrs about megacorp legality).
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Unlikely. There is a pretty good explanation for that on Nvidia website:

"NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector."

I don't think there is any reason for this to change, they have no interest in bringing pro features into mainstream.
Pro features , i hardly call 10bit pro it comes in handy for HDR and every amd card out now can do it easily with a compatible monitor over hdmi or displayport.
Also such segregation offends my sensibility, yeah some pro features can be kept for quadro but that's not one i agree with.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
I remember the days when you used to be able to trick the driver to think a GeForce card was a Quadro card by faking a different device id.
Why do that?
I wouldn't be surprised if ECC is supported but, they just put non-ECC DRAM on it for the consumer.
Of course the chip itself is compatible with ECC. But why would you want gamers to pay for ECC RAM?
2x ? 2x what ? Price ?
2x performance in best case scenario (tensor cores taking over AA). There are a few games which already support this.
I seems the average improvement is around 50% (not bad, right?)
- GP100’s SM incorporates 64 single-precision (FP32) CUDA Cores. In contrast, the Maxwell and Kepler SMs had 128 and 192 FP32 CUDA Cores, respectively.
GP100 has 64 FP32 and 32 FP64 cores. It's the same with GV100.
Also, why the drama if you don't care that much? You've missed today's medicine or what?
 
Joined
Oct 14, 2017
Messages
210 (0.08/day)
System Name Lightning
Processor 4790K
Motherboard asrock z87 extreme 3
Cooling hwlabs black ice 20 fpi radiator, cpu mosfet blocks, MCW60 cpu block, full cover on 780Ti's
Memory corsair dominator platinum 2400C10, 32 giga, DDR3
Video Card(s) 2x780Ti
Storage intel S3700 400GB, samsung 850 pro 120 GB, a cheep intel MLC 120GB, an another even cheeper 120GB
Display(s) eizo foris fg2421
Case 700D
Audio Device(s) ESI Juli@
Power Supply seasonic platinum 1000
Mouse mx518
Software Lightning v2.0a
I think the idea of dedicated anti aliasing tensors is very good, don't need to bother with special anti aliasing parts in the rasterization process
I have nothing against ECC for gamers, I think it also a good idea, why should gamers have shit hardware ?
when you "quadrify" a geforce card you unlock 2 things: bypass artificial FP64 limitation of nvidia driver (which limits to 1/4 FP64 operations of the hardware ability), enable hardware virtualization PCI passthrough and bypass code 43 of nvidia driver
192 FP32 vs 64 FP32 units ? is that realy something that people shouldn't care about ?
if nvidia didn't limit FP64 capacity of geforce cards, and the speed was there, developers would be using more of it, high precision computations translates to higher quality graphics, it simple as that, wouldn't even need RTRT to look good
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
I think the idea of dedicated anti aliasing tensors is very good, don't need to bother with special anti aliasing parts in the rasterization process
These are not "dedicated anti aliasing tensors". Tensor cores simply are good at a certain type of operations - just like the general "CUDA" cores.
I have nothing against ECC for gamers, I think it also a good idea, why should gamers have shit hardware ?
Why do you think ECC is so important? :eek:
192 FP32 vs 64 FP32 units ? is that realy something that people shouldn't care about ?
Exactly, they shouldn't. It's a product. Nvidia tells you what it's good for. Then there are reviews which check if those claims are true.
If the products does what you want, buy it. If not, buy something else.
You're simply overconcerned about some magic numbers on the spec sheet.
developers would be using more of it, high precision computations translates to higher quality graphics, it simple as that, wouldn't even need RTRT to look good
Actually, no. High precision doesn't translate into higher quality graphics. For graphics you need many low-precision operations. This is exactly the reason why GPUs exist.
 
Joined
Oct 1, 2013
Messages
250 (0.06/day)
Unlikely. There is a pretty good explanation for that on Nvidia website:

"NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector."

I don't think there is any reason for this to change, they have no interest in bringing pro features into mainstream.

Quick reminder that AMD has these features in their Vega FE though.
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Quick reminder that AMD has these features in their Vega FE though.
Polaris and vega both can output to upto 12bit afaik and have tried
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.88/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
@Raevenlord I'd love to see a tiddly RT 1030 type card with full ray tracing capabilities present and correct, performing at 1fps or less, just for the fun of it. And yes, I'd be that nerd that bought it just to watch that slide show. :laugh: After all, I did that with the GT 1030 to complement my GTX 1080, but was actually surprised that it performs as well as it does for such a little GPU.
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,745 (3.30/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
GTX580 seems twice as fast... but it beats the much older GTX285. Still, I was rocking a GTX260 at one time... admirable that such low end "garbage" is actually able to perform somewhat.
 
Joined
Dec 5, 2017
Messages
157 (0.06/day)
As I understand it, process defects at the foundry are largely responsible for product segmentation. For instance (pulling numbers out of my ass here, because I don't know the chip names that well) they may be "trying" to make GP100 chips at the foundry, and the perfect chips get put on Quadros and such, and the "defective" chips (which they may call GP102 or something) may get slapped on a Titan or high end GTXwhatever, with some shader units disabled or something, wherever the defect was. This theory seems to be supported by that one Quadro in the current lineup that is faster than even the current Titan in games, by a margin that isn't insignificant.

While what you're describing definitely is a practice (a la x80 vs x70, or i7 vs i5 etc) GP100 and GP102 are a different matter; NVIDIA made two different high-end chips due to the additional die size FP64 and other niche professional features require. Additionally GP100 uses HBM2 while GP102 uses GDDR5X. GP102 is significantly cheaper for NVIDIA to produce.
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
19,083 (3.00/day)
Location
UK\USA
Why does this article act like this is new info? Everyone knows the 2060 is very unlikely to have RT cores - Nvidia wants its sheep to pay $600 cards from now on.

Furthermore the x60 series is now basically made for laptops first, and the new RT cores are HORRIBLY inefficient. It makes substantially more sense to create a Turing die that lacks RT and Tensor cores so they can make it a tiny <200mm^2 die that only uses 50-100w.

Yes but they are marketing the wholle range as RTX when really they are not. As they know full well the low end ones will sell as company's like best buy and such will sell them with a system telling them they have ray tracing, which by time it becomes a thing it be way to late for the people who got the cards to do any thing about it.

Another nVidia ripp off.
 
Joined
Jun 10, 2014
Messages
2,987 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
"NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector."

I don't think there is any reason for this to change, they have no interest in bringing pro features into mainstream.
Their Linux drivers does support 10-bit per channel for all recent GeForce cards though.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.81/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Why do that?
Because it actually unlocked some of the professional features that weren't actually a hardware limitation at the time because a lot of things were blocked by the driver. Remember ever unlocking pixel or vertex pipelines via software?
Of course the chip itself is compatible with ECC. But why would you want gamers to pay for ECC RAM?
...but why does it support ECC even if it's not being used for the consumer? Usually that's because it has an application where ECC is required. Hence me jumping to the conclusion that there are just professonal cards that were slightly altered for the consumer because they had already been built. It also could be the case that the demand for these cards wasn't as high as they had anticipated and are repurposing these chips instead.

The big thing is if these are professional cards with a lot more hardware in them, how is that going to impact clock speed and heat? Usually when this kind of thing happens, things don't live up to expectation and that's usually because despite all of the compute hardware, it's not really giving the GPU an advantage when it comes to rendering stuff in games.

Their Linux drivers does support 10-bit per channel for all recent GeForce cards though.
Too bad that you're practically selling your soul by using nVidia's closed source drivers. I'm still waiting for them to release their firmwares so nouveau can suck a little less but, obvious nVidia has no intent to make the open source community happy.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
Remember ever unlocking pixel or vertex pipelines via software?
No...?
It's been 10 years since I stopped overclocking, modding, unlocking and so on. It was fun when I was a teenager, but really seem to have been a waste of time. I should have just spent more time outside or learn German instead. :)
...but why does it support ECC even if it's not being used for the consumer? Usually that's because it has an application where ECC is required. Hence me jumping to the conclusion that there are just professonal cards that were slightly altered for the consumer because they had already been built. It also could be the case that the demand for these cards wasn't as high as they had anticipated and are repurposing these chips instead.
No. "Gaming" cards are not altered "pro" cards. It's just a chip which can do multiple things. Firmware and drivers tell him what to do.
The reason why gaming chips have ECC functionality included (but blocked) is fairly simple: there's nothing special about ECC. Pretty much all CPUs and GPUs sold today are ECC-compliant.
Manufacturers are using ECC as a factor in lineup segmentation, which really makes sense.
It's not like ECC is crucial for what most people do with their PCs and ECC RAM is more expensive.

Sure, we could suddenly make all computers use ECC.
But sooner or later someone would notice that consumers are paying a premium for a feature they don't need, so it must be a conspiracy by chip makers! Just look how people reacted to RTX. :)
And the overclocking crowd would soon decide that they miss the non-ECC times, because they could OC higher. :)
 
Top