• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Builds Exotic RTX 4070 From Larger AD103 by Disabling Nearly Half its Shaders

Joined
Jun 22, 2014
Messages
446 (0.12/day)
System Name Desktop / "Console"
Processor Ryzen 5950X / Ryzen 5800X
Motherboard Asus X570 Hero / Asus X570-i
Cooling EK AIO Elite 280 / Cryorig C1
Memory 32GB Gskill Trident DDR4-3600 CL16 / 16GB Crucial Ballistix DDR4-3600 CL16
Video Card(s) RTX 4090 FE / RTX 2080ti FE
Storage 1TB Samsung 980 Pro, 1TB Sabrent Rocket 4 Plus NVME / 1TB Sabrent Rocket 4 NVME, 1TB Intel 660P
Display(s) Alienware AW3423DW / LG 65CX Oled
Case Lian Li O11 Mini / Sliger CL530 Conswole
Audio Device(s) Sony AVR, SVS speakers & subs / Marantz AVR, SVS speakers & subs
Power Supply ROG Loki 1000 / Silverstone SX800
VR HMD Quest 3
AMD did the same with entire Navi 21 stack, from 6950XT to 6800 non-XT.
In any case, AMD made their Octa cores from scrapped dual CCD counterparts, and sold it to thousands of people. No one would have know, until the problems started to pop out. And I doubt that these GPUs can be any worse than that.
We can go back much, much further. I had a Radeon 6950 that you could bios flash to unlock to a 6970 (read about it right here on TPU back in the day, thanks Wiz! lol), and an Athlon X3 that could potentially unlock to the quad core variant. That was before the days of physically fusing off portions of the silicon. 2010 to be exact....wow, I feel really old now.

All of this binning/fusing/repurposing has been going on for at least 14 years now, that is the earliest experience I can remember having with it. I do not want to even imagine the cost of CPU's/GPU's if this was not the standard practice.
 
Joined
Aug 2, 2012
Messages
1,986 (0.44/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7-11700
Motherboard ASRock Z590 Steel Legend
Cooling Noctua NH-D15S
Memory Crucial Ballistix 3200/C16 32GB
Video Card(s) Nvidia RTX 4070 Ti 12GB
Storage Crucial P5 Plus 2TB / Crucial P3 Plus 2TB / Crucial P3 Plus 4TB
Display(s) EIZO CX240
Case Lian-Li O11 Dynamic Evo XL / Noctua NF-A12x25 fans
Audio Device(s) Creative Sound Blaster ZXR / AKG K601 Headphones
Power Supply Seasonic PRIME Fanless TX-700
Mouse Logitech G500S
Keyboard Keychron Q6
Software Windows 10 Pro 64-Bit
Benchmark Scores None, as long as my games runs smooth.
Can't happen. It's all about pushing Ray Tracing with Nvidia for years now. Hence the RTX branding.
You know path tracing is the future right? Eventually an iGPU will be able to render it.
 
Joined
Nov 26, 2021
Messages
1,648 (1.50/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
You know path tracing is the future right? Eventually an iGPU will be able to render it.
Given how poorly even today's highest end hardware performs in path traced games and taking into account that non Apple iGPUs selling today aren't even as fast as a 290X or a 780 Ti from over ten years ago, that's a rather optimistic prediction.

1714419311229.png
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.73/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
You know path tracing is the future right? Eventually an iGPU will be able to render it.
Yes, but bear in mind that games only get more and more demanding on hardware as the years go by. That is what has held back 4K adoption since 4K came on the market 12 years ago. The price for a 4K monitor had been down to reasonable levels for years now but the adoption rate is still only around 3.5%
 
Joined
May 13, 2022
Messages
70 (0.08/day)
Nvidia is anxious to sell us all the latest and greatest expensive hardware so we can REALLY get excited about playing:.........Fortnight?.....GTAV Online?.....Witcher III? Sorry, I'm neglecting the big games.....Starfield.....errr....Cyberpunk 2077's benchmark mode.....uhhhhh. HellDivers 2? No wait that runs even on AMD APU's......

Release all the hardware you want, there reaches a point where nobody cares anymore when the software is garbage, and software is garbage because instead of making good games based on previous sales figures, all they are doing is forcing devs to focus on the buzzwords. "Our game NEEDS AI and it needs PATH TRACING......and NFT's!!!....huh?...Oh..wait....<side conversation>...ok the NFT's can be patched-in later!".....

Smells like 83/84 again......"so all the new software is shovelware but they still want us to buy new systems every 2 years to re-play the old games but with upgraded graphics, you say?" :)
 
Joined
May 18, 2005
Messages
72 (0.01/day)
Given how poorly even today's highest end hardware performs in path traced games and taking into account that non Apple iGPUs selling today aren't even as fast as a 290X or a 780 Ti from over ten years ago, that's a rather optimistic prediction.

View attachment 345628
... and then you turn on frame generation and it's nice and smooth with an imperceptible latency difference. That "useless" ai stuff people love to whine about. Also dlss, but I don't know how it looks on 1080p dlss quality as I haven't run that low a resolution since 2004. Been on 1600p since 2008, and 4k since 2014. Dlss q at those resolutions is so close to flawless on every game I've tried it on, it's basically a free performance boost.
 
Last edited:
Joined
Jan 7, 2022
Messages
131 (0.12/day)
Processor Intel i5 9400f
Motherboard MSI Z390 Gaming Plus
Cooling SilentiumPC Fera 3
Memory 2x 8GB Corsair Vengeance LPX 3200 16-18-18-36
Video Card(s) MSI GTX 1660 Super Ventus XS OC
Storage 500 GB Kingston A2000; 1 TB Kingston A2000; 1 TB HGST Travelstar
Display(s) AOC 24G2U
Case SilentiumPC Signum SG1
Audio Device(s) Creative Pebble Plus; Logitech G533
Power Supply beQuiet Systempower 9 500W
Mouse Logitech G403 Hero
Keyboard HyperX Alloy Origins (Red)
Software Windows 10 Pro
So about 8-10% more power-hungry in the case of the 2060KO. Not a disaster, but also not great either.
Keep in mind that the powerlimits are different for the cards in that graphic:

So it's less than 5%
 
Joined
Mar 10, 2024
Messages
12 (0.05/day)
Location
Hungary
System Name Main rig
Processor Intel Core i5-14600k
Motherboard TUF GAMING B760M-PLUS
Cooling Be Quiet! DARK ROCK PRO 5
Memory 32 GB DDR5 6000 MHz
Video Card(s) RTX 3060 Ti GDDR6X
Storage Kingston KC3000 1TB, Samsung 970 evo plus, Kingmax 480 GB SSD, Western Digital WD Red Plus 3.5 3TB
Display(s) 1080p
Case Fractal Design Focus 2
Power Supply Seasonic FOCUS GX Series 750W
If i remember correctly the 2060 KO was inexplicably a small bit faster than the regular 2060 in certain workloads (beyond the margin of error) despite having been cut down to the same specs. It would be interesting to see if this also the case with these
 
Joined
Apr 19, 2018
Messages
1,227 (0.51/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
Have nGreedia stopped manufactoring the smaller 40x0 chips to get ready for an early 50x0 launch? Probably not, but fun to speculate.
 
Joined
Feb 20, 2019
Messages
8,284 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Given how poorly even today's highest end hardware performs in path traced games and taking into account that non Apple iGPUs selling today aren't even as fast as a 290X or a 780 Ti from over ten years ago, that's a rather optimistic prediction.

View attachment 345628
Yeah, I've been on the RTX train since launch and threw a 3090 at path tracing when it first popped up in CP2077, which was a total disaster.
So I tried it on a 4070 (non-super) and it was pretty lousy even with balanced DLSS, ray-reconstruction and frame-gen. Even if I was getting a reasonable framerate thanks to frame-gen, the number of actual raytracing samples it was working with at a reduced resolution every other frame made the resulting image quality absolute shit-tier garbage in motion.

Sure, path-tracing looks great in still screenshots once you've had a couple of seconds of static image to fill in all the missing data over the course of 100+ frames, but the in-motion, per-frame level of noise and distortion is abysmal on the 4070. With the 4090 having roughly double the raytracing performance in CP2077 according to this chart, I can't imagine the 4090 experience is much better. Ray reconstruction makes the reflections look pretty decent but everything else is a laggy, shadow-crawling mess of delayed lighting and out-of-date temporal information in the scene.

1714470394930.png

Honestly, I liked the raytraced reflections in CP2077 but the shadow and GI stuff isn't great even without path-tracing, it simply doesn't work in motion, only in static screenshots (by which I mean screenshots taken when the camera is static). With path tracing and ray-reconstruction, the GI and reflections are fantastic but the low framerate means that the already dubious RT shadows and ambient occlusion delay are an even worse, unmissable eyesore that stick out so badly that it completely ruins the entire image in motion.

There's simply not enough raw information per frame to handle shadows in realtime. "Slow light" simply just work in an FPS game where even when you're stood still the shadows crawly like a Lovecraftian ooze and every moving object has a bright trail absent of shadow where shadow should be at its darkest - as well as any new part of the image coming into view as you turn your view to glance sideways always being fullbright for several frames as the temporal-filter slowly fills in shadows and occlusion. It's hard to ignore when it's often half of your entire screen lit incorrectly in something as simple as turning your head to look left at any street junction or turn of any corridor.

I'm at work, but I will take a screenshot at home if I remember of CP2077 at 4K with DLSS3.5 FG RR path-traced as I turn my camera to check a street junction. From memory, these screenshots look like total ass for around 1/3rd of the image that's just come into view. Anyone accepting that as a superior image quality to baked-and-faked is crazy, in my opinion.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
Yes, but bear in mind that games only get more and more demanding on hardware as the years go by. That is what has held back 4K adoption since 4K came on the market 12 years ago. The price for a 4K monitor had been down to reasonable levels for years now but the adoption rate is still only around 3.5%

No. There is a very strong anti-4K propaganda. You actually get a higher image quality if you switch from 1K to 4K, instead of staying at 1K with some glossy effects aka ray-tracing...
 
Joined
Feb 20, 2019
Messages
8,284 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
So I said I'd post two path-traced screenshots - This is raytracing in motion vs a static scene. I took a screenshot during a camera move that represents actual gameplay and then after pasting the image into PhotoShop I took a second screenshot without movement as close to the output of the first screenshot as possible.

In-motion screenshot as I turn to to face the street on the stairs to V's Megacomplex apartment:
1714512719473.png
This screenhot is 100% authentic gameplay and highlights the shortcomings of both DLSS and raytracing with it's jaggies and godawful splotchy lighting. There's obvious visible crawl of any and all lighting in the scene and you can repeat this experiment yourself by simply enabling overdrive and hitting PrintcSreen in typical gameplay motion. The uneven shadow/lighting mess in the screenshot above isn't static, it crawls like a tentacle-endowed Lovecraftian horror so an animated .gif of this in slow-mo would look orders of magnitude worse!

What an RT Overdrive screenshot looks like if you give it a second or so (40-50 frames of temporal data):
1714512830200.png
This is what RT frames should look like, but this is the temporal average of several hundred frames and doesn't represent real gameplay! In other words, it's a hoax. The lighting is a total mess in motion and it looks like ass - nothing like this second static screenshot. Yes, I said 'ass' because I lack the vocabulary to better describe this splotchy, irregular mess, but if the 4070 isn't remotely close to a reasonable RT lighting output, I'm going to guess that the 4090 is only half as bad, and "semi-ass" is no compliment for $2000 of cutting-edge hardware that's beyond the reach of 99% of all gamers.

Meanwhile, here is what "raster" looks like (in motion) to compare against the first screenshot in this post:
1714513119639.png
This "fake" lighting will run at 160+ FPS without framegen-lag on my 4070. Youtube proves that you can get decent, true 60+fps results from entry-level GPUs costing $200, which is a miles better experience than the approximately 30fps input lag of framegen with path-tracing on far more expensive hardware on my $600 4070. Sure, the lighting is different, but it isn't anywhere near as bad as the RTX splotchy, blurry, mess. Anyone with half-decent hardware can make their own decision but I'm not personally keen on the inaccurate, Lovecraftian nightmare of RTX realtime* lighting. It's low quality, inconsistent garbage that requires you to spend quadruple on hardware for the low-quality result.

Realistically, human vision isn't a 1:1 translation of screenshots, but you need to be practically blind to miss the artifacts and problems with realtime RT. We're so far off the 'realtime' RT reality that Nvidia aspire to that I don't ever think we'll get there. Game developers increase scene complexity to match GPU capabilities. Unless a GPU magically appears with 20x more horsepower than the current expectations, the in-game scenes will always be too complex to render on any given GPU of the same generation.

* - intentional obvious sarcasm.

Let us not forget that this example I'm using is the definitive Nvidia-sponsored, Nvidia-funded, total RTX experience in its best possible light. No other game comes close to this level of DLSS and raytracing development effort.
 
Last edited:
Joined
Apr 19, 2018
Messages
1,227 (0.51/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
So I said I'd post two path-traced screenshots - This is raytracing in motion vs a static scene. I took a screenshot during a camera move that represents actual gameplay and then after pasting the image into PhotoShop I took a second screenshot without movement as close to the output of the first screenshot as possible.

In-motion screenshot as I turn to to face the street on the stairs to V's Megacomplex apartment:
View attachment 345792
This screenhot is 100% authentic gameplay and highlights the shortcomings of both DLSS and raytracing with it's jaggies and godawful splotchy lighting. There's obvious visible crawl of any and all lighting in the scene and you can repeat this experiment yourself by simply enabling overdrive and hitting PrintcSreen in typical gameplay motion. The uneven shadow/lighting mess in the screenshot above isn't static, it crawls like a tentacle-endowed Lovecraftian horror so an animated .gif of this in slow-mo would look orders of magnitude worse!

What an RT Overdrive screenshot looks like if you give it a second or so (40-50 frames of temporal data):
View attachment 345793
This is what RT frames should look like, but this is the temporal average of several hundred frames and doesn't represent real gameplay! In other words, it's a hoax. The lighting is a total mess in motion and it looks like ass - nothing like this second static screenshot. Yes, I said 'ass' because I lack the vocabulary to better describe this splotchy, irregular mess, but if the 4070 isn't remotely close to a reasonable RT lighting output, I'm going to guess that the 4090 is only half as bad, and "semi-ass" is no compliment for $2000 of cutting-edge hardware that's beyond the reach of 99% of all gamers.

Meanwhile, here is what "raster" looks like (in motion) to compare against the first screenshot in this post:
View attachment 345794
This "fake" lighting will run at 160+ FPS without framegen-lag on my 4070. Youtube proves that you can get decent, true 60+fps results from entry-level GPUs costing $200, which is a miles better experience than the approximately 30fps input lag of framegen with path-tracing on far more expensive hardware on my $600 4070. Sure, the lighting is different, but it isn't anywhere near as bad as the RTX splotchy, blurry, mess. Anyone with half-decent hardware can make their own decision but I'm not personally keen on the inaccurate, Lovecraftian nightmare of RTX realtime* lighting. It's low quality, inconsistent garbage that requires you to spend quadruple on hardware for the low-quality result.

Realistically, human vision isn't a 1:1 translation of screenshots, but you need to be practically blind to miss the artifacts and problems with realtime RT. We're so far off the 'realtime' RT reality that Nvidia aspire to that I don't ever think we'll get there. Game developers increase scene complexity to match GPU capabilities. Unless a GPU magically appears with 20x more horsepower than the current expectations, the in-game scenes will always be too complex to render on any given GPU of the same generation.

* - intentional obvious sarcasm.

Let us not forget that this example I'm using is the definitive Nvidia-sponsored, Nvidia-funded, total RTX experience in its best possible light. No other game comes close to this level of DLSS and raytracing development effort.
Absolutely correct. True RT is about 12-15 years away, at least, maybe. Everything so far is tricks, and not very good ones at that.

I would guess the higer end cards of the RTX 8000 series might be the first to do real-time RT at 720p 30fps.

TBH we need a new way of rendering games, I'm not sure current style RT rendering can ever be truly realtime.
 
Joined
Feb 20, 2019
Messages
8,284 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I would guess the higher end cards of the RTX 8000 series might be the first to do real-time RT at 720p 30fps.
In a 2020 game like CP2077? Sure I can believe that. They'll still need to use temporal dithering but perhaps they have enough samples in just 2 frames to make a decent image, rather than requiring 20 frames to get anywhere close to a decent output.

The problem is that game graphics keep getting pushed forward to match the capabilities of current GPUs, so once the RTX 8000 series arrives, we will probably have games that push an order of magnitude more polygons...
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Unless a GPU magically appears with 20x more horsepower than the current expectations, the in-game scenes will always be too complex to render on any given GPU of the same generation.
The same claim is made literally every time a new GPU generation is released, and it's always disproven a few generations later. Especially given that this is only the third generation of GPUs with dedicated ray-tracing hardware, it seems rather premature to claim that real-time ray-tracing will never be possible.
 
Joined
Feb 20, 2019
Messages
8,284 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The same claim is made literally every time a new GPU generation is released, and it's always disproven a few generations later. Especially given that this is only the third generation of GPUs with dedicated ray-tracing hardware, it seems rather premature to claim that real-time ray-tracing will never be possible.
My point is that game requirements increase to keep pace with hardware, so it may be a loooong time until RT is the default render path.
When the RTX 11070TiSuper comes out in 2034, you can be sure there will be game developers making games that will bring it to its knees and max out its VRAM.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
My point is that game requirements increase to keep pace with hardware, so it may be a loooong time until RT is the default render path.
When the RTX 11070TiSuper comes out in 2034, you can be sure there will be game developers making games that will bring it to its knees and max out its VRAM.
And Cyberpunk is one such game, yet you are using this outlier to justify your argument that real-time RT/PT will in general never be the default renderer. This basic fallacy entirely invalidates the claim being made.
 
Joined
Feb 20, 2019
Messages
8,284 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
And Cyberpunk is one such game, yet you are using this outlier to justify your argument that real-time RT/PT will in general never be the default renderer. This basic fallacy entirely invalidates the claim being made.
I picked Cyberpunk because it's clearly the game that's had the most work and effort put into it by Nvidia for RTX features. It's definitely an outlier, but it's the BEST CASE SCENARIO for current raytracing, not a poor example that doesn't represent raytracing fairly.

More realistic examples of fully-raytraced games are Portal RTX and Quake II RTX. For Quake II, it took a 2080Ti to get 4K60 path-traced, in a 22-year-old game at that point in time.
 
Joined
Apr 19, 2018
Messages
1,227 (0.51/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
Another thing that doesn't help is nGreedias glacial pace at improving RT performance. It's only really the current gen series that has moved the needle substantially on RT performance. I sincerely hope that the Blackwell series also has a marked uptick in RT performance. RT is almost useless on anything less than a 4090. That level of RT performance needs to be available on the 5070 series cards, to help bring playable RT to the "mid range", obviously I know we are not talking of full RT here.
 
Joined
Feb 20, 2019
Messages
8,284 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Another thing that doesn't help is nGreedias glacial pace at improving RT performance. It's only really the current gen series that has moved the needle substantially on RT performance.
What? 2080Ti > 3080Ti was a bigger jump in RT performance than 3090 to 4090. Maybe what you say is true for other parts of the product stack but nobody is doing the latest detailed benchmarks with mid-tier 20-series by the looks of it.

1714647755580.png


I'm not sure the 2080Ti can path trace or not. I'm seeing no benchmarks anywhere so maybe it's a "30-series-and-above" feature in CP2077, the single heaviest RT title I can think of.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
What? 2080Ti > 3080Ti was a bigger jump in RT performance than 3090 to 4090. Maybe what you say is true for other parts of the product stack but nobody is doing the latest detailed benchmarks with mid-tier 20-series by the looks of it.

View attachment 345957

I'm not sure the 2080Ti can path trace or not. I'm seeing no benchmarks anywhere so maybe it's a "30-series-and-above" feature in CP2077, the single heaviest RT title I can think of.
You're responding to someone who unironically uses the phrase "ngreedia", don't expect much in the way of capability to absorb facts.
 
Joined
Dec 25, 2020
Messages
6,778 (4.73/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~

mubarak ahmed

New Member
Joined
May 4, 2024
Messages
4 (0.02/day)
Nice find. Are there any clues regarding the approximate manufacturing date of your card in its box, BIOS or documentation?
I tried searching the box for any manufacturing date but I didn't find anything, and the BIOS is not available in the database
 
Top