• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 Founders Edition

Joined
Jul 1, 2022
Messages
46 (0.06/day)
Processor 12th Gen i7-12700KF Stock
Motherboard MSI B660M Pro M-A WIFI
Memory 4x16GB DDR4 3200 CL16 @ 3466 CL16
Video Card(s) NVIDIA RTX 4070 SUPER
Storage 4x HDDs, 2x SATA SSDs, 2x NVME SSDs.
Display(s) 1920x1080 180Hz AOC, 1920x1080 60 Hz BenQ.
That's why the best time to buy a GPU is two years after a console launch, then keep said GPU for 6+ years.

This has been true for awhile. 2 years after the xbox came out, we got the 9800 pro/XT that worked well for years afterwards. two years after the PS3 we got the 9800 series/ radeon 3000 series, again working for most of those consoles lifecycles. Two years after the PS4 we got the GTX 900 series. And two years after the PS5 we are getting ada/rDNA3.

Agreed with your 1st statement.

But with regards to the NVIDIA 9800/ATI 3000 series, those DX10 cards had really horrible longevity, DX10 (launched with Vista in 2006) was a transitional phase that most game devs ignored (they just kept making DX9 games instead with some providing DX10 executables, no one made DX10 exclusive titles that didn't also have DX9 fallback renderers) before the industry moved rapidly to DX11 only 3 years later in 2009 when Windows 7 released, once that happened, those DX10 only cards became obsolete rapidly.

DX11 would become the longest running API and is STILL used in games to this day, over 14 years after introduction, so those 1st gen Radeon 5800/Nvidia 4xx series cards had arguably the longest lifespan of them all in terms of feature set compatibility.
 
Joined
Oct 31, 2022
Messages
170 (0.27/day)
Still 50-100$ too expensive, should have been a 499$ card. But prices might come down, especially after AMD shows their cards....
Other then that, exactly what I expected: Close Performance to the 3080 with a smaller pricetag (locally it is) and 40% less powerusage. Althorugh I thought RT might have been a bit stronger.

Many people still complain, they wouldn't if it were 499$ OR 16GB VRAM...
 
Joined
Oct 28, 2012
Messages
1,159 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
No plans for that. The goal is actually to make short ~5 minute summary videos, I can't stand those 20++ minute videos that dance around the topic just to get preferred by the YT algo, because they are longer than 20 minutes
:D:D
1681320178507.png
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,718 (1.96/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus Block, HWLABS Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 2x A4x10, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White, Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19044.4046
Benchmark Scores Legendary
Wizzard who refused to put in the rx6950xt on the charts ? Or for that matter rtx 2070super? Dosent even bother retesting older cards with newer drivers to see the improvements from both camps.
FYI the entire GPU test bench was updated in January, including updated drivers.

1681320235576.png
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,283 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Wizzard who refused to put in the rx6950xt on the charts ?
Send me a 6950 XT ref, I'll add it to the reviews. Just not seeing much of a reason to buy one for the charts

Or for that matter rtx 2070super?
Not sure how 2070S is relevant to this review?

retesting older cards with newer drivers
Huh? I've retested every single card in this review in January on the newest drivers
 
Joined
Mar 28, 2023
Messages
22 (0.05/day)
Location
Argentina
Processor Ryzen 5 5600 PBO 4.6GHz
Motherboard Asrock B550 Phantom Gaming 4
Cooling CoolerMaster Hyper 212 Turbo
Memory 32GB Patriot Viper Steel DDR4 3600MHz
Video Card(s) Sapphire 6800XT Nitro+
Storage Patriot Viper VPN110 500GB + XPG Gammix S11 Pro 1TB
Display(s) LG 27MP48HQ-P
Case Lancool II Mesh Performance
Audio Device(s) ASUS Xonar DX 7.1 + Edifier R1600T
Power Supply Corsair RM750x
Mouse Logitech G403 Hero Gaming
Keyboard Patriot Viper V765
You want RT performance: nVidia
You want DLSS2/3: nVidia
You want nvenc: nVidia
You want CUDA for ANYTHING OUT THERE apart from gaming: nVidia
etc. etc.

And you have AMD to price their gpus competitively to nVidia? On what ground?
Anyway, the 6000 cards do not exist for me. The whole package is problematic.
The 7000 are good but very badly priced. I would purchase one if they were cheaper.

If I could afford a 16GB nVIDIA GPU, I would have gotten nVIDIA, but my options were 3080 10GB or 6800XT and it was a no brainer for me, no way in hell Im getting a 10GB GPU in 2022/23 after having 8GB for 6 years.
 
Joined
Oct 31, 2022
Messages
170 (0.27/day)
If I could afford a 16GB nVIDIA GPU, I would have gotten nVIDIA, but my options were 3080 10GB or 6800XT and it was a no brainer for me, no way in hell Im getting a 10GB GPU in 2022/23 after having 8GB for 6 years.
Would do the same.
Right now the best deal in germany might be the 6800XT 569€ or 6950XT for 649€ (same model). The 3080 10gb starts at 740€...the 3080 12GB at 867€...
The next tier with 7900XT (829€) or 4070Ti (889€) is too high. I'd say 1% higher price per 1% performance is alright. Coming from the 6800XT that is ~700€ for the 7900XT...
The 4070 should also be 500-600€, but it will start at 659€ with tax.
 
Last edited:
Joined
Jul 8, 2015
Messages
34 (0.01/day)
Location
Greece
Processor Intel Core i7 10700K
Motherboard Asus Rog Strix Z490-G Gaming (WI-FI) mATX
Cooling Arctic Liquid Freezer II 240
Memory HyperX Fury 32GB (2x16GB) DDR4 3200MHz CL16 @3466MHZ
Video Card(s) Gigabyte RTX 3070 8GB Aorus Master
Storage Adata XPG SX8200 Pro 256GB M.2 NVMe // Samsung 850 EVO 500GB // Seagate Firecuda 2TB
Display(s) Dell S2417DG @165hz @1440p @G-sync / 24" inch
Case Fractal Design Meshify C Mini TG @mATX
Audio Device(s) Creative Sound Blaster G6 USB // Philips SHP9500S 50mm
Power Supply EVGA SuperNova 850 G2
Mouse Razer Viper Ultimate
Keyboard HyperX Alloy Origins Core (HyperX Red)
Software Windows 10 Pro
Joined
Nov 18, 2010
Messages
7,291 (1.46/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX. Water block. Crossflashed.
Storage Optane 900P[Fedora] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO+SN560 1TB(W11)
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) SMSL RAW-MDA1 DAC
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 40
I am just not comfortable with the statement that RTX 4070 Ti maxes all the available hardware. Obviously not, as both boards have two vacant places for memory IC's for higher memory bus = capacity.
 
Joined
Oct 28, 2012
Messages
1,159 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
"classic raster"? Who are you trying to fool, 99.9% of the market is still driven by raster.

The 6800 XT is AMD's last gen card, I'd expect Nvidia's newer cards to take the lead. That it's even debatable just goes to show you how disappointing it is. Fake frames? Is that something people should care about? According to most reviews, no, no you should not.
Bleeding edge tech and AI seems to be Nvidia core business strategy, since Turing, and even with all the complains about it...Nvidia is still winning business wise, and the whole competition is following their leads with AMD own fake frame coming soon. It's not like Phys x where Nvidia was alone doing it, and very few games made use of it... everybody is on board this time...so we might just be looking at the future of PC gaming with what ADA is doing...we'll have to wait for the sales figure of the 4070 to see where the people's wallet really is
 
Joined
Feb 20, 2019
Messages
7,697 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Everyone complaining about the price is still stuck in 2011 when we were in the throes of the recession and top tier silicon was $500. Forgetting, of course, that in 2007 the top dog was a $830 card, and that was a SHATLOAD back in 2007.

The fact that we have seen rampant inflation in the last 3 years just doesnt seem to connect mentally. You point out to people that nvidia's margins have gone from 54% to 59% since 2011 and they get real quiet, because the harsh reality is that it isnt just nvidia raising prices. It's everything behind them. Is nvidia scalping? A bit, yeah, especially on the xx8x series. Is it realistic for them to offer GPUs like this for 1xxx series pricing? Oh hell no.

Yes, it is an alright price.

You're not in 2018 anymore, let alone 2010. Costs have gone up. $600 is the new $300.
Yes you're right and no you're wrong.
Let's get the wrong out of the way first: It's not inflation. A $299 GPU from 2010 would only cost $414 today if inflation was the reason. Inflation doesn't even begin to cover the price doubling - it's a minor factor at best.

What you're right about is that people do need to accept GPUs are more expensive now; There's more to the cost of a card than die size, because the GTX 660Ti from over 10 years ago cost $300 and I picked it because it has the same ~300mm die size as this RTX 4070. The other differences, which are what drives the price up, are as follows:
  • More VRM phases using more capable (and more expensive) MOSFETS that require more cooling
  • Premium GDDR(6)X instead of regular GDDR
  • VRAM cooling, if you even got any in 2012, was usually just a stamped plate. GDDR6 and GDDR6X need proper cooling.
  • Multi-heatpipe, CNC-machined all-metal multi-layer heatsink, baseplate, backplate with 2+ fans rather than 2012's plastic shroud over a simple skived copper block or maybe on higher-end cards a small heatsink by today's standards with around one-quarter the surface area.
  • PCIe 4.0 in the same physical slot as PCIe 2.0 requires higher-quality PCBs, more layers, more copper, and more SMC components. We saw that issue even in something as simple as a PCIe 4.0 ribbon cable was quadruple the cost and half of them were borderline unstable at PCIe 4.0 even then!
  • Physical rigidity and size has to increase with the weight/size of today's cooling requirements. Backplates, support arms, additional bolt-throughs for baseplates all add to the cost of manufacture, raw materials, and also the shipping/handling costs for every single part of the chain from Taiwan to your door. A tier-equivalent GPU box now weighs a good 50% more than it did a decade ago and you can probably only fit half as many of them on a shipping container, so shipping costs have doubled right there.
If we didn't want to pay any more than inflation-adjusted costs, we'd be getting power delivery, cooling, PCB quality, and build-quality of cards from the bad old days - and it's likely that a modern GPU would simply refuse to work and be impossible to build with the limitations of yesteryear's cheaper, simpler components and cooling.
 
Last edited:
Joined
Aug 25, 2015
Messages
188 (0.06/day)
Location
Denmark
System Name Red Bandit
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS PRIME X670E-PRO WIFI
Cooling Mo-Ra3 420 W/4x Noctua NF-A20S - 2xD5's/1xDDC 4.2
Memory G.SKILL Trident Z5 NEO EXPO 6000CL30/3000/2000
Video Card(s) GIGABYTE RX 6900 XT Ultimate Xtreme WaterForce WB 16GB
Storage Adata SX8200 PRO 2TB x 2
Display(s) Samsung Odyssey G7 32" 240HZ
Case Jonsbo D41 Mesh/Screen
Audio Device(s) Logitech Pro X Wireless
Power Supply Corsair RM1000e v2 ATX 3.0
Mouse Logitech G502 Hero
Keyboard Corsair K70 MX RED Low profile
Software W11 Pro
Send me a 6950 XT ref, I'll add it to the reviews. Just not seeing much of a reason to buy one for the charts


Not sure how 2070S is relevant to this review?


Huh? I've retested every single card in this review in January on the newest drivers
The fact that both the 6000 series and 7000 series are on diffrent drivers. 7000 series are still on Beta drivers :) you added Rtx 3090ti but not the 3080 12gb?
 
Joined
Apr 14, 2022
Messages
688 (0.82/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Asus XG35VQ
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Software Windows 11 64bit
If I could afford a 16GB nVIDIA GPU, I would have gotten nVIDIA, but my options were 3080 10GB or 6800XT and it was a no brainer for me, no way in hell Im getting a 10GB GPU in 2022/23 after having 8GB for 6 years.

I prefer 10GB vram and 3080s level of RT performance rather than 16GB and barely Turing one.
Apart from some AMD sponsored console ports, it's highly unlikely to find a game to require more than 10gb of vram.
Even if you find some, like Doom eternal, the reason of the requirement is a joke (pointless size of textures).

For me the lighting, the physics, the animation make a beautiful game. Not the 8K textures.
The console ports require vram that is not justifiable by the look of them. It's totally unreasonable.
 
Joined
Sep 10, 2018
Messages
6,095 (2.84/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I prefer 10GB vram and 3080s level of RT performance rather than 16GB and barely Turing one.
Apart from some AMD sponsored console ports, it's highly unlikely to find a game to require more than 10gb of vram.
Even if you find some, like Doom eternal, the reason of the requirement is a joke (pointless size of textures).

For me the lighting, the physics, the animation make a beautiful game. Not the 8K textures.
The console ports require vram that is not justifiable by the look of them. It's totally unreasonable.

While I don't disagree textures is one of the easiest ways to improve a games visuals without any performance loss assuming the card has adequate vram. I feel like games as a whole would benefit from higher quality textures a lot of games still have really terrible low resolution textures in places even if the overall quality level is fine likely due to a lot of game engines still having to factor in last generation consoles and their 6.5-7GB ish vram pools.
 
Joined
Apr 14, 2018
Messages
551 (0.24/day)
I prefer 10GB vram and 3080s level of RT performance rather than 16GB and barely Turing one.
Apart from some AMD sponsored console ports, it's highly unlikely to find a game to require more than 10gb of vram.
Even if you find some, like Doom eternal, the reason of the requirement is a joke (pointless size of textures).

For me the lighting, the physics, the animation make a beautiful game. Not the 8K textures.
The console ports require vram that is not justifiable by the look of them. It's totally unreasonable.

So you want everything to look like path traced Quake? Yikes.

Your bias aside, and regardless of performance, name games using more than just shadows, reflections and AO RT implementations that are actually worth a damn and visually improve a game?

Textures are equally important to a game looking “beautiful”. If I’m not mistaken, the higher resolution and quality RT features are (especially shadows) the larger VRAM requires will become. Not only this but as consoles dictate what features are often implemented in games, RT will remain a gimmick or true path tracing will remain unreachable by existing hardware and hardware to come for the next few years.

People have some weird hang up on Ray Tracing with it having almost no impact on popular games and the majority of the market going on 3 generations after Nvidia introduced the RTX feature set.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,283 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The fact that both the 6000 series and 7000 series are on diffrent drivers. 7000 series are still on Beta drivers :)
These were the latest drivers for these cards in Jan. AMD didn't provide RX 6000 driver updates for months, while they kept pushing out beta for 7900 Series. I'll retest everything soon again anyway
 

izy

Joined
Jun 30, 2022
Messages
950 (1.25/day)
Still 50-100$ too expensive, should have been a 499$ card. But prices might come down, especially after AMD shows their cards....
Other then that, exactly what I expected: Close Performance to the 3080 with a smaller pricetag (locally it is) and 40% less powerusage. Althorugh I thought RT might have been a bit stronger.

Many people still complain, they wouldn't if it were 499$ OR 16GB VRAM...
I agree with you but i think the TI version should have 16GB vRAM and 100$ less and this one should be ~499$ / 12GB vRAM, i was considering to get an RTX 3070 before all of this talking about 8GB is not enough anymore (i tend to keep cards for a good period of time, i rarely game but i like to set graphics highest possible when i do) and i was looking at the 4070, at 500$ i would have just grab it but at 600$ im thinking twice, maybe something nice from AMD will show up. (edit: problem is you dont find them at MSRP , mostly 100$ more i think)
 
Last edited:
Joined
Sep 10, 2018
Messages
6,095 (2.84/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I agree with you but i think the TI version should have 16GB vRAM and 100$ less and this one should be ~499$ / 12GB vRAM, i was considering to get an RTX 3070 before all of this talking about 8GB is not enough anymore (i tend to keep cards for a good period of time, i rarely game but i like to set graphics highest possible when i do) and i was looking at the 4070, at 500$ i would have just grab it but at 600$ im thinking twice, maybe something nice from AMD will show up.


Yeah if the RDNA3 option cost the same and has 16GB and has similar or better raster performance it's probably the safer option if you keep your gpu longer than 2 years.

If not you could defiantly do worse just don't pay a hair over MSRP for this card it's already a stretch to begin with whether its worth 600 usd.
 
Joined
Sep 12, 2022
Messages
51 (0.07/day)
just did a search and learned about the zluda project as an alternative to cuda. Apparently nobody has still forked the project, which is amazing to me as trillions of dollars are being pushed into AI projects.. Everyone complaining about the prices, the path to better competition is an alternative to CUDA yes?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,283 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
just did a search and learned about the zluda project as an alternative to cuda. Apparently nobody has still forked the project, which is amazing to me as trillions of dollars are being pushed into AI projects.. Everyone complaining about the prices, the path to better competition is an alternative to CUDA yes?
Also HIP by AMD, yet nobody is buying
 
Joined
Sep 15, 2013
Messages
50 (0.01/day)
Processor i5-4670k @ 4.2 GHz
Motherboard ASUS Z87 Pro
Cooling Corsair H105
Memory G.SKILL RipjawsX 16GB @ 2133 MHz
Video Card(s) Gigabyte GTX 780 GHz Edition
Storage Samsung 840 Evo 500GB
Case Thermaltake MK-1
Power Supply Seasonic X 750w
Mouse Razer DeathAdder
Go buy a 6800XT then, and enjoy AMD driver’s heaven :rolleyes:

really I cannot see the point people here is constantly speaking about the 6800XT, a card that in last Steam survey was positioned in THE VERY LAST position, just above the “others” category…
 
Joined
Apr 14, 2018
Messages
551 (0.24/day)
just did a search and learned about the zluda project as an alternative to cuda. Apparently nobody has still forked the project, which is amazing to me as trillions of dollars are being pushed into AI projects.. Everyone complaining about the prices, the path to better competition is an alternative to CUDA yes?

Everyone will generally take the path of least resistance. Kudos to nvidia for introducing hardware and software advances over the years, but proprietary solutions will always creat stagnation in multiple forms. Thankfully the majority of consumer based instances of this (physics, upscalers, VRR) have since gone the way of the Dodo and prices based on these features now have parity.

It’s unfortunate AMD acquired ATi when they did to some degree, splitting their focus didn’t help out either department. Nvidia is deeply entrenched from a commercial standpoint, and aside from the strong arming they more than likely do, commercial applications of hardware rarely get changed unless there’s a massive shift in performance.
 
Last edited:
Joined
Feb 8, 2022
Messages
268 (0.30/day)
Location
Georgia, United States
System Name LMDESKTOPv2
Processor Intel i9 10850K
Motherboard ASRock Z590 PG Velocita
Cooling Arctic Liquid Freezer II 240 w/ Maintenance Kit
Memory Corsair Vengeance DDR4 3600 CL18 2x16
Video Card(s) RTX 3080 Ti FE
Storage Intel Optane 900p 280GB, 1TB WD Blue SSD, 2TB Team Vulkan SSD, 2TB Seagate HDD, 4TB Team MP34 SSD
Display(s) HP Omen 27q, HP 25er
Case Fractal Design Meshify C Steel Panel
Audio Device(s) Sennheiser GSX 1000, Schiit Magni Heresy, Sennheiser HD560S
Power Supply Corsair HX850 V2
Mouse Logitech MX518 Legendary Edition
Keyboard Logitech G413 Carbon
VR HMD Oculus Quest 2 (w/ BOBO VR battery strap)
Software Win 10 Professional
Congratulations nVidia you have made an efficient RTX 3080 for $100 less. Rejoice! This is pathetic. Remember the gains the 3070 had over the 2080? Pepperidge Farm remembers. And they remember it being atleast noticeable, unlike this one.

Also, @W1zzard this is a 1440p card and the 3080 in your own graph shows it being IDENTICAL any small differences at other resolutions don't matter because an extra 2% @1080p (a resolution I seriously doubt people would be buying this card for) is impossible to notice without a framerate counter on.
 
Joined
Dec 28, 2012
Messages
3,630 (0.86/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
Agreed with your 1st statement.

But with regards to the NVIDIA 9800/ATI 3000 series, those DX10 cards had really horrible longevity, DX10 (launched with Vista in 2006) was a transitional phase that most game devs ignored (they just kept making DX9 games instead with some providing DX10 executables, no one made DX10 exclusive titles that didn't also have DX9 fallback renderers) before the industry moved rapidly to DX11 only 3 years later in 2009 when Windows 7 released, once that happened, those DX10 only cards became obsolete rapidly.

DX11 would become the longest running API and is STILL used in games to this day, over 14 years after introduction, so those 1st gen Radeon 5800/Nvidia 4xx series cards had arguably the longest lifespan of them all in terms of feature set compatibility.
That's pretty revisionist. DX11 didnt get major mainstream traction until 2010/2011, at which point said cards were 4-5 years old, and those games STILL had DX9 fallback.

The GTX 9800 was useful well into 2012, at which point the PS4 was coming out. The GTX 200 series were goldilocks cards.
"You're being ripped off, you should be happy about it!"
"I know the cost of everything went up by 50% and our wages went up from 2018 but everything should cost the same it did 10 years ago!"
 
Top