• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Official NVIDIA RTX 4070 Performance Claims Leak Online

Joined
Apr 14, 2018
Messages
701 (0.29/day)
I might get downvoted for this but they should probably include some lube in the boxes for sure.......

The problem with the 7900XT/7900XTX is AMD is not at a point where for most gamers they are very appealing even though they are both slightly better overall than the 4070ti given the pricing. I've never personally had any major issues with amd drivers but I know many who have to the point that AMD gpu's don't even exist anymore to them.

I own plenty of AMD and Nvidia products I still will bash the crap out of stuff they release that isn't very good due to pricing. At the same time I won't blindly do it either even if it's not a good one you can still make a case for all the current gen gpus if you try hard enough...

lets be real though all these products are pretty good from the 4070ti to the 4090 and both the AMD offerings the issue is their price this isn't a RX 6500 scenario that is just a bad product regardless of price but even then people who own them defend it and enjoy it so my opinion really doesn't matter all that much to begin with at least for the people actually buying these products and not just blindly drinking the koolaid from either of these companies.

Not directed at you, but that line is a blanket excuse for being an ill informed buyer. On hand I have a 2070 super, 3080 12gb, and 7900 XTX between the 3 rigs I have at home, the 3080 has had more driver related crashes and issues in the span of a week than my 7900 XTX since release.

While for some the initial change going between nvidia and amd drivers may be jarring, modern UI aside, AMD offers more and better control through their driver than nvidia currently does from a feature standpoint.

I can’t say for sure, but I like to blame tech tubers for a lot of this. They burn through so much free hardware for pumping out reviews and click bait videos, that more often than not discrepancies across reviews and other reviewers point out they more than likely have software/setup configuration issues that sometimes aren’t even related to the hardware they’re testing. With so little time to move to the next review it gets passed of as yea it’s definitely their problem (amd/nvidia/whoever) and not something they did. So everyone draws conclusions from half baked data, and click-bait snide bullet points, then choose to argue with whatever “influencers” data suits their argument

Write me off as a fanboy, but both nvidia and amd have had major issues in the past, which is entirely different from now. With the exception of some buggy game releases, both my nvidia and amd rigs have been exceptionally stable. I laugh everytime someone uses the driver excuse though. It may be my opinion, but AMD drivers are objectively better.

TLDR

Good hardware, bad BAD prices, escepcially Nvidia.
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
Marketing bs including DLSS3 makes little sense, but considering the specs, the 4070 will be very close to the 3080, maybe faster at 1080P, and slightly slower at 4K.

Not directed at you, but that line is a blanket excuse for being an ill informed buyer. On hand I have a 2070 super, 3080 12gb, and 7900 XTX between the 3 rigs I have at home, the 3080 has had more driver related crashes and issues in the span of a week than my 7900 XTX since release.

While for some the initial change going between nvidia and amd drivers may be jarring, modern UI aside, AMD offers more and better control through their driver than nvidia currently does from a feature standpoint.

I can’t say for sure, but I like to blame tech tubers for a lot of this. They burn through so much free hardware for pumping out reviews and click bait videos, that more often than not discrepancies across reviews and other reviewers point out they more than likely have software/setup configuration issues that sometimes aren’t even related to the hardware they’re testing. With so little time to move to the next review it gets passed of as yea it’s definitely their problem (amd/nvidia/whoever) and not something they did. So everyone draws conclusions from half baked data, and click-bait snide bullet points, then choose to argue with whatever “influencers” data suits their argument

Write me off as a fanboy, but both nvidia and amd have had major issues in the past, which is entirely different from now. With the exception of some buggy game releases, both my nvidia and amd rigs have been exceptionally stable. I laugh everytime someone uses the driver excuse though. It may be my opinion, but AMD drivers are objectively better.

TLDR

Good hardware, bad BAD prices, escepcially Nvidia.
AMD drivers better … Is this some kind of a joke?
 
Joined
Sep 10, 2018
Messages
6,971 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Marketing bs including DLSS3 makes little sense, but considering the specs, the 4070 will be very close to the 3080, maybe faster at 1080P, and slightly slower at 4K.


AMD drivers better … Is this some kind of a joke?
Yeah DLSS3 comparisons are only semi useful when comparing 40 series vs 40 series but even then it's not a feature everyone will use less so than even DLSS as a selling point

idk the last time I had issues with AMD drivers was the 7970 but only in Crossfire. I even purchased a 5700XT to try and replicate the issue people where having and couldn't but that was a real issue and something AMD should have been faster to fix. Them ignoring the 6000 series to try and fix 7000 series and going by Hardware Unboxed not really improving anything overall was stupid though and just gave Nvidia fanboys more ammunition against them.
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
Joined
Sep 10, 2018
Messages
6,971 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Sure. And the next one in 2027 will be even better…

You see this every generation. Pascal sucks I'm going to buy the next generation cards.... Turing sucks I'm going to buy the next generation cards..... Ampere sucks but I cant buy it anyway so I'm going to buy the next generation cards.... Ada sucks prices are too high and it never ends.....
 
Joined
Jan 29, 2021
Messages
1,878 (1.32/day)
Location
Alaska USA
You’re conveniently ignoring most 3000 series cards has/had been offered at massive discounted prices while available alongside the 4080/4070ti for months; at this point stock looks to be dried up. People had plenty of time to purchase and make a value comparison at those points.

Almost every AIB model except a handful are also around the $900 mark for the 4070ti, and will be the same case for the 4070. So you get 3080 and 3080ti performance for almost the same exact prices and or performance per dollar.


I also specifically said price to performance, and spoke of nothing about msrp, as that is pretty irrelevant in most cases. Nvidia did nothing to move that needle. They shifted cards down yet another tier. Inflation, cost of materials yada yada, a midrange card (60/70 series), could be had anywhere from $250-400 just a few years ago. Now we get a 4060/4070 at 1.5-2 the cost, no value increase whatsoever, and people come here and defend nvidia on the matter?
RTX 4070 Ti 12GB: $799.99, $814.99, $829.99, $839.99
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
You see this every generation. Pascal sucks I'm going to buy the next generation cards.... Turing sucks I'm going to buy the next generation cards..... Ampere sucks but I cant buy it anyway so I'm going to buy the next generation cards.... Ada sucks prices are too high and it never ends.....
The only constant is: no one want an AMD card. It doesn’t matter how badly priced and low specced Nvidia cards are. Radeon still are irrelevant.
 
Joined
Jan 5, 2006
Messages
18,584 (2.68/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
The only constant is: no one want an AMD card. It doesn’t matter how badly priced and low specced Nvidia cards are. Radeon still are irrelevant.
I used to buy ATi only in the past but times have changed.
Also I have a G-Sync only monitor.

BTW there are many people owning an AMD GPU/CPU here....
 
Joined
Sep 10, 2018
Messages
6,971 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
The only constant is: no one want an AMD card. It doesn’t matter how badly priced and low specced Nvidia cards are. Radeon still are irrelevant.

Eh a decent amount of people on this forum own them and if I only had 900 bucks in my pocket at microcenter and needed a new gpu it would not be a 4070ti
 
Joined
Jan 29, 2021
Messages
1,878 (1.32/day)
Location
Alaska USA
Yes that is exactly what I said. Not sure what you’re trying to prove? There are a “handful” around msrp, the majority are not. Thanks for proving my point.
There's plenty at and around MSRP. Two cards by two different manufactures are at MSRP, three more are slightly above MSRP. If someone is ignorant enough to blow $900 on a 4070 Ti then that's on them.
 
Joined
Apr 14, 2018
Messages
701 (0.29/day)
There's plenty at and around MSRP. Two cards by two different manufactures are at MSRP, three more are slightly above MSRP. If someone is ignorant enough to blow $900 on a 4070 Ti then that's on them.

Unless you have some burning desire to play with ray tracing enabled (butcher fps, degrade visual quality with DLSS and or FSR, and get a good experience in maybe 10 games where it’s worthwhile), I’d say the ignorant person is the one buying a 4070ti at all, so keep defending it all you want. The 4070ti and 7900 XT are terrible choices for value and performance.
 
Joined
Jan 29, 2021
Messages
1,878 (1.32/day)
Location
Alaska USA
Unless you have some burning desire to play with ray tracing enabled (butcher fps, degrade visual quality with DLSS and or FSR, and get a good experience in maybe 10 games where it’s worthwhile), I’d say the ignorant person is the one buying a 4070ti at all, so keep defending it all you want. The 4070ti and 7900 XT are terrible choices for value and performance.
Did W1zzard use ray tracing in this benchmark?

 
Joined
Sep 26, 2022
Messages
236 (0.29/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
So let me get this straight, WITHOUT DLSS3 @1440P:

3070>3070TI>3080/4070>3080TI>3090>3090TI/4070TI>4080
That is a massive GAP... And for NVIDIA to admit that the 4070=3080 it'll be in a best case/cherry picked scenarios.

if I only had 900 bucks in my pocket at microcenter and needed a new gpu it would not be a 4070ti
This one made me think back... I believed the 4070ti was a great deal when it came out: 3090/TI performance for 900€. Now after seeing what happened with the 8Gb cards ind RE 4 Remake and Hogwarts Legacy, my mind has changed...
 
Joined
May 11, 2018
Messages
1,292 (0.53/day)
Official Benchmarks For NVIDIA RTX 4070 Leak Online – Matches RTX 3080 Performance Without Frame Generation!

"Considering these are first party benchmarks, a grain of salt never hurt anyone, but they are incredibly exciting as NVIDIA is stating that the upcoming RTX 4070 will be able to match the NVIDIA RTX 3080 GPU in DLSS performance without Frame Generation. This is a huge deal because Frame Generation is quite the controversial technology and so-called 'fake frames' have divided gamers. With the RTX 4070 however, even without framer generation, you are looking at RTX 3080 performance levels with standard DLSS. "

" RTX 4070 appears to have a far greater value proposition once you tie in DLSS 3.0 and/or Frame Generation. Without Frame Generation and just good old DLSS3, it performs more or less identical to an RTX 3080. If you include Frame Generation however, it suddenly performs up to 40% faster than an RTX 3080 or 80% faster than an RTX 3070 - which is what generational upgrades should always be like. "

But then:

" If you are someone that believes at raster performance only (although in today's era, I would add that it is a highly obsolete metric), than the RTX 4070 performs 15% faster than an RTX 3070."

:p

So, how much does Nvidia pay the reviewers and news sites to write such an exciting articles about sub par performance?
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
I used to buy ATi only in the past but times have changed.
Also I have a G-Sync only monitor.

BTW there are many people owning an AMD GPU/CPU here....
Many people HERE doesn’t mean much.
AMD graphic card market share dropped below 6%, worldwide.
The fact of having a very vocal fan base doesn’t change things.

Eh a decent amount of people on this forum own them and if I only had 900 bucks in my pocket at microcenter and needed a new gpu it would not be a 4070ti
Again, as I said above, having a very vocal fan base doesn’t change facts about market share.
Radeon are cheaper for a single reason: no one want one.
Lisa Su already proved to the market than the moment they gain market share, they immediately drop the “nice price” policy.

BTW I don’t care which card you (or me) prefer. I don’t have any brand loyalty.
I installed dozens of Radeon card in the last 4 years, and on a good chunk of them I had to fix some issue sooner or later (I would say on the 40% of them). Nvidia cards/drivers weren’t perfect, but the percentage of issues drops down to 10%.
This just means one thing for me: angry customers and more workload for me.

yes I know: when you enter a forum like this, AMD supporters are very vocal in saying “never had a problem with my Radeon”. Happy for them. My experience is vastly different.
 
Joined
Aug 25, 2021
Messages
1,183 (0.97/day)
4070 and 4070Ti have VRAM crippled to 12GB only, instead of 16GB, and therefore this will become the same problem in two years as it is now with 3070 and 3070Ti with 8GB. Even Ray Tracing performance will be crippled in increasing number of games.

In most recent video by Hardware Unboxed, Steve clearly shows how planned obsolescennce VRAM strategy by Nvidia works with 3070 and 3070Ti. More and more games are a stuttering mess with 8GB VRAM both in 1080p and 1440p and RT performance is dismal, to the point that RT with 6800 is better because it has 16GB of VRAM. The same will happen with 4070 and 4070Ti in 2 years. Nvidia is literally forcing people to upgrade gen-to-gen due to small VRAM offering. Well done to them for being able to convince people that those cards are fine today.
 
Joined
May 10, 2020
Messages
738 (0.44/day)
Processor Intel i7 13900K
Motherboard Asus ROG Strix Z690-E Gaming
Cooling Arctic Freezer II 360
Memory 32 Gb Kingston Fury Renegade 6400 C32
Video Card(s) PNY RTX 4080 XLR8 OC
Storage 1 TB Samsung 970 EVO + 1 TB Samsung 970 EVO Plus + 2 TB Samsung 870
Display(s) Asus TUF Gaming VG27AQL1A + Samsung C24RG50
Case Corsair 5000D Airflow
Power Supply EVGA G6 850W
Mouse Razer Basilisk
Keyboard Razer Huntsman Elite
Benchmark Scores 3dMark TimeSpy - 26698 Cinebench R23 2258/40751
4070 and 4070Ti have VRAM crippled to 12GB only, instead of 16GB, and therefore this will become the same problem in two years as it is now with 3070 and 3070Ti with 8GB. Even Ray Tracing performance will be crippled in increasing number of games.

In most recent video by Hardware Unboxed, Steve clearly shows how planned obsolescennce VRAM strategy by Nvidia works with 3070 and 3070Ti. More and more games are a stuttering mess with 8GB VRAM both in 1080p and 1440p and RT performance is dismal, to the point that RT with 6800 is better because it has 16GB of VRAM. The same will happen with 4070 and 4070Ti in 2 years. Nvidia is literally forcing people to upgrade gen-to-gen due to small VRAM offering. Well done to them for being able to convince people that those cards are fine today.
That video is flawed in many ways… by the way, from 8 GB to 12 GB there is a huge difference. 8 GB was a poor choice since the beginning, but 12 GB for cards intended for 1440P are ok in my opinion. The problem would be an 8 GB 4060 Ti, if Nvidia dare to…
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
£599 notes my arse, 1440p my arse.

I can see the EWaste potential from here.

16GB should be minimum on this class of card and for 1440p.

HUB showed what you can expect, two years viable use then viable to scrap, and it wasn't flawed, only someone who hasn't watched it and so, can't back it up with a actual reason would say so, or a fanboi as demonstrated, no watching just opinionated.

And 599 my arse fake MSRP that will be valid day's only then aibs will have to up their prices to Stop loosing money.



And after this I will be surprised if Nvidia don't loose more AIB partners, Nvidia basically shat on them again.
 
Last edited:
Joined
Sep 6, 2013
Messages
3,405 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Steve clearly shows how planned obsolescennce VRAM strategy by Nvidia works with 3070 and 3070Ti
I think what we can see in that video and really worry is that we wouldn't be able to trust benchmarks in the future. I mean, if the driver or the game can keep the framerate smooth by butchering the visual quality, by a little or by a lot, can we really trust reviews and framerates? And when it is by a lot, we can see the results on Hogwarts easily. But what happens in the case we put for example ultra settings and the game changes silently those to high without informing us? What if the driver can do that as an "optimization"? Visually it will be difficult to see it and benchmarks will be showing a false image.
 
Joined
May 11, 2018
Messages
1,292 (0.53/day)
4070 and 4070Ti have VRAM crippled to 12GB only, instead of 16GB, and therefore this will become the same problem in two years as it is now with 3070 and 3070Ti with 8GB. Even Ray Tracing performance will be crippled in increasing number of games.

In most recent video by Hardware Unboxed, Steve clearly shows how planned obsolescennce VRAM strategy by Nvidia works with 3070 and 3070Ti. More and more games are a stuttering mess with 8GB VRAM both in 1080p and 1440p and RT performance is dismal, to the point that RT with 6800 is better because it has 16GB of VRAM. The same will happen with 4070 and 4070Ti in 2 years. Nvidia is literally forcing people to upgrade gen-to-gen due to small VRAM offering. Well done to them for being able to convince people that those cards are fine today.

Well, suits them right, real PC Master race card is of course at least $1200!
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I think what we can see in that video and really worry is that we wouldn't be able to trust benchmarks in the future. I mean, if the driver or the game can keep the framerate smooth by butchering the visual quality, by a little or by a lot, can we really trust reviews and framerates? And when it is by a lot, we can see the results on Hogwarts easily. But what happens in the case we put for example ultra settings and the game changes silently those to high without informing us? What if the driver can do that as an "optimization"? Visually it will be difficult to see it and benchmarks will be showing a false image.
This is Exactly what Nvidia's driver optimisation has been for years.

It isn't by accident that Nvidia has things like

LOD bias in driver where Intel and AMD don't.

It's also not for no reason that Nvidia lead the fake frames tech drive.


They Always have cheated at benches and always will, they just have the market on their side now.
 
Joined
Aug 25, 2021
Messages
1,183 (0.97/day)
That video is flawed in many ways… by the way, from 8 GB to 12 GB there is a huge difference. 8 GB was a poor choice since the beginning, but 12 GB for cards intended for 1440P are ok in my opinion. The problem would be an 8 GB 4060 Ti, if Nvidia dare to…
It all depends on which games people play. My 7900XTX can use up to 21GB of VRAM in some dense urban 3D renderings of buildings in Flight Simulator. If someone plays VRAM intense games (the once Steve measured, plus MFS), 12GB on 4070 and 4070Ti will quickly become troublesome in terms of stuttering, lower RT performance and unloaded textures, just like 3070 and 3070Ti have. 12GB vs 8GB is NOT "huge difference", I am afraid. Just try those games tested and you will find out that now you might play well, on the edge of 10-11GB, but next year and later on, there will be mounting problems. The history of 3070 and 3070Ti will repear itself. This is also what Unreal Engine 5 game developers say in interviews. have a watch online.

Well, suits them right, real PC Master race card is of course at least $1200!
7900XTX is already available for $960 in some markets, from Asrock. It's faster than 4080 and has 24GB of VRAM, not to be underestimated at all.

I think what we can see in that video and really worry is that we wouldn't be able to trust benchmarks in the future. I mean, if the driver or the game can keep the framerate smooth by butchering the visual quality, by a little or by a lot, can we really trust reviews and framerates? And when it is by a lot, we can see the results on Hogwarts easily. But what happens in the case we put for example ultra settings and the game changes silently those to high without informing us? What if the driver can do that as an "optimization"? Visually it will be difficult to see it and benchmarks will be showing a false image.
We can trust it, as several reviewers test video quality on images, both in pure raster and with upscalers. I would not go into conspiracy of games silently changing texture settings. Even if they do, it can still be checked and uncovered. That would be embarrassing for GPU vendors to report to the public.
 
Joined
Dec 6, 2016
Messages
155 (0.05/day)
System Name The cube
Processor AMD Ryzen 5700g
Motherboard Gigabyte B550M Aorus Elite
Cooling Thermalright ARO-M14
Memory 16GB Corsair Vengeance LPX 3800mhz
Video Card(s) Powercolor Radeon RX 6900XT Red Devil
Storage Kingston 1TB NV2| 2x 1TB 2.5" Hitachi 7200rpm | 2TB 2.5" Toshiba USB 3.0
Display(s) Samsung Odyssey G5 32" + LG 24MP59G 24"
Case Chieftec CI-02B-OP
Audio Device(s) Creative X-Fi Extreme Audio PCI-E (SB1040)
Power Supply Corsair HX1200
Mouse Razer Basilisk X Hyperspeed
Keyboard Razer Ornata Chroma
Software Win10 x64 PRO
Benchmark Scores Mobile: Asus Strix Advantage G713QY | Ryzen 7 5900HX | 16GB Micron 3200MHz CL21 | RX 6800M 12GB |
So, Nvidia is marketing Frame Generation as a performance improvement.
I remember when nvidia's FX series (5xxx) could not compete with ATi's 9000 series, and were caught cheating in benchmarks - particularly 3dmark 2001 if memory serves. Everyone was up in arms about it. Now they come up with DLSS and is trying to sell it as a "performance increase" - and fans rejoice... What a clown world we live in.
 
Last edited:
Joined
May 11, 2018
Messages
1,292 (0.53/day)
7900XTX is already available for $960 in some markets, from Asrock. It's faster than 4080 and has 24GB of VRAM, not to be underestimated at all.

But it's made by a company that's loosing that small market share it still has to broken Intel cards. :-D

I remember when nvidia's FX series (5xxx) could not compete with ATi's 9000 series, and were caught cheating in benchmarks - particularly 3dmark 2001 if memory serves. Everyone was up in arms about it. Now they come up with DLSS and is trying to sell it as a "performance increase" - and fans rejoice...
They don't even hide it, there's no conspiracy - people just aren't reading reviews in the finer details. I'm quite sure the default benchmarks with no DLSS, no frame generation will sooner or later be delegated to closing pages in reviews, as a footnote - or the reviewers won't be getting their shiny new cards for free, and then they can just close down.

As WCCFTech wrote:

"If you are someone that believes at raster performance only (although in today's era, I would add that it is a highly obsolete metric), than the RTX 4070 performs 15% faster than an RTX 3070."
 
Top