• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 Ti Drops Down to $699, Matches Radeon RX 7900 XT Price

Joined
Jan 27, 2024
Messages
298 (0.90/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
Still, erry one talks shit about the card and it’s actually not too bad.

I haven't seen anyone talking about the RX 7900 XT.
Actually, from the RX 6000 and RX 7000, it is the only 7000 variant worth buying today, along with (maybe) RX 6650 XT, RX 6700 XT and RX 6800 (XT).

Of course, it could have been better if it had been a monolithic GPU - more performance against the chiplets method, plus those missing 10% performance up to the original performance target for Navi 31.
I wonder why AMD doesn't release an improved or fixed card.
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
8,858 (3.87/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
I haven't seen anyone talking about the RX 7900 XT.
It’s the 4070Ti that brings up so much hate.

Edit:

That is with vsync enabled.. the only game I have to cheat with is cp2077.
 
Last edited:
Joined
Jun 2, 2017
Messages
9,371 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
My tv is 60Hz so I aim low :)

Still, erry one talks shit about the card and it’s actually not too bad.
at 4k 60 HZ I am sure that is fine. My 6800XT played 4K 60Hz (At the time) fine too.

Still too damned expensive....regardless of mfgr, model, version etc....

What we need ATM are well-rounded, well-spec'd cards that can do 99% of what we need them to, AND are affordable for the average everyday user, including gamrs, CAD folk, the Blender crowd etc....
Blame Nvida. If AMD kept their pricing you would not be able to buy any GPUs from them. The narrative is not that powerful.
 
Last edited:
Joined
Apr 6, 2021
Messages
1,131 (0.83/day)
Location
Bavaria ⌬ Germany
System Name ✨ Lenovo M700 [Tiny]
Cooling ⚠️ 78,08% N² ⌬ 20,95% O² ⌬ 0,93% Ar ⌬ 0,04% CO²
Audio Device(s) ◐◑ AKG K702 ⌬ FiiO E10K Olympus 2
Mouse ✌️ Corsair M65 RGB Elite [Black] ⌬ Endgame Gear MPC-890 Cordura
Keyboard ⌨ Turtle Beach Impact 500
Over here only one 4070 Ti dropped down to 759€ (from a eBay dealer). The rest all start from 800€ and up. :laugh: The 7900 XT on the other hand starts at 759€ (from credible sellers).

Considering Gamers Nexus includes Blender as one of their tests? A lot more than you believe, again facts trump your feelings.
It's not like you can't Blender with a AMD card, it just takes a little bit longer. ;) Which only matters if you have to render regularly, if you render a video from time to time it shouldn't be a big deal. Blender is also pretty "cherry picking" since AMD is doing pretty good at the other productive tasks.

Also you can't blame AMD entirely for the bad Blender performance, as the following article shows:
Quote: "For years, AMD users have eagerly waited for Blender to tap into their hardware’s ray-tracing capabilities fully. Blender officially added AMD HIP support with the 3.0 release in December 2021. However, this did not take advantage of the dedicated ray tracing cores available in the Radeon 6000 and 7000 series GPUs. The wait is finally over, as the latest Blender 3.6 update officially enables support for AMD ray tracing cores. This enhancement promises to significantly accelerate the rendering process, showcasing the potent synergy between AMD hardware and Blender’s advanced rendering algorithms. We’ll delve into the impact of this update and how it promises to improve rendering workflows.

Blender’s decision to enable AMD ray tracing cores marks a pivotal moment in the world of 3D rendering. This follows Maxon’s recent inclusion of HIP in their Redshift renderer. We are increasingly seeing AMD looking to professional workflows with their video cards. They still aren’t entirely competitive with NVIDIA, but this comes as a warning shot. AMD is taking GPU rendering seriously, and if they are able to make the same sort of improvements as they did with CPUs when they introduced the Ryzen line, 3D artists stand to win. We are excited to see what the future holds for GPU rendering."
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
On low end parts, you're not gonna see them doubling the VRAM on a 4090 unless they slap a Quadro sticker on it and charge some 2000$ more.
Are you trying your hardest to avoid the point I'm making or are you reading past it?

There is a 4060ti with 16GB.
There are midrange Ampere cards with 16GB.
These cards have no business in gaming whatsoever. There's nil advantage to them over their half VRAM counterparts (perhaps situationally, but that won't last), especially the 4060ti.

That's them catering to that exact demand right there but on a much more 'democratic' price level. Still overpriced for what it really is. But. An Nvidia card with 16GB on the newest architecture. It can do RT. It has AI. It has creator tools. It has everything your little gamur heart wants. Yadayada. You get the gist ?

I'm not telling you this makes sense in any kind of realistic economy or for real pro markets. But it makes sense in the hearts and minds of prospective buyers. Young people with little knowledge of what they might do or can do with that GPU perhaps. Somewhat more knowledgeable people that know how to put VRAM to use. Etc. There's a market here. Niche? I'm not so sure. I think a lot of people are sensitive to this.

I can't even deny I was totally insensitive to this, say for example a feature like Ansel. It's not like I would have picked Pascal over any other GPU at the time for it. But still. Its yet another neat little tool you can use, and I've pulled some pretty nifty screens for my desktop from it. All these little things really do matter. If you buy an Nvidia GPU, you get a package of added value that AMD simply cannot match. I'm now past the point of caring too much about all of that, but its a package nonetheless.
 
Last edited:
D

Deleted member 237813

Guest
its pretty simple nvidia is ripping you off enormously.

Amd tried to but they got their shit together.

500 for 7800xt
700 for 7900xt

is almost normal price. Should have been release price but we cant get everything.

whoever thinks any 70 or 70ti card is worth 700-900€ has lost their damn minds especially with that vram greed und ridiculous 60 card bit bus.

while a 7900xt gives you ti super to 4080 performance depending on the game.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
its pretty simple nvidia is ripping you off enormously.

Amd tried to but they got their shit together.

500 for 7800xt
700 for 7900xt

is almost normal price. Should have been release price but we cant get everything.

whoever thinks any 70 or 70ti card is worth 700-900€ has lost their damn minds especially with that vram greed und ridiculous 60 card bit bus.

while a 7900xt gives you ti super to 4080 performance depending on the game.
I'll be honest, all of the choices we've really had this gen, still, are less than ideal. Palatable, at this point, is the furthest I would go, and that only counts for a small selection of GPUs on either side of the line. And you're right... almost normal price. Almost. But its also nearly Q2 2024.
 
D

Deleted member 237813

Guest
True that bit its getting better by the month. we are still feeling the aftermath of mining and scalping this will take years. But consumers are also stupid as shit accepting a gaming card for 1600€ is insanity and calling it a good deal while it has objectively the worst value. ‍♂️

The funny thing is the 4090 is so cut down it would barely make an actual 80ti card.

but hey people bought the freaking 3090 for double the price of a 3080 while being 10% faster. stupidity has no bounds especially gamers as you can see on the gaming industry.

Nvidia lied when they said 90 card will replace titans . Titan rtx birns the 3090 in some productivity tasks because it was areal titan. not a wannabe so they can raise the prices by 2. and people eating it up
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
8,858 (3.87/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
I paid 679USD for my Ti last July, good deal :)
 
Joined
Jun 1, 2010
Messages
392 (0.07/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
Sweet. Seems AMD is too slow and lags in the game nVidia runs. How much it will take, until nV will do another punch? Make note, nVidia didn't even had to lower the prices. They could keep simple keep gouging the market, or even raise the MSRPs. AMD is being beaten with their own methods.

However, the availability of the 4070Ti and other nVidia cards, is another question.

That’s a pretty big “unless”. I love how a lot of people just downplay that Radeon are absolute cheeks in anything apart from gaming. If you do, for example, any amount of work in Blender you basically have no actual choice. NV is the only game in town. The fact that AMD still hasn’t even tried to compete with OptiX is pathetic, to be blunt. How they think that they can just keep their back turned while the competition reaps the rewards is baffling to me.
Youre right here. However, some time ago, the same sentences were shot towards GCN and Vega achitectures, which even being the low end gaming were compute monsters. Contrary to nVidia counterparts, which with the exception of very high end, were completely anemic for the said tasks. Now the tables have been turned, but the narratives left the same.

You don't understand how this works, neither Nvidia or AMD care about regular consumers and professional workflows, it's not relevant for that segment. Nvidia cards perform better because of Quadros, since they share the architecture but Nvidia wants people to buy Quadro not Geforce, that's why they more than once went of their way to cripple productivity performance on those cards and limit the VRAM. Those market segments are distinct, regular consumers do not care about professional workflows, you are just wrong.

Not to mention Nvidia is using CUDA to gatekeep this market, AMD could make a GPU a billion times faster and it still wouldn't matter. They'd be absolute idiots to try and focus on this one thing that simply does not even matter, it's not going to get them more market share.
I don't try to attack you, but just some points to note.

Everyone knows, that Nvidia gatekeeps the market with CUDA, and does the really dirty things towards their fans, consumers, heck, even towards their precious clients and partners. They shit on absolutely everyone. Thats the fact. nVidia is anti-consumer and pro-investor trillion bucks worth corporation. And it growth like the mushrooms after the rain.
But at same time, what prevents AMD, to overtake the situation, and provide "morally" correct, OpenSource alternative to the CUDA, as well as countless of other confortable tool sets? To stop this vicious circle, to disrupt the monopoly. But AMD doesn't fight that, and instead joins this game, and might be in collusion with Nvidia. Anybody can point out the disgusting tactics, Nvidia wages, and how locked is their proprietary ecosystem. But it worth a credit to their many endeavours, and many SDKs they open to developers, excluding direct "incentives". There's no need to bribe game developers, as most already make games in regard with consoles, which carry Zen2 and RDNA2. What is needed is to help, support developers, make the process as easy as possible, so the devs won't evn care about nVidias fat suitcases.

Again, why AMD can't make invest into own viable, effective and comfortable and quality ecosystem? Pproprietary or not. What prevents AMD to do so, except their greed. At this point, it looks like AMD is the laziest company, as they sit on the laurels of EPYC/Ryzen, and gouging them as much as possible, and just ocasionally respond to the rivals. And they use OpenSource banner of their stuff, just to offload the development on the shoulders of clients and community.

Every couple months of every release since Zen2 is almost double over MSRP milking of the trustful consumers. And only when Intel or nVidia, try to undercut them, they pring down the prices, to more sane level (still not sane). And that considering, that AMDs chiplet way, is miles cheaper, that intel's and Nvidia big monolitic dies. They still cost same or more, regardless. Even considering the Nvidia's premium tax, this looks as scam. For years AMD was pouring into everyones ears, that chiplet strategy would become both energy efficient and cost effective, and will bring their product prices down by alot.

Also It took AMD almost two decades, to roll out ROCm. And it's only to accompany it with their MI200/300. This shows AMD invested into it only to rival in the Nvidia's race for AI, and also wanted to take a piece of that profitable pie. And it still isn't complete alternative to CUDA.

And make no mistake, consumer VGAs that nVidia sells by miles more to their "dumb" fans, are still supported with both gimmics and really strong features. The gimmics are impotent RTRT, that doesn't work without "deblurers" and fake frames. The PhysX, Hairworks, Gsync, etc. And the strong point is their encoding. Like it was mentioned, everyone is able to become a streamer and youtuber, and even mid range cards from both nVidia and intel, provide the vastly better performance than even top AMD cards are unable to reach. AMD simply has no alternative. The CUDA is just a small bonus in this perspective of regular consumer.

Now again, what was the reason, AMD did not make any moves toward rising their marketshare, and market penetration for their own products. Why AMD doesn't fight for their share like nVidia with their somewhat overestimated and to some extent BS producs? AMD is not the underdog they were years ago. They have wery big profit margins, and tons of cash to fund any software and hardware development. There's no one who can say that AMD is poor company. At same time they behave like they are market leaders or monopolies on every front, and they don't have to do anything anymore.

Look, once nVidia, had less than 50% of market share, while having inferior products. They did invest in their marketing, and R&D, even while using anticonsumer tactis. Why AMD just sit and wait while the market gonna come to their hands, without even trying.

Yes, APUs are great. This is the way, the absolute majority of desktops should be, and be that powerefficient. The exceptions are demanding CAD/Rendering and scientific use cases. However, as you've said yourself, these are not the tasks for the ordinary users. Which doen't need the ultra hiend dGPU to run the games. Majority of gamers already use medium or even lower end class GPUs. What is needed is more powerful iGPU, that is capable of High settings with 1440P. And at this pace it's not really far from reality.
But that merrit is not due to sheer AMDs generosity. This is the result, of AMD has the stock of unsold "bad" binned mobile chips, that are not capable of mobile use.

Same goes to Ryzen. Despite how amazing it is, that's just literally the bottom of the binning. The absolute leftovers, that were not suitable firstly for EPYC, and then Threadripper. And even then, AMD managed to go further and cut down many featurs for desktop users, that came with Zen chips for free, as they were already there (people for years mocked intel for the same stinky tricks).
And even worse, they pulled the "worst intel", and started to artificially limit and fragment the chipsets and motheboard capabilities. They let partners to rogue with their BIOS settings, and thus damage the image of Ryzen. They had RAM problems during AM5 launch, and recently the STAPm fail. The QC is nonexistant. Their motherboards cost more than intel ones, while missing many absolute necessary and basic features. Again due to partners being loose. All of this within one socket/platform launch. This is disater. Intel has been burned for this crap for decades. Now the tables turned. But it seems AMD dosn't draw any conclusions.

Why this matters, is because such incompetent behavior is dangerous not only for AMD itself, but to entire market. Loosing one participant due to it's reckless moves, and the market would collapse. The next RTXxx50 would cost a grand, if will be at all. Every consumer, buyer needs a competition. It's impossible, when the partaker already gave up.
And all it took for price competition to restart was for NVIDIA to release products that compete with its own products, while AMD sits in the corner and sucks its thumb.

Uh yeah, do you know why companies optimise their software for NVIDIA and not AMD? Because NVIDIA pays them to and AMD does not. Because NVIDIA understands that the ROI on that tiny expense is going to be many times greater. This is business 101, yet AMD perpetually fails to understand this.
Indeed. This is almost like the Buldozer vs Sandy Bridge drama all over again. When intel was competing itself for almost eight years. AMD needs to roll out their "Zen" of GPU, or they will loose the consumer gaming market completely. Intel is already reached the marketshare, that AMD were gaining for decade,. With just being couple of years present on the market, and even having their Xe failed launch. What AMD is going to do, when Battlemage wll happen. I bet intel doesn't sit their idling on their arse.
 
Last edited:
Joined
May 3, 2018
Messages
2,881 (1.19/day)
Who cares about run-out pricing. It's being discontinued and is crap card even at $699. Ti Super is only worth $599 at most.
 
Joined
Jun 1, 2010
Messages
392 (0.07/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
The problem is, that even considering very limited data from Steam HW survey, it shows that 4060 still has about the same share, as HD5450. What that means, is that the GPU pricing and positioning is utter sh*t. Be it reasonable, the share would surpass the 1060 6GB in a nick of time. As much as it isn't really worthy to upgrade over 3060Ti.

What they did with PhysX, where they where offering a software support to non Nvidia systems that was hilariously slow to force people to buy Nvidia GPUs. Hardware PhysX died in the end, but CUDA is a different beast.
Nvidia knows how to create the illusion of being open, while driving consumers to it's proprietary options.
AFAIK, PhysX still relied on CUDA. And it still used CPU heavily. Much like encoding/decoding due to VRAM compression operations. It still was a CPU tech, but artifficaly locked behind proprietary GPU.

Another question, where it is possible to trick games to use Radeons, as there's no way they can't run such a basic task?

I have a question on the Steam Hardware chart. In the GPU section it shows that the 3060 laptop has increased in January. The only issue with that is that 3060 laptop GPU has not been available since 2022. How could that be?
That might be those poor laptops, that these sweatshops and cafés were running the Etherium and other crypto garbage. What nobody tells, is where have gone all the storage that been used for Chia mining? :rolleyes:

Are you trying your hardest to avoid the point I'm making or are you reading past it?

There is a 4060ti with 16GB.
There are midrange Ampere cards with 16GB.
These cards have no business in gaming whatsoever. There's nil advantage to them over their half VRAM counterparts (perhaps situationally, but that won't last), especially the 4060ti.

That's them catering to that exact demand right there but on a much more 'democratic' price level. Still overpriced for what it really is. But. An Nvidia card with 16GB on the newest architecture. It can do RT. It has AI. It has creator tools. It has everything your little gamur heart wants. Yadayada. You get the gist ?

I'm not telling you this makes sense in any kind of realistic economy or for real pro markets. But it makes sense in the hearts and minds of prospective buyers. Young people with little knowledge of what they might do or can do with that GPU perhaps. Somewhat more knowledgeable people that know how to put VRAM to use. Etc. There's a market here. Niche? I'm not so sure. I think a lot of people are sensitive to this.

I can't even deny I was totally insensitive to this, say for example a feature like Ansel. It's not like I would have picked Pascal over any other GPU at the time for it. But still. Its yet another neat little tool you can use, and I've pulled some pretty nifty screens for my desktop from it. All these little things really do matter. If you buy an Nvidia GPU, you get a package of added value that AMD simply cannot match. I'm now past the point of caring too much about all of that, but its a package nonetheless.
Even if 4060Ti had wider bus, the GPU is still incapable of using all the VRAM, fast enough. Maybe 10-12GB would be better, but still doubtful.
 
Last edited:
Joined
Feb 20, 2019
Messages
8,339 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
What nobody tells, is where have gone all the storage that been used for Chia mining? :rolleyes:
Landfill!

Chia mining didn't use lots of storage, it' used up lots of storage. Expected lifespans of TLC SSDs was 90-days of Chia mining per TB of capacity. I assume QLC drives didn't even last long enough to be worth bothering with. For a mechanical drive of any capacity to survive more than 6 months was also an outlier, apparently, with death usually at the 3-5 month mark.

Even as someone who mined and holds crypto, I couldn't see the point of Chia, and I'm not really sure I see the point of Bitcoin mining. Digital, nation-independent, DeFi is the future, and Bitcoin started that, but we don't need to mine it wastefully. A successful independent DeFi doesn't have to generate 90 Mt of CO2 a year for no justifiable reason.

Even if 4060Ti had wider bus, the GPU is still incapable of using all the VRAM, fast enough. Maybe 10-12GB would be better, but still doubtful.
The narrow bus is exactly what the 4060Ti needs. My own personal 4060Ti is undervolted and underclocked to a 125W power draw but even hamstrung like that and rendering at just 1080p I'll run into situations where the overlay says it's not fully loaded and neither is any single CPU core. That's either memory bandwidth or game engine bottlenecking, and I know it's not game engine because the same scene runs at 100% GPU usage on the 4070 or 7800XT.

It's also ROP-limited, so resolution scaling on the 4060Ti is pathetic compared to the 3060Ti but since the bandwidth bottleneck is so great, we don't really get to see the ROP limitation. For the 4060Ti to be a better 1440p card, it would have mostly needed more bandwidth, but also that would just revealed the ROP deficiency, which is more situational but still an issue holding it back.

Sadly, if you head to Wikipedia and look at the one-spec-sheet-to-rule-them-all, you can see how the 4060Ti is really a successor to the 3060 8GB in terms of bandwidth, SM+GPC counts, and relative position of that silicon in Nvidia's range of GPU dies. It's a long way off the 3060Ti and the only reason it gets close is because TSMC 4N lets Nvidia clock it 55% higher than the 3060Ti on Samsungs underwhelming 8nm node.

That’s all I play at, usually at maxed settings, RT on. 4K/60 @ 55”
And you've enjoyed 15 months of games at 4K from 2022 to 2023 titles I presume?

TLoU-P1, CP2077, Hogwarts, MS Fligh Sim all exceed 12GB at 4K on max settings. You'll notice it uncapped because it manifests initially as microstuttering and frame-pacing issues but realistically at those settings (expecially overdrive in CP2077) you're unlikely to be getting much more than 60fps in heavy scenes anyway, so artificially hindering the stuttering/pacing with 60Hz cap means it's less of an issue in those older 2022/2023 titles. Realistically, the issue with the 4070Ti isn't it's performance over the past 15 months, it's how it's going to perform in the next 15 months now that so many more games in the development pipeline are moving to UE5 and also ditching any semblence of PS4 and XB1-compatibility now that those consoles have been dropped for good.

The 4070 Ti isn't a bad card. It's objectively better than the 4070 and 4070S, both of which are considered "decent" cards. Everyone talks shit about the 4070Ti because of the asking price, and the sheer hubris/cheek/greed of Nvidia in trying to launch it at $900 as the 4080 12GB.

If you fall into the trap of comparing its price to other 40-series cards on pricing, you'll end up drinking the Nvidia kool-aid and justifying the cost relative to the 4080 which was just ridiculously poor value, but the reality at launch was that the $800 4070 Ti was bringing 3080 12GB/ 3080 Ti levels of performance and VRAM to the table at the exact same performance/$ point for new retail cards at that time. It wasn't dramatically faster than the 6950XT (another chart in the same HUB Jan '23 update) which was selling for $649, not that it had the same feature set.

1707558815837.png

Hardware Unboxed has been doing these monthly GPU pricing updates for a few years now, and around the 4070 Ti's launch it's clear to see why there was so much hate for the card and it's all because of the price. The only reason you can say it's a decent card is because you bought it at a deep discount which means you weren't actually price-scalped by Nvidia.
 
Last edited:
Joined
Sep 6, 2013
Messages
3,391 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Another question, where it is possible to trick games to use Radeons, as there's no way they can't run such a basic task?
There wasn't. What people where doing was running patches that where defeating Nvidia's lock. There was also an Nvidia driver (I think 256.something) where they had "forgotten" to implement the lock. That driver was working fine with Radeons as primary cards without wanting any patches. Nvida's excuse was that they could not guarranty CUDA and PhysX stabillity and performance if the main GPU was not an Nvidia one, which was a HUGE pile of BS. So people had to use software PhysX that was totally UNoptimised. Somehere in 2014 I think, when hardware PhysX was already dead, I think they removed that lock. In my second system I have a GT 710 as a second card for PhysX and CUDA (well more accurate is "for wasting power and occupying one PCIe slot") that runs fine CUDA and PhysX without needing any kind of patch to activate those.

Nvidia's lock was so anticonsumer that even someone who had payed full price for an Nvidia card to use it for CUDA or PhysX, while also using a higher end/newer AMD card as main GPU, couldn't, because Nvidia was forcing the option the Nvidia card to be primary. So, If I had for example a GeForce 9800GT and was buying at a latter time an HD 4870 to use as my primary 3D card, Nvidia was punishing me for not being loyal by disabling CUDA and PhysX. That was a very sh__y business practice from a company that had 60% of the market back then, less of a billion income per quarter and less support from the public and press. Imagine them today having 80%+ of the market, billions of income every quarter and total acceptance from public and support from tech press, how they move in the background. And people expect AMD to offer super competitive options and not lose money in the process for no results.

AFAIK, PhysX still relied on CUDA. And it still used CPU heavily. Much like encoding/decoding due to VRAM compression operations. It still was a CPU tech, but artifficaly locked behind proprietary GPU.
When running on the GPU, CPU wasn't doing much. When running on the CPU everything bad was happening, like CPU maxing and FPS becoming a slideshow.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Still more than what I'd consider reasonable for a 12 GB card, but it's good to see that the price wars have started. :)
 
Joined
Jan 27, 2024
Messages
298 (0.90/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
Still more than what I'd consider reasonable for a 12 GB card, but it's good to see that the price wars have started. :)

There are at least 3 games which need more than 12 GB in order not to see that VRAM caused performance drop and stutter.

How does Nvidia even define how much VRAM to put on their cards in order to meet the games' hardware requirements? :rolleyes:

GameAverage VRAM use (GB)Peak VRAM use (GB)
The Last of Us Part 111.812.4
Cyberpunk 2077 (Overdrive)12.013.6
Hogwarts Legacy12.113.9

1707563879702.png


 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
There are at least 3 games which need more than 12 GB in order not to see that VRAM caused performance drop and stutter.

How does Nvidia even define how much VRAM to put on their cards in order to meet the games' hardware requirements? :rolleyes:

GameAverage VRAM use (GB)Peak VRAM use (GB)
The Last of Us Part 111.812.4
Cyberpunk 2077 (Overdrive)12.013.6
Hogwarts Legacy12.113.9

View attachment 334003

1. VRAM usage and VRAM allocation aren't the same thing. A lot of games allocate more than 12 GB VRAM if available, but run fine on an 8 GB card. Some of them have texture and asset loading issues if there isn't enough VRAM, but the FPS looks fine. There is no blanket statement here, unfortunately.

2. Not everyone plays in 4K. Up to 1440p, 12 GB is still fine, in my opinion. How fine it will be in the near future when the PS5 Pro is out and new games get developed for it, we'll see.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
There are at least 3 games which need more than 12 GB in order not to see that VRAM caused performance drop and stutter.

How does Nvidia even define how much VRAM to put on their cards in order to meet the games' hardware requirements? :rolleyes:

GameAverage VRAM use (GB)Peak VRAM use (GB)
The Last of Us Part 111.812.4
Cyberpunk 2077 (Overdrive)12.013.6Hogwarts Legacy12.113.9

View attachment 334003

  1. Last of Us and Hogwarts are shitty console ports that make no effort to manage resources. That's not NVIDIA's fault yet you're blaming NVIDIA. No logic.
  2. NVIDIA has never positioned the 4070 series or lower as 4K cards. Therefore, if you run 4K on anything lower than a 4080 (which has perfectly sufficient memory for that resolution because it's designed for it) and complain about the experience, It's really simple, if you want to run games at 4K, buy the GPU designed for 4K. Can't believe I have to explain this, but here we are.
VRAM usage and VRAM allocation aren't the same thing. A lot of games allocate more than 12 GB VRAM if available, but run fine on an 8 GB card. Some of them have texture and asset loading issues if there isn't enough VRAM, but the FPS looks fine. There is no blanket statement here, unfortunately.
Don't waste your time explaining that; people whose only agenda is to parrot "NVIDIA doesn't have enough VRAM REEEEEEEEE" , to care about facts.
 
Last edited by a moderator:
Joined
Jun 2, 2017
Messages
9,371 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
There wasn't. What people where doing was running patches that where defeating Nvidia's lock. There was also an Nvidia driver (I think 256.something) where they had "forgotten" to implement the lock. That driver was working fine with Radeons as primary cards without wanting any patches. Nvida's excuse was that they could not guarranty CUDA and PhysX stabillity and performance if the main GPU was not an Nvidia one, which was a HUGE pile of BS. So people had to use software PhysX that was totally UNoptimised. Somehere in 2014 I think, when hardware PhysX was already dead, I think they removed that lock. In my second system I have a GT 710 as a second card for PhysX and CUDA (well more accurate is "for wasting power and occupying one PCIe slot") that runs fine CUDA and PhysX without needing any kind of patch to activate those.

Nvidia's lock was so anticonsumer that even someone who had payed full price for an Nvidia card to use it for CUDA or PhysX, while also using a higher end/newer AMD card as main GPU, couldn't, because Nvidia was forcing the option the Nvidia card to be primary. So, If I had for example a GeForce 9800GT and was buying at a latter time an HD 4870 to use as my primary 3D card, Nvidia was punishing me for not being loyal by disabling CUDA and PhysX. That was a very sh__y business practice from a company that had 60% of the market back then, less of a billion income per quarter and less support from the public and press. Imagine them today having 80%+ of the market, billions of income every quarter and total acceptance from public and support from tech press, how they move in the background. And people expect AMD to offer super competitive options and not lose money in the process for no results.


When running on the GPU, CPU wasn't doing much. When running on the CPU everything bad was happening, like CPU maxing and FPS becoming a slideshow.
I was one of the "fools" that bought an Nvidia card for Physx, only for them to have my card not be detected by Windows of all things. It was dastardly because the card would work fine without an AMD card.
 
Joined
Jan 27, 2024
Messages
298 (0.90/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
Not everyone plays in 4K. Up to 1440p, 12 GB is still fine, in my opinion.

You don't buy an 800-850$ card only to be said - but no 4K, you will be stuck at 1K and 2K only! :roll:
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
You don't buy an 800-850$ card only to be said - but no 4K, you will be stuck at 1K and 2K only! :roll:
As someone who played at 1080p with a 2070, then a 6750 XT, then a 7800 XT for a while before I upgraded to UW 1440p, I strongly disagree.

If you only want to play current games and swap out your GPU with every new generation, then go ahead, but I do think that having some reserve potential in your system for future games isn't a bad idea.
 
Joined
Jan 27, 2024
Messages
298 (0.90/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
Still more than what I'd consider reasonable for a 12 GB card, but it's good to see that the price wars have started. :)

Especially when you've earlier said that ^^^^
So, how much would you be willing to spend on that 12 GB card and what are you going to use it for if you did spend?

As someone who played at 1080p with a 2070, then a 6750 XT, then a 7800 XT for a while before I upgraded to UW 1440p, I strongly disagree.

But that is a 4070 Ti, a tier higher. And the 2070 was a 2018 thing, now it's 2024, and you will be using that for at least a few years more.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Especially when you've earlier said that ^^^^
So, how much would you be willing to spend on that 12 GB card and what are you going to use it for if you did spend?
400-450 GBP max.

But that is a 4070 Ti, a tier higher. And the 2070 was a 2018 thing, now it's 2024, and you will be using that for at least a few years more.
What's your point? All I'm saying is, if someone - I, was happy with mid-range GPUs at 1080p for a long time, then I don't see why buying a high-end one for anything less than 4K would be a bad idea.
 
Joined
Sep 19, 2014
Messages
73 (0.02/day)
its pretty simple nvidia is ripping you off enormously.

Amd tried to but they got their shit together.

500 for 7800xt
700 for 7900xt

is almost normal price. Should have been release price but we cant get everything.

whoever thinks any 70 or 70ti card is worth 700-900€ has lost their damn minds especially with that vram greed und ridiculous 60 card bit bus.

while a 7900xt gives you ti super to 4080 performance depending on the game.

bit bus...hmm..

who cares membus if gpu is fast..
My last car was 3.7L v6 mustang.
Now i got MB AMG CLA 2.0 and that thing is Fast! amd only 2.0l 4cyl engine..

so, i dont care how they buid GPU if it fast and run all my games.

True that bit its getting better by the month. we are still feeling the aftermath of mining and scalping this will take years. But consumers are also stupid as shit accepting a gaming card for 1600€ is insanity and calling it a good deal while it has objectively the worst value. ‍♂️

The funny thing is the 4090 is so cut down it would barely make an actual 80ti card.

but hey people bought the freaking 3090 for double the price of a 3080 while being 10% faster. stupidity has no bounds especially gamers as you can see on the gaming industry.

Nvidia lied when they said 90 card will replace titans . Titan rtx birns the 3090 in some productivity tasks because it was areal titan. not a wannabe so they can raise the prices by 2. and people eating it up
But maybe we who bought 3090 are so smart we can make money almost thin air.. or lucky ebought that we got the money to spend and buy 3090.

u can buy lambo or Toyota..
but even Toyota oweners can buy 3090..
i just want to say, 3090 is not smart buy but many can buy it whitout selling kidney.

i hope i can buy better enghlist writing talets.. but i cant so i buy best GPU money can buy.
 
D

Deleted member 237813

Guest
i could buy 1000, 4090s instantly its not about that but you dont get it.
 
Top