• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 Ti to Launch $100 Cheaper Than RTX 4080 12 GB

IQ tastes?
Amount of sharpening, tolerance for ghosting, shimmer etc, these can vary wildly from person to person. Personally Shimmer is my bigger bugbear, and that will not come across in screenshots.
You know that you can just prove me wrong by showing me some screenshots or something, right ? Lord Jensen has been kind enough to allow screenshots of games running DLSS to be viewable by peasants on anything that has a screen in order to do comparisons on them.
Why would I want to send you a screenshot of something that should absolutely be seen in motion, rendered live before your eyes, not compressed on youtube, on the display of your choice to be accurately judged? I have no interest in nitpicking zoomed stills from a third party when I can pause a game, toggle either on or off and immediately see what jumps out at me, which would absolutely not come across in a screenshot.
 
Last edited:
So still around $1100 CAD. Plus another $100+ for the AIB markup. How generous of Nvidia...

I thought paying $800 for my 2070 Super was bad.
It is bad for the greenies.
 
Why would I want to send you a screenshot of something that should absolutely be seen in motion, rendered live before your eyes, not compressed on youtube, on the display of your choice to be accurately judged? I have no interest in nitpicking zoomed stills from a third party when I can pause a game, toggle either on or off and immediately see what jumps out at me, which would absolutely not come across in a screenshot.

Sorry but that's just straight up cope, you just don't want to admit the reality here, which is that these things look indistinguishable for the most part. If DLSS really did look noticeably better that would be visible in motion as well as in screenshots. Unless, of course, you think the image you end up seeing on the screen will somehow be magically different than the screenshot you take.

when I can pause a game, toggle either on or off and immediately see what jumps out at me
What exactly jumps out at you ? Isn't the point of DLSS to not notice any difference, actually ? That when you turn DLSS on you see no difference between it and native ? :kookoo:
 
It's still high but better than I expected, hopefully helps nudge the market in my favour though I can't see it mattering.
 
The problem with today's dGPU market is that there is nothing to get hyped about anymore. No value GPUs, like 1060s, 1080TIs, 480s... It's all about shameless money grab, lifting profit margins through the roof, maximizing short term gains over the long run health of the market. The way things are going, AA(A) PC gaming will be no more a thing in a decade or even less, because publishers and devs just won't bother porting big titles to PCs anymore. Nvidia and AMD are effectively killing PC gaming hobby. Would it killed them to offer 3070 level of performance for 350€? Hell no, but why when they can still charge 600€ for it and get away with it. They can still sell these GPUs to Chinese companies if we don't budge. It's all F..ed up.
 
Last edited:
Sorry but that's just straight up cope, you just don't want to admit the reality here, which is that these things look indistinguishable for the most part. If DLSS really did look noticeably better that would be visible in motion as well as in screenshots. Unless, of course, you think the image you end up seeing on the screen will somehow be magically different than the screenshot you take.
Yet another simple concept lost on you, turns out something that is viewed in motion should also be judged in motion, and that taking a sample of one frame from said motion does not and cannot convey motion artefacts, such as shimmer. Read that and reread it till you understand it.
 
Yet another simple concept lost on you, turns out something that is viewed in motion should also be judged in motion, and that taking a sample of one frame from said motion does not and cannot convey motion artefacts, such as shimmer. Read that and reread it till you understand it.

OK so the frames you see on the monitor are in fact magically different that the ones you screenshot. Glad you cleared that up for us because there is no other way to interpret that.

By the way, the reason for shimmering is because thin visual elements are discontinuous in the image and they shift their location from frame to frame. Meaning that if you see small details breaking up on a screenshot you're also gonna see it in motion. By the way both FSR 2 and DLSS are known for plenty of shimmering and also ghosting, it's not like DLSS is immune to that, nothing noteworthy here.

Matter of fact any kind of image reconstruction/upscaling with a temporal component will have shimmering and ghosting, it's unavoidable.
 
Last edited:
OK so the frames you see on the monitor are in fact magically different that the ones you screenshot. Glad you cleared that up for us because there is no other way to interpret that.

By the way, the reason for shimmering is because thin visual elements are discontinuous in the image and they shift their location from frame to frame. Meaning that if you see small details braking up on a screenshot you're also gonna see it in motion. By the way both FSR 2 and DLSS are known for plenty of shimmering and also ghosting, it's not like DLSS is immune to that, nothing noteworthy here.

Matter of fact any kind of image reconstruction/upscaling with a temporal component will have shimmering and ghosting, it's unavoidable.
Tell me you haven't seen DLSS in motion without telling me.

You could help yourself to some in depth analysis on both techniques, it's evident you require it, digital Foundry can help you there should you choose to want to learn. I can't help you interpret things with more intelligence and less purposeful intellectual dishonesty, only you can. I'd hope by now that as someone with access to test both back to back, you can't convince me with 3rd party cherry picked stills that they look the same in motion, so for now I put your opinion of DLSS squarely in the worthless bucket.

Enjoy your stills mate.
 
Tell me you haven't seen DLSS in motion without telling me.

You could help yourself to some in depth analysis on both techniques, it's evident you require it, digital Foundry can help you there should you choose to want to learn. I can't help you interpret things with more intelligence and less purposeful intellectual dishonesty, only you can. I'd hope by now that as someone with access to test both back to back, you can't convince me with 3rd party cherry picked stills that they look the same in motion, so for now I put your opinion of DLSS squarely in the worthless bucket.

Enjoy your stills mate.

Surely at some point you'll realize the absurdity of your argument here, which is the following :

"My anecdotal experience of what I see proves that I am right, as opposed to the objective evidence you and everyone else can take a look at, which is wrong".

Cool, talk about worthless opinions. I actually think FSR looks like 8K native, because that's what I am seeing on my monitor in motion and no screenshots can prove me wrong.
 
Wasn't that during the mining boom?
Yep and that's why the 3080 12GB and Ti models came in so high in pricing because all the other previously released models were priced so high at the retailers.

All that overpricing and selling of cards then has impacted the pricing of what we see now and I don't know if things will ever return to a more normal pricing scheme.
 
Yep and that's why the 3080 12GB and Ti models came in so high in pricing because all the other previously released models were priced so high at the retailers.

All that overpricing and selling of cards then has impacted the pricing of what we see now and I don't know if things will ever return to a more normal pricing scheme.
They have to, dGPU sales are down 42% YtoY. something's got to give. I have a feeling we will see lots of AIB's bankruptcies in 2023.
 
So still around $1100 CAD. Plus another $100+ for the AIB markup. How generous of Nvidia...

I thought paying $800 for my 2070 Super was bad.
I hear similar complaints from a lot of Europeans. Don't put that on Nvidia for your country's weak currency and outrageous taxes. That so called free medical isn't as free as some would like to think.
 
A card with an MSRP of $800 that might possibly well sit between the 3090 and 3090 Ti at 1440P .. the resolution this card is meant for. :wtf:
Yes a 4070 Ti a mid range card that costs $800 with literally no improvement to performance per $. This is a terrible value and a total stagnation which in my book is bad so meh it is.
 
Will end up selling closer to $1000 than to its fake MSRP. Just like the 3070 still being sold much more than its fake MSRP. A tradition of fake MSRPs it seems...
 
192 bit bus 12 VRAM seems not be attractive to 4K-8K resolution for future games in Unreal 5...
 
192 bit bus 12 VRAM seems not be attractive to 4K-8K resolution for future games in Unreal 5...
That card is meant for gaming at 1440P. If you want a card for gaming at 4K then look at the likes of the 4080, 4090 or 7900 XTX.
 
192 bit bus 12 VRAM seems not be attractive to 4K-8K resolution for future games in Unreal 5...
Hi,
Yep 16gb 4080 is 1200.us though :laugh:
 
192 bit bus 12 VRAM seems not be attractive to 4K-8K resolution for future games in Unreal 5...
It feels like NVidia wants to arbitrary segment the market thus 192 bit memory bus. 4070(TI) is now a 1440p GPU. Midrange starting at 800 bucks. You'll have to dig even deeper in your pockets to buy 4080 or better 4090, if you want to game at 4K and above. That's Jensen's greedy logic. I really hope people don't buy into this shit.
 
That card is meant for gaming at 1440P. If you want a card for gaming at 4K then look at the likes of the 4080, 4090 or 7900 XTX.
4070(TI) is now a 1440p GPU. Midrange starting at 800 bucks. You'll have to dig even deeper in your pockets to buy 4080 or better 4090, if you want to game at 4K and above

Previous leaked benchmarks point towards the 4070 ti being about on par with the 3090 ti. The much smaller bus will make a big difference in practice but so will the new architecture.

Let's not be ridiculous, this card will be more than fine at 4k, even the 3070 ti already did more than 60fps easily at 4k, the 4070 will as well.

average-fps-3840-2160.png
 
Previous leaked benchmarks point towards the 4070 ti being about on par with the 3090 ti. The much smaller bus will make a big difference in practice but so will the new architecture.

Let's not be ridiculous, this card will be more than fine at 4k, even the 3070 ti already did more than 60fps easily at 4k, the 4070 will as well.

average-fps-3840-2160.png
60 FPS is fine if you don't play first person shooter games or you've been hooked on a console for years. Otherwise it's all about frames per second as in smoothness of game play.
 
I'm sure limited quantity will be available to help maintain the illusion that cards are selling like hotcakes....

Or maybe I'll be surprised and inventory will sit on the shelves like the 4080 did for weeks at my local Micro Center before finally selling out of them.
There is no illusion the market is rejecting bad priced gpus. Multiple Microcenters have hundreds of 4080 and 7900XT cards in stock. The 7900xt is showing a slight price reduction to $879.99
XFX Radeon RX 7900XT Gaming Graphics Card with 20GB GDDR6, AMD RDNA 3 RX-79TMBABF9 https://a.co/d/7pC3ouZ
The 4090s get restocked at about 25 units twice per week per store and do get sold out almost instantly. The only ploy is that Nvidia and AMD priced current gen to deplete last gen and now last gen is almost entirely depleted in terms of high end flagships. Naturally the prices should come down in 1 or 2 quarters its a matter of who blinks first.
 
60 FPS is fine if you don't play first person shooter games or you've been hooked on a console for years. Otherwise it's all about frames per second as in smoothness of game play.

I mentioned 60fps as a low ball, feel free to look at the actual chart with average 84, more than 90 in shooters like doom eternal or battlefield (certainly a fuck ton in competitive low resource games), and this is only the 3070ti and values taken at highest settings (which is not very reasonable).

Saying the 4070ti will be a 1440p card is pure nonsense
 
If people would stop paying so much for these damn cards ‍♂️.
I don't know how accurate this graph is.

But is says 2 things
There are more nvidia fanboys than AMD fanboys
and
Nvidia fanbois have more money
;)

And if I was Jensen, and seen this graph I would double the price of all cards and laugh all the way to the bank.

1672761496578.png
 
I mentioned 60fps as a low ball, feel free to look at the actual chart with average 84, more than 90 in shooters like doom eternal or battlefield (certainly a fuck ton in competitive low resource games), and this is only the 3070ti and values taken at highest settings (which is not very reasonable).

Saying the 4070ti will be a 1440p card is pure nonsense
There's a reason they sell 170hz and 240hz gaming monitors. No first person shooter gamer wants to see an average of less than 150 FPS.
 
Back
Top