• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA: Turing Adoption Rate 45% Higher Than Pascal, 90% of Users Buying Upwards in Graphics Product Tiers

I kinda get what your trying to say, but the fact is this apparently "mid-range" Turing GPU smashes everything but the big daddy Turing GPU. What that says about the competiton is up for debate clearly, and Nvidia haven't even gone to 7nm yet.

In the real world people look at price and performance.
It does smash everything before, as it is supposed to. Whether Nvidia classifies something as high, mid or low range depends on the chip. All x04 chips are mid range. You used to be the one that most understood this and promoted it. It’s rated “within each family”.
 
  • Like
Reactions: 64K
I kinda get what your trying to say, but the fact is this apparently "mid-range" Turing GPU smashes everything but the big daddy Turing GPU. What that says about the competiton is up for debate clearly, and Nvidia haven't even gone to 7nm yet.

In the real world people look at price and performance.

Well the midrange Pascal beat the high end Maxwell. The midrange Maxwell beat the high end Kepler. The midrange Kepler beat the high end Fermi.

If anyone today is looking at buying a new 1080 Ti then they can expect to pay as much for the 1080 Ti today as the 2080 Ti sells for which is silly.
 
Seems we are arguing semantics at this point, of course the 104 chips are mid tier, but that doesn't mean the performance is... relative to what pathetic competition there is, and as such that dictates the price.
 
1. Of course no one knows but look at the price drops we have seen already on the 2060, 2070, 2080 are down $100s

I don't recall seeing any msrp drops. What I see is price gouging ending.
 
980 Ti is costing 420$ with import and shipping, that's all you need to know :)
 
Seems we are arguing semantics at this point, of course the 104 chips are mid tier, but that doesn't mean the performance is... relative to what pathetic competition there is, and as such that dictates the price.

Every new generation gamers will struggle with the idea that their expensive last generation GPU has been passed by due to new tech. Performance is the most important factor at the end of the day as you said. Knowing what you're buying is important too.

Nvidia has been overcharging for midrange GPUs for 7 years now. It's gotten ridiculous with the 2080 (non-Ti) and if gamers keep going along with their shenanigans then it will just get worse unless Intel enters the ring with reasonable prices next year. I think Intel will offer competition with Nvidia but the prices may be ridiculous too.
 
Nvidia didn't do a flawless launch, but at least now people should stop calling Turing a failure.
Well, turing not a failure, of course - rather than in temrs of prices and the common sence. Rather than that, raytracing is a nice feature - though not performed by nv - and there’s also some perfomance gains - but not for 1500$, cmon
When the 2080 is beating out the 1080ti by a fair margin, that's not midrange. That's top tier. The 2060 and 2070 are mid-range.
Yeah, we all know u love ur 2080, nv fanboy. TU104 is a midrange chip - and the pitty fact that nvidia charges premium for that. Especially considering that TU104 for 2080 suffered small “losses”, as well as tu102, as well as titan rtx price - all this raises the issue even more, making technically, like u said, a “troll” anybody who justifies ngreedia at this moment.

P.S. and open ur eyes wider, and u see that there’s much smaller difference between 1080Ti and 2080 - smaller than tpu states in their review, and definitely not “smashing”
 
It does smash everything before, as it is supposed to. Whether Nvidia classifies something as high, mid or low range depends on the chip. All x04 chips are mid range. You used to be the one that most understood this and promoted it. It’s rated “within each family”.
The x00/x02, x04, x06, x07/x08 names does not classify the target market for the chip, but the size and/or feature set of the chip in that chip family.

Back when GTX 680 launched, it featured a GK104 chip. In the consumer GPU market back then, it was clearly a high-end product. GTX 1080 launched similarly with a GP104 chip, and it was a high-end product for a few months. What is classified as high-end, mid-range and low-end at any time depends on whatever is present in the market at that time. The product makers intended market place is irrelevant, and so is pricing. If e.g. AMD decided to price a Polaris product at $2000, it will still not be a high-end product, and no one in their right mind would call it so.

The only sensible way to segment the market is by making a scale between the highest and lowest performing product on the market, then dividing that scale into three, and grouping products accordingly. We can argue about where exactly to draw the line between mid-range and high-end, but regardless if you use a linear or slightly exponential graph, you will end up with something close to the same thing; in the current market RTX 2080 is high-end, RTX 2080 and GTX 1080 Ti is in the grey area between upper mid-range and high-end, Radeon VII, RTX 2070, RTX 2060, GTX 1660 Ti, Vega 56, Vega 64 are all mid-range, GTX 1660 is in the transition between mid-range and low-end.
 
Well, turing not a failure, of course - rather than in temrs of prices and the common sence. Rather than that, raytracing is a nice feature - though not performed by nv - and there’s also some perfomance gains - but not for 1500$, cmon

Yeah, we all know u love ur 2080, nv fanboy. TU104 is a midrange chip - and the pitty fact that nvidia charges premium for that. Especially considering that TU104 for 2080 suffered small “losses”, as well as tu102, as well as titan rtx price - all this raises the issue even more, making technically, like u said, a “troll” anybody who justifies ngreedia at this moment.

P.S. and open ur eyes wider, and u see that there’s much smaller difference between 1080Ti and 2080 - smaller than tpu states in their review, and definitely not “smashing”

I've been called an Nvidia fanboy as well because for the last 12 years I have bought Nvidia GPUs for my Rigs and laptops except a AMD APU in one of them. I've been called an AMD fanboy for telling the truth about the GTX 680 and speaking out against the shenanigans Nvidia pulled with the 970. I bought both of them and was very much pleased with the performance of both of them and was still very much pleased with the performance of the 970 after the news broke about it. It didn't change the benchmarks from before.

The x00/x02, x04, x06, x07/x08 names does not classify the target market for the chip, but the size and/or feature set of the chip in that chip family.

Back when GTX 680 launched, it featured a GK104 chip. In the consumer GPU market back then, it was clearly a high-end product. GTX 1080 launched similarly with a GP104 chip, and it was a high-end product for a few months. What is classified as high-end, mid-range and low-end at any time depends on whatever is present in the market at that time. The product makers intended market place is irrelevant, and so is pricing. If e.g. AMD decided to price a Polaris product at $2000, it will still not be a high-end product, and no one in their right mind would call it so.

The only sensible way to segment the market is by making a scale between the highest and lowest performing product on the market, then dividing that scale into three, and grouping products accordingly. We can argue about where exactly to draw the line between mid-range and high-end, but regardless if you use a linear or slightly exponential graph, you will end up with something close to the same thing; in the current market RTX 2080 is high-end, RTX 2080 and GTX 1080 Ti is in the grey area between upper mid-range and high-end, Radeon VII, RTX 2070, RTX 2060, GTX 1660 Ti, Vega 56, Vega 64 are all mid-range, GTX 1660 is in the transition between mid-range and low-end.

The problem comes in with the terminology used. Most people that I have come across on tech sites rank an architecture as entry level, midrange and high end. When the GTX 1080 launched 11 months before the 1080 Ti it was the fastest Pascal but it wasn't the high end gaming Pascal. Why does it matter the terminology we use? Because this is a tech site and people come here that don't know very much about GPUs seeking advice because they followed a Google link and landed on this site.

There are still people around that consider the GTX 680 to be a Kepler Flagship because of terminology.

If Nvidia had launched the RTX 2060 first like they launched the first Maxwell (750 and 750 Ti) would anyone call that GPU a high end Turing?
 
For gaming obviously there is no point of upgrading from 10xx to 20xx. For rendering, yeah maybe in the long run as most renders don't fully support RTX yet. Me certainly skipping 2xxx. Too power hungry vs Pascal, very little incentive to upgrade, even considering tangible gains in rendering times. Twice the price for 15-20% more I'm not that daft and nobody really should be.

Next gen RTX yeah, maybe. When professional engines mature and embrace the technology. Right now, not a chance. :shadedshu: nVidia CEO feels the heat and his underwear is probably already on fire...
 
Too power hungry vs Pascal

I'd have to say this is the first time I have heard someone call Turing power hungry. Perhaps compared to Pascal like you say but still good compared to the rest of the market.
 
Consumerism at it's finest.
I hope we get a good <75w card someday.
 
For gaming obviously there is no point of upgrading from 10xx to 20xx.
When has it ever been any point to upgrading every generation historically speaking?

I usually recommend skipping at least one generation, and buying something that is not the absolute minimum, so it will at least be satisfying for a while…
 
When has it ever been any point to upgrading every generation historically speaking?
There have been times when it was justified. For example, back in the ATI days, upgrading from the Radeon 8000 to the 9000 series was a very good move because the performance jump was big and there were new features that were must haves. Same with the Geforce 7000 to 8000, big jump in performance and new feature sets.

This generation, the RTX cards have given a justifiable jump in performance and a new feature set.

I usually recommend skipping at least one generation, and buying something that is not the absolute minimum, so it will at least be satisfying for a while…
Normally with you on that one, but there are exceptions to that rule and this generational jump is one of them.
 
So, here we have fresh review from techspot regarding RTX in SoTR, including entire rtx line. So what we have in the end?
1. The first thing to note is that the Medium ray tracing setting doesn’t affect the quality of shadows in most situations, because it only ray traces point light shadows...... when outdoors, the medium mode uses standard Ultra-quality shadow maps.....shadows from the sun are not ray traced......there is no appreciable difference between DXR off and DXR medium........to notice a difference is with the High and Ultra.

And then - the results. Mostly poor. In case you really want to get some quality improvements that raytracing should provide, you should prefer high and ultra, course. Entire RTX line fails to maintain pretty minFPS - 2080 Ti only almost playable, but 40FPS even in 1080p is bad, 4K is unplayable at all, 1440p only for 60Hz gaming, with poor minFPS and - I guess - frametime. RTX 2060 looks like a fifth wheel, even in 1080p.

lexluthermiester said:
This generation, the RTX cards have given a justifiable jump in performance and a new feature set.
Yeah, perfomance for the same price, and not-even-close justifiable when RTX is "ON"
 
Nvidia prices are too high, their practices are awful, they only care about money.......goes and buys a rtx2080. I get the impression there are quite a few hypocrites among PC users
 
  • Like
Reactions: Fx
Remember when the new gen of cards beat the old gen of cards in every category/price point yet where still about the same price ?
Nvidia is trying to pretend that old generations of cards dont depreciate just because they over produced due to the mining craze and now they expect consumers to either buy all their old stock at non depreciated prices or pay the new gen tax for them tacking on what equates to PhysX 2.0 shenanigans.

Yeah, perfomance for the same price, and not-even-close justifiable when RTX is "ON"
I have to agree, justifiable would be running at at least 60fps which doesn't happen in most cases.
You also have the shenanigans with forcing DLSS to enable raytracing, that's about as bad as Unreal engine games having ultra settings but using less than 100% resolution scaling to make up for the performance hit which is exactly what DLSS is doing

I was seriously considering getting a 1660 but once again Nvidia is forcing market segmentation by being cheap and not giving it 8gb of vram like every other modern gpu so I guess i will just keep my money and wait.
 
Remember when the new gen of cards beat the old gen of cards in every category/price point yet where still about the same price ?
I think it might be your mind playing tricks on you.
To refresh your memory:
1489189662xrJkzvohX8_1_1.png

Prices has varied a lot from generation to generation.
 
I think it might be your mind playing tricks on you.
To refresh your memory:
1489189662xrJkzvohX8_1_1.png

Prices has varied a lot from generation to generation.
Your chart simply proves my point the average cost of the top nvidia gpu over the last 18 generations in 2017 dollars was $635 the only tricks being played are by Nvidia not my mind.
Nvidia's own chart highlights the problem that $500~$700 range is now 1000$ + range
Untitled.png
 
Your chart simply proves my point the average cost of the top nvidia gpu over the last 18 generations in 2017 dollars was $635 the only tricks being played are by Nvidia not my mind.
Try looking at the column at the right, that's corrected for inflation as of 2017.
 
Try looking at the column at the right, that's corrected for inflation as of 2017.
Try doing the math yourself and get back to me I would really hate to have to post a picture of the windows calculator.
 
Remember when the new gen of cards beat the old gen of cards in every category/price point yet where still about the same price ?

Yeah but that was also a time when technological innovations were 'easy'. Process nodes aren't being deployed every year and the ones being worked on have been problematic. I have no doubts that manufacturing costs have increased since those days and you can be rest assured that everyone is passing those costs along to the consumer. Is it enough to explain the increase in RTX prices? Probably not. But it is one contributor.

You can also expect AMD to stay as close as they feel they can to NV prices because they need money. The days of AMD giving out free lunch are gone (hopefully). If they can't start banking some cash from GPU sales then we are always going to be in this mess.
 
Try doing the math yourself and get back to me I would really hate to have to post a picture of the windows calculator.
I was actually serious, but I'm sorry, I don't have time to educate you on inflation.
 
Back
Top