• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA to Launch Mid-range GP106 Based Graphics Cards in Autumn 2016

I said 390/390x for a reason the 390 at 4K smokes the 970. The 390X price wise is like you said about the same as a fancy 970, but it is rare to see a 390x with an OEM style cooler so its really no different at that point.

smokes as it gets f-cking hot.. :) it aint gonna smoke frame rate wise at 4K.. lets be honest the 970 is a good 1080 card and thats about it.. talking 4K dont make a lot of sense with such cards..

trog
 
I said 390/390x for a reason the 390 at 4K smokes the 970. The 390X price wise is like you said about the same as a fancy 970, but it is rare to see a 390x with an OEM style cooler so its really no different at that point.
What R9 390(X) does in 4K does not matter at all since none of the cards are really powerful enough for 4K. R9 390X is just a slightly overclocked R9 290X with more memory, memory which it has no use for. It's kind of ironic that AMD fans claim 8 GB GPU memory is a benefit for R9 390X yet Fiji can do with 4 GB because memory capacity is suddenly not that important after all. In reality both GTX 970 and R9 390X will run in to GPU bottlenecks when they approach ~3 GB of memory usage, if you actually try to use more than 4 GB you'll have an extremely low FPS anyway.
 
GTX 970 can't touch GTX 980 or R9 390X. Later are way ahead. Also dat 4GB thing. Nope.
 
GTX 970 can match a reference 980 when highly overclocked. Yes, you can overclock the 980 too, but that seemingly wasn't the point.
 
1080p is still the defacto, and that is where the GTX 970 really shines, throw in it's excellent performance per watt and it's no wonder it was and is still so popular.
 
1080p is still the defacto, and that is where the GTX 970 really shines, throw in it's excellent performance per watt and it's no wonder it was and is still so popular.
1080p resolution is what majority of people use, number of people using higher resolution displays fall exponentially after 1080p. I got 970 after that whole memory fiasco came to light, got it for half of what it was selling in India for. At the end of this year I do plan to get 1440p(Dell U2515H) display so I am hoping that next gen will bring some good 1440p gpus to table.
 
What R9 390(X) does in 4K does not matter at all since none of the cards are really powerful enough for 4K. R9 390X is just a slightly overclocked R9 290X with more memory, memory which it has no use for. It's kind of ironic that AMD fans claim 8 GB GPU memory is a benefit for R9 390X yet Fiji can do with 4 GB because memory capacity is suddenly not that important after all. In reality both GTX 970 and R9 390X will run in to GPU bottlenecks when they approach ~3 GB of memory usage, if you actually try to use more than 4 GB you'll have an extremely low FPS anyway.


Not to mention that AMD video cards have no HDMI 2.0 support. Therefore if someone wanted to drive a 4K UHD TV even for just desktop use they would likely have little choice then to go with an nVidia GTX 900 series video card. Any kind of kludge solutions using DisplayPort to HDMI conversion to yield 4K at 60Hz was questionable at best until reciently and less and less relevant as these older generation of cards will be on their way out.

It's easy to it's a niche market, which it is. Still if you have one or more 4K UHD TVs then it's not so easily dismissed that AMD kept recycling their parts so much that the lack of HDMI 2.0 became just one of the omissions that gave away its considerable vintage.

And your right, since there really aren't any single cards that can perform really well at 4K resolution what does it matter!?!
 
You can play 4k with medium settings using a 390 without issue. Actually most games will average 40-50fps with high settings all around.
 
They'll have to pry my 970 from my cold dead hands. Though I'm curious to see if this will be a worthy upgrade from my 760 on my Ubuntu open air system.
 
You can play 4k with medium settings using a 390 without issue. Actually most games will average 40-50fps with high settings all around.
You didn't really thing this through now, did you? You'll need a stable 60 FPS for most games to be playable, which means you'll be running the most demanding games at low setting without AA, that's not what I call a gaming experience. Just fase it, R9 390/390X, GTX 970 and GTX 980 are pretty much 1080p cards.
 
You didn't really thing this through now, did you? You'll need a stable 60 FPS for most games to be playable, which means you'll be running the most demanding games at low setting without AA, that's not what I call a gaming experience. Just fase it, R9 390/390X, GTX 970 and GTX 980 are pretty much 1080p cards.

That is untrue completely. Most people can't see the difference between 30 and 60FPS.

and if that is your argument the fury x and 980 ti/titan are all 1080P cards because they are all within 3-5FPS of the 390/390x
 
You didn't really thing this through now, did you? You'll need a stable 60 FPS for most games to be playable,
Funny, Id ask the same question of you... In many titles, you don't need 60 FPS to a smooth gaming experience...

Id also happily run the 290x/390x/980 at 1440p...

the fury x and 980 ti/titan are all 1080P cards because they are all within 3-5FPS of the 390/390x
They are????!!!!!!

980 Ti/Titan X is like 20%+ faster than a 980 and the 980 is a few % faster than a 390x...

https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html
 
Last edited:
That is untrue completely. Most people can't see the difference between 30 and 60FPS.

That must be why manufacturers are offering 100Hz+ panels now.
 
They are????!!!!!!

980 Ti/Titan X is like 20%+ faster than a 980 and the 980 is a few % faster than a 390x...

https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html

I was looking at specific games, but you are correct 20% makes a huge difference I guess. There are still a lot of games that are under 40FPS with those cards was really all I was trying to get at.

bf4_3840_2160.gif


I would also prefer to use a little newer of a review that has newer drivers.

crysis3_3840_2160.png
codbo3_3840_2160.png


bf4_3840_2160.png
gtav_3840_2160.png


This is one of the most recent reviews w1z has done.

https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/1.html


That must be why manufacturers are offering 100Hz+ panels now.

Can you see the difference and in what specific game?
 
It seems the GK106 and GM206 have been always stuck in the wake of AMD Pitcairn, Tonga, and inept competition. Let's hope what appears after at what's looking like a 3-4 month trailing, and informing folks expect a lavish "Top Shelf" $250 price point it's going to bring more than it prior relatives. For such a price it better provide prodigious 1440p performance, way better than the 970 mustered.

Reflect on Nvidia should be able to deliver such 950/960 at much better prices, given there die area being 38% smaller than Tonga, 128-Bit, and their sales volume's. I suppose they don't want to show their hand as that would pervert this next GP106 being justified as a $220-250 part. For this next generation great 1440p performance is requisite from a $200 card!
 
Reflect on Nvidia should be able to deliver such 950/960 at much better prices, given there die area being 38% smaller than Tonga, 128-Bit, and their sales volume's. I suppose they don't want to show their hand as that would pervert this next GP106 being justified as a $220-250 part. For this next generation great 1440p performance is requisite from a $200 card!

I like the idea that ~$200 will yield good 1440p performance so let's hope nVidia agrees
 
You didn't really thing this through now, did you? You'll need a stable 60 FPS for most games to be playable, which means you'll be running the most demanding games at low setting without AA, that's not what I call a gaming experience. Just fase it, R9 390/390X, GTX 970 and GTX 980 are pretty much 1080p cards.

Actually, 60fps may be your preference, but most games are perfectly playable at 30fps and higher.

@bug manufacturers offer 100Hz(+) monitors because people are willing to buy them.
 
Can you see the difference and in what specific game?


I can see the difference, and it doesn't have to do with any specific game; it has to do with the monitor and how close FPS "meshes" with refresh intervals in specific ratios. This is why some people accept 30 FPS. But 38 FPS, it's a bit jarring. The true problems with this aspect of graphics rendering is all that much more evident with VR. The "refresh" of external light sources also plays a role, and that's why it has a greater impact with VR, since there is only one real source of light. It's also why for some people, flashing lights can cause seizures. And to truly understand what's going on there, other than knowing that there are effects, kind of requires some time spent in medical studies, or having the ear of someone who specializes in such topics.

So unless we've all become doctors, delving into this topic is a waste of time.

This might play a bit in the timing of why we get "entry-level" graphics before high-end ones. Its easier for these companies to have "stable" solutions on a smaller scale, and not all of it has to do with silicon stuff. Higher-end graphics have to do specific tasks at specific FPS, with monitors of a specific resolution, and a specific refresh.
 
Actually, 60fps may be your preference, but most games are perfectly playable at 30fps and higher.

@bug manufacturers offer 100Hz(+) monitors because people are willing to buy them.

This is simply not the case. Or it isn't for the types of games that are sensitive to or necessitate fast movement like FPS for example. FWIW, I have heard that 40fps at 4K with something like FreeSync can feel very much like 60fps.
 
I can see the difference, and it doesn't have to do with any specific game; it has to do with the monitor and how close FPS "meshes" with refresh intervals in specific ratios. This is why some people accept 30 FPS. But 38 FPS, it's a bit jarring. The true problems with this aspect of graphics rendering is all that much more evident with VR. The "refresh" of external light sources also plays a role, and that's why it has a greater impact with VR, since there is only one real source of light. It's also why for some people, flashing lights can cause seizures. And to truly understand what's going on there, other than knowing that there are effects, kind of requires some time spent in medical studies, or having the ear of someone who specializes in such topics.

So unless we've all become doctors, delving into this topic is a waste of time.

This might play a bit in the timing of why we get "entry-level" graphics before high-end ones. Its easier for these companies to have "stable" solutions on a smaller scale, and not all of it has to do with silicon stuff. Higher-end graphics have to do specific tasks at specific FPS, with monitors of a specific resolution, and a specific refresh.

Maybe that's why I don't notice it most games with an option to smooth frame rate that's what I choose if not old school vsync and 85hz. I can't personally tell the difference I have more issues with color reproduction issues on monitors than refresh rate
 
Maybe that's why I don't notice it most games with an option to smooth frame rate that's what I choose if not old school vsync and 85hz. I can't personally tell the difference I have more issues with color reproduction issues on monitors than refresh rate

Yeah, everyone's eyes are different, so there's no standard on this stuffs; they just try to get it the best they can, AFAIK.
 
Yeah, everyone's eyes are different, so there's no standard on this stuffs; they just try to get it the best they can, AFAIK.

And then impose marketing gimmicks to create the rest ;)
 
Back
Top