Tuesday, April 19th 2016

NVIDIA to Launch Mid-range GP106 Based Graphics Cards in Autumn 2016

NVIDIA is expected to launch the first consumer graphics cards based on the GP106 silicon some time in Autumn 2016 (late Q3-early Q4). Based on the company's next-generation "Pascal" architecture, the GP106 will drive several key mid-range and performance-segment (price/performance sweetspot) SKUs, including the cards that succeed the current GeForce GTX 960 and GTX 950. Based on the way NVIDIA's big GP100 silicon is structured, assuming the GP106 features two graphics processing clusters (GPCs), the way the current GM206 silicon does; one can expect a CUDA core count in the neighborhood of 1,280. NVIDIA could use this chip to capture several key sub-$250 price points.
Source: SweClockers
Add your own comment

55 Comments on NVIDIA to Launch Mid-range GP106 Based Graphics Cards in Autumn 2016

#26
efikkan
cdawallI said 390/390x for a reason the 390 at 4K smokes the 970. The 390X price wise is like you said about the same as a fancy 970, but it is rare to see a 390x with an OEM style cooler so its really no different at that point.
What R9 390(X) does in 4K does not matter at all since none of the cards are really powerful enough for 4K. R9 390X is just a slightly overclocked R9 290X with more memory, memory which it has no use for. It's kind of ironic that AMD fans claim 8 GB GPU memory is a benefit for R9 390X yet Fiji can do with 4 GB because memory capacity is suddenly not that important after all. In reality both GTX 970 and R9 390X will run in to GPU bottlenecks when they approach ~3 GB of memory usage, if you actually try to use more than 4 GB you'll have an extremely low FPS anyway.
Posted on Reply
#27
RejZoR
GTX 970 can't touch GTX 980 or R9 390X. Later are way ahead. Also dat 4GB thing. Nope.
Posted on Reply
#28
EarthDog
GTX 970 can match a reference 980 when highly overclocked. Yes, you can overclock the 980 too, but that seemingly wasn't the point.
Posted on Reply
#29
Fluffmeister
1080p is still the defacto, and that is where the GTX 970 really shines, throw in it's excellent performance per watt and it's no wonder it was and is still so popular.
Posted on Reply
#30
Chaitanya
Fluffmeister1080p is still the defacto, and that is where the GTX 970 really shines, throw in it's excellent performance per watt and it's no wonder it was and is still so popular.
1080p resolution is what majority of people use, number of people using higher resolution displays fall exponentially after 1080p. I got 970 after that whole memory fiasco came to light, got it for half of what it was selling in India for. At the end of this year I do plan to get 1440p(Dell U2515H) display so I am hoping that next gen will bring some good 1440p gpus to table.
Posted on Reply
#32
Octavean
efikkanWhat R9 390(X) does in 4K does not matter at all since none of the cards are really powerful enough for 4K. R9 390X is just a slightly overclocked R9 290X with more memory, memory which it has no use for. It's kind of ironic that AMD fans claim 8 GB GPU memory is a benefit for R9 390X yet Fiji can do with 4 GB because memory capacity is suddenly not that important after all. In reality both GTX 970 and R9 390X will run in to GPU bottlenecks when they approach ~3 GB of memory usage, if you actually try to use more than 4 GB you'll have an extremely low FPS anyway.
Not to mention that AMD video cards have no HDMI 2.0 support. Therefore if someone wanted to drive a 4K UHD TV even for just desktop use they would likely have little choice then to go with an nVidia GTX 900 series video card. Any kind of kludge solutions using DisplayPort to HDMI conversion to yield 4K at 60Hz was questionable at best until reciently and less and less relevant as these older generation of cards will be on their way out.

It's easy to it's a niche market, which it is. Still if you have one or more 4K UHD TVs then it's not so easily dismissed that AMD kept recycling their parts so much that the lack of HDMI 2.0 became just one of the omissions that gave away its considerable vintage.

And your right, since there really aren't any single cards that can perform really well at 4K resolution what does it matter!?!
Posted on Reply
#33
cdawall
where the hell are my stars
You can play 4k with medium settings using a 390 without issue. Actually most games will average 40-50fps with high settings all around.
Posted on Reply
#34
InhaleOblivion
They'll have to pry my 970 from my cold dead hands. Though I'm curious to see if this will be a worthy upgrade from my 760 on my Ubuntu open air system.
Posted on Reply
#35
efikkan
cdawallYou can play 4k with medium settings using a 390 without issue. Actually most games will average 40-50fps with high settings all around.
You didn't really thing this through now, did you? You'll need a stable 60 FPS for most games to be playable, which means you'll be running the most demanding games at low setting without AA, that's not what I call a gaming experience. Just fase it, R9 390/390X, GTX 970 and GTX 980 are pretty much 1080p cards.
Posted on Reply
#36
cdawall
where the hell are my stars
efikkanYou didn't really thing this through now, did you? You'll need a stable 60 FPS for most games to be playable, which means you'll be running the most demanding games at low setting without AA, that's not what I call a gaming experience. Just fase it, R9 390/390X, GTX 970 and GTX 980 are pretty much 1080p cards.
That is untrue completely. Most people can't see the difference between 30 and 60FPS.

and if that is your argument the fury x and 980 ti/titan are all 1080P cards because they are all within 3-5FPS of the 390/390x
Posted on Reply
#37
EarthDog
efikkanYou didn't really thing this through now, did you? You'll need a stable 60 FPS for most games to be playable,
Funny, Id ask the same question of you... In many titles, you don't need 60 FPS to a smooth gaming experience...

Id also happily run the 290x/390x/980 at 1440p...
cdawallthe fury x and 980 ti/titan are all 1080P cards because they are all within 3-5FPS of the 390/390x
They are????!!!!!!

980 Ti/Titan X is like 20%+ faster than a 980 and the 980 is a few % faster than a 390x...

www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html
Posted on Reply
#38
bug
cdawallThat is untrue completely. Most people can't see the difference between 30 and 60FPS.
That must be why manufacturers are offering 100Hz+ panels now.
Posted on Reply
#39
cdawall
where the hell are my stars
EarthDogThey are????!!!!!!

980 Ti/Titan X is like 20%+ faster than a 980 and the 980 is a few % faster than a 390x...

www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html
I was looking at specific games, but you are correct 20% makes a huge difference I guess. There are still a lot of games that are under 40FPS with those cards was really all I was trying to get at.



I would also prefer to use a little newer of a review that has newer drivers.





This is one of the most recent reviews w1z has done.

www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/1.html
bugThat must be why manufacturers are offering 100Hz+ panels now.
Can you see the difference and in what specific game?
Posted on Reply
#40
Casecutter
It seems the GK106 and GM206 have been always stuck in the wake of AMD Pitcairn, Tonga, and inept competition. Let's hope what appears after at what's looking like a 3-4 month trailing, and informing folks expect a lavish "Top Shelf" $250 price point it's going to bring more than it prior relatives. For such a price it better provide prodigious 1440p performance, way better than the 970 mustered.

Reflect on Nvidia should be able to deliver such 950/960 at much better prices, given there die area being 38% smaller than Tonga, 128-Bit, and their sales volume's. I suppose they don't want to show their hand as that would pervert this next GP106 being justified as a $220-250 part. For this next generation great 1440p performance is requisite from a $200 card!
Posted on Reply
#41
Octavean
CasecutterReflect on Nvidia should be able to deliver such 950/960 at much better prices, given there die area being 38% smaller than Tonga, 128-Bit, and their sales volume's. I suppose they don't want to show their hand as that would pervert this next GP106 being justified as a $220-250 part. For this next generation great 1440p performance is requisite from a $200 card!
I like the idea that ~$200 will yield good 1440p performance so let's hope nVidia agrees
Posted on Reply
#42
rtwjunkie
PC Gaming Enthusiast
efikkanYou didn't really thing this through now, did you? You'll need a stable 60 FPS for most games to be playable, which means you'll be running the most demanding games at low setting without AA, that's not what I call a gaming experience. Just fase it, R9 390/390X, GTX 970 and GTX 980 are pretty much 1080p cards.
Actually, 60fps may be your preference, but most games are perfectly playable at 30fps and higher.

@bug manufacturers offer 100Hz(+) monitors because people are willing to buy them.
Posted on Reply
#43
rruff
OctaveanI like the idea that ~$200 will yield good 1440p performance so let's hope nVidia agrees
The only way Nvidia will "agree" is if AMD does...
Posted on Reply
#44
cadaveca
My name is Dave
cdawallCan you see the difference and in what specific game?
I can see the difference, and it doesn't have to do with any specific game; it has to do with the monitor and how close FPS "meshes" with refresh intervals in specific ratios. This is why some people accept 30 FPS. But 38 FPS, it's a bit jarring. The true problems with this aspect of graphics rendering is all that much more evident with VR. The "refresh" of external light sources also plays a role, and that's why it has a greater impact with VR, since there is only one real source of light. It's also why for some people, flashing lights can cause seizures. And to truly understand what's going on there, other than knowing that there are effects, kind of requires some time spent in medical studies, or having the ear of someone who specializes in such topics.

So unless we've all become doctors, delving into this topic is a waste of time.

This might play a bit in the timing of why we get "entry-level" graphics before high-end ones. Its easier for these companies to have "stable" solutions on a smaller scale, and not all of it has to do with silicon stuff. Higher-end graphics have to do specific tasks at specific FPS, with monitors of a specific resolution, and a specific refresh.
Posted on Reply
#45
Octavean
rruffThe only way Nvidia will "agree" is if AMD does...
That's a two way street isn't it,....?
Posted on Reply
#46
Octavean
rtwjunkieActually, 60fps may be your preference, but most games are perfectly playable at 30fps and higher.

@bug manufacturers offer 100Hz(+) monitors because people are willing to buy them.
This is simply not the case. Or it isn't for the types of games that are sensitive to or necessitate fast movement like FPS for example. FWIW, I have heard that 40fps at 4K with something like FreeSync can feel very much like 60fps.
Posted on Reply
#47
cdawall
where the hell are my stars
cadavecaI can see the difference, and it doesn't have to do with any specific game; it has to do with the monitor and how close FPS "meshes" with refresh intervals in specific ratios. This is why some people accept 30 FPS. But 38 FPS, it's a bit jarring. The true problems with this aspect of graphics rendering is all that much more evident with VR. The "refresh" of external light sources also plays a role, and that's why it has a greater impact with VR, since there is only one real source of light. It's also why for some people, flashing lights can cause seizures. And to truly understand what's going on there, other than knowing that there are effects, kind of requires some time spent in medical studies, or having the ear of someone who specializes in such topics.

So unless we've all become doctors, delving into this topic is a waste of time.

This might play a bit in the timing of why we get "entry-level" graphics before high-end ones. Its easier for these companies to have "stable" solutions on a smaller scale, and not all of it has to do with silicon stuff. Higher-end graphics have to do specific tasks at specific FPS, with monitors of a specific resolution, and a specific refresh.
Maybe that's why I don't notice it most games with an option to smooth frame rate that's what I choose if not old school vsync and 85hz. I can't personally tell the difference I have more issues with color reproduction issues on monitors than refresh rate
Posted on Reply
#48
cadaveca
My name is Dave
cdawallMaybe that's why I don't notice it most games with an option to smooth frame rate that's what I choose if not old school vsync and 85hz. I can't personally tell the difference I have more issues with color reproduction issues on monitors than refresh rate
Yeah, everyone's eyes are different, so there's no standard on this stuffs; they just try to get it the best they can, AFAIK.
Posted on Reply
#49
cdawall
where the hell are my stars
cadavecaYeah, everyone's eyes are different, so there's no standard on this stuffs; they just try to get it the best they can, AFAIK.
And then impose marketing gimmicks to create the rest ;)
Posted on Reply
#50
bug
rtwjunkieActually, 60fps may be your preference, but most games are perfectly playable at 30fps and higher.

@bug manufacturers offer 100Hz(+) monitors because people are willing to buy them.
It makes a difference in fast-paced games (first/third person, racing). It's probably unnoticed when playing TBS or maybe MOBA.
I, for one, am not a FPS junkie. But when trying to tune Witcher3, it felt stuttery with enough detail enabled. When I looked at the FPS count, it was usually in the mid 40s.

Another aspect is that if the panel can do more than 60Hz, you get the luxury of enabling ULMB, for example.
Posted on Reply
Add your own comment
Dec 23rd, 2024 21:54 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts