• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are These GeForce GTX 780 and GeForce GTX 770?

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA's next-generation GPU family is not far away. We're hearing that new product-launches from the GPU giant could be just weeks away, and within this month. It's only natural that some of these cards could pass through leaky pipes, much to our benefit. One such source in China posted pictures of what he claims to be NVIDIA-reference design GeForce GTX 780 and GeForce GTX 770. Both cards feature a design not unlike the $1000 GeForce GTX TITAN. In fact they look identical. It's not the questionable embossing on the cooler shrouds that caught our attention, it's the subtle differences near the PCI-Express interface - location of PCB number, arrangement of termination resistors, etc., that did, and so we're rating this leak highly plausible.

We know from a previous report that GeForce GTX 780 will be positioned a notch below the GeForce GTX TITAN, in NVIDIA's product stack. It could be based on the same GK110 silicon, and could feature 2,496 CUDA cores, and a 320-bit wide GDDR5 memory interface, holding 5 GB of memory. It won't surprise us if NVIDIA completely recycles the GTX TITAN PCB, as it doesn't particularly have an over-the-top selection of components, apart from the GPU. The GeForce GTX 770 is a different beast altogether. It is based on a GPU not unlike the GK104, with 1,536 CUDA cores, and a 256-bit wide GDDR5 memory interface, holding 4 GB of memory. To sweeten the prospect of upgrading to these new cards, NVIDIA is dropping in the same sexy magnesium alloy-based cooling solution it used on $1000 cards such as the GTX TITAN and GTX 690.



View at TechPowerUp Main Site
 
It is based on a GPU not unlike the GK104, with 1,536 CUDA cores, and a 256-bit wide GDDR5 memory interface, holding 4 GB of memory.

so a 770 is pretty much a 680? they better have it at a cheaper pricing point then...
 
so a 770 is pretty much a 680? they better have it at a cheaper pricing point then...

Higher clocks, 4 GB standard issue, Titan-like cooler.
 
That escalated quickly! From "It's a FAKEEEEEE!" to "Highly plausible" :toast:

so a 770 is pretty much a 680? they better have it at a cheaper pricing point then...

:laugh: Good luck! This Nvidia we iz talkin' 'boot ;)
 
Higher clocks, 4 GB standard issue, Titan-like cooler.

Make money money make money money!!!

I love it that they are sticking with that cooler design.
 
Sunuva...
I now have minor buyers remorse having recently purchased a GTX 660 DirectCU II.:nutkick:

It's a good little card though, so the regret is very small.
 
Admittedly the Titan cooler probably isn't a bad 'stock' cooler but give me an after market cooler any day of the week, I will be waiting for an MSI Twin Frozr III, ASUS DirectCU II or Inno3D iChill if i go with nVidia but tbh with AMD winning all the consoles it might make me choose an AMD card as all future games will be tailored for AMD hardware.
 
Admittedly the Titan cooler probably isn't a bad 'stock' cooler but give me an after market cooler any day of the week, I will be waiting for an MSI Twin Frozr III, ASUS DirectCU II or Inno3D iChill if i go with nVidia but tbh with AMD winning all the consoles it might make me choose an AMD card as all future games will be tailored for AMD hardware.

The current Xbox uses an ATI architecture and that didn't pay dividends (in this very specific sense) at all - these things change too much over time.

And as we've seen with recent "Gaming Evolved" titles, games optimised for AMD architectures often run better on Nvidia anyway.
 
Sunuva...
I now have minor buyers remorse having recently purchased a GTX 660 DirectCU II.:nutkick:

It's a good little card though, so the regret is very small.

Should've went with eVGA for their step up program, that's what I plan on doing :D
 
so a 770 is pretty much a 680? they better have it at a cheaper pricing point then...

It is rumored it could be priced at 399, where the current 670 is price around, but with 680 chip, high clocks, more memory as a standard, and the sexy titan cooler like bta mentioned! These coolers are enough to get me to buy one of them honestly.
 
has there been anymore evidence of 4gb standard on the 770? the site that leaked those pictures suggested 2gb didnt they?
 
has there been anymore evidence of 4gb standard on the 770? the site that leaked those pictures suggested 2gb didnt they?

Seems there will be 2/4gb versions available, as with the 670 and 680.
 
Sucks. I wanted to wait for this but one of my GTX470s crapped the bed to I "had" to "upgrade" to a GTX680...

Makes me want to try to sell my 680 and fly with one 470 until these release.
 
The current Xbox uses an ATI architecture and that didn't pay dividends (in this very specific sense) at all - these things change too much over time.

And as we've seen with recent "Gaming Evolved" titles, games optimised for AMD architectures often run better on Nvidia anyway.

I am not trying to knock nVidia's cards as they have some great cards atm and more just over the horizon but i do think that with both consoles using very similar architecture (AMD GCN) then game developer will devote 100% of their time to getting everything out of the AMD hardware, instead of trying split their time between getting the most out of the overly complicated CELL architecture with nVidia or PowerPC with ATi.

But i agree if nVidia has better/faster cards at the time then it won't matter if the game was developed with AMD's consoles in mind or not. Plus this will only be of consequence for a generation or 2 as after this the PC graphics world will have left the current consoles in its dust.

The way i am looking at it is if i am going to buy a card to last be the next 3 or 4 years then it may as well be GCN based as i know that almost all games for the next 6 to 10 years will be designed with this architecture in mind.
 
Admittedly the Titan cooler probably isn't a bad 'stock' cooler but give me an after market cooler any day of the week, I will be waiting for an MSI Twin Frozr III, ASUS DirectCU II or Inno3D iChill if i go with nVidia but tbh with AMD winning all the consoles it might make me choose an AMD card as all future games will be tailored for AMD hardware.

Don't forget Gigabyte's Windforce 3x/2x gpu cooler designs. But the stock cooler does look beefy nonetheless.
 
And as we've seen with recent "Gaming Evolved" titles, games optimised for AMD architectures often run better on Nvidia anyway.

that's why i would consider my next gpu upgrade on red camp side.
anyway those pics are surely showing us a sexy card. :)
 
let's just hope the pricing is right
 
you mean "nvidia right" right
 
The current Xbox uses an ATI architecture and that didn't pay dividends (in this very specific sense) at all - these things change too much over time.

And as we've seen with recent "Gaming Evolved" titles, games optimised for AMD architectures often run better on Nvidia anyway.

While the current Xbox does use an old R500 chip that did use similar design approaches to the rest of the VLIW GPUs (software scheduling and instruction level parallelism), it offered little to no advantaged when it comes to the programming interface of the GPU (Direct 3D software runtime), as most games written for the Xbox 360 don't use a software runtime and run directly on the hardware (native code).

The problem with frame time latencies with current AMD GPUs is strictly software related, and is exclusively related to the way their new driver model interfaces with the Direct 3D runtime. The latencies problem is actually visible mostly on windows 8, as it uses WDDM 1.2, which is quite different from WDDM 1.1 in Windows 7. Moreover, Nvidia, a year or so ago, had similar latency problems with their Kepler line of Graphics cards. Back then, only tech report used frame-time tests in their reviews and the problem didn't become as known as AMD's problem is today, and they had all the time they needed to "fix" the latency problems in their drivers.

Current AMD graphics products are overall better both value and performance wise than their Nvidia counterparts. The 7970GHz edition, which is a bit cheaper than the 680, with the latest beta driver scores better than the GTX680 in both average and frame time latencies .tests, but the crossfire/7990 does does exhibit a lot of frame rendering consistency problems. But hey, who gives a damn, it's a $1000 graphics product, and all multi-gpu products are bound to perform in a worse manner when it comes to smooth animation frame rendering.
 
Last edited:
let's just hope the pricing is right
GTX770 will MSRP at $500 same a the 680 did. And while the cooler might appear Titan like, something tells me at least the GTX770 won't be of the same construction it could be the same GTX 680 with a faux painted plastic shroud. No vapor-chamber, lower grade fan, a lot of slight differences that make it cost effective but cool looking enough to make folk go wow!
 
I've been curious about the 770 in the respect to the (4GB) RAM. Conjecture follows, but I think it's a possibility to consider:

Hynix, the less-expensive-than-but-pretty-close-in-quality-to-Samsung GDDR5 producer has current availability of 4Gb (512MB chips) 1.5v 7ghz GDDR5. 2Gb ( 256MB chips, like what is used on most consumer cards) will not be available until at least Q3 according to their marketing materials.

Yes, that is kind of odd, but think about that for a minute.

It may explain the 4GB push. I think it's possible 770 may launch with 4GB because it has to do so, unless they plan to use Samsung exclusively or opt for 1.6v. That said, it leaves room for a cheaper product later to compete with a Q4 AMD product also using 2Gb ram of the same type as prices on said ram should fall because of competition/availability.

I've long thought the availability of newer gddr5 (perhaps including 1.35v 5ghz spec to keep lower-end products in-check) has played a roll in the release of the new (/refresh) generation. That said, 680, all things considered, is on the cusp of 4GB being feasible (to keep with current standards of bw/compute 680 could scale to almost 1300mhz with 7ghz ram.) Certainly the speed will be useful compared to where current 6ghz scale, even if barely, as 680 is clocked as high as possible stock for current stock 6ghz ram. To make a better product, the higher spec has to be used, plus 4GB makes sense when you consider SLI 770's be a very realistic option compared to pricing/size of GK110 options.
 
I've been curious about the 770 in the respect to the (4GB) RAM. Conjecture follows, but I think it's a possibility to consider:

Hynix, the less-expensive-than-but-pretty-close-in-quality-to-Samsung GDDR5 producer has current availability of 4Gb (512MB chips) 1.5v 7ghz GDDR5. 2Gb ( 256MB chips, like what is used on most consumer cards) will not be available until at least Q3 according to their marketing materials.

Yes, that is kind of odd, but think about that for a minute.

It may explain the 4GB push. I think it's possible 770 may launch with 4GB because it has to do so, unless they plan to use Samsung exclusively or opt for 1.6v. That said, it leaves room for a cheaper product later to compete with a Q4 AMD product also using 2Gb ram of the same type as prices on said ram should fall because of competition/availability.

I've long thought the availability of newer gddr5 (perhaps including 1.35v 5ghz spec to keep lower-end products in-check) has played a roll in the release of the new (/refresh) generation. That said, 680, all things considered, is on the cusp of 4GB being feasible (to keep with current standards of bw/compute 680 could scale to almost 1300mhz with 7ghz ram.) Certainly the speed will be useful compared to where current 6ghz scale, even if barely, as 680 is clocked as high as possible stock for current stock 6ghz ram. To make a better product, the higher spec has to be used, plus 4GB makes sense when you consider SLI 770's be a very realistic option compared to pricing/size of GK110 options.

If the GTX 780 will be equipped with Samsung GDDR5 the adopters will be in for a treat.

Those chips are awesome, my cards do +700/800 on the memory, that's insane.
 
Back
Top