• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce GTX TITAN Black Pictured, Isn't Strictly Black

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Here's the first alleged picture of NVIDIA's GeForce GTX TITAN Black, a high-end SKU NVIDIA is working on, to restore the competitiveness of the $999 price-point it commands. Although referred to as "TITAN Black," the card is nowhere close to looking like the CGI renders that surfaced last November. The board looks identical to the original GTX TITAN, except its "TITAN" etching on the cooler shroud is painted in black. The GTX TITAN Black maxes out the 28 nm GK110 silicon, featuring 2,880 CUDA cores, 240 TMUs, 48 ROPs, and a 384-bit wide GDDR5 memory interface, holding 6 GB of memory. It also features full DPFP for the GK110 silicon, which is exclusive to the GTX TITAN, within the GeForce range. NVIDIA is expected to give the GTX TITAN Black a low-key launch some time next week.



View at TechPowerUp Main Site
 
Lame, but I'd be more disappointed if it weren't for the fact that I'd be ripping the stock cooler off anyways.
 
waiting for some bad azz benches!
 
3 Word, "NOT FRIENDLY BUDGET"
 
For an industry that used to churn out a 'proper' new top end card every 6-9 months plus that new card would usually double the performance over the last, its amazing to see how things have slowed down so much.

The 'Titan' chip was released over 14 months ago in the form of the Tesla K20 and yet we are still talking about another 6-12 months before we will see anything new in the high end!

I miss having Matrox, 3DFX, PowerVR, S3, 3DLabs and the others around!
 
I miss having Matrox, 3DFX, PowerVR, S3, 3DLabs and the others around!
I hear you.
I also want to see again a videocard like Voodoo 5 with 4 GPUs on it and external power source brick. And for 599$ =)))
 
Last edited:
It's not so much the lack of competition from the other brands, but physics that's causing this lack of performance.

We're constantly up against power and thermal limits now, which has massively reduced the extra performance one can get out of a commercial GPU. Same goes for CPUs.

It's no longer possible just to produce a chip at a certain performance level and deal with the power and heat issues, because there's just too much. The massive number of transistors required (7 billion or so) is also an issue, since it seriously impacts yields.
 
Thick ! 2slot throwing out (show of). This fat Card, outdated outputs view of the upcoming 4k. The appearance did not do our best with regard to black ed. (bad cooler) GTX780ti with 6 GB of RAM is all I see and not Black Edition not to mention the price which again exceeds the sense !Previous generations of Black Edition cards are displayed as a beautiful and much better cooler here, we have added 3 GB of RAM on the back of the card and without the rear cover (back pl.).
 
Thick ! 2slot throwing out (show of). This fat Card
Dual slot qualifies as a "fat card"? If that's the case then every air cooled performance and enthusiast board in existence qualifies as a "fat card"...along with more than a fair percentage of mainstream cards.
Previous generations of Black Edition cards are displayed as a beautiful and much better cooler here
What the f___ are you smoking? "Previous Black Edition cards" - it's a badge that XFX (Kings of shitty support) sticks on reference cards. Here's one of my old 5850's...I can tell you from first hand experience, that the card isn't overly "beautiful", and the cooler is so great that two (of the three I had) didn't handle the factory overclock at all well
B2Cgnd2.jpg
 
Dude relax. He is using Google translate to post, cut him some slack =)))))
Besides you are comparing ATI design with nVidia design....
And yes, AMD card's default cooler is pure garbage since...forever; including actual cards.
 
It's not so much the lack of competition from the other brands, but physics that's causing this lack of performance.

We're constantly up against power and thermal limits now, which has massively reduced the extra performance one can get out of a commercial GPU. Same goes for CPUs.

It's no longer possible just to produce a chip at a certain performance level and deal with the power and heat issues, because there's just too much. The massive number of transistors required (7 billion or so) is also an issue, since it seriously impacts yields.

Exactly. This is also true for the CPU industry. I'm curious how or what will happen once they will go bellow 10nm, and increase even more the transistor count. What will be the percent of the good yields? But the performance one? I'm starting to question the Moore's law....
 
It's not so much the lack of competition from the other brands, but physics that's causing this lack of performance.

We're constantly up against power and thermal limits now, which has massively reduced the extra performance one can get out of a commercial GPU. Same goes for CPUs.

It's no longer possible just to produce a chip at a certain performance level and deal with the power and heat issues, because there's just too much. The massive number of transistors required (7 billion or so) is also an issue, since it seriously impacts yields.


You are thinking too linearly. Yes using current hardware we have packed as many transitions into a set space as we are going to fit but they have to innovate, they have to come up with a new architecture that's changes the game.
They used to bring out chips each generation that really were new and so performance increases came from that, now all they do is either overclock it, make the same architecture bigger or rebrand and price cut, then just wait for a die shrink.
 
Exactly. This is also true for the CPU industry. I'm curious how or what will happen once they will go bellow 10nm, and increase even more the transistor count. What will be the percent of the good yields? But the performance one? I'm starting to question the Moore's law....

3D chip stacking, water channels inside the chip, new materials, physically bigger chips that just spread the heat densities out better...
 
" to restore the competitiveness of the $999 price-point it commands"

In other words, Nvidia is giving you another chance to bend over the table in case you missed the Titan.
 
8800 Ultra Part Deaux
 
It's not so much the lack of competition from the other brands, but physics that's causing this lack of performance.

We're constantly up against power and thermal limits now, which has massively reduced the extra performance one can get out of a commercial GPU. Same goes for CPUs.

It's no longer possible just to produce a chip at a certain performance level and deal with the power and heat issues, because there's just too much. The massive number of transistors required (7 billion or so) is also an issue, since it seriously impacts yields.
We are in serious need of more exotic cooling to keep up this performance race for sure, we are hitting the maximum power output per square mm of die area that most coolers are able to handle before the silicon starts to show heat stress.

IBM was playing with the idea of using imbedded tubes to allow coolant to run through the actual chip itself and there are some interesting papers out on that. Or perhaps sealing the die in a liquid filled chamber.

https://ssd-rd.web.cern.ch/ssd-rd/seminar/transparencies/03-01-20-ssd-HGardeniers-pt1.pdf
 
" to restore the competitiveness of the $999 price-point it commands"

In other words, Nvidia is giving you another chance to bend over the table in case you missed the Titan.

LOL!! I love this!
 
" to restore the competitiveness of the $999 price-point it commands"

In other words, Nvidia is giving you another chance to bend over the table in case you missed the Titan.

Has anyone noticed me bending over the table yet with my pants down? But seriously, this is not a gamers card. As much as I'll enjoy the reviews, having bought Titan initially, Nvidia are not going to spank me again.

Maybe... Kidding!
 
You are thinking too linearly. Yes using current hardware we have packed as many transitions into a set space as we are going to fit but they have to innovate, they have to come up with a new architecture that's changes the game.
They used to bring out chips each generation that really were new and so performance increases came from that, now all they do is either overclock it, make the same architecture bigger or rebrand and price cut, then just wait for a die shrink.

Maxwell is in the cards for the near future, in fact the 750Ti is going to be using that architecture. There's your new architecture. Sure full Maxwell won't be out until late this year or early next year but it's not as if nVidia is sitting around doing nothing.

Plus, nVidia is just getting their lineup complete again. Doesn't hurt you that they're making this $1000 card as you're not going to buy it anyway. In reality the Titan cards were more for professionals that didn't want to pay $3000 for a video card but also game on the same PC. For those that bought it for this purpose it has done well for them. For the others... well they probably mostly wasted $1000.
 
Maxwell is in the cards for the near future, in fact the 750Ti is going to be using that architecture. There's your new architecture. Sure full Maxwell won't be out until late this year or early next year but it's not as if nVidia is sitting around doing nothing.

Plus, nVidia is just getting their lineup complete again. Doesn't hurt you that they're making this $1000 card as you're not going to buy it anyway. In reality the Titan cards were more for professionals that didn't want to pay $3000 for a video card but also game on the same PC. For those that bought it for this purpose it has done well for them. For the others... well they probably mostly wasted $1000.

From what I have seen Maxwell is making my point for me, it's a tiny tweak to a stale 3-4 architecture. NVidia are totally reliant on 20nm to bring anything faster to the table.

The whole industry has lost the art of thinking outside the box and bringing something revolutionary to the table and it's purely down to competition, there is none!

AMD and nVidia normally only seem to fight on price and recently power usage.

Intel has no competition in the CPU market anymore and you can see that from how tiny the performance jumps are each generation. If you remember back when AMD had the lead for a year or two suddenly Intel managed to miraculously find 40% performance increase in one generation, they regained the lead and now it's back to 3% here, 5% there.

ARM is competitive with many big players and the performance jumps are massive each year, yes it's a young architecture but companies still need to invest to attain these performance leaps, that is driven by competition.
 
NVIDIA will most likely make only 1 exclusive titan black just for the jen hsun presentation! then its just waiting for full maxwell lineup.
 
Back
Top