• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Announces the GeForce GTX TITAN-Z

Is the GTX Titan-Z made with Super Alloy Z?

Might be same material they made these Titan Z out of.

$_75.JPG
ttn_z4800k_34l_74816_2279.jpeg


This is a serious card built for serious gamers. TITAN Z is designed with the highest-grade components to deliver the best experience – incredible speed and cool, quiet performance—all in a stunningly crafted aluminum case.
 
Last edited:
Nvidia price trolling AMD - like they don't care ?
 
...better than R9 295x2.

2_Chips.jpg
 
The pci 3.0 bandwidth is damn near saturated with a single gpu 680.
You mean PCIex 2.0... ;) PCIex 3.0 is not even saturated by a dual GPU card. Just check and compare the specs if you don't believe me. Google is your friend ;)
 
I am so tired of seeing dual GPU cards. This trick only works on people that do not understand how computers work. The pci 3.0 bandwidth is damn near saturated with a single gpu 680. I'm no engineer, but I'm pretty sure that a tractor trailer can't fit through the eye of a sewing needle...

I was under the impression that a GTX780 still doesn't saturate the bandwidth of PCIe 2.0? If so, we're pretty safe with dual video cards on PCIe 3.0.
 
Im still disappointed in Jen-Shun

First he shows Pascal, which irrelevant for another 2-3 years, few moments later he shows this meh Titan-z GK110 gpu...


KCeOwiQ.jpg

:D
 
I know this sound stupid but i dont see anything about GPU Clocks or Memory Clocks
 
I know this sound stupid but i dont see anything about GPU Clocks or Memory Clocks

288gb/s means 6ghz ram, just like on old titan, 8tflops means ~ 700mhz per gpu.
 
I'd be lying if I was to say oh I wouldn't want that lol..id take on in a heart beat if I was financially capable of spending this much on something I use for gaming, but to me this card seems to be focused for pro's. I would love to have one, or why stick with one when two of them is better lol.

Now if someone could show me how I can cram this into my iMac that would be great lol.
 
288gb/s means 6ghz ram, just like on old titan, 8tflops means ~ 700mhz per gpu.

Well since this thing is Half Gaming Graphics card and Pro Graphics card, but its just curiousity, that so called "5K".Is it true? since AMD Announcing R9 290 series is ready for 4K now Nvidia said that this card is ready for 5K:eek::confused::confused:
 
I don't know very much about compute cards but does the Titan Z even make sense for someone who needs a card for compute and gaming? Wouldn't 2 Titans for $1,000 less be a better deal. I don't think Nvidia is going to release the Titan Z with anywhere close to a 500 watt TDP so it won't be as powerful as 2 Titans.
 
For $3000, this card better come with sharks with frickin' laser beams attached to their heads.
 
After listening to all this, I have an atypical interpretation (or as "Serpent of Darkness covers in point #5) . Nvidia enjoys (nye almost demands) such PR to keep selling GK110's as a gaming offerings, hear me out.

Nvidia knows the tipping point they can recoup engineering, tooling, manufacturing costs to deliver such a card, and they have a good idea the number they can expect to sell. Even if that number of units mainly to enterprise purchasers, the PR frenzy it whips-up within "Gamers" just adds to the business plan for releasing it. It's a perfectly good plan, and it returns venue, actually better profit that selling "X" amount chips individually (at lower margins) as offerings that counteracts AMD Hawaii product. Nvidia gets to elevate the brand even higher, use up chips on extreme products offerings, and that actual adds "cred" to themselves as not directly vying with AMD.

We know 20Nm Maxwell is some time off, Nvidia needs to maintain GK110 production, but can’t hold margins selling a bunch in some price war especially the good full-compute parts. I think the dual chip board Nvidia realized it lends itself to more to low-end enterprise, as the package provide substantial punch in more non-traditional chassis arrangements. Two of them puts 4x compute without the need for risers on a more traditional (cost effective) motherboard, but honestly I don’t how/what is entailed in that today. So, they release this as it devours two full-chips, pays the overhead, return profit, all the while shoring up the legitimacy for the pricing on the GTX 780/780Ti. Best of keeps the gamers in side-bar discussing it. It’s a very shrewd move to extent they use 2x the product (GK110), in an exceptionally high margin offering.

It will find buyers mostly from enterprise/compute that hadn't justified the Tesla/Quarto pricing. I see it as product that 85% of even "bleeding edge gamers" won’t step-up to purchase… that fine. It's a card that folks can use as for experimentation, pushing huge resolutions and multiple monitor configurations, if some deep pocket gamer finds it worthwhile... all the more merry.
 
No problem
xzibit-pimp-my-ride.jpg

Before
$_75.JPG


Pimp'd out
wood2.jpg


Here some more Titan Zoolander

1966029_10152066116823253_398909557_o.jpg


2x8-pin
1559260_10152066117613253_471115848_o.jpg

3-Slot so very limited on SFF like ITX were there only 2 slot and depth can be an issue
1052237_10152066116968253_1377179069_o.jpg
Disappointed in the lack of a bubble gum dispenser, bowling ball shiner, and a fish tank
 
Silly card.... gaming? workstation card with half the ram..... just silly....
 
1. +1
2. Depending on in-game settings and the title, 6GB will NOT be enough for 3x 4K... FFS BF4 at default Ultra uses almost 3GB at 2560x1440...
3. +1
4. +1
5. Prices do not always have to go up when sales are good... basic business principle there...*blows dust off degree* LOL!
6. Who knows...
7. I had no idea the (vaporware?) 2x r9 290x was scaling 2x consistently. Like all dual card setups, it depends on several factors to determine scaling... drivers, the title, resolution, in-game settings, etc.
 
I can't imagine even the hardest core gamer buying one of these. Maybe some rich sheik addicted to WoW. A handful of people with the biggest and baddest flight sim setups that are already megabuck cockpits.

The sale to compute users isn't entirely clear either: Titan Z may use 3/4 the room of Titan Black, but 2xTitan Black is 2/3 the cost, and probably faster, and far less costly to replace if one fails.

(Disclaimer: I have an original firmware-flashed Titan, which has been quite good.)
 
I can't imagine even the hardest core gamer buying one of these. Maybe some rich sheik addicted to WoW
They'd have to be a rich sheik with brain damage.
By all accounts, EVGA will sell the 6GB 780 inc Kingpin and GTX 780 Ti for around the same money as the current 3GB versions. A couple of 780Ti's for around half the money of a Titan Z, and I'm pretty sure which will overclock better.
The sale to compute users isn't entirely clear either: Titan Z may use 3/4 the room of Titan Black, but 2xTitan Black is 2/3 the cost, and probably faster, and far less costly to replace if one fails
Not necessarily. Tyan sells a custom barebones especially for rendering (the first board below is actually by Trenton, the second a Supermicro), there is a reason why they are fully stocked with boards. If the 4U (in this case) can accommodate 10 GPUs (5 x Titan Z), why would they stick with 8 Titan/Titan Black ? You are potentially losing 20% of the possible performance per unit. It isn't really that much different from server CPU economics- initial cost may be somewhat less than needing extra hardware to achieve the same throughput - there's a reason that there are multi PCI-E slot boards available, and they generally revolve around putting as much processing power as possible into a single unit.
bpg8032_pci_express_backplane_ovh1.png
FT77A_8x_Titans.png
X9DRX_-F_spec.jpg
 
  • Like
Reactions: xvi
Back
Top