• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

New GTX TITAN-Z Launch Details Emerge

Joined
Nov 18, 2010
Messages
7,493 (1.47/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX. Water block. Crossflashed.
Storage Optane 900P[Fedora] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO+SN560 1TB(W11)
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) SMSL RAW-MDA1 DAC
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 41
, I'm sorry but ATI has always been superior, look at the "overkillness" they slapped onto the 295x2.

Be more mature...

Nevertheless the design looks underpowered. Both Titan too much Zeroes and 295X2 Celsius are failures design wise... they are utterly useless for the given price, R/D cost and other stuff... it is just a check in the book like we had them...
 
Joined
Oct 26, 2011
Messages
3,145 (0.66/day)
Processor 8700k Intel
Motherboard z370 MSI Godlike Gaming
Cooling Triple Aquacomputer AMS Copper 840 with D5
Memory TridentZ RGB G.Skill C16 3600MHz
Video Card(s) GTX 1080 Ti
Storage Crucial MX SSDs
Display(s) Dell U3011 2560x1600 + Dell 2408WFP 1200x1920 (Portrait)
Case Core P5 Thermaltake
Audio Device(s) Essence STX
Power Supply AX 1500i
Mouse Logitech
Keyboard Corsair
Software Win10
Be more mature...

Nevertheless the design looks underpowered. Both Titan too much Zeroes and 295X2 Celsius are failures design wise... they are utterly useless for the given price, R/D cost and other stuff... it is just a check in the book like we had them...

Can't see any childish remark in my post, I think it's genuinely positive for a customer to ask for a decent power section on a graphics card of this caliber.

For 3K USD I expect NOTHING else than overkill.

Its not entirely about the amount of phases. But also the capacity that each phase is rated for.

I expect that with dropping one phase per gpu, the rest are rated a bit higher to compensate, but who knows.

I honestly think that they won't be rated any higher than what Nvidia has been using on reference 780/Titan/780ti, they are almost all the same.

I bet this GPU will blow when matched against 110%+ TDP, but hey we shouldn't overclock our GPUs, right? :)
 
Joined
Nov 18, 2010
Messages
7,493 (1.47/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX. Water block. Crossflashed.
Storage Optane 900P[Fedora] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO+SN560 1TB(W11)
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) SMSL RAW-MDA1 DAC
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 41
For 3K USD I expect NOTHING else than overkill.

Well mate... it reminds me of this :D. And the second thing... It is just the way the things work... They do it because they CAN.

GTX 295 X4 Black Edition Sonic Ultra.jpg
 
Joined
Jan 31, 2012
Messages
2,630 (0.56/day)
Location
East Europe
System Name PLAHI
Processor I5-10400
Motherboard MSI MPG Z490 GAMING PLUS
Cooling 120 AIO
Memory 32GB Corsair LPX 2400 Mhz DDR4 CL14
Video Card(s) PNY QUADRO RTX A2000
Storage Intel 670P 512GB
Display(s) Philips 288E2A 28" 4K + 22" LG 1080p
Case Thermaltake URBAN R31
Audio Device(s) Creative Soundblaster Z SE
Power Supply Fractal Design IntegraM 650W
Mouse Logitech Triathlon
Keyboard REDRAGON MITRA
Software Windows 11 Home x 64
Can't see any childish remark in my post, I think it's genuinely positive for a customer to ask for a decent power section on a graphics card of this caliber.

I do. I don't know if you have noticed or pretend not to notice, but nVidia has been smacking some pretty impressive power efficiency numbers in AMD's face. Nothing personal, just pick any Maxwell-based card (750 Ti for example- 4W idle/ 5W multi-monitor etc., etc.). I think they know what they are doing with power. At least they have some pretty serious testimony for it. Of course, nobody is "bullet-proof" of error in one's life, but I personally trust these guys.
 
Joined
Oct 26, 2011
Messages
3,145 (0.66/day)
Processor 8700k Intel
Motherboard z370 MSI Godlike Gaming
Cooling Triple Aquacomputer AMS Copper 840 with D5
Memory TridentZ RGB G.Skill C16 3600MHz
Video Card(s) GTX 1080 Ti
Storage Crucial MX SSDs
Display(s) Dell U3011 2560x1600 + Dell 2408WFP 1200x1920 (Portrait)
Case Core P5 Thermaltake
Audio Device(s) Essence STX
Power Supply AX 1500i
Mouse Logitech
Keyboard Corsair
Software Win10
I do. I don't know if you have noticed or pretend not to notice, but nVidia has been smacking some pretty impressive power efficiency numbers in AMD's face. Nothing personal, just pick any Maxwell-based card (750 Ti for example- 4W idle/ 5W multi-monitor etc., etc.). I think they know what they are doing with power. At least they have some pretty serious testimony for it. Of course, nobody is "bullet-proof" of error in one's life, but I personally trust these guys.

Do you realize that we are talking about power delivery section and not power consumption? Those things are completely different from each other.
 
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
I do. I don't know if you have noticed or pretend not to notice, but nVidia has been smacking some pretty impressive power efficiency numbers in AMD's face. Nothing personal, just pick any Maxwell-based card (750 Ti for example- 4W idle/ 5W multi-monitor etc., etc.). I think they know what they are doing with power. At least they have some pretty serious testimony for it. Of course, nobody is "bullet-proof" of error in one's life, but I personally trust these guys.
Unfortunately this board (the Titan Z) isn't Maxwell...and people looking at the top of the performance hierarchy tend to be happy for efficiency to play second fiddle to outright performance.

The Titan Z seems to fall into the chasm between usability and outright performance. Nvidia obviously tried to squeeze as much into a conventional air cooled card as was possible, but it falls short against the competition. AMD have shown in the past that they don't have any qualms about ignoring the PCI-SIG (the HD 6990 and 7990 ), but unlikely that Nvidia expected AMD to put out the first 500 watt reference card, or the first water cooled reference card for that matter. In this instance (the top of the model line) brute force trumps efficiency and Nvidia will be pilloried for being too conservative even if they relaunch the card sans FP64 as a GTX 790. Having said that, I fully expect both cards to enjoy the short and intermittent production runs and free-falling depreciation enjoyed by their dual-GPU predecessors.

The sad thing is that one camp has a $3000 card, and the other camp has a 500 watt card. I'm not entirely sure we're heading in the right direction.;)
 
Joined
Sep 15, 2011
Messages
6,691 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Can I ask a stupid question? Isn't just 10x times better to buy 2x 780 Ti GTX cards for 1500$ and have 4 slots taken, instead of buying 1 card for 3000$ and have 3 slots taken, but 20% LESS performance?!? I mean, seriously, what's the deal with this card???? For professional use are better cards for the same price. I mean, is it only me, or this card seems an abomination!?
 
Joined
Oct 26, 2011
Messages
3,145 (0.66/day)
Processor 8700k Intel
Motherboard z370 MSI Godlike Gaming
Cooling Triple Aquacomputer AMS Copper 840 with D5
Memory TridentZ RGB G.Skill C16 3600MHz
Video Card(s) GTX 1080 Ti
Storage Crucial MX SSDs
Display(s) Dell U3011 2560x1600 + Dell 2408WFP 1200x1920 (Portrait)
Case Core P5 Thermaltake
Audio Device(s) Essence STX
Power Supply AX 1500i
Mouse Logitech
Keyboard Corsair
Software Win10
To be fair, your question is not stupid at all, I feel the same about it.

Two Titan Blacks for DP make it obsolete, two 780Tis for gaming make it obsolete.

It's just an hype halo w/e product.
 

64K

Joined
Mar 13, 2014
Messages
6,748 (1.73/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Can I ask a stupid question? Isn't just 10x times better to buy 2x 780 Ti GTX cards for 1500$ and have 4 slots taken, instead of buying 1 card for 3000$ and have 3 slots taken, but 20% LESS performance?!? I mean, seriously, what's the deal with this card???? For professional use are better cards for the same price. I mean, is it only me, or this card seems an abomination!?

No, your question is not in any way stupid. Gamers are trying to figure out why Nvidia is calling this a GeForce card and aiming it at gamers and though I have no experience with professional cards I have to wonder why wouldn't 2X Titan Black for less money and more performance be better? As a gamer the Titan Z would have never come across my radar if Nvidia hadn't labeled it GeForce and aimed it at gamers.

http://www.geforce.com/whats-new/articles/announcing-the-geforce-gtx-titan-z

http://blogs.nvidia.com/blog/2014/03/25/titan-z/
 
Joined
Feb 8, 2012
Messages
3,014 (0.65/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
I mean, seriously, what's the deal with this card???? For professional use are better cards for the same price. I mean, is it only me, or this card seems an abomination!?

Having 2 GPUs on a single PCB is actually a good thing for compute professionals that need double precision performance but want cheaper alternative for array of Tesla cards.
There is no requirement for SLI with compute work, so with pcie risers one can build massive GPU array while reducing overall number of systems (less cpu-s, motherboards and psu-s needed) on site.
This card is for companies that are building their own supercomputer.
Additionally marketing it as a geforce product because it runs games beautifully, nvidia would be crazy not to. Promote synergy like a boss and all that.
 
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
Having 2 GPUs on a single PCB is actually a good thing for compute professionals that need double precision performance but want cheaper alternative for array of Tesla cards.
There is no requirement for SLI with compute work, so with pcie risers one can build massive GPU array while reducing overall number of systems (less cpu-s, motherboards and psu-s needed) on site.
This card is for companies that are building their own supercomputer.
Additionally marketing it as a geforce product because it runs games beautifully, nvidia would be crazy not to. Promote synergy like a boss and all that.

Not very smart company if their building a supercomputer from a gaming product with no ECC and half the memory per GPU. Kind of takes the super out of it all together.
 
Last edited:
Joined
Feb 8, 2012
Messages
3,014 (0.65/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Not very smart company if their building a supercomputer from a gaming product with no ECC and half the memory per GPU.

Well you do get what you pay for ... single Tesla K40 is 5.5K USD ... that's one kepler gpu
 

DrunkMonk74

New Member
Joined
May 3, 2014
Messages
1 (0.00/day)
Anyone think that when nVidia's partners get their hands on this card, they'll be able to squeeze some more juice out of it? Maybe along the lines of EVGA's ACX cooler that you find on EVGA's version of the 780 Ti?

If you look at EVGA's K|NGP|N edition of the 780 Ti it's base clock is 1072MHz!! Rumour is that EVGA is going to also be bringing out a 6Gb version of that card. The current 3Gb version of that card sells for 859.99 USD from EVGA themselves. A 6Gb version might possibly be closer to 1000 USD buy two of those, Sli, and you'll have all the power you need for quite some time and probably save yourself 1000 USD along the way.
 
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Anyone think that when nVidia's partners get their hands on this card, they'll be able to squeeze some more juice out of it? Maybe along the lines of EVGA's ACX cooler that you find on EVGA's version of the 780 Ti?
If it gets any kind of treatment then it should be a HydroCopper Classified solution. A custom air cooled card along the lines of the KPE would certainly be possible, but would the sales and PR justify the development expenditure?
Judging by the initial testing, the card has some headroom. A 1050MHz boost on an unheated board isn't too bad, so either a waterblock or a reworked air cooler with larger fans such as the ACX would be a better bet for maintain that kind of level without having to ramp the fanspeed to max.
Not very smart company if their building a supercomputer from a gaming product with no ECC and half the memory per GPU
ECC for GDDR5 isn't really needed unless the workload is of critical importance. GDDR5 already has EDC (Error Detection Code) built in which detects errors across the system bus. The only errors it can't check for is memory IC fault and GPU memory controller errors, both of which (along with GPU runtime validation) are stringently binned for to produce pro cards. It's why a K40 (as BiggieShady mentioned) costs 4-5 times the price of a Titan Black, and a W9100 costs 6 times as much as a 290X
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
ECC for GDDR5 isn't really needed unless the workload is of critical importance. GDDR5 already has EDC (Error Detection Code) built in which detects errors across the system bus. The only errors it can't check for is memory IC fault and GPU memory controller errors, both of which (along with GPU runtime validation) are stringently binned for to produce pro cards. It's why a K40 (as BiggieShady mentioned) costs 4-5 times the price of a Titan Black, and a W9100 costs 6 times as much as a 290X

I believe this is what he said

Having 2 GPUs on a single PCB is actually a good thing for compute professionals that need double precision performance but want cheaper alternative for array of Tesla cards.
There is no requirement for SLI with compute work, so with pcie risers one can build massive GPU array while reducing overall number of systems (less cpu-s, motherboards and psu-s needed) on site.
This card is for companies that are building their own supercomputer.
Additionally marketing it as a geforce product because it runs games beautifully, nvidia would be crazy not to. Promote synergy like a boss and all that.

Nvidia Titan Z 12 GB
6 GB per GPU
single precision = 8.0 Tflops
2.6 Tflops per slot
double precision = 2.6 Tflops
0.86 Tflops per slot

375 TDP
$2,999

I'll save you some money on Nvidia at the same site

Nvidia Tesla K40 12 GB
single precision = 4.29 Tflops
2.14 Tflops per slot
double precision = 1.43 Tflops
0.71 Tflops per slot

235 TDP
$4,245

Nvidia Quadro K6000 12 GB
single precision = 5.2 Tflops
2.6 Tflops per slot
double precision = 1.7 Tflops
0.85 Tflops per slot

225 TDP
$4,235

AMD FirePro W9100 16 GB
single precision = 5.24 Tflops
2.62 Tflops per slot
double precision = 2.62 Tflops
1.31 Tflops per slot

275 TDP
$3,499

You don't build supercomputers with a card from a gaming stack. Unless uptime, errors and stability isn't a concern.
 
Last edited:
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
I believe this is what he said
I was answering your post, not anyone elses. The fact that I quoted your post should have been a dead giveaway. Why you answered a post concerning double precision with a supposed need for ECC I have no idea- they aren't inextricably linked.
As for whatever vague point you're making, there are plenty of instances where FP64 could be useful to a prosumer (mixed single+ double precision workloads such as 3D modelling)
Nvidia Titan Z 12 GB
single precision = 8.0 Tflops
double precision = 2.6 Tflops
$2,999

I'll save you some money on Nvidia at the same site

Nvidia Tesla K40 12 GB
single precision = 4.29 Tflops
double precision = 1.43 Tflops
$4,245

Nvidia Quadro K6000 12 GB
single precision = 5.2 Tflops
double precision = 1.7 Tflops
$4,235

AMD FirePro W9100 16 GB
single precision = 5.24 Tflops
double precision = 2.62 Tflops
$3,499
So, judging by the bolding and price inclusion, you're saying double precision :
Titan Z...0.87 GFlop/$
W9100..0.75 GFlop/$
K6000...0.45 GFlop/$ (the card is available for $3800)
K40.......0.34 GFlop/$

Not sure how that the Tesla, Quadro, or FirePro are supposed to be "saving some money".

Of course, it's still an apples vs oranges scenario. Professional drivers, software (Nvidia's OptiX, SceniX, CompleX etc.), support, binning, and a larger frame buffer (the Titan Z isn't a 12GB card, it's a 2 x 6GB card) should all add value to the pro boards regardless of vendor.

A further point to note is that Nvidia's FLOP numbers are calculated on base clock (which is correct for double precision since boost is disabled) , not boost -either guaranteed minimum or maximum sustained for single precision. The FLOPS for AMD's cards are calculated on maximum boost, whether it is attainable/sustainable or not.
Case in point: The GTX 780 is quoted as having a 3977 GFlops FP32 rate ( 863 base clock * 2304 cores * 2 Ops/clock). But GPGPU apps can be as intensive as games. The GTX 780 I have here at the moment - based on that the usual calculation should be 967 * 2304 * 2 = 4456 GFlops. In reality the card sustains a boost of 1085 MHz at stock settings (no extra voltage, no OC above factory, no change in stock fan profile). The actual FP32 rate would be 1085 * 2304 * 2 = 5000 GFlops
A quick Heaven run to show how meaningless the base clock (and its associated numbers) are, and why they generally aren't worth the time to record


You don't build supercomputers with a card from a gaming stack.
Jesus, how many times are you going to edit a post.

It probably depends upon your definition of a supercomputer. If its an HPC cluster, then no, you wouldn't...but that's a very narrow association used by people with little technical knowledge of the range of compute solutions.
Other examples:
The Fastra II is a desktop supercomputer designed for tomography
Rackmount GPU servers also generally come under the same heading, since big iron generally tend to be made up of the same hardware....just add more racks to a cabinet...and more cabinets to a cluster...etc. etc.
I'd also note that they aren't "one offs" as you opined once before, as explained here: " We build and ship at least a few like this every month".
Go nuts configure away.
 
Last edited:
Joined
Oct 26, 2011
Messages
3,145 (0.66/day)
Processor 8700k Intel
Motherboard z370 MSI Godlike Gaming
Cooling Triple Aquacomputer AMS Copper 840 with D5
Memory TridentZ RGB G.Skill C16 3600MHz
Video Card(s) GTX 1080 Ti
Storage Crucial MX SSDs
Display(s) Dell U3011 2560x1600 + Dell 2408WFP 1200x1920 (Portrait)
Case Core P5 Thermaltake
Audio Device(s) Essence STX
Power Supply AX 1500i
Mouse Logitech
Keyboard Corsair
Software Win10
GTX Titan also shines at CUDA, untapped CUDA power is nothing to overlook.

I've been abusing my two graphics cards for rendering, work and fun especially now that you can purchase multiple render platforms that support CUDA rendering.

I personally use Octane CUDA render plugins (3Ds Max and Poser) for my personal enjoyment when I'm free and Vray CUDA for work.

You don't need a Tesla/Quadro card for CUDA rendering :)

I would still purchase two separate GTX Titans (not black cause the difference is minimal) compared to this one. Save one slot for what?
 
Joined
Feb 8, 2012
Messages
3,014 (0.65/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
You don't build supercomputers with a card from a gaming stack. Unless uptime, errors and stability isn't a concern.

Look at it from a standpoint of a testing environment and production environment. If you own a software company and want to offer a solution for CUDA based supercomputers, it would be economically more feasible to do development on Titans, and deploy on customer's Tesla array.
 
Joined
Oct 1, 2013
Messages
250 (0.06/day)
I was answering your post, not anyone elses. The fact that I quoted your post should have been a dead giveaway. Why you answered a post concerning double precision with a supposed need for ECC I have no idea- they aren't inextricably linked.
As for whatever vague point you're making, there are plenty of instances where FP64 could be useful to a prosumer (mixed single+ double precision workloads such as 3D modelling)

So, judging by the bolding and price inclusion, you're saying double precision :
Titan Z...0.87 GFlop/$
W9100..0.75 GFlop/$
K6000...0.45 GFlop/$ (the card is available for $3800)
K40.......0.34 GFlop/$
Stop defending the Titan's price with DP. 7970 is 200$ on eBay nowaday, and it has 947 GFlop double precision. Sooooo....
7970... 4.735 GFlop/$
It makes all your number look like a rob. Not to mention 7970 can easily OC more than its default 925 MHz

Few people bought the first Titan for their work with CUDA. However, the others simply bought it because it was the best of its time. They was f*cked really hard by nVi with the release of 780 and 780Ti. Hopefully a smart gamer could learn from their demise and stay away from these stupidly overpriced cards.
 
Last edited:
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Stop defending the Titan's price with DP. 7970 is 200$ on eBay nowaday, and it has 947 GFlop double precision. Sooooo....
7970... 4.735 GFlop/$
It makes all your number look like a rob. Not to mention 7970 can easily OC more than its default 925 MHz
Newsflash genius, I used the models and numbers provided by Xzibit in his comment to me.....Do I care that you can get a 7970 on eBay for $200 ? Not really since it isn't relevant as it wasn't part of the original data set provided by Xzibit- if he'd included a bunch of other cards I'd have extrapolated their numbers also. All it proves is that the 7970 is a good secondhand deal (if it hasn't been half fried by a miner) and suffers from horrendous depreciation (a loss of 64% of its initial value in 28 months). Delve into the second hand market and you can find comparable deals everywhere since it seems to come as a shock to you that new card prices don't compare particularly well with pre-owed. How about a Quadro at 8.68 GFlops/$ + pro driver support thrown in? or a HD 5970 at 16.87 GFlops/$, or a desktop $20 8800 GTS that works out at 31.2 GFlops/$
Few people bought the first Titan for their work with CUDA.
You got a link for that ? Maybe some sales numbers?......even some anecdotal evidence would suffice....really.
 
Last edited:
Joined
Oct 1, 2013
Messages
250 (0.06/day)
Newsflash genius, I used the models and numbers provided by Xzibit in his comment to me.....Do I care that you can get a 7970 on eBay for $200 ? Not really since it isn't relevant as it wasn't part of the original data set provided by Xzibit- if he'd included a bunch of other cards I'd have extrapolated their numbers also. All it proves is that the 7970 is a good secondhand deal (if it hasn't been half fried by a miner) and suffers from horrendous depreciation (a loss of 64% of its initial value in 28 months). Delve into the second hand market and you can find comparable deals everywhere since it seems to come as a shock to you that new card prices don't compare particularly well with pre-owed. How about a Quadro at 8.68 GFlops/$ + pro driver support thrown in? or a HD 5970 at 16.87 GFlops/$, or a desktop $20 8800 GTS that works out at 31.2 GFlops/$

You got a link for that ? Maybe some sales numbers?......even some anecdotal evidence would suffice....really.
I can remind you that 7970's release price is 549$. But it's not my main point here.

Most of people bought Titan for GAMES, and the official drivers of nVidia for this card have been always optimized for GAMES.

And that price for a gaming card is stupidly high.

Sales numbers cannot determine the buying purpose unfortunately. However, you can go to some tech forum to see how those Titan buyers bragged their FPS , just like AMD buyers recently have been talked about the hashrate of their cards.
 
Joined
Feb 8, 2012
Messages
3,014 (0.65/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Most of people bought Titan for GAMES, and the official drivers of nVidia for this card have been always optimized for GAMES.

Graphics driver should be optimized for all applications (including games) but this is about CUDA libraries and CUDA driver (they are part of a standard geforce driver package but are also independent). Also the whole point is that TITAN is not only marketed as a gaming card: https://developer.nvidia.com/ultimate-cuda-development-gpu
 
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
Graphics driver should be optimized for all applications (including games) but this is about CUDA libraries and CUDA driver (they are part of a standard geforce driver package but are also independent). Also the whole point is that TITAN is not only marketed as a gaming card: https://developer.nvidia.com/ultimate-cuda-development-gpu

Nvidia just let us know that Titan supports Dynamic Parallelism and Hyper-Q for CUDA streams, and does not support ECC, the RDMA feature of GPU Direct, or Hyper-Q for MPI connections

The Titan brand is to suck you into the CUDA eco-system. Once they got you there its not like you can buy Intel or AMD to use it.
 
Last edited:
Joined
Oct 26, 2011
Messages
3,145 (0.66/day)
Processor 8700k Intel
Motherboard z370 MSI Godlike Gaming
Cooling Triple Aquacomputer AMS Copper 840 with D5
Memory TridentZ RGB G.Skill C16 3600MHz
Video Card(s) GTX 1080 Ti
Storage Crucial MX SSDs
Display(s) Dell U3011 2560x1600 + Dell 2408WFP 1200x1920 (Portrait)
Case Core P5 Thermaltake
Audio Device(s) Essence STX
Power Supply AX 1500i
Mouse Logitech
Keyboard Corsair
Software Win10
^ Kinda what my point has always been.

Crosses fingers for Maxwell's Titan to have 12GB Vram, FULL GPU rendered scenes can reach up the 6GB framebuffer easily as you have to load all textures into the GPU.
 
Top