• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ZOTAC Announces GeForce GTX 550 Ti Multiview Graphics Card

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,300 (7.53/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
ZOTAC International, a leading innovator and the world's largest channel manufacturer of graphics cards, motherboards and mini-PCs, today expands the GeForce Multiview line-up with the addition of the ZOTAC GeForce GTX 550 Ti Multiview. The ZOTAC GeForce GTX 550 Ti Multiview is the latest mainstream graphics card capable of delivering a seamless triple-display computing experience.

"ZOTAC believes multi-monitor computing is the future of desktop computing. Triple displays are the sweet spot where productivity and the gaming experience drastically improves without overwhelming users with too many monitors," said Carsten Berger, marketing director, ZOTAC International. "With our latest ZOTAC GeForce GTX 550 Ti Multiview, we are able to deliver a quality visual experience that combines Microsoft DirectX 11 compatibility, NVIDIA CUDA technology and triple simultaneous displays at a mainstream price point."



The ZOTAC GeForce GTX 550 Ti Multiview is powered by the latest NVIDIA GeForce GTX 550 Ti graphics processor with 192 lightning-fast unified shaders paired with 1GB of DDR5 memory. The combination enables users to experience stunning visual quality while maintaining class-leading performance-per-watt with the ZOTAC GeForce GTX 550 Ti Multiview.

It's time to play with triple displays and the ZOTAC GeForce GTX 550 Ti Multiview.



View at TechPowerUp Main Site
 

Rebelstar

New Member
Joined
Sep 3, 2010
Messages
71 (0.01/day)
Location
Minsk, Belarus
Display(s) custom thin bezel eyefinity 1x3 portrait
Case Cooler Master HAF 932
Power Supply CoolerMaster SilentPro M850
Software Windows 7 Enterprise x64
Will somebody explain me, why Zotac makes triple display cards only for low-end level? Why not for the 570/580? This is so A N N O Y I N G ! I don't want to buy 2 cards for the triple display gaming, I want to buy single card like AMD 5XXX or 6XXX
 
Joined
May 4, 2009
Messages
1,972 (0.35/day)
Location
Bulgaria
System Name penguin
Processor R7 5700G
Motherboard Asrock B450M Pro4
Cooling Some CM tower cooler that will fit my case
Memory 4 x 8GB Kingston HyperX Fury 2666MHz
Video Card(s) IGP
Storage ADATA SU800 512GB
Display(s) 27' LG
Case Zalman
Audio Device(s) stock
Power Supply Seasonic SS-620GM
Software win10
Will somebody explain me, why Zotac makes triple display cards only for low-end level? Why not for the 570/580? This is so A N N O Y I N G ! I don't want to buy 2 cards for the triple display gaming, I want to buy single card like AMD 5XXX or 6XXX

I'd like to find out as well :confused:
Also is Zotac can do it, why can't Nvidia have it as an option by default?
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.09/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Also is Zotac can do it, why can't Nvidia have it as an option by default?

Zatac does it by adding a seperate display logic chip to the card, think G80 and G200. Everyone complained about how nVidia had to seperate the display logic on those GPUs, so they didn't want to do it again.
 
Top