Friday, October 10th 2014

Galaxy Intros Single-slot GeForce GTX 750 Ti Razor Graphics Card

Galaxy launched its single-slot GeForce GTX 750 Ti Razor graphics card in Europe, under the Galax and phasing-out KFA2 brands. This the second such single-slot GTX 750 Ti cards, after ELSA launched one such card, in certain APAC markets. Galaxy's card uses a typical nickel-plated copper channel heatsink, which draws heat from a copper plate, which makes contact with the GPU, memory and VRM. The card relies entirely on the PCI-Express 3.0 x16 bus for its power. Display outputs include one each of D-Sub, dual-link DVI, and HDMI. The card ships with clock speeds of 1020 MHz core, 1080 MHz GPU Boost, and 5.40 GHz (GDDR5-effective) memory. Available now, the card is priced at 139.90€, including taxes.
Add your own comment

32 Comments on Galaxy Intros Single-slot GeForce GTX 750 Ti Razor Graphics Card

#1
RCoon
The hell is that DVI port? Why not a full DVI port? This card is what a lot of people want, but that DVI port is a bit of a bummer.
Posted on Reply
#2
EzioAs
RCoonThe hell is that DVI port? Why not a full DVI port? This card is what a lot of people want, but that DVI port is a bit of a bummer.
What's wrong with DVI-D?
Posted on Reply
#3
PLAfiller
Just flashing the pansies I see. Awesome card, if there wide availability or easy way to get it. Why d-sub? How about DP+2xHDMI? or mini DP + 2x mini HDMI. Ok, dual link DVI is alright, but that dsub, come on- 2015 is knocking on the door.

Edit: while writing 2 other people commented on the ports...wow :)
Posted on Reply
#4
RCoon
EzioAsWhat's wrong with DVI-D?
Doesn't work with converter cables. In the event somebody wants to run a VGA>DVI converter with this card, they can't because it's missing the 4 pin holes.

Posted on Reply
#5
Shou Miko
why not write the name off the card which is labelled KFA2 not Galaxy even it's the same computer, KFA2 is just one of Galaxy's brands ;)
Posted on Reply
#6
Shou Miko
RCoonDoesn't work with converter cables. In the event somebody wants to run a VGA>DVI converter with this card, they can't because it's missing the 4 pin holes.
you got a dedicated VGA, so no need for adapters and if u only have old VGA screens maybe it would be time for an upgrade on your screens too? ;)
Posted on Reply
#7
EzioAs
RCoonDoesn't work with converter cables. In the event somebody wants to run a VGA>DVI converter with this card, they can't because it's missing the 4 pin holes.
If they want VGA, there's a port right there...I still don't see what the problem is.

Ninja'd by puma I see...
Posted on Reply
#8
RCoon
puma99dk|you got a dedicated VGA, so no need for adapters and if u only have old VGA screens maybe it would be time for an upgrade on your screens too? ;)
Valid point, but the reality is some people will need that ability. At my work place we buy low power basic GPUs for multi-screen setups. For us, that DVI-D instead of DVI-I would be a huge deal breaker. Those requirements still exist, regardless of how ancient they are.
EzioAsIf they want VGA, there's a port right there...I still don't see what the problem is.
People will run more than just 1 VGA monitor. Most basic corporate business buy cheap monitors, most of which still only have VGA.

I agree with everyone, VGA should die, but the fact is it hasn't so it is still a requirement until panel makers stop making VGA panels.
Posted on Reply
#9
EzioAs
RCoonValid point, but the reality is some people will need that ability. At my work place we buy low power basic GPUs for multi-screen setups. For us, that DVI-D instead of DVI-I would be a huge deal breaker. Those requirements still exist, regardless of how ancient they are.



People will run more than just 1 VGA monitor. Most basic corporate business buy cheap monitors, most of which still only have VGA.
I see your point...I use 2 monitors as well, one using DVI and one using VGA (an old monitor) with an adapter, but I can see the problem if there is someone out there who uses 2 VGA-only monitor. I'm glad my card came with at least one DVI-I so I can use an adapter
Posted on Reply
#10
Shou Miko
RCoonValid point, but the reality is some people will need that ability. At my work place we buy low power basic GPUs for multi-screen setups. For us, that DVI-D instead of DVI-I would be a huge deal breaker. Those requirements still exist, regardless of how ancient they are.



People will run more than just 1 VGA monitor. Most basic corporate business buy cheap monitors, most of which still only have VGA.

I agree with everyone, VGA should die, but the fact is it hasn't so it is still a requirement until panel makers stop making VGA panels.
i know this will still exist RCoon but more and more manufactures of SSF computers now use one DisplayPort and one VGA now no DVI which is annoying sometimes, but the reality and maybe in the near future they will start using only DisplayPort and no analog connection...
Posted on Reply
#11
Danieldee
750ti? Why? It's time for 950(ti).
Posted on Reply
#12
SaltyFish
Oh, that thing from Computex this year. Nice to see it finally rolling out. Wonder if that fancy-pants Darbee card is right behind.

As for the ports, dual link DVI-I would've been better for usage versatility. I can see people still using dual CRTs and/or el-cheapo LCDs that only have VGA inputs. But for business use? Isn't a GTX 750 Ti a little over-the-top for most business applications? And the few that aren't generally require specialized hardware.
Posted on Reply
#13
64K
Danieldee750ti? Why? It's time for 950(ti).
If they made a GTX 950Ti right now it would be the same as the GTX 750Ti. They would both be 28nm Maxwell. Possibly they will do this with the die shrink to 20nm next year.
Posted on Reply
#14
GhostRyder
I love it!!!

Though there are two things I would love to see still

1: DP
2: a low profile variant

Bit of a high request but I would think it is possible to do the low profile variant basing it off what other cards I have seen like that. Either way though I love the single slot cards like this because they make me wanna build a tiny computer.
Posted on Reply
#15
Shou Miko
GhostRyderI love it!!!

Though there are two things I would love to see still

1: DP
2: a low profile variant

Bit of a high request but I would think it is possible to do the low profile variant basing it off what other cards I have seen like that. Either way though I love the single slot cards like this because they make me wanna build a tiny computer.
maybe that will come with a GTX 960 or 950/Ti ;)
Posted on Reply
#16
Easy Rhino
Linux Advocate
at 1080p with a modern cpu this little card will run ALL games above 40 fps on high settings. don't let the hardware snobs talk down to you. this card with that small footprint and power usage is great for 95% of PC gamers out there.
Posted on Reply
#17
RadeonProVega
galaxy cards for me always get too hot and seem to take way too much power. i had to take two back in the past, so i stick with pny. Nice looking card though
Posted on Reply
#18
ypsylon
Fine idea, but ... cooling system probably sounds like jumbojet during take off. That's biggest downer.

Nice addiction for Media PC or something along that lines, but noise of tiny fan. Thanks but no thanks.
Posted on Reply
#19
Tatsu
u2konlinegalaxy cards for me always get too hot and seem to take way too much power. i had to take two back in the past, so i stick with pny. Nice looking card though
I can't see how it can take way too much power when there is no 6-pin connector; it draws all power through the slot.
Posted on Reply
#20
BorisDG
It's interesting why they released it now? You can see that it's written on the PCB - 1350... which means - 50th week of 2013 ... so super old card released 1 year after.
Posted on Reply
#22
newtekie1
Semi-Retired Folder
RCoonValid point, but the reality is some people will need that ability. At my work place we buy low power basic GPUs for multi-screen setups. For us, that DVI-D instead of DVI-I would be a huge deal breaker. Those requirements still exist, regardless of how ancient they are.
The 750Ti, as well as most other recent cards, only have one analog output. Since this card already has a dedicated VGA port, it would be impossible to make the DVI port a DVI-I port, it has to be DVI-D. There is physically no second analog output from the GPU to connect to the DVI port.
Posted on Reply
#23
Shou Miko
newtekie1The 750Ti, as well as most other recent cards, only have one analog output. Since this card already has a dedicated VGA port, it would be impossible to make the DVI port a DVI-I port, it has to be DVI-D. There is physically no second analog output from the GPU to connect to the DVI port.
so pay a DVI-D to VGA adapter

www.amazon.com/dp/B007EA48MA/?tag=tec06d-20
Posted on Reply
#24
de.das.dude
Pro Indian Modder
RCoonValid point, but the reality is some people will need that ability. At my work place we buy low power basic GPUs for multi-screen setups. For us, that DVI-D instead of DVI-I would be a huge deal breaker. Those requirements still exist, regardless of how ancient they are.



People will run more than just 1 VGA monitor. Most basic corporate business buy cheap monitors, most of which still only have VGA.

I agree with everyone, VGA should die, but the fact is it hasn't so it is still a requirement until panel makers stop making VGA panels.
there exists HDMI to vga convertors!
Posted on Reply
#25
Assimilator
We have a lot of VGA monitors at my workplace, but all the PCs have dual DisplayPort outputs, so we just buy a lot of DP-to-VGA converters. Not cheap, but cheaper than replacing perfectly serviceable monitors.
Posted on Reply
Add your own comment
Dec 21st, 2024 20:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts