• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS Announces GeForce GTX 750 Ti Strix 4GB

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
In a bid to woo those who choose graphics cards by memory amounts (and cars with engine-displacement), ASUS rolled out a 4 GB variant of its GeForce GTX 750 Ti Strix lower mid-range graphics card. Pictured below, the card is built identical to the standard 2 GB model, but with double the GDDR5 memory amount. It features out of the box clock speeds of 1124 MHz core, 1202 MHz GPU Boost, and 5.40 GHz (GDDR5-effective) memory. Based on the 28 nm GM107 silicon, the GTX 750 Ti features 640 CUDA cores based on the "Maxwell" architecture, and a 128-bit wide GDDR5 memory interface. ASUS didn't disclose pricing.



View at TechPowerUp Main Site
 
Ehh, what is the point of this card. Gtx 750 ti does not support sli, and I don't think vram is this cards biggest bottleneck...

Yeah, not everyone is as educated about hardware as people who frequent tech sites. Probably a lot of people will see this as a faster 750Ti because it has double the VRAM.
 
load a 750ti with 4gb but you cant get a 760x2 with 4gb affective :shadedshu:
 
Does this even have the processing power to use 4gig of VRAM? About all it would be good for would be FNV or Skyrim with lots of texture mods.
 
I don't think it has the balls to heavily mod skyrim like that
 
It's for selling! Makes zero practical sense, but they can sell them to people that don't know the difference. The 750 Ti doesn't even need 2GB.

I've seen lower end Nvidia cards with 4GB (like GT 640s for instance), but don't recall any AMD cards being that silly.
 
I don't see this making almost any sense, I cannot really fathom a situation where a card at this performance point would need the extra 2gb of VRAM or even be able to use it properly. If this card supported SLI then maybe I could say its for people running two cards, otherwise though the situations where more VRAM would be needed its not going to exceed.
 
Its just a card for chasing fools, nothing else.

Just like 640GT 4GB
 
:twitch: they would never.. nvidia are as honest as nuns besides they clearly have been working on more than color compression.

http://www.tomshardware.com/reviews/sapphire-vapor-x-r9-290x-8gb,3977.html

the test is meant to test vram though you cant tell there is gimping going on

It's no surprise to see the 290X pulling out in the lead at 4K but I'm not sure that the reviewer summed it up correctly.



His last page is titled "4K Gains and 1080 Parity". I see the 5% gain at 4K over the 970 but how is the 13% gain by the 970 over the 290X 8 GB parity?
 
It's no surprise to see the 290X pulling out in the lead at 4K but I'm not sure that the reviewer summed it up correctly.



His last page is titled "4K Gains and 1080 Parity". I see the 5% gain at 4K over the 970 but how is the 13% gain by the 970 over the 290X 8 GB parity?
maybe parity with thief throwing off the average or just how good all 3 do at 1080p. I did not really understand the test and why no sli/crossfire when I read it months ago so I dismissed it but I found it yesterday and it says a bit more to me.
 
Last edited:
The slower the card, the more likely it'll have big ram ... they make 8GB lower-end oem cards from time to time.

In other news, a 4GB 750Ti has more usable memory than a 970 ... :roll:

The 750Ti is like 50W average, and the 960 about 110W average. They need a 950 around 75W average.
 
I see a 1080p parity especially for 60hz.. I mean if you got 780(ti)290(x)98(7)0 your not really going to be telling the difference without a benchmark and even a 770 or 280x is going to hit home most of the time.
apparently 660's and possibly even titans have similar memory systems so if its true of course there is no specific update for the 970 when it is business as usual. that would also mean microsoft has been approving the way the driver handles it for some time. doesn't really mean microsoft knew but it means the driver gets the job done. most of the time anyway and that's what they do though.. figure out when the driver could do a better job and fix it.
that's what it has boiled down to for me and of course opinion so feel enlightened or hate.. whatever
meanwhile things are looking up for amd share holders.
 
Last edited:
you know what would be better card for upgrade VRAM to 4GB ... a GTX 970 ;)
 
what for :confused:,
why they dont make another mars(960) with true 4GB vram... that would be nice
 
what for :confused:,
why they dont make another mars(960) with true 4GB vram... that would be nice

That's crazy talk, all that cost for a single-card SLI setup using 128-bit chips? Just get a single 980 which will never have any SLI problems.
 
Nice card, being Maxwell it's DX12 performance should give it some longevity too.
 
Nice card, being Maxwell it's DX12 performance should give it some longevity too.

what is the ram going to put the price up to? near the price of a 960? surely a amd gpu with more horsepower could be had and its not like having a full dx12 gpu is going to matter for a long ass time.
guess someone has to look blindly and try keep nv rep up since they don't seem to be to worried about it that much.
 
There is also another issue that might present itself with DX12 and Nvidia cards

Toms Hardware said:
Note that the GeForce GTX 980's stress test power consumption is actually a few watts lower than the gaming result. This is likely due to throttling that kicks in when we hit the thermal ceiling.

TechPowerUp said:
Temperatures are good, but they are capped by NVIDIA's Boost 2.0 algorithm which is set to a 80°C temperature limit. Clocks will be reduced slightly if the card gets any hotter than that, although they will never drop below the base clock. Our graph further down on the page details this card's clock distribution.

Reference models are reaching 80C-81C in open bench test. DX12 is suppose to keep the GPU feed more consistently with information and will have less time for GPU Boost to down clock it in games leading to higher voltage/temps and possibly throttling.
 
Back
Top