- Joined
- Nov 20, 2013
- Messages
- 5,541 (1.38/day)
- Location
- Kyiv, Ukraine
System Name | WS#1337 |
---|---|
Processor | Ryzen 7 5700X3D |
Motherboard | ASUS X570-PLUS TUF Gaming |
Cooling | Xigmatek Scylla 240mm AIO |
Memory | 4x8GB Samsung DDR4 ECC UDIMM |
Video Card(s) | MSI RTX 3070 Gaming X Trio |
Storage | ADATA Legend 2TB + ADATA SX8200 Pro 1TB |
Display(s) | Samsung U24E590D (4K/UHD) |
Case | ghetto CM Cosmos RC-1000 |
Audio Device(s) | ALC1220 |
Power Supply | SeaSonic SSR-550FX (80+ GOLD) |
Mouse | Logitech G603 |
Keyboard | Modecom Volcano Blade (Kailh choc LP) |
VR HMD | Google dreamview headset(aka fancy cardboard) |
Software | Windows 11, Ubuntu 24.04 LTS |
It's not that easy. Over the past few years there were so many variations of 730, that you could find a 1GB DDR5 Fermi or 1GB DDR3 Kepler, or a Fermi card with up to 4GB framebuffer in both DDR3&5 variants.Again, nVidia made the specs clear enough that you can use just the memory amount and type to determine the card. Just look at you own screen shot. Notice how nVidia only lists one memory size for each of the 3 versions? DDR3 1GB = Fermi, that is the only one of the three versions that has 1GB DDR3. DDR3 2GB = Kepler, that is the only version of the three that has 2GB DDR3. And finally 1GB DDR5 = Better Kepler, that is the only one of the three that has that 1GB DDR5. These are the specs that nVidia put out, and it makes it possible to determine the version you are getting from looking at the front of the box. The problem is the AIBs didn't stick to nVidia's specs and released cards with different memory amounts. That is an issue, and why releasing 3 versions of the GT730 was bad. But AMD's RX 560 was a little worse, because AMD made no effort to make a way to easily distinguish what version of the card you are getting just by looking at the front of the box.
Like this puppy:
https://www.techpowerup.com/vgabios/188957/188957