eidairaman1
The Exiled Airman
- Joined
- Jul 2, 2007
- Messages
- 42,537 (6.67/day)
- Location
- Republic of Texas (True Patriot)
System Name | PCGOD |
---|---|
Processor | AMD FX 8350@ 5.0GHz |
Motherboard | Asus TUF 990FX Sabertooth R2 2901 Bios |
Cooling | Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED |
Memory | 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V) |
Video Card(s) | AMD Radeon 290 Sapphire Vapor-X |
Storage | Samsung 840 Pro 256GB, WD Velociraptor 1TB |
Display(s) | NEC Multisync LCD 1700V (Display Port Adapter) |
Case | AeroCool Xpredator Evil Blue Edition |
Audio Device(s) | Creative Labs Sound Blaster ZxR |
Power Supply | Seasonic 1250 XM2 Series (XP3) |
Mouse | Roccat Kone XTD |
Keyboard | Roccat Ryos MK Pro |
Software | Windows 7 Pro 64 |
I see you just like being argumentative.
Anyway, this is a thread about 32-bit support being pulled by NVIDIA, with their GeForce GPUs indeed being used mostly for gaming, so my comment is properly in context, but it seems you hadn't noticed.
My can create argument isn't weak at all. Sorry. How about if I rephrase it that it does create compatibility problems in some instances, hence my sentence. Do let me know if I didn't dot the I or cross the T somewhere in this post, won't you?
Just put the troll on ignore