AshenSugar
New Member
- Joined
- Sep 20, 2006
- Messages
- 1,998 (0.30/day)
- Location
- ashentech.com
Processor | Athlon64 3500+(2.2gz)@2.94gz(3.03gz) |
---|---|
Motherboard | Biostar Tforce550 (RMA) (m2n-sli delux) |
Cooling | PIB cooler |
Memory | 2gb ocz 533 +1gb samsung 533 4-4-4-12 |
Video Card(s) | x1900xtx 512mb+zalman vf900 cooler(kicks stock coolers arse) |
Storage | 80gb,200gb,250gb,160gb |
Display(s) | 20.1 in dell 2001fp + KDS visual sensations 19" |
Case | Codegen briza seirse |
Audio Device(s) | ADI SoundMax HD audio onboard,using Ket's driver pack |
Power Supply | FSP 400watt SAGA seirse w/noise killer |
Software | Windows 2003 ent server as workstation(kills xp in perf and stab) |
ok to start, im posting this to clear up some missconseptions/fud i have seen people posting about the fx cards "emulating dx9" and why they sucked.
http://techreport.com/news_reply.x/4782/4/
its got 2 links neer the top in comments that are usefull.
from wikipedia : http://en.wikipedia.org/wiki/GeForce_FX
haxxxx!!!!
see the thumb for specs of these cards, i will also link specs of ati's r300 core
http://en.wikipedia.org/wiki/Radeon_R300
thumb is of the r300 range cards specs, cards that totaly stomp the nvidia equivlants!!!
http://techreport.com/news_reply.x/4782/4/
its got 2 links neer the top in comments that are usefull.
from wikipedia : http://en.wikipedia.org/wiki/GeForce_FX
Specifications
NVIDIA's GeForce FX series is the fifth generation in the GeForce line. With GeForce 3, NVIDIA introduced programmable shader units into their 3D rendering capabilities, in line with the release of Microsoft's DirectX 8.0 release, and the GeForce 4 Ti was an optimized version of the GeForce 3. With real-time 3D graphics technology continually advancing, the release of DirectX 9.0 ushered in a further refinement of programmable pipeline technology with the arrival of Shader Model 2.0. The GeForce FX series brings to the table NVIDIA's first generation of Shader Model 2 hardware support..
The FX features DDR, DDR2 or GDDR-3 memory, a 130 nm fabrication process, and Shader Model 2.0/2.0A compliant vertex and pixel shaders. The FX series is fully compliant and compatible with DirectX 9.0b. The GeForce FX also included an improved VPE (Video Processing Engine), which was first deployed in the GeForce4 MX. Its main upgrade was per pixel video-deinterlacing — a feature first offered in ATI's Radeon, but seeing little use until the maturation of Microsoft's DirectX-VA and VMR (video mixing renderer) APIs. Among other features was an improved anisotropic filtering algorithm which was not angle-dependent (unlike its competitor, the Radeon 9700/9800 series) and offered better quality, but affected performance somewhat. Though NVIDIA reduced the filtering quality in the drivers for a while, the company eventually got the quality up again, and this feature remains one of the highest points of the GeForce FX family to date (However, this method of anisotropic filtering was dropped by NVIDIA with the GeForce 6 series for performance reasons).
Disappointment
Analysis of the hardware
Hardware enthusiasts saw the GeForce FX series as a disappointment as it did not live up to expectations. NVIDIA had aggressively hyped the card up throughout the summer and autumn of 2002, to combat ATI Technologies' autumn release of the powerful Radeon 9700. ATI's very successful Shader Model 2 card had arrived several months earlier than NVIDIA's first NV30 board, the GeForce FX 5800.
When the FX 5800 finally launched, it was discovered after testing and research on the part of hardware analysts that the NV30 was not a match for Radeon 9700's R300 core. This was especially true when pixel shading was involved. Additionally, the 5800 had roughly a 30% memory bandwidth deficit caused by the use of a comparatively narrow 128-bit memory bus (ATI and other companies moved to 256-bit). NVIDIA planned to use the new, state-of-the-art GDDR-2 instead because of its support for much higher clock rates. It couldn't clock high enough to make up for the bandwidth of a 256-bit bus, however.
While the NV30's direct competitor, the R300 core, was capable of 8 pixels per clock with its 8 pipelines, the NV30 was an 8 pipeline chip that could only render 4 color pixels per clock. This dramatically limited its pixel fillrate in the majority of game titles. However, in games with large use of stencil shadows, such as Doom3, NV30 could perform 8 pixels per clock on the shadow rendering pass. This did help its performance in this relatively rare rendering situation. Fortunately NVIDIA's use of 130 nm manufacturing technology allowed them to clock the GPU rather highly compared to ATI's 150 nm R300. This allowed NVIDIA to close the gap somewhat. Still, the fact that ATI's solution was more architecturally effective across the board caused the FX 5800 to remain well behind the older Radeon 9700 in many situations.
The initial version of the GeForce FX (the 5800) was one of the first cards to come equipped with a large dual-slot cooling solution. Called "Flow FX", the cooler was stunningly apparent in comparison to ATI's small cooler on the 9700 series. Not only that, but it was very loud and garnered complaints from gamers and developers alike. It was even jokingly coined the 'Dustbuster' and graphics cards which happen to be loud are often compared to the GeForce FX 5800 for this reason.
With regard to the much-touted Shader Model 2 capabilities of the NV3x series and the related marketing point of "cinematic effects" capabilities of the chip, the actual performance was shockingly poor. A combination of unfortunate factors combined to really hamper how well NV3x could perform these calculations.
Firstly, the chips were designed for use with a mixed precision fragment (pixel) programming methodology, using 48-bit integer ("FX12") precision and also (to a lesser extent) a 64-bit "FP16" for situations where high precision math was unnecessary to maintain image quality, and using the 128-bit "FP32" mode only when absolutely necessary. The R300-based cards from ATI did not benefit from partial precision in any way because these chips were designed purely for Direct3D 9's required minimum of 96-bit FP24 for full precision pixel shaders. For a game title to use FP16, the programmer had to specify which pixel shader instructions used the lower precision by placing "hints" in the shader code. Because ATI didn't benefit from the lower precision and the R300 performed far better on shaders overall, and because it took significant effort to set up pixel shaders to work well with the lower precision calculations, the NV3x hardware was usually crippled to running full precision pixel shaders all the time.
Additionally, The NV30, NV31, and NV34 also were handicapped because they contained a mixture of DirectX 7 fixed-function T&L units, DirectX 8 integer pixel shaders, and DirectX 9 floating point pixel shaders.[citation needed] The R300 chips emulated these older functions on their pure Shader Model 2 hardware allowing the SM2 hardware to use far more transistors for SM2 performance with the same transistor budget. For NVIDIA, with their mixture of hardware, this resulted in non-optimal performance of pure SM2 programming, because only a portion of the chip could calculate this math.
The NV3x chips used a processor architecture that relied heavily on the effectiveness of the video card driver's shader compiler. Proper ordering of shader code could dramatically boost the chip's shader computational efficiency. Compiler development is a long and difficult task and this was a major challenge that NVIDIA tried to overcome during most of NV3x's lifetime. NVIDIA released several guidelines for creating GeForce FX-optimized code and worked with Microsoft to create a special shader model called "Shader Model 2.0A", which generated the optimal code for the GeForce FX. NVIDIA also controversially would rewrite game shader code and force the game to user their shader code instead. NVIDIA engineers could tailor the arithmetic structure of the code and adjust its precision so it would optimally run on their hardware. However, such code would often bring with it lower final image quality.
Still, it was later found that even with the use of partial precision, Shader Model 2.0A, and shader code replacements, that the GeForce FX's performance in shader-heavy applications trailed behind the competition. GeForce FX still remained competitive in OpenGL applications, which can be attributed to both the fact that most OpenGL applications use manufacturer-specific extensions and that NVIDIA's OpenGL drivers were still generally superior to the competition's at this point in time.
The FX series was a moderate success but because of its delayed introduction and flaws, NVIDIA ceded market leadership to ATI's Radeon 9700. Due to market demand and the FX's deficiency as a worthy successor, NVIDIA extended the production life of the aging GeForce 4, keeping both the FX and 4 series in production for some time.
[edit]
Valve's presentation
Soon after the introduction of the Geforce FX, synthetic benchmarks (such as 3DMark 2003) revealed potential weakpoints in PS 2.0 shaders. But outside of the developer community and tech-savvy computer gamers, few mainstream users were aware of such issues. Then in late 2003, Valve Corporation (developer of the popular PC game-series 'Half Life') presented a series of inhouse benchmarks pitting the Geforce FX against the Radeon R300. Based on a pre-release build of the highly anticipated game Half-Life 2, Valve's game-engine benchmarks placed Nvidia's FX product-line a full generation behind ATI's R300 product-line. In Shader 2.0 enabled game-levels, NVIDIA's top-of-the-line FX 5900 Ultra performed about as fast as ATI's mainstream Radeon 9600, which cost a third as much as the NVIDIA card.
Valve had initially planned to pursue partial floating point precision (FP16) optimizations specifically for the FX family. The optimizations required detailed analysis on a case by case basis for each of many visual-effects shader routines, yet offered limited benefit; only the small fraction of gamers with Geforce FX 5700/5900 cards would reasonably expect to playable frame-rates. ATI's R300 cards did not need and did not benefit at all from the optimizations, and the substantial majority of gamers (with DirectX8 hardware) could not use any DirectX9 game-effects regardless. Based on their assessment, Valve programmed Half Life 2 to default to DirectX 8 shaders on all GeForce FX hardware, thereby sidestepping the Geforce FX's poor PS2.0 performance.
Players could tweak a game configuration file to force Half Life 2 to run in DirectX 9 mode. But doing so on NV3x cards results in a significant loss of performance, with the top of the line models (FX/5900 and FX/5950) performing comparably to ATI's entry-level Radeon 9600. An unofficial fan-patch later allowed GeForce FX owners to comfortably play the game in DirectX 9 mode, by selectively replacing original shaders with optimized routines. This allowed FX owners to play Half Life 2 in DirectX 9 mode with improved visuals at a cost of some speed.
haxxxx!!!!
more haxxxx!!!!!!!Questionable tactics
NVIDIA's GeForce FX era was one of great controversy for the company. The competition had soundly beaten them on the technological front and the only way to get the FX chips competitive with the Radeon R300 chips was to optimize the drivers to the extreme.
This took several forms. NVIDIA historically has been known for their impressive OpenGL driver performance and quality, and the FX series certainly maintained this. However, with image quality in both Direct3D and OpenGL, they aggressively began various questionable optimization techniques not seen before. They started with filtering optimizations by changing how trilinear filtering operated on game textures, reducing its accuracy, and thus quality, visibly. Anisotropic filtering also saw dramatic tweaks to limit its use on as many textures as possible to save memory bandwidth and fillrate. Tweaks to these types of texture filtering can often be spotted in games from a shimmering phenomenon that occurs with floor textures as the player moves through the environment (often signifying poor transitions between mip-maps). Changing the driver settings to "High Quality" can alleviate this occurrence at the cost of performance.
NVIDIA also began to clandestinely replace pixel shader code in software with hand-coded optimized versions with lower accuracy, through detecting what program was being run. These "tweaks" were especially noticed in benchmark software from Futuremark. In 3DMark03 it was found that NVIDIA had gone to extremes to limit the complexity of the scenes through driver shader changeouts and aggressive hacks that prevented parts of the scene from even rendering at all. This artificially boosted the scores the FX series received. Side by side analysis of screenshots in games and 3DMark03 showed vast differences between what a Radeon 9800/9700 displayed and what the FX series was doing. NVIDIA also publicly attacked the usefulness of these programs and the techniques used within them in order to undermine their influence upon consumers.
In essence, NVIDIA programmed their driver to look for specific software and apply aggressive optimizations tailored to the limitations of the NV3x hardware. Upon discovery of these tweaks there was a very vocal uproar from the enthusiast community, and from several popular hardware analysis websites. Unfortunately, disabling most of these optimizations showed that NVIDIA's hardware was dramatically incapable of rendering the scenes on a level of detail similar to what ATI's hardware was displaying. So most of the optimizations stayed, except in 3DMark where the Futuremark company began updates to their software and screening driver releases for hacks.
Both NVIDIA and ATI are guilty of optimizing drivers like this historically. However, NVIDIA went to a new extreme with the FX series. Both companies optimize their drivers for specific applications even today (2006), but a tight rein and watch is kept on the results of these optimizations by a now more educated and aware user community.
9600 256mb i had stomped the 5800ultra i had at EVERY SINGEL THING and it was around 1/3-1/4 the price!!!!Competitive response
By early 2003, ATI had captured a considerable chunk of the high-end graphics market and their popular Radeon 9600 was dominating the mid-high performance segment as well. In the meantime, NVIDIA introduced the mid-range 5600 and low-end 5200 models to address the mainstream market. With conventional single-slot cooling and a more affordable price-tag, the 5600 had respectable performance but failed to measure up to its direct competitor, Radeon 9600. As a matter of fact, the mid-range GeForce FX parts did not even advance performance over the chips they were designed to replace, the GeForce 4 Ti and MX440. In DirectX 8 applications, the 5600 lost to or matched the Ti 4200. Likewise, the entry-level FX 5200 did not perform as well as the DirectX 7.0 generation GeForce 4 MX440, despite the FX 5200 possessing a far better 'checkbox' feature-set. FX 5200 was easily matched in value by ATI's older R200-based Radeon 9000-9250 series and outperformed by the even older Radeon 8500.
With the launch of the GeForce FX 5900, NVIDIA fixed many of the problems of the 5800. While the 5800 used fast but hot and expensive GDDR-2 and had a 128-bit memory bus, the 5900 reverted to the slower and cheaper DDR, but it more than made up for it with a wider 256-bit memory bus. The 5900 performed somewhat better than the Radeon 9800 in everything not heavily using shaders, and had a quieter cooling system than the 5800, but most cards based on the 5900 still occupied two slots (the Radeon 9700 and 9800 were both single-slot cards). By mid-2003, ATI's top product (Radeon 9800) was outselling NVIDIA's top-line FX 5900, perhaps the first time that ATI had been able to displace NVIDIA's position as market leader.
NVIDIA later attacked ATI's mid-range card, the Radeon 9600, with the GeForce FX 5700 and 5900XT. The 5700 was a new chip sharing the architectural improvements found in the 5900's NV35 core. The FX 5700's use of GDDR-2 memory kept product prices expensive, leading NVIDIA to introduce the FX 5900XT. The 5900XT was identical to the 5900, but was clocked slower, and used slower memory.
The final GeForce FX model released was the 5950 Ultra, which was a 5900 Ultra with higher clockspeeds. This model did not prove particularly popular, as it was not much faster than the 5900 Ultra, yet commanded a considerable price premium over it. The board was fairly competitive with the Radeon 9800XT, again as long as pixel shaders were lightly used
pwned again!!!!Windows Vista and GeForce FX PCI cards
Although ATI's competitive cards clearly surpassed the GeForce FX series among many gamers, NVIDIA may regain some market share with the release of Windows Vista, which requires DirectX 9 for its signature Windows Aero interface. Many users with systems with an integrated graphics processor (IGP) but without AGP or PCIe slots, that are otherwise powerful enough for Vista, may demand DirectX 9 PCI video cards for Vista upgrades, though the size of this niche market is unknown.
To date, the most common such cards use GeForce FX-series chips; most use the FX 5200, but some use the FX 5500 (a slightly overclocked 5200) or the FX 5700 LE, (which has similar speeds to the 5200, but has a few more pixel pipelines.) For some time, the only other PCI cards that were Aero-capable were two GeForce 6200 PCI cards made by BFG Technologies and its 3D Fuzion division. The XGI Technology Volari V3XT was also DirectX 9 on PCI, but with XGI's exit from the graphics card business in early 2006, it is apparently not being supported in Vista as of RTM. [1]
For a long time, ATI's PCI line-up was limited to the Radeon R200-based Radeon 9000, 9200, and 9250 cards, which are not capable of running Aero because of their DirectX 8.1 lineage. Indeed, ATI may have helped assure NVIDIA's initial dominance of the DirectX 9-on-PCI niche by buying XGI's graphics card assets [2]. However, in June 2006 a Radeon X1300-based PCI card was spotted in Japan [3], so it now appears ATI will try to contest the GeForce FX's dominance of this niche. Nonetheless, ATI's deployment of a later-generation GPU in what is likely to be a low-end, non-gamer niche may still leave NVIDIA with the majority of units sold.
see the thumb for specs of these cards, i will also link specs of ati's r300 core
http://en.wikipedia.org/wiki/Radeon_R300
thumb is of the r300 range cards specs, cards that totaly stomp the nvidia equivlants!!!