Nvidia (I CALL THEM, nShitia, AKA *SARCASTIC* I have an nShitia Crapforce GTCrapX 970si) LOL
F*cking nvidia , I call them (nShitia) iv hated nVidia ever scence they bought, took over 3Dfx Interactive INC.
, I loved 3Dfx Graphics Accelerators, iv seen people say that nvidia OR ATI/AMD, or even SiS Graphics chips by S3, like the S3 Virge DX 4MB accelerator, S3 Savage 3D, so nVidia, ATI/AMD, SIS, Matrox, PowerVR. etc.
From WIKI.. below......................
In the PC world, notable failed first tries for low-cost 3D graphics chips were the S3 ViRGE, ATI Rage, and Matrox Mystique. These chips were essentially previous-generation 2D accelerators with 3D features bolted on. Many were even pin-compatible with the earlier-generation chips for ease of implementation and minimal cost. Initially, performance 3D graphics were possible only with discrete boards dedicated to accelerating 3D functions (and lacking 2D GUI acceleration entirely) such as the PowerVR and the 3Dfx Voodoo. However, as manufacturing technology continued to progress, video, 2D GUI acceleration and 3D functionality were all integrated into one chip. Rendition's Verite chipsets were among the first to do this well enough to be worthy of note. In 1997, Rendition went a step further by collaborating with Hercules and Fujitsu on a "Thriller Conspiracy" project which combined a Fujitsu FXG-1 Pinolite geometry processor with a Vérité V2200 core to create a graphics card with a full T&L engine years before Nvidia's GeForce 256. This card, designed to reduce the load placed upon the system's CPU, never made it to market.
OpenGL appeared in the early '90s as a professional graphics API, but originally suffered from performance issues which allowed the Glide API to step in and become a dominant force on the PC in the late '90s.[30] However, these issues were quickly overcome and the Glide API fell by the wayside. Software implementations of OpenGL were common during this time, although the influence of OpenGL eventually led to widespread hardware support. Over time, a parity emerged between features offered in hardware and those offered in OpenGL. Direct X became popular among Windows game developers during the late 90s. Unlike OpenGL, Microsoft insisted on providing strict one-to-one support of hardware. The approach made Direct X less popular as a standalone graphics API initially, since many GPUs provided their own specific features, which existing OpenGL applications were already able to benefit from, leaving Direct X often one generation behind. (See: Comparison of OpenGL and Direct3D.)
Over time, Microsoft began to work more closely with hardware developers, and started to target the releases of Direct X to coincide with those of the supporting graphics hardware. Direct3D 5.0 was the first version of the burgeoning API to gain widespread adoption in the gaming market, and it competed directly with many more-hardware-specific, often proprietary graphics libraries, while OpenGL maintained a strong following. Direct3D 7.0 introduced support for hardware-accelerated transform and lighting (T&L) for Direct3D, while OpenGL had this capability already exposed from its inception. 3D accelerator cards moved beyond being just simple rasterizers to add another significant hardware stage to the 3D rendering pipeline. The Nvidia GeForce 256 (also known as NV10) was the first consumer-level card released on the market with hardware-accelerated T&L, while professional 3D cards already had this capability. Hardware transform and lighting, both already existing features of OpenGL, came to consumer-level hardware in the '90s and set the precedent for later pixel shader and vertex shader units which were far more flexible and programmable.
Im even building an old AMD K7 AMD Athlon Slot A Thunderbird - (T-Bird) 750MHz Slot A CPU with........ 3Dfx Voodoo 5 5500 64MB AGP 2X and TWO (2x) 2x 3Dfx Voodoo2 2000 PCI in SLI .
if im wrong about this, im at least right when either 3Dfx and or ATI Technologies both had invented a dual GPU / dual card setup aka .by ATI Tech. the, ( ATi Rage Furry MAXX) which was the first ATi Graphics card that had multiple ( 2x ) GPU Chips on a single graphics card, with 3Dfx Interactive INC. was the first , 3D Graphics accelerator company to design a dual card system for increasing 3D acceleration performance which was SLI Technology, Invented / or developed by 3Dfx Interactive. SLI Technology, SLI stands for ****( SCAN LINE INTERLEAVE )**** which connected 2 3Dfx Voodoo2's (either 8MB or 12MB Voodoo2 3D accelerators to increase 3D performance, 3Dfx then, eventually produced the....
3Dfx Voodoo 5 5500 64MB (32MB SGRAM*or*DDR) per VSA-100 GPU, an was in the PCI Interface, but ultimately, also had the AGP version as well, which i own the AGP version. 3Dfx Interactive's Last 3D Accelerator was, or would have been the 3Dfx Voodoo 5 6000 AGP which would of had,......
4x 3Dfx VSA-100 GPU Chips on a single card, but do to needing an external power supply, and i believe 3Dfx was having a lot of problems with the Voodoo 5 6000, ...
which would of had, not only 4x VSA-100 GPUS, it would of had 32MB of SGRAM*or* 1st Gen DDR Vram, the voodoo 5 5500 had 64MB for the Two VSA-100 GPUS, the Voodoo 5 6000 would of had 128MB or Vram for all 4 gpus, and prob, would have been the first 3D accelerator wit h 128MB of Vram, even though the 4 gpus ran in SLI, still, id love to have the voodoo 5 6000, just for the hell of it, whether the card worked or not... but the SLI technology nor the AMD/ATI crossfire X, was not invented by ATI/AMD nor nVidia.
yeah, they invented their versions, u know, ATI/AMD called their form of SLI, Crossfire / Crossfire X, and nshitia's (nVidia) SLI is just called SLI that they got when they bought out 3Dfx Interactive, which i despised nVidia for in the first place and this was years ago back in the early 2000's 1999, - 2002, and all nvidia used was the SLI tech, nvidia completely discarded the 3Dfx Glide2x /glide3x, 3Dfx's Glide API, which had problems, but i still loved the quality and Performance that i had from the glide API and 3dfx Voodoo cards... my first 3Dfx card was a creative labs voodoo banshee, which was the first 3Dfx, 3D Hardware accelerator which also had 2D hardware acceleration on a single GPU die, a 3Dfx Voodoo Banshee was just pretty much, a Voodoo 2 ,the two 3Dfx voodoo 2 Pixel Pipeline unit chips & the 3Dfx voodoo 2's single texel unit chip , (or Vise Versa) -i may have that backwards- but all built into a singel 3Dfx GPU chip with full 2D Hardware Acceleration and clocked higher than the Voodoo2's original gpu clocks, the voodoo banshee, i believe ran at either 100MHZ or 166MHZ , i cant rememberm but i think the 3dfx banshee was 100MHz, but the voodoo 2's pixel / texel pipeline chips ran at 90MHZ so the banshee was an excelently improved voodoo2 with out the need for a seperate 2D accelerator and a pass through vga cable..
but my point is, i FUCKING despise Nvidai, and ill always call them nShitia, Nice, any other nVidai hater like my code name for nvida, nShitia***
and im not a AMD/ati fanboy, AMD products are just the only hardware parts i can afford, they are way cheaper than INTEL or nShitia hardware and provide performance that is good enough for games i play, if i need more performance, then i do a slight over clock on my FX-8150 CPU, and my Radeon R9 290 GPU which is right now, my FX-8150 is clocked @ 4.0 Ghz, CPU, with no turbo core and all 4 core modules enabled for 8 cores, 2 cores per pair/module, which has good performance, just AMD needs to finally out perform Intel cpus with AMD's Zen cores, and i cant wait to try out an AMD ZEN FX Processor.. cant we buy Engineering samples of AMD Zen damnit... ok an peice out..