Wednesday, January 27th 2016

NVIDIA Unveils GeForce 361.75 WHQL Game Ready Drivers

Following a month without any driver releases (the last one being dated December 21), NVIDIA released its latest GeForce drivers. Version 361.75 WHQL drivers are "Game Ready" for Rise of the Tomb Raider, and Tom Clancy's The Division (beta). This includes performance optimization, SLI profiles, and GeForce Experience optimal settings. The drivers also come with early (beta) support for GeForce GTX GPUs installed in external graphics solutions over the Thunderbolt 3 (40 Gb/s) interface. Grab them from the links below.
DOWNLOAD: NVIDIA GeForce 361.75 WHQL for Windows 10 64-bit | Windows 10 32-bit | Windows 8.1/7 64-bit | Windows 8.1/7 32-bit
Add your own comment

50 Comments on NVIDIA Unveils GeForce 361.75 WHQL Game Ready Drivers

#26
erocker
*
Ferrum MasterOkay... who's the first one :nutkick:

Report in while I am at work.
Everything installed normally with no issue. The few games I play are working as they should.
Posted on Reply
#27
BoyGenius
EroticusPureHair is TressFX right ?
Yes, It uses TressFX 3.0 engine
okidna

GPU hot swap, what a nice feature. "Need a FPS quick boost, time to swap this puny 970 with dual 980Ti" :D

On topic, only found the "eject" issue under Windows 10, no problem at all with my Windows 7.
Eject it .:D

Everything seems to be working fine here.
Posted on Reply
#28
RejZoR
DivergeThere's no need to eject usb drives since windows 7 and beyond (maybe it was xp sp2, i forget). delayed writes are disabled on removable drives.
You're missing the point entirely. And yes, I do prefer to safely remove some devices, like portable HDD because it spins down and then I unplug it. It doesn't like unplugging while it's spinning. Not to mention the USB icon was indication that I have USB plugged in. Now this shit is there the entire time so I strictly have to look at the USB ports if anything is in them or click the damn thing to see the list. Stupid.
Posted on Reply
#29
semantics
RejZoR
"Eject GTX 980". WHY!? WHY NVIDIA!? I don't want to have an option to eject god damn ONLY and DEDICATED graphic card in my system. Not only I'll have to watch this god damn icon down there the entire time, I'll by mistake eject ONLY graphic card when I'll want to eject god damn USB drive and lose image. This is just lazy and stupid.
This is part of the new BETA support for external GPUs however this message will be removed from the next driver.
forums.geforce.com/default/topic/912985/geforce-drivers/official-361-75-game-ready-whql-display-driver-feedback-thread-1-27-16-/post/4791013/#4791013
Posted on Reply
#30
looniam
never had an idle problem, not getting either game and back on W7. so any reason to grab this?

like a performance increase in MGSV:TPP . . oh wait that's fine already . .
Posted on Reply
#31
Fluffmeister
looniamnever had an idle problem, not getting either game and back on W7. so any reason to grab this?

like a performance increase in MGSV:TPP . . oh wait that's fine already . .
You appear to have answered you own question.

Pesky nVidia and their superb day one support! *shakes fist*
Posted on Reply
#32
Mistral
RejZoRNVIDIA, are you fucking kidding me?



"Eject GTX 980". WHY!? WHY NVIDIA!?
They want you to buy a Titan X.
Posted on Reply
#33
Xzibit

Stick with Win7 and a Titan X

PCWorld - Rise of the Tomb Raider (PC) review impressions: Gorgeous game, ugly stutter
PCWorldUPDATE, 2:00 PM Pacific: I’ve installed Nvidia’s Game Ready Drivers and it helped a bit, but didn’t completely eliminate the stuttering. The Geothermal Valley continues to display precipitous drops in frame rate, falling from around 100 down to 55. Tweaking the Level of Detail down a notch gained me back five frames (bringing it to a slightly-stuttery 60), but be aware that even a high-powered rig might show quite a bit of slowdown in these larger areas regardless of how the rest of the game runs.
PCGamer - Rise of the Tomb Raider review
PCGamerWhile my 970 GTX couldn't keep up with the demands of running every option at maximum—dropping to a stutter during cutscenes and set pieces—a few sensible reductions had it running smoothly and consistently at 60 frames per second.
Posted on Reply
#35
Ferrum Master
RCoonIt runs beautifully on AMD gpus. Doesn't have crossfire support, but a 290X is bang on for max settings as 1080p.
Is it me, but the hair looks like a sponge?

And AMD cards have snow in the here and nvidia don't, is it true?
Posted on Reply
#36
RCoon
Ferrum MasterIs it me, but the hair looks like a sponge?

And AMD cards have snow in the here and nvidia don't, is it true?
Hair looks great for me, I have the hair effects on, but not set to NVidia processing. As for snow I have no idea, I have snow :D
Posted on Reply
#37
NightOfChrist
I have GTX 980 and I can report I have no problem.
Only installed the driver and PhysX without 3D, audio and Experience.
I do not have Rise of Tomb Raider or The Division but Witcher 3 and Fallout 4 run well.

But I have not purchase Rise of Tomb Raider yet. Should I buy now or wait until a later date? I am interested to play the game but I have doubts. @RCoon? @rtwjunkie? Anybody can give an advice to me?
Posted on Reply
#38
rtwjunkie
PC Gaming Enthusiast
NightOfChristBut I have not purchase Rise of Tomb Raider yet. Should I buy now or wait until a later date?
I think @RCoon's review of it is today in a few hours, or Friday. You could wait till then.
Posted on Reply
#39
RejZoR
If that hot-swapping nonsense is a BETA feature, what the hell is it doing in the WHQL driver!?
Posted on Reply
#40
RCoon
NightOfChristI have GTX 980 and I can report I have no problem.
Only installed the driver and PhysX without 3D, audio and Experience.
I do not have Rise of Tomb Raider or The Division but Witcher 3 and Fallout 4 run well.

But I have not purchase Rise of Tomb Raider yet. Should I buy now or wait until a later date? I am interested to play the game but I have doubts. @RCoon? @rtwjunkie? Anybody can give an advice to me?
rtwjunkieI think @RCoon's review of it is today in a few hours, or Friday. You could wait till then.
Review goes up tomorrow at 9AM PST for both Bombshell and Tomb Raider. That said if you cba to wait for that, Tomb Raider gets a definitive thumbs up from me. Bit heavy on QTE's and scripted events, but even so, much better than the previous title, and even that was relatively good.
Posted on Reply
#41
rtwjunkie
PC Gaming Enthusiast
RCoonTomb Raider gets a definitive thumbs up from me
Good to hear! Even if more QTE's than last one, it puts it more in line with the previous 3 (LAU), which is fine with me.
Posted on Reply
#42
RCoon
rtwjunkieGood to hear! Even if more QTE's than last one, it puts it more in line with the previous 3 (LAU), which is fine with me.
Main story is like 10 hours worth of content. Within that there's around 5-10 QTE's so they're fairly spread out. Completed the game and it says I've got just under 40% of the whole content remaining
Posted on Reply
#43
Adam Krazispeed
Nvidia (I CALL THEM, nShitia, AKA *SARCASTIC* I have an nShitia Crapforce GTCrapX 970si) LOL

F*cking nvidia , I call them (nShitia) iv hated nVidia ever scence they bought, took over 3Dfx Interactive INC.
, I loved 3Dfx Graphics Accelerators, iv seen people say that nvidia OR ATI/AMD, or even SiS Graphics chips by S3, like the S3 Virge DX 4MB accelerator, S3 Savage 3D, so nVidia, ATI/AMD, SIS, Matrox, PowerVR. etc.

From WIKI.. below......................

In the PC world, notable failed first tries for low-cost 3D graphics chips were the S3 ViRGE, ATI Rage, and Matrox Mystique. These chips were essentially previous-generation 2D accelerators with 3D features bolted on. Many were even pin-compatible with the earlier-generation chips for ease of implementation and minimal cost. Initially, performance 3D graphics were possible only with discrete boards dedicated to accelerating 3D functions (and lacking 2D GUI acceleration entirely) such as the PowerVR and the 3Dfx Voodoo. However, as manufacturing technology continued to progress, video, 2D GUI acceleration and 3D functionality were all integrated into one chip. Rendition's Verite chipsets were among the first to do this well enough to be worthy of note. In 1997, Rendition went a step further by collaborating with Hercules and Fujitsu on a "Thriller Conspiracy" project which combined a Fujitsu FXG-1 Pinolite geometry processor with a Vérité V2200 core to create a graphics card with a full T&L engine years before Nvidia's GeForce 256. This card, designed to reduce the load placed upon the system's CPU, never made it to market.
OpenGL appeared in the early '90s as a professional graphics API, but originally suffered from performance issues which allowed the Glide API to step in and become a dominant force on the PC in the late '90s.[30] However, these issues were quickly overcome and the Glide API fell by the wayside. Software implementations of OpenGL were common during this time, although the influence of OpenGL eventually led to widespread hardware support. Over time, a parity emerged between features offered in hardware and those offered in OpenGL. Direct X became popular among Windows game developers during the late 90s. Unlike OpenGL, Microsoft insisted on providing strict one-to-one support of hardware. The approach made Direct X less popular as a standalone graphics API initially, since many GPUs provided their own specific features, which existing OpenGL applications were already able to benefit from, leaving Direct X often one generation behind. (See: Comparison of OpenGL and Direct3D.)
Over time, Microsoft began to work more closely with hardware developers, and started to target the releases of Direct X to coincide with those of the supporting graphics hardware. Direct3D 5.0 was the first version of the burgeoning API to gain widespread adoption in the gaming market, and it competed directly with many more-hardware-specific, often proprietary graphics libraries, while OpenGL maintained a strong following. Direct3D 7.0 introduced support for hardware-accelerated transform and lighting (T&L) for Direct3D, while OpenGL had this capability already exposed from its inception. 3D accelerator cards moved beyond being just simple rasterizers to add another significant hardware stage to the 3D rendering pipeline. The Nvidia GeForce 256 (also known as NV10) was the first consumer-level card released on the market with hardware-accelerated T&L, while professional 3D cards already had this capability. Hardware transform and lighting, both already existing features of OpenGL, came to consumer-level hardware in the '90s and set the precedent for later pixel shader and vertex shader units which were far more flexible and programmable.

Im even building an old AMD K7 AMD Athlon Slot A Thunderbird - (T-Bird) 750MHz Slot A CPU with........ 3Dfx Voodoo 5 5500 64MB AGP 2X and TWO (2x) 2x 3Dfx Voodoo2 2000 PCI in SLI .

if im wrong about this, im at least right when either 3Dfx and or ATI Technologies both had invented a dual GPU / dual card setup aka .by ATI Tech. the, ( ATi Rage Furry MAXX) which was the first ATi Graphics card that had multiple ( 2x ) GPU Chips on a single graphics card, with 3Dfx Interactive INC. was the first , 3D Graphics accelerator company to design a dual card system for increasing 3D acceleration performance which was SLI Technology, Invented / or developed by 3Dfx Interactive. SLI Technology, SLI stands for ****( SCAN LINE INTERLEAVE )**** which connected 2 3Dfx Voodoo2's (either 8MB or 12MB Voodoo2 3D accelerators to increase 3D performance, 3Dfx then, eventually produced the....
3Dfx Voodoo 5 5500 64MB (32MB SGRAM*or*DDR) per VSA-100 GPU, an was in the PCI Interface, but ultimately, also had the AGP version as well, which i own the AGP version. 3Dfx Interactive's Last 3D Accelerator was, or would have been the 3Dfx Voodoo 5 6000 AGP which would of had,......
4x 3Dfx VSA-100 GPU Chips on a single card, but do to needing an external power supply, and i believe 3Dfx was having a lot of problems with the Voodoo 5 6000, ...
which would of had, not only 4x VSA-100 GPUS, it would of had 32MB of SGRAM*or* 1st Gen DDR Vram, the voodoo 5 5500 had 64MB for the Two VSA-100 GPUS, the Voodoo 5 6000 would of had 128MB or Vram for all 4 gpus, and prob, would have been the first 3D accelerator wit h 128MB of Vram, even though the 4 gpus ran in SLI, still, id love to have the voodoo 5 6000, just for the hell of it, whether the card worked or not... but the SLI technology nor the AMD/ATI crossfire X, was not invented by ATI/AMD nor nVidia.

yeah, they invented their versions, u know, ATI/AMD called their form of SLI, Crossfire / Crossfire X, and nshitia's (nVidia) SLI is just called SLI that they got when they bought out 3Dfx Interactive, which i despised nVidia for in the first place and this was years ago back in the early 2000's 1999, - 2002, and all nvidia used was the SLI tech, nvidia completely discarded the 3Dfx Glide2x /glide3x, 3Dfx's Glide API, which had problems, but i still loved the quality and Performance that i had from the glide API and 3dfx Voodoo cards... my first 3Dfx card was a creative labs voodoo banshee, which was the first 3Dfx, 3D Hardware accelerator which also had 2D hardware acceleration on a single GPU die, a 3Dfx Voodoo Banshee was just pretty much, a Voodoo 2 ,the two 3Dfx voodoo 2 Pixel Pipeline unit chips & the 3Dfx voodoo 2's single texel unit chip , (or Vise Versa) -i may have that backwards- but all built into a singel 3Dfx GPU chip with full 2D Hardware Acceleration and clocked higher than the Voodoo2's original gpu clocks, the voodoo banshee, i believe ran at either 100MHZ or 166MHZ , i cant rememberm but i think the 3dfx banshee was 100MHz, but the voodoo 2's pixel / texel pipeline chips ran at 90MHZ so the banshee was an excelently improved voodoo2 with out the need for a seperate 2D accelerator and a pass through vga cable..


but my point is, i FUCKING despise Nvidai, and ill always call them nShitia, Nice, any other nVidai hater like my code name for nvida, nShitia***

and im not a AMD/ati fanboy, AMD products are just the only hardware parts i can afford, they are way cheaper than INTEL or nShitia hardware and provide performance that is good enough for games i play, if i need more performance, then i do a slight over clock on my FX-8150 CPU, and my Radeon R9 290 GPU which is right now, my FX-8150 is clocked @ 4.0 Ghz, CPU, with no turbo core and all 4 core modules enabled for 8 cores, 2 cores per pair/module, which has good performance, just AMD needs to finally out perform Intel cpus with AMD's Zen cores, and i cant wait to try out an AMD ZEN FX Processor.. cant we buy Engineering samples of AMD Zen damnit... ok an peice out..
Posted on Reply
#44
rtwjunkie
PC Gaming Enthusiast
Adam KrazispeedNvidia (I CALL THEM, nShitia, AKA *SARCASTIC* I have an nShitia Crapforce GTCrapX 970si) LOL

F*cking nvidia , I call them (nShitia) iv hated nVidia ever scence they bought, took over 3Dfx Interactive INC.
, I loved 3Dfx Graphics Accelerators, iv seen people say that nvidia OR ATI/AMD, or even SiS Graphics chips by S3, like the S3 Virge DX 4MB accelerator, S3 Savage 3D, so nVidia, ATI/AMD, SIS, Matrox, PowerVR. etc.

From WIKI.. below......................

In the PC world, notable failed first tries for low-cost 3D graphics chips were the S3 ViRGE, ATI Rage, and Matrox Mystique. These chips were essentially previous-generation 2D accelerators with 3D features bolted on. Many were even pin-compatible with the earlier-generation chips for ease of implementation and minimal cost. Initially, performance 3D graphics were possible only with discrete boards dedicated to accelerating 3D functions (and lacking 2D GUI acceleration entirely) such as the PowerVR and the 3Dfx Voodoo. However, as manufacturing technology continued to progress, video, 2D GUI acceleration and 3D functionality were all integrated into one chip. Rendition's Verite chipsets were among the first to do this well enough to be worthy of note. In 1997, Rendition went a step further by collaborating with Hercules and Fujitsu on a "Thriller Conspiracy" project which combined a Fujitsu FXG-1 Pinolite geometry processor with a Vérité V2200 core to create a graphics card with a full T&L engine years before Nvidia's GeForce 256. This card, designed to reduce the load placed upon the system's CPU, never made it to market.
OpenGL appeared in the early '90s as a professional graphics API, but originally suffered from performance issues which allowed the Glide API to step in and become a dominant force on the PC in the late '90s.[30] However, these issues were quickly overcome and the Glide API fell by the wayside. Software implementations of OpenGL were common during this time, although the influence of OpenGL eventually led to widespread hardware support. Over time, a parity emerged between features offered in hardware and those offered in OpenGL. Direct X became popular among Windows game developers during the late 90s. Unlike OpenGL, Microsoft insisted on providing strict one-to-one support of hardware. The approach made Direct X less popular as a standalone graphics API initially, since many GPUs provided their own specific features, which existing OpenGL applications were already able to benefit from, leaving Direct X often one generation behind. (See: Comparison of OpenGL and Direct3D.)
Over time, Microsoft began to work more closely with hardware developers, and started to target the releases of Direct X to coincide with those of the supporting graphics hardware. Direct3D 5.0 was the first version of the burgeoning API to gain widespread adoption in the gaming market, and it competed directly with many more-hardware-specific, often proprietary graphics libraries, while OpenGL maintained a strong following. Direct3D 7.0 introduced support for hardware-accelerated transform and lighting (T&L) for Direct3D, while OpenGL had this capability already exposed from its inception. 3D accelerator cards moved beyond being just simple rasterizers to add another significant hardware stage to the 3D rendering pipeline. The Nvidia GeForce 256 (also known as NV10) was the first consumer-level card released on the market with hardware-accelerated T&L, while professional 3D cards already had this capability. Hardware transform and lighting, both already existing features of OpenGL, came to consumer-level hardware in the '90s and set the precedent for later pixel shader and vertex shader units which were far more flexible and programmable.

Im even building an old AMD K7 AMD Athlon Slot A Thunderbird - (T-Bird) 750MHz Slot A CPU with........ 3Dfx Voodoo 5 5500 64MB AGP 2X and TWO (2x) 2x 3Dfx Voodoo2 2000 PCI in SLI .

if im wrong about this, im at least right when either 3Dfx and or ATI Technologies both had invented a dual GPU / dual card setup aka .by ATI Tech. the, ( ATi Rage Furry MAXX) which was the first ATi Graphics card that had multiple ( 2x ) GPU Chips on a single graphics card, with 3Dfx Interactive INC. was the first , 3D Graphics accelerator company to design a dual card system for increasing 3D acceleration performance which was SLI Technology, Invented / or developed by 3Dfx Interactive. SLI Technology, SLI stands for ****( SCAN LINE INTERLEAVE )**** which connected 2 3Dfx Voodoo2's (either 8MB or 12MB Voodoo2 3D accelerators to increase 3D performance, 3Dfx then, eventually produced the....
3Dfx Voodoo 5 5500 64MB (32MB SGRAM*or*DDR) per VSA-100 GPU, an was in the PCI Interface, but ultimately, also had the AGP version as well, which i own the AGP version. 3Dfx Interactive's Last 3D Accelerator was, or would have been the 3Dfx Voodoo 5 6000 AGP which would of had,......
4x 3Dfx VSA-100 GPU Chips on a single card, but do to needing an external power supply, and i believe 3Dfx was having a lot of problems with the Voodoo 5 6000, ...
which would of had, not only 4x VSA-100 GPUS, it would of had 32MB of SGRAM*or* 1st Gen DDR Vram, the voodoo 5 5500 had 64MB for the Two VSA-100 GPUS, the Voodoo 5 6000 would of had 128MB or Vram for all 4 gpus, and prob, would have been the first 3D accelerator wit h 128MB of Vram, even though the 4 gpus ran in SLI, still, id love to have the voodoo 5 6000, just for the hell of it, whether the card worked or not... but the SLI technology nor the AMD/ATI crossfire X, was not invented by ATI/AMD nor nVidia.

yeah, they invented their versions, u know, ATI/AMD called their form of SLI, Crossfire / Crossfire X, and nshitia's (nVidia) SLI is just called SLI that they got when they bought out 3Dfx Interactive, which i despised nVidia for in the first place and this was years ago back in the early 2000's 1999, - 2002, and all nvidia used was the SLI tech, nvidia completely discarded the 3Dfx Glide2x /glide3x, 3Dfx's Glide API, which had problems, but i still loved the quality and Performance that i had from the glide API and 3dfx Voodoo cards... my first 3Dfx card was a creative labs voodoo banshee, which was the first 3Dfx, 3D Hardware accelerator which also had 2D hardware acceleration on a single GPU die, a 3Dfx Voodoo Banshee was just pretty much, a Voodoo 2 ,the two 3Dfx voodoo 2 Pixel Pipeline unit chips & the 3Dfx voodoo 2's single texel unit chip , (or Vise Versa) -i may have that backwards- but all built into a singel 3Dfx GPU chip with full 2D Hardware Acceleration and clocked higher than the Voodoo2's original gpu clocks, the voodoo banshee, i believe ran at either 100MHZ or 166MHZ , i cant rememberm but i think the 3dfx banshee was 100MHz, but the voodoo 2's pixel / texel pipeline chips ran at 90MHZ so the banshee was an excelently improved voodoo2 with out the need for a seperate 2D accelerator and a pass through vga cable..


but my point is, i FUCKING despise Nvidai, and ill always call them nShitia, Nice, any other nVidai hater like my code name for nvida, nShitia***

and im not a AMD/ati fanboy, AMD products are just the only hardware parts i can afford, they are way cheaper than INTEL or nShitia hardware and provide performance that is good enough for games i play, if i need more performance, then i do a slight over clock on my FX-8150 CPU, and my Radeon R9 290 GPU which is right now, my FX-8150 is clocked @ 4.0 Ghz, CPU, with no turbo core and all 4 core modules enabled for 8 cores, 2 cores per pair/module, which has good performance, just AMD needs to finally out perform Intel cpus with AMD's Zen cores, and i cant wait to try out an AMD ZEN FX Processor.. cant we buy Engineering samples of AMD Zen damnit... ok an peice out..
Wow, that's awesome how you joined up and appear to have jumped in just to write an entire rambling book about how much you hate Nvidia. Good job. :rolleyes:

Normally I'd welcome someone to TPU with their first post, but a hate-filled sewage-piece with barely any punctuation, and hard as hell to read kind of just took that welcoming incentive away.
Posted on Reply
#45
Fluffmeister
I think Nvidia must have stolen his first born child or something. :laugh:
Posted on Reply
#46
64K
Adam KrazispeedNvidia (I CALL THEM, nShitia, AKA *SARCASTIC* I have an nShitia Crapforce GTCrapX 970si) LOL

F*cking nvidia , I call them (nShitia) iv hated nVidia ever scence they bought, took over 3Dfx Interactive INC.
, I loved 3Dfx Graphics Accelerators, iv seen people say that nvidia OR ATI/AMD, or even SiS Graphics chips by S3, like the S3 Virge DX 4MB accelerator, S3 Savage 3D, so nVidia, ATI/AMD, SIS, Matrox, PowerVR. etc.

From WIKI.. below......................

In the PC world, notable failed first tries for low-cost 3D graphics chips were the S3 ViRGE, ATI Rage, and Matrox Mystique. These chips were essentially previous-generation 2D accelerators with 3D features bolted on. Many were even pin-compatible with the earlier-generation chips for ease of implementation and minimal cost. Initially, performance 3D graphics were possible only with discrete boards dedicated to accelerating 3D functions (and lacking 2D GUI acceleration entirely) such as the PowerVR and the 3Dfx Voodoo. However, as manufacturing technology continued to progress, video, 2D GUI acceleration and 3D functionality were all integrated into one chip. Rendition's Verite chipsets were among the first to do this well enough to be worthy of note. In 1997, Rendition went a step further by collaborating with Hercules and Fujitsu on a "Thriller Conspiracy" project which combined a Fujitsu FXG-1 Pinolite geometry processor with a Vérité V2200 core to create a graphics card with a full T&L engine years before Nvidia's GeForce 256. This card, designed to reduce the load placed upon the system's CPU, never made it to market.
OpenGL appeared in the early '90s as a professional graphics API, but originally suffered from performance issues which allowed the Glide API to step in and become a dominant force on the PC in the late '90s.[30] However, these issues were quickly overcome and the Glide API fell by the wayside. Software implementations of OpenGL were common during this time, although the influence of OpenGL eventually led to widespread hardware support. Over time, a parity emerged between features offered in hardware and those offered in OpenGL. Direct X became popular among Windows game developers during the late 90s. Unlike OpenGL, Microsoft insisted on providing strict one-to-one support of hardware. The approach made Direct X less popular as a standalone graphics API initially, since many GPUs provided their own specific features, which existing OpenGL applications were already able to benefit from, leaving Direct X often one generation behind. (See: Comparison of OpenGL and Direct3D.)
Over time, Microsoft began to work more closely with hardware developers, and started to target the releases of Direct X to coincide with those of the supporting graphics hardware. Direct3D 5.0 was the first version of the burgeoning API to gain widespread adoption in the gaming market, and it competed directly with many more-hardware-specific, often proprietary graphics libraries, while OpenGL maintained a strong following. Direct3D 7.0 introduced support for hardware-accelerated transform and lighting (T&L) for Direct3D, while OpenGL had this capability already exposed from its inception. 3D accelerator cards moved beyond being just simple rasterizers to add another significant hardware stage to the 3D rendering pipeline. The Nvidia GeForce 256 (also known as NV10) was the first consumer-level card released on the market with hardware-accelerated T&L, while professional 3D cards already had this capability. Hardware transform and lighting, both already existing features of OpenGL, came to consumer-level hardware in the '90s and set the precedent for later pixel shader and vertex shader units which were far more flexible and programmable.

Im even building an old AMD K7 AMD Athlon Slot A Thunderbird - (T-Bird) 750MHz Slot A CPU with........ 3Dfx Voodoo 5 5500 64MB AGP 2X and TWO (2x) 2x 3Dfx Voodoo2 2000 PCI in SLI .

if im wrong about this, im at least right when either 3Dfx and or ATI Technologies both had invented a dual GPU / dual card setup aka .by ATI Tech. the, ( ATi Rage Furry MAXX) which was the first ATi Graphics card that had multiple ( 2x ) GPU Chips on a single graphics card, with 3Dfx Interactive INC. was the first , 3D Graphics accelerator company to design a dual card system for increasing 3D acceleration performance which was SLI Technology, Invented / or developed by 3Dfx Interactive. SLI Technology, SLI stands for ****( SCAN LINE INTERLEAVE )**** which connected 2 3Dfx Voodoo2's (either 8MB or 12MB Voodoo2 3D accelerators to increase 3D performance, 3Dfx then, eventually produced the....
3Dfx Voodoo 5 5500 64MB (32MB SGRAM*or*DDR) per VSA-100 GPU, an was in the PCI Interface, but ultimately, also had the AGP version as well, which i own the AGP version. 3Dfx Interactive's Last 3D Accelerator was, or would have been the 3Dfx Voodoo 5 6000 AGP which would of had,......
4x 3Dfx VSA-100 GPU Chips on a single card, but do to needing an external power supply, and i believe 3Dfx was having a lot of problems with the Voodoo 5 6000, ...
which would of had, not only 4x VSA-100 GPUS, it would of had 32MB of SGRAM*or* 1st Gen DDR Vram, the voodoo 5 5500 had 64MB for the Two VSA-100 GPUS, the Voodoo 5 6000 would of had 128MB or Vram for all 4 gpus, and prob, would have been the first 3D accelerator wit h 128MB of Vram, even though the 4 gpus ran in SLI, still, id love to have the voodoo 5 6000, just for the hell of it, whether the card worked or not... but the SLI technology nor the AMD/ATI crossfire X, was not invented by ATI/AMD nor nVidia.

yeah, they invented their versions, u know, ATI/AMD called their form of SLI, Crossfire / Crossfire X, and nshitia's (nVidia) SLI is just called SLI that they got when they bought out 3Dfx Interactive, which i despised nVidia for in the first place and this was years ago back in the early 2000's 1999, - 2002, and all nvidia used was the SLI tech, nvidia completely discarded the 3Dfx Glide2x /glide3x, 3Dfx's Glide API, which had problems, but i still loved the quality and Performance that i had from the glide API and 3dfx Voodoo cards... my first 3Dfx card was a creative labs voodoo banshee, which was the first 3Dfx, 3D Hardware accelerator which also had 2D hardware acceleration on a single GPU die, a 3Dfx Voodoo Banshee was just pretty much, a Voodoo 2 ,the two 3Dfx voodoo 2 Pixel Pipeline unit chips & the 3Dfx voodoo 2's single texel unit chip , (or Vise Versa) -i may have that backwards- but all built into a singel 3Dfx GPU chip with full 2D Hardware Acceleration and clocked higher than the Voodoo2's original gpu clocks, the voodoo banshee, i believe ran at either 100MHZ or 166MHZ , i cant rememberm but i think the 3dfx banshee was 100MHz, but the voodoo 2's pixel / texel pipeline chips ran at 90MHZ so the banshee was an excelently improved voodoo2 with out the need for a seperate 2D accelerator and a pass through vga cable..


but my point is, i FUCKING despise Nvidai, and ill always call them nShitia, Nice, any other nVidai hater like my code name for nvida, nShitia***

and im not a AMD/ati fanboy, AMD products are just the only hardware parts i can afford, they are way cheaper than INTEL or nShitia hardware and provide performance that is good enough for games i play, if i need more performance, then i do a slight over clock on my FX-8150 CPU, and my Radeon R9 290 GPU which is right now, my FX-8150 is clocked @ 4.0 Ghz, CPU, with no turbo core and all 4 core modules enabled for 8 cores, 2 cores per pair/module, which has good performance, just AMD needs to finally out perform Intel cpus with AMD's Zen cores, and i cant wait to try out an AMD ZEN FX Processor.. cant we buy Engineering samples of AMD Zen damnit... ok an peice out..
Nvidia and Amd both do crappy things sometimes but I would like to point out that Nvidia makes a nice profit and the shareholders profit from that whereas AMD goes into the red by hundreds of millions of dollars and owes more than they are worth. Their share price has dropped from about $40 to about $2 in the last 10 years and they could bankrupt in 2019 if they can't come up with 600 million dollars to repay just one of their big debts.

Also, one more thing.....

Posted on Reply
#47
R-T-B
Adam Krazispeed, that was almost artful in how bad it was. should I... thank it?
Posted on Reply
#48
BiggieShady
R-T-BAdam Krazispeed, that was almost artful in how bad it was. should I... thank it?
You just did without actually doing it ... your charisma has increased, rest to level up.
Posted on Reply
#49
Serpent of Darkness
In other news, seems like the new driver screwed up VRayRT for me in some manner. Any rendering with a light cache GI just produces blotches of black on the 3D models. I thought it was an issue with the diffuse maps, but there's nothing wrong with them. In addition, it works perfectly fine in Active Shader Mode. Gee NVidia, why you got to break my GPU-rendering features. You know one of those things that made you supposively better than AMDerps. Now I have to go back to rendering on the CPU--smooth move bro-dumbo! The prior driver was working just fine till you had to f**** it up...

Don't really play much PC Games. So I don't have much to gripe about there. The driver gave me zero issues during the installation...
Posted on Reply
Add your own comment
May 21st, 2024 14:43 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts