News Posts matching #NVIDIA

Return to Keyword Browsing

NVIDIA's Keita Iida at IGN PC

Bennett Ring and Patch Kolan from IGN Australia had Keita Iida, Director of Content Management at NVIDIA, as their guest yesterday and he answered a couple of interesting questions. Iida is not a sales person and was quite communicative during the whole interview. The subjects discussed vary from DirectX 10 and rough performance expectations for Crysis and Microsofts Flight Simulator X over crippled Vista drivers to game development for the PS3 and PC from the performance perspective.

Get to the interview right here.

NVIDIA will create clever market name for CUDA

AMD's ATI has a fancy name for their graphics card computing solution (Stream computing), so why doesn't NVIDIA? Since the CUDA name isn't exactly very appealing to the consumer market, NVIDIA is planning on renaming the CUDA incentive when it has something more ready for public release (such as a Folding@Home client). Rumor has it that NVIDIA will call their GPU-accelerated computing solution "GPU Computing", but we will see when NVIDIA actually releases clients to the average user.

For those of you that don't know, CUDA is a C compiler that compiles software so that it can be run/accelerated using an NVIDIA GPU as a processor.

NVIDIA nForce 650i Ultra Reference Board Details

Though the NVIDIA nForce 650i Ultra MCP was launched at the end of last year it took NVIDIA some time to come up with an actual reference design. Again ChileHardware were the first in the business to show some facts. The 650i ultra is the least feature-packed chipset out of the 600i family, with the absence of SLI being the biggest drawback. But if you don't intend to go for a SLI setup then the board will please you with full Core 2 Extreme, Core 2 Duo and Core 2 Quad support. Furthermore we are talking about dual-channel DDR2-800 (four DIMM sockets), a single PCIe x16, two PCIe x1 and three PCI (32bit/33MHz) slots, four SATA II ports and a single PATA port (HDD or optical drive).
The ATX I/O panel is not that crowded either, besides the obligatory PS/2 keyboard and mouse ports there are four USB 2.0 ports, the Gigabit Ethernet connector and the eight-channel audio connectors.
The mentioned Gigabit port supports NVIDIAs FirstPacket prioritizing technology in order to deliver the lowest pings possible while gaming. The SATA ports support Raid arrays (Raid 0, 1, 0+1 and 5) and onboard you will find four additional USB2.0 pinouts to be used with a USB bracket. Its price will be in the 50-100 US$ range, companies like EVGA, XFX, ECS, Biostar will soon adopt this design to their product family. All in all very nice budget solution mainboard that has its strengths.

New NVIDIA S.T.A.L.K.E.R. Optimized Drivers Coming Soon

Relating to Firingsquad.com, NVIDIA will release a new S.T.A.L.K.E.R. optimized graphics driver for the GeForce 8 series next week. The new optimized driver should increase the performance especially on GeForce 8800GTX cards, because these cards aren't much faster than their "little brother", the 8800GTS, when playing S.T.A.L.K.E.R. with current available drivers. Additionally the new driver should provide the missing SLI-functionality in the game.

NVIDIA Folding@Home GPU client: where is it?

On February 16th of this year, NVIDIA went ahead and announced their new GPU computing client, CUDA. Over seven weeks later, we have yet to see so much as a beta of CUDA. And so, this means that NVIDIA, similar to the G80 Vista driver fiasco, has yet to deliver a product that they promised we'd see. The part about this that NVIDIA users probably hate the most is the lack of a GPU-based Folding@Home client. It seems that these days, everything except an NVIDIA system (even the PS3) can run a Folding@Home GPU client. This is an extreme opposite to ATI's GPU processing client, called "Stream", which has a large list of clients that can be used to accelerate programs using the GPU.

G100 supports Cuda 2

G100, an upcoming graphics core from NVIDIA, supposedly supports CUDA 2. CUDA, an acronym for "compute unified device architecture", allows programmers to offload complex mathematical functions onto the GPU for processing. It takes advantage of Unified Shaders for these mathematical calculations. This is a new way to do demanding science computer calculations, as a graphics chip is much faster than a CPU.

The differences between CUDA and CUDA 2 are to be seen. The G100 core is scheduled Q1 2008 and it might be even smaller than 65 nanometre.

G84 and G86 NVIDIA chips late

NVIDIA's mainstream chips should be launched on April 17th, we were already reporting about G86 and G84 here. Now one of The Inquirers says NVIDIA got problems with the latest chip revision and therefore needs to respin it to get rid of the bugs. That will most likely mean that the their launch together with the 8800 Ultra in mid April will be postponed. But I am quite sure NVIDIA will still be first with it's direct R600 rival, AMD/ATI is again later then they admit.

NVIDIA Launches nForce 680i LT SLI Edition

NVIDIA has updated its line of motherboard chipsets with the release of nForce 680i LT SLI Edition chipset. The 680i LT is cut-down version of the nForce 680i flagman. The main difference between the two chipsets is the lack of third PCI-E physics slot, which is not presented in the 680i LT model. Other specs remain the same. Motherboards equipped with the 680i LT chipset should cost under $200. EVGA was the first NVIDIA partner to release motherboard, which follows NVIDIA's reference nForce 680i LT design. Reviews of the EVGA motherboad also surfaced all over the big hardware sites.

Hot Hardware | [H]ard|OCP | Legit Reviews | Guru3D | nvnews.net | bit-tech | iXBT | The Tech Report

Nvidia's 8950GX2 gets a block

Even before the 8950GX2 card hits the stores, blocks are already starting to appear. The Inquirer managed to snap a few shots of Alphacool's block, ready to cool the beast:

The block covers everything: GPU, RAM, NVIO1 chip, PWM...

The block looks very thin, which brings up concerns over its restriction/performance. Alphacool has produced some highly restrictive blocks in the past.

NVIDIA Terminates ULi Chipset Supplies

Maybe many of you know that ULi was acquired by NVIDIA in February of last year. Now in order to push its own chipset series into the lower segment of the mainboard market, NVIDIA decided to terminate the supplies of ULi chipsets to third parties completely. At the CeBIT 2007 show NVIDIA Corp. has demonstrated such solutions as MCP68 and MCP73, so the ULi solutions need to be removed from their way. Engineers that were involved into ULi chipsets development will now join the NVIDIA team working in the same field. So, it looks like ULi branded products are history now.

PNY Releases NVIDIA Quadro FX 4600 and Quadro FX 5600 Graphics

PNY Technologies, the supplier and marketer of NVIDIA Quadro by PNY professional graphics boards announced today the immediate availability of next generation NVIDIA Quadro by PNY solutions based on new NVIDIA Quadro FX 4600 and FX 5600 graphics, and NVIDIA GSync genlock/frame lock and HD SDI options. These new ultra high-end professional graphics solutions meet the challenges of the most complex 3D design, DCC, visualization, scientific, and broadcast applications with a new unified architecture, Shader Model 4.0 technology, large frame buffers, and GPU computing for visualization technology. The new NVIDIA Quadro by PNY solutions include:

NVIDIA's New Software Development Kit Supports Shader Model 5.0

Even though not all game developers have adopted shader model (SM) 3.0 introduced three years ago, and some claiming that transiting to DirectX 10's shader model 4.0 right now hardly makes sense, NVIDIA's new software development kit (SDK) already features profiles for shader model 5.0, which is believed to be an improved version of the SM4.0.

NVIDIA new SDK 10, which was released just last week, apparently contains macro invocations that define the supported CG profiles, including such profiles as Fragment50, Vertex50, Geometry50, meaning that current SDK supports architecture micro-code profiles for pixel shaders 5.0, vertex shaders 5.0 and geometry shaders 5.0.

While hardly anybody knows what shader model 5.0 actually is and how much is it different from the shader model 4.0, the inclusion of the architecture micro-code inside a compiler indicates that NVIDIA foresees the arrival of shader model 5.0-capable hardware soon enough to enable game developers to compile their titles for it.

NVIDIA GeForce 8500GT Pictured


VR-Zone has revealed some information about the GeForce 8500GT video cards to be released on April 17. Based on 80nm TSMC process and 128-bit memory interface, the GeForce 8500GT cards will be clocked at 450MHz core with 256MB of DDR2 memory at 400MHz (800MHz DDR). Those cards are expected to outperform any current GeForce 7600GS, scoring around 22xx and 42xx marks at 3DMark06 (1280x1024) and 3DMark05 (1024x768) respectively. The DX10 performance is unknown at this stage. The NVIDIA GeForce 8500GT series will be priced between $79 to $99 USD.

AGP solutions from NVIDIA: 7900GS and 7950GT

As it turns out AGP is not quite dead like many people (obviously PCI-E users) claim. If you check our news since the 1st of February we reported five times about the neverending AGP-story. Three of the reports were related to new AGP cards from ATI released to the market. Now it's NVIDIA's turn and they hit the undying AGP crowd with the successor of the 7800GS and another, even faster card.
The Germans from 3DCenter.de found out that with the latest NVIDIA Forceware 101.41 Beta driver you not only get problems when trying to run Doom 3 and Battlefield 2142 under Vista but NVIDIA is disclosing the upcoming 7900GS and 7950GT GeForce cards with it.
But the best part of the story is that at least a 7950GT card from XFX is already listed at a pricewatcher for a mere 226 Euros. Of course it's not in stock yet but surely not far from really being around.

NVIDIA 8600GT and 8600GTS Pictured

OCWorkbench has found some pictures and specs of NVIDIA's mainstream DirectX 10 cards, the 8600GT and the 8600GTS. The GT (pictured below on the left) is the replacement for the 7600GT and will feature the G84-300 GPU running at 540MHz and either 128MB or 256MB of 128-bit GDDR3 RAM at 1400MHz, priced between $150 and $180. The 8600GTS (below on the right) is a step up from the GT and will replace the 7900GS, using the G84-400 GPU running at 675MHz and either 256MB or 512MB of 128-bit GDDR3 RAM at 2000MHz, with a price of between $200 and $250. As reported before on techPowerUp!, these cards should be released (along with the 8500GT) on April 17th.

NVIDIA releases G80 Quadro cards

NVIDIA Corporation, the worldwide leader in programmable graphics processor technologies, yesterday unveiled a new line of professional graphics solutions: NVIDIA Quadro FX 4600, Quadro FX 5600, and NVIDIA Quadro Plex VCS Model IV. Armed with the largest increase in GPU power and functionality to date, these solutions are designed to help solve the world's most complex professional graphics challenges.

Tackling the extreme visualization challenges of the automotive styling and design, oil and gas exploration, medical imaging, visual simulation and training, scientific research, and advanced visual effects industries, these new Quadro solutions offer:
  • Next-Generation Vertex and Pixel Programmability-Shader Model 4.0 enables a higher level of performance and ultra-realistic effects for OpenGL and DirectX 10 professional applications
  • Largest Frame Buffers-Up to 1.5 GB frame buffers deliver throughput needed for interactive visualization and real-time processing of large textures and frames, enabling the superior quality and resolution for full-scene antialiasing (FSAA)
  • New Unified Architecture-Industry-first unified architecture capable of dynamically allocating compute, geometry, shading and pixel processing power for optimized GPU performance
  • GPU Computing for Visualization-Featuring NVIDIA CUDA technology, developers are, for the first time, able to tap into the high-performance computing power of Quadro to solve complex, visualization problems

NVIDIA isn't the only graphics company short of Vista drivers

While NVIDIA nearly got sued over their lack of Vista-ready drivers for their G80, ATI isn't exactly innocent. The Inquirer did a quick experiment to see if it was possible to configure a Vista workstation with an ATI FireGL graphics card. To their surprise, it wasn't. This is because ATI does not have any FireGL drivers compatible with Windows Vista. And unlike NVIDIA, they do not even have beta drivers out. While most of the gaming community is more likely to use an NVIDIA G80 than an ATI FireGL, this is still a major problem for anyone relying on a FireGL based workstation.

NVIDIA to Launch GeForce 8600 Series on April 17th

NVIDIA is set to launch the mainstream 8600GTS (G84-400) and 8600GT (G84-300), as well as the 8500GT (G86-300), on the 17th of April. The GeForce 8600GTS and 8600GT will have 256MB GDDR3 memories onboard each, both sporting a 128-bit memory interface but no HDMI yet. The GeForce 8600GTS is meant to replace the 7950GT and 7900GS, while the latter will replace the 7600GT. The 8500GT aims to replace the 7600GS.

The 8600GTS will be clocked at 700MHz core / 2GHz memory and comes with dual D-DVI, HDTV, HDCP, but requires external power. Price is estimated between US$199-$249. Another mainstream model, the 8600GT, will be clocked at 600MHz core / 1.4GHz memory and has 2 variants; one with HDCP (G84-300) and the other without (G84-305). This model doesn't requires any external power. It will be priced between US$149-$169.

The last model meant for the budget segment is actually a G84 core but downgraded to meet the value segment pricing structure. The 8500GT will be clocked at 450MHz core / 800MHz 256MB DDR2 memory and comes in 2 variants; one with HDCP (G86-300) and the other without HDCP (G86-305). The 8500GT should see a retail price between US$79 to US$99. The 8300GS, which will be released towards the end of April, is expected to replace the current 7300 series.

The NVIDIA 80nm G84 and G86 line-up will meet head on with ATi's DX10 65nm offerings, where the mainstream RV630 is slated to arrive in May and the value RV610 is slated to arrive earlier in April.

nForce 680i LT SLI for Hardcore Gamers on March 12

As CeBIT is approaching, more and more new products are being finalized for launch. Along with MCP68 chipset, NVIDIA is going to launch the nForce 680i LT SLI chipset at the event, primarily targeted at hardcore gamers, whereras the current 680i SLI chipset is targeted at hardcore enthusiasts. nForce 680i LT SLI boards will be some US$50 cheaper than the 680i SLI boards, where the MSRP is around US$199 compared to US$249+.

The main changes are that nForce 680i LT SLI reference boards will come with active cooling instead of the heat-pipe design currently used on the 680i SLI reference board, a green PCB instead of a black PCB, will support DDR2-800 instead of DDR2-1200 SLI memory, 8 USB 2.0 ports instead of 10, one Gigabit Ethernet instead of two, two PCIe x16 slots instead of 3, and without all the neat stuff like LED POST codes, Power/Reset buttons and Speaker.

NVIDIA mentioned that overclocking on 680i LT SLI won't be as good as the 680i SLI, but there are strong reasons to believe that the chipset is basically the same unless the company has done some sort of sorting/binning on the chipsets. Will this be the budget OC king?

Zotac - New Brand of Graphic Cards


Zotac is one of the latest graphics card maker to enter the video market. It will appear at this year's CEBIT in Germany, at Booth B27 Hall 20. In fact, Zotac is subdivision company formed out of PC Partner, which will produce only NVIDIA GPU-based graphics cards. All our readers interested in the new Zotac brand, can click here and read the first Zotac GeForce 8800GTS 320MB review on the net.

No G8x AGP chip in the end?

Sad news for all AGP based motherboard owners (including myself) if it turns out to be true. The VR Zone is reporting that on the basis of their foundings NVIDIA won't develop a new AGP chip. The reason is that the G80 'simply' can't support it (though a bridge chip should implement the required compatibility if you ask me). That leads us all to hope for a R600 solution it seems.

G90 will be a 65nm G80 with 512-bit GDDR4

Or at least, that's the current rumor. While we debate the current R600 rumors, The Inquirer is claiming that their "senior industry sources" have let loose the first G90 details rumors. The G90 will undertake the monumental task of putting the G80 through a die shrink. If all goes well, this will allow for very high clocks, much lower power consumption, and a lower production cost. NVIDIA also hopes to get hold of some GDDR4 for the G90, and will put it on a 512-bit bus.
Return to Keyword Browsing
Jan 7th, 2025 22:15 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts