• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA tells us the truth about CrossFire

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,817 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
ATI's competitor NVIDIA has sat down and put together a presentation dealing with the shortcomings of CrossFire. Problems like limited resolution and game issues are listed. Not only do they cover the CrossFire video cards, but also problems with ATI's dual VGA motherboard chipsets.

Show full review
 
Last edited:

kRaZeD

New Member
Joined
Aug 20, 2004
Messages
101 (0.01/day)
Location
Melbourne, Australia
Processor E6600
Motherboard ASUS P5W DH Deluxe
Cooling Zalman CNPS-7700Cu
Memory 2gb Corsair XMS2-6400
Video Card(s) 7900GT
Storage 700gb across 4 drives.
Display(s) CMV 221D
Case Antec P180
Audio Device(s) X-fi Moosic
Looks like a lot of propaganda to me. nVidia have had alot of time to get things right on their mother board chipsets, but as i havent heard much about ATi boards (hell, nothing about them except for the xpress 200) i think ATi are doing pretty well for themselves!

And the second last slide: nTune, nvidia has it, and ATi doesnt... Guess why? cos its nVidia Tune...

Propaganda!

Anyone else like to comment?
 

wazzledoozle

New Member
Joined
Aug 30, 2004
Messages
5,358 (0.73/day)
Location
Seattle
Processor X2 3800+ @ 2.3 GHz
Motherboard DFI Lanparty SLI-DR
Cooling Zalman CNPS 9500 LED
Memory 2x1 Gb OCZ Plat. @ 3-3-2-8-1t 460 MHz
Video Card(s) HIS IceQ 4670 512Mb
Storage 640Gb & 160Gb western digital sata drives
Display(s) Hanns G 19" widescreen LCD w/ DVI 5ms
Case Thermaltake Soprano
Audio Device(s) Audigy 2 softmod@Audigy 4, Logitech X-530 5:1
Power Supply Coolermaster eXtreme Power Plus 500w
Software XP Pro
Propaganda. It is funny that they put "Ntune" as a feature to be supported or not supported by each lol.

Everyone says HDR wont work on 98xx/X8xx cards, yet in that preview of the Source lost coast posted here-
http://arstechnica.com/articles/culture/lostcoast.ars/1

The guy runs HDR on a 9600. So whats the real deal with SM 3.0?

On NVIDIA cards, it stores these values as 16-bit floating point data (what is known as a "linear color space," as opposed to separate RGB values), and on ATI cards stores it as a 4.12 integer value (again a linear space). The advantages of this technique are many. All DirectX9 hardware is supported, and because it does not require that much extra texture memory the performance hit is small even on an ATI 9600. Fog effects work, and so does FSAA. The only real downside is that HDR effects are not supported with refraction, so while light will still stream in nicely through stained-glass windows, you won't see the HDR effects when looking directly through the window at the sun. It is a small price to pay, however.

And if there was a problem with the USB controller a year ago, why the hell should they think it would be now? Nvidia is slinging mud.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.95/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
the USB problem was easily solved, they used an external USB 2.0 chip.
 
Joined
Nov 30, 2004
Messages
181 (0.02/day)
Location
UK, Lincoln
System Name Omicron
Processor Athlon X2 3800 @5000
Motherboard Asrock 939DualSATA2
Cooling Zalman
Memory 2x 1gb Corsair XMS DDR3500PRO 2-3-2-6 1t
Video Card(s) Radeon 3870
Storage Seagate 320gb
Display(s) BenQ FP241w
Case X-Blade Deluxe
Audio Device(s) Creative X-Fi Music
Power Supply Corsair HX620
Software WinXP SP2
Im waiting for the good part where ati sues nvidia for damages due to the lies spread in this public presentation ;).

Wouldnt be hard to hit back at nvidia though. Total lack of software support from nvidia to take advantage of their hardware and breaking support for their previous generation in doing so :| There the Creative (soundblaster) of the gpu industry.
 
Joined
Apr 11, 2005
Messages
2,722 (0.38/day)
Location
Canada
Processor AMD 5800x
Motherboard ASUS ROG Crosshair VIII Dark Hero
Cooling Custom Water Loop
Memory 32GB G.Skill Trident Z neo
Video Card(s) Sapphire Vega 64
Storage Sabrent Rocket 4 NVMEs / SSDs
Display(s) 27" AOC AG271QX
Case Phanteks Enthoo Evolve
Audio Device(s) SoundBlaster AE-7
Power Supply EVGA SuperNova 750
Software Windows 10
Thing that gets me is that they are compairing a 7800GTX to a X850XTPE...then they say the X850 is out of date. Well yes it is, but the 7800GTX is a brand new card. Just wait till the X1800 comes out and see what Nvidia has to say about ATI then.

-Dan
 

ShinyG

New Member
Joined
Sep 17, 2005
Messages
185 (0.03/day)
Location
Romania
System Name My Computer
Processor Intel 2600K@4.4Ghz
Motherboard ASUS Z68
Cooling Scythe Mugen
Memory 2x4Gb
Video Card(s) Sapphire HD7850
Storage 1xA-Datat SSD 64GB; 1x WD RE3 500GB; 1x Maxtor 750GB
Display(s) Dell Ultrasharp 2005FPW
Case Antec 900 + cable management + Zalman fan ctrl
Audio Device(s) Asus XONAR DS
Power Supply Corsair 520HX
Software Win 7 Pro
Benchmark Scores I've fried plenty o' parts when not OCing. Need to start doing it again to avoid problems...
My first post on this forum:

That article right there is a lot of doo-doo. Even my 9550GU-Extreme supports HDR, so a X800 "probably" suppots it too...

I don't know about "paperlaunch-ing", but GeCube just released their first X800GTO with Crossfire.

How dare they acuse ATi of lack of flexibility when their technology is the one that requiers both cards to have SLI enabled for that same SLI to work. How dare they say they have "two, full band 16x PCI-Xpress slots" when everybody knows that if you have 2 cards in SLI, each will only use it's PCI-Xpress at 8X!!! If they acuse ATi of "paperlaunch-ing" how the heck do they know that ATi's USB and PCI-Xpress are not running at full capacity!?! I've seen a test in the romanian edition of Chip magazine with an ATi Xpress 200 which trashed an Abit NF4 at default settings even though it was a preview version with very little BIOS settings...

What is tha "Nvidia nTune not present on ATi cards" poo: it's like saying that GeForce cards don't support ATi Hydravision or something....

And last, but not least: What the heck is "Mainstream Price Points"!?!
 

Polaris573

Senior Moderator
Joined
Feb 26, 2005
Messages
4,268 (0.59/day)
Location
Little Rock, USA
Processor LGA 775 Intel Q9550 2.8 Ghz
Motherboard MSI P7N Diamond - 780i Chipset
Cooling Arctic Freezer
Memory 6GB G.Skill DDRII 800 4-4-3-5
Video Card(s) Sapphire HD 7850 2 GB PCI-E
Storage 1 TB Seagate 32MB Cache, 250 GB Seagate 16MB Cache
Display(s) Acer X203w
Case Coolermaster Centurion 5
Audio Device(s) Creative Sound Blaster X-Fi Xtreme Music
Power Supply OCZ StealthXStream 600 Watt
Software Windows 7 Ultimate x64
You would have to be pretty foolish to accept information, at face value, from a corporation demeaning a competitor’s product. Show me third party research showing SLI is better than Crossfire, than I may believe it. However, nothing is certain until the product is even released. Shame on you Nvidia, you just made yourselves look bad.
 
Joined
Apr 11, 2005
Messages
2,722 (0.38/day)
Location
Canada
Processor AMD 5800x
Motherboard ASUS ROG Crosshair VIII Dark Hero
Cooling Custom Water Loop
Memory 32GB G.Skill Trident Z neo
Video Card(s) Sapphire Vega 64
Storage Sabrent Rocket 4 NVMEs / SSDs
Display(s) 27" AOC AG271QX
Case Phanteks Enthoo Evolve
Audio Device(s) SoundBlaster AE-7
Power Supply EVGA SuperNova 750
Software Windows 10
In the 9th piture, it saying how if you have a master card but if your running a 12p card, it will make the master use only 12p. In SLI you must have the identical card, so whats the difference? Plus its almost better because you will be able to buy the master card, and run it with say your X800pro, and then if you want to upgrade you can buy a faster card without needing to get a a new master card. With Nvidia you would have to buy two new cards. Not to put Nvidia down or anything, but they are just pulling everything they can think of to deter people from crossfire. As Polaris573 just said "Shame on you Nvidia, you just made yourselves look bad."

-Dan
 
Joined
Mar 16, 2005
Messages
7 (0.00/day)
Location
Atlantic Canada
Wow, NVIDIA is whack. I love their cards, but to spread lies and false rumors about ATI is rediculous.

Page 4.. You mean I can *only* have 14x Anti-Aliasing with my ATI's?? There is no way I could settle with less than 16x :mad:

"Enthusiasts don't buy old technology." Of course not, but they may just upgrade their 7800GTX to an X1800XT when it's released. We'll see.

Page 9.. They accuse Crossfire of *not* being more flexible. At least people DO have a choice to mix models/brands, and not be forced to use the exact precise SKU.

The only pages I found to have any warrant are maybe 11 and 12. If people have different display resolutions, then they may run into problems. But that's only if that's actually true.

Page 16.. They devote an entire page to a rant about PCI-E 1x lanes? Because we all have so many 1x cards lying around???

Page 18 was quite hilarious as well. nTune and Price Points. It's hard to compete prices to nF4 when CF is not even out yet :confused:

Cool slide though. Opened my eyes more about how foolish NVIDIA can be.
 
U

Unregistered

Guest
well cant say theyre lying 100%

Very biased, quite unprofessional, but then again i already knew how nvidia could be upset, personally im unhappy cuz r520 tunred out to be 16 pipes and i still cant find a crossfire board....

you have to admit though i read the review for some ati express chipsets and i have to admit only the turion chipsets are impressive to me.....crossfire seemed fairly rigid...but yeah its unfair cuz theyre still working on crossfire and im sure theyre gonna fix the problems as well if not faster than nvidia....personally? i think both companies are acting like pms'in schoolgirls...
 
Joined
Jul 2, 2005
Messages
514 (0.07/day)
Location
USA
System Name G-REX
Processor AMD Ryzen 5 5600X
Motherboard Gigabyte Aorus Master X570
Cooling Corsair H100i RGB Platinum SE + 4 x Corsair LL120 (White)
Memory 32GB Corsair Vengeance RGB Pro DDR4-3600 (White)
Video Card(s) Gigabyte Aorus Master RTX 3080 10GB
Storage 1TB Samsung 980 Pro NVME | 2TB Sabrent Rocket Q NVME
Display(s) 34.0" LG 34GP83A-B
Case Lian-Li Lancool 2 Mesh (White)
Audio Device(s) Logitech G560 | Corsair Virtuoso RGB Wireless
Power Supply Corsair RM850x
Mouse Corsair Dark Core Pro SE
Keyboard Corsair K70 MK2 (Cherry MX Brown) + White PBT Key Caps
Software Windows 10 Home
Having 2 video cards is simply dump.
 
U

Unregistered

Guest
I think that you have to understand that both ATI and NVIDIA have people sitting around just thinking up things about the competition. I would bet that both companies do it.
Any way that being said I am here for some Nvidia support… objectively

Go ahead and get a cup of coffee and something to eat this is long.

Slide 2 - Seems here that this could be the case however not knowing when the ATi cards will be released it is hard to judge. To the person that believes that they can buy a card that is crossfire capable… yes you can however you still need a master card. The ATI version will work with all previous x8XX cards just with a master card.

Slide 3 - This is a hard case to make for either company, True the 7800gxt in sli will undoubtedly beat the current generation of ATI cards in Crossfire ( based on current single card benchmarks where the 7800gtx beats all). However ati could have simple ment that if you own an ati card you will have the best performance with crossfire and an ATI chipset Mobo.

Slide 4 - On reason I know many people do not buy ATI cards is that they SUCK in OGL applications, this doesn’t mater much if you like windows however if you use Linux you don’t want to be using ATI. Technically 16x is better than 14x. It is also unclear weather or not ATI will have a competing solution to transparent AA

Slide 5 – HDR lighting does only work on Shader model 3 cards
http://www.microsoft.com/whdc/winhec/partners/shadermodel30_NVIDIA.mspx

Slide 6 – Nvidia Has changed this recently with a driver update. I don’t know when that page was written on ATI’s site but it is still there. Im sure they wanted people to know that. What you don’t see is the rest of the bulleted points, • CrossFire is an open platform that supports multiple components and graphics cards that can be mixed and matched in a single system. Competitive multi-GPU solutions are constrained to supporting identical graphics cards.
This is true however if you have a better card it gets dumbed down to the lover version of card you have. Also you can mix and match brands of NVIDIA cards as long as they are the same card.

Slide 7 – ahh who cares, Nvidia is running out of things to argue about, its all done automatically I don’t see why anyone should care.

Slide 8 – seems to me Nvidia is just pointing out the obvious. One thing how ever is that Nvidia can do all 4 modes in OGL and on all cards that support SLI, Looks like ATI will not beat Nvidia at doom3 at least lol

Slide 9 – yeah the dongle is more flexible. This one is most definitely FUD by Nvidia, put out there to get people away from the fact that Nvidia can’t dumb down a card. (Well I know you can with certain programs but that’s not the point)

Slide 10 – Well I believe that this slide is spot on.

Slide 11 and 12 – I don’t know if this is confirmed or not, but if it is true that is a very very big limitation. People with large lcd’s and high end CRT’s should probably shy away from Crossfire if they like playing at 1600x1200 or higher.

Slide 13 and on – I don’t care practically much about a mobos features as long as it is stable, offers good performance in its class and can OC like no other. I am very sorry that there Gigabit nic only allows them to dl pr0n at the same speed as me because your not on a gigabit network and your cable modem only works at 5Mbits.
 
U

Unregistered

Guest
This presentation was made to the press to look at when reviewing Crossfire parts as things to look for. It was never made to be presented to readers like this. I was in on one of these presentations and that is what they stated this part of the briefing was for.
 
U

Unregistered

Guest
I agree, readers need to remember this isn't targeted towards the customers. Ofcourse it's biased, it's from nVidia. Would you expect an unbiased response from Ati? It's the marketing people that are creating these slides afterall.
As for the mud-slinging, it's no different than the mud-slinging by fanboys, of which there are plenty here. Recalling old problems either company had when talking about new hardware (driver cheats, poor performance, etc.) is nothing new.
 
Joined
Apr 11, 2005
Messages
2,722 (0.38/day)
Location
Canada
Processor AMD 5800x
Motherboard ASUS ROG Crosshair VIII Dark Hero
Cooling Custom Water Loop
Memory 32GB G.Skill Trident Z neo
Video Card(s) Sapphire Vega 64
Storage Sabrent Rocket 4 NVMEs / SSDs
Display(s) 27" AOC AG271QX
Case Phanteks Enthoo Evolve
Audio Device(s) SoundBlaster AE-7
Power Supply EVGA SuperNova 750
Software Windows 10
Dark Ride said:
Having 2 video cards is simply dump.

I agree with you in some ways, Like if you have a 7800GTX, geting another one is not worth the money you pay for such a little performnace increase. But say to cheaper cards, like a 6600Gt for say, you can get close to the same performance a one single highend card, that cost more than the two cheaper ones. Its great if you want a performance increase without buying a newer more expensive card later on. You can just get the same card again, and it will be quite cheaper. But in most cases its just not worth it...

-Dan
 
D

Drenmi

Guest
ShinyG said:
How dare they say they have "two, full band 16x PCI-Xpress slots" when everybody knows that if you have 2 cards in SLI, each will only use it's PCI-Xpress at 8X!!!


Actually, there are motherboards using 2 chipsets, thus achieving 2 PCIe 16x slots.
 

ShinyG

New Member
Joined
Sep 17, 2005
Messages
185 (0.03/day)
Location
Romania
System Name My Computer
Processor Intel 2600K@4.4Ghz
Motherboard ASUS Z68
Cooling Scythe Mugen
Memory 2x4Gb
Video Card(s) Sapphire HD7850
Storage 1xA-Datat SSD 64GB; 1x WD RE3 500GB; 1x Maxtor 750GB
Display(s) Dell Ultrasharp 2005FPW
Case Antec 900 + cable management + Zalman fan ctrl
Audio Device(s) Asus XONAR DS
Power Supply Corsair 520HX
Software Win 7 Pro
Benchmark Scores I've fried plenty o' parts when not OCing. Need to start doing it again to avoid problems...
Drenmi said:
Actually, there are motherboards using 2 chipsets, thus achieving 2 PCIe 16x slots.

I knew that, but if I'm not mistaken there are no graphics cards to work in dual 16x (maybe Quadro's). And using two chipsets on one mobo just to overcome that is not exactly a thing to brag about...
 
U

Unregistered

Guest
One area 2 cards may be good for is 3D Rendering. I'm really curious to see what kind of an impact 2 cards will have on rendering times compared to 1 card. For people who use 3D Max or something similar 2 cards may be well worth the money if it really cuts down on render times.
 

IsaacS

New Member
Joined
Aug 21, 2005
Messages
9 (0.00/day)
Yes it may be Nvidia propaganda, but Nvidia is right. Crossfire is going to sink without trace and so it seems is the x520 or x1800xt as it is now known. Ati is in serious trouble.

Cossfire was originally launched in June or July this year - and now ATI have had to do a relaunch and there is still no availability. There are many issues with crossfire such as using mastercards, which dumb down the perfomance of your cards and incredibly crossifre doesn't support 1920x1200 at 60hz or greater which is just absurd.

Secondly, the X1800XT is beaten by Nvidia's 7800gtx in nearly all high resolution benchmarks and yet the x1800xt is going to cost $150 more at launch. Who, apart from ATI fan boys, is going to buy it?

Thirdly, to the idiot who said that they though ATi is doing well - maybe he should go back to school. Ati's share price has slumped by almost 50% this year in comparison to Nvida's which has more than doubled. Ati is also being sued in multiple class action suits - by its OWN shareholder - for misleading statements by its directors.

Conclusion - ATi has serious problems.
 

nutop

New Member
Joined
May 6, 2005
Messages
2 (0.00/day)
IsaacS said:
Secondly, the X1800XT is beaten by Nvidia's 7800gtx in nearly all high resolution benchmarks and yet the x1800xt is going to cost $150 more at launch. Who, apart from ATI fan boys, is going to buy it?

OH wow, because you know... x1800 benches are everywhere now that the nda has lifted and the card is out in stores.
 
Last edited:

Tera

New Member
Joined
Sep 19, 2005
Messages
2 (0.00/day)
It’s nice to see that nothing ever changes… ATI fan boys VS. nVidia fanatics, Intel VS. AMD and so on, it never ends.

And you wonder what camp I’m in, I’m in the “best performance for decent money” camp. And witch brand is the better right now? I don’t care! I can’t afford premium ones anyway :(

What makes you angry is that a document meant for journalists, made by nvidias sales- department to discredit ATI leaked out. We all know that sales departments are fishy. The reason ATI isn’t suing them is because they of course do the very same thing. Every body know that, so way are you upset?

As a comparison I’ve had to do with the sales department at the company I work for, regarding rolling out some advertising now and then. Being a technician I know how hard it is to teach a sales person what is good/bad about a product, especially compeered to the competitors. It really doesn’t mater how well you say it, they won’t understand and doesn’t care. Some of the info folders turned out okey, but some….. Man I feel ashamed knowing that I’ve been involved in the process of making them

My guess is that both nVidia and ATIs tech-teams regularly wonder when their sales departments sold their souls to the devil :rolleyes:
 
U

Unregistered

Guest
I agree,meant for the press

Beating a dead horse is what came to mind when I first read this,but without question this was meant for another audience.I am currentley a nvidia owner,actually 3dfx prior(many moons ago),because I had several bad expieriences with early ati cards,xpert99,and such with horrible driver support and performance."HOWEVER" I have a few friends with 9800 pros that made me drool back in the day,and wiped nvidia all over the place-ala 5900 series.Really,I find it refreshing that both of these companies can catch up to one another year after year,forcing each one to top the other,only benefiting the customer in the end.Now if amd and intel were on equal marketing grounds,I bet you would find that 4 ghz might already be here.Marketing and competition has it's obvious ugly side,but causes roadblocks to open unexpectedly.
 
U

Unregistered

Guest
unregistered said:
ATI ==> The best
NVidia ==> Wrong propagande


ATI ==> Top Dog sometimes

Nvidia ==>Top Dog sometimes

I could give a rat's hairy ballbag about branding. And I'm sure as hell not going to take the manufacturer's word for who has the best product.
 
Top