Wednesday, December 10th 2008

NVIDIA GeForce GTX 295 Spotted

VR-Zone scored the first photo of the upcoming GeForce GTX 295 card that's reported to make first appearance at next year's Consumer Electronics Show (CES). Unlike previous reports, the card will feature a sandwich design, like most dual GPU cards released by NVIDIA. Two 55nm GT200 CPUs will be incorporated in this card. From the picture we also see two DVI and one Display ports. The source also reports the card is using 8+6 pin connector combo to deliver external power. The pricing is yet to be disclosed, but card makers are speculating that NVIDIA will price it competitively against AMD 4870X2.
Source: VR-Zone
Add your own comment

96 Comments on NVIDIA GeForce GTX 295 Spotted

#51
SystemViper
cdawalllook at his picture is that a 3rd PCB? and i found the quad SLi connector they just dont have it as a plug on the PCB yet
If that is anything like the 9800GX2 there is 2 of those flat ribbon cables that connect both boards, so it looks like someone just forgot to connect that 2nd cable to the PCB
Posted on Reply
#52
MrMilli
Solaris17i think they will the cost of making it wont be too much as they know that wouldnt be cost effective no i think everything is already in place. its basically going to be the same design as the GX2 i can tell you that just looking at that pic and remembering what mine looks like taken apart. the only thing they are goig to do it move a couple capacitors to fit the biger core.
This thing will be twice as expensive to produce compared to a GX2 but they will sell it at the same price i guess. nVidia knows it won't be cost effective but they just wants to be the fastest again.
Posted on Reply
#53
cdawall
where the hell are my stars
SystemViperIf that is anything like the 9800GX2 there is 2 of those flat ribbon cables that connect both boards, so it looks like someone just forgot to connect that 2nd cable to the PCB
thats what i thought but looking at there design there is no spot for it.
Posted on Reply
#54
DarkMatter
lemonadesodaWhere's the debate? Did you determine it over? Hadn't realised you had been promoted to the discussion police.

This 295 monster isn't an evolutionary feat. It's a frankenstein monster. By its design it's clear it will suffer from heat problems (trapped between the PCB) limiting aftermarket coolers and OCability.

It's also going to be power hungry.

IMO, two GTX260's (55nm refresh) in SLI are a superior product combination. It will allow greater flexibility and heat distribution will probably allow for higher OC and the use of larger, quieter, coolers.

woot! this is the fastest card? it will rock? meh. This is mutton dressed as lamb :banghead:
Someone has to fight the increasing load of BS in the forums. Like everything in your post. BS. The same BS that was said once and again about the 9800GX2. The dual card solution is BY FAR a better solution than the one present in the X2 in terms of cooling. More expensive for what a cooling solution is ($15 Cheap, $25 expensive, that affects a card to sell for $510 instead of $500, woohoo big deal!) FAR BETTER. Not only is better in theory, but actual cooling performance of the GX2 compared to the X2 proved it to be much better.

Trapped between PCBs... Yeah because the chips in the X2 aren't trapped between the PCB and the plastic protector that is at exactly the same distance right?? What is worse is that in the X2 the second GPU recieves hot and dirty* air and because of that it's much much hotter than the first one. With two PCBs both GPUs recieve fresh and clean air and are cooled much better. The fan keeps moving the air, so it doesn't matter how hot the air gets because there are 2 GPUs in the same space. The pace at which a fan moves the air is not relative to how hot the air is in one place.

* I don't know the words. Dirty air = convoluted, whirl air. It is the worst enemy of air cooling, that's why cable management is so important.

The 9800GX2 overclocked wonderfully, better than the X2's in fact. Assuming the GTX295 will be hot and won'0t overclock well because it follows the same fundations than the GX2, when this last one was cooler and oCed better than Ati's X2 cards is stupid.

Power consumption: the GTX260 consumes much less than the HD4870, why in hell would a 55nm GT200 consume more than the X2? Simply it won't and you have to compare it to that card. It will be power hungry of course, but considering against what it will compete, even mentioning the fact means your trying to say it will consume more, which is FALSE, or in the best case for your defense UNKNOWN.
Posted on Reply
#55
SystemViper
I loved my 9800GX2, that thing cranked over 20k 3Dmark06 without much coaxing.
Posted on Reply
#56
phanbuey
TRIPTEX_MTLAlso that was then.... lol

If the claims ATI has made in the release notes for their latest drivers hold true the lead the GTX 260 216 has just got much smaller if not removed. This is all speculation but they are claiming up to 57% increase in FC2 with crossfire systems among other things.
:roll: It was... it was then? I don't doubt it... i think alot of xfire issues are caused by lag in AMD's driver department (which is located in the basement of the building - and their boss only goes down there to to steal their red stapler)
Posted on Reply
#57
Binge
Overclocking Surrealism
DarkMatterSomeone has to fight the increasing load of BS in the forums. Like everything in your post. BS. The same BS that was said once and again about the 9800GX2. The dual card solution is BY FAR a better solution than the one present in the X2 in terms of cooling. More expensive for what a cooling solution is ($15 Cheap, $25 expensive, that affects a card to sell for $510 instead of $500, woohoo big deal!) FAR BETTER. Not only is better in theory, but actual cooling performance of the GX2 compared to the X2 proved it to be much better.

Trapped between PCBs... Yeah because the chips in the X2 aren't trapped between the PCB and the plastic protector that is at exactly the same distance right?? What is worse is that in the X2 the second GPU recieves hot and dirty* air and because of that it's much much hotter than the first one. With two PCBs both GPUs recieve fresh and clean air and are cooled much better. The fan keeps moving the air, so it doesn't matter how hot the air gets because there are 2 GPUs in the same space. The pace at which a fan moves the air is not relative to how hot the air is in one place.

* I don't know the words. Dirty air = convoluted, whirl air. It is the worst enemy of air cooling, that's why cable management is so important.

The 9800GX2 overclocked wonderfully, better than the X2's in fact. Assuming the GTX295 will be hot and won'0t overclock well because it follows the same fundations than the GX2, when this last one was cooler and oCed better than Ati's X2 cards is stupid.

Power consumption: the GTX260 consumes much less than the HD4870, why in hell would a 55nm GT200 consume more than the X2? Simply it won't and you have to compare it to that card. It will be power hungry of course, but considering against what it will compete, even mentioning the fact means your trying to say it will consume more, which is FALSE, or in the best case for your defense UNKNOWN.
1. You've owned a 4870x2? Mine ON AIR, STOCK COOLER, got better temps than my GTX260.
2. Whoever said this won't overclock is a moron because nVidia will cripple the stock speeds to allow a massive overhead. It makes them look better.
3. The GTX260 does not consume less power than a correctly bios modded 4870.
4. This is all speculation, and so is your retort.
5. Nobody interested in benchmarking/performance cares about the environment.
6. The thing that got nVidia into issues with their sales was sticking to the old designs and not pushing into the new era. Why not really turn the tables and have a cooler on both sides of the card? Wait that'd kill slot SLI... lame. Still at least I'm THINKING.
7. A lot of what you've said about the 260 means that you have actual knowledge on the subject. Have you owned one?
8. Don't point fingers and call BS when I see a ton of flaws in a number of your statements.
9. I'm going 100% behind the statement "Don't knock it until you've tried it," and I honestly can't see dual card solutions beating SLI/Crossfire these days when the synthetic and real world tests prove having two single cards with a bridge clip is less energy efficient but produces better results. Go ahead, ask Fitseries, dark2099, and a bunch of other people here.
Posted on Reply
#60
DarkMatter
Binge1. You've owned a 4870x2? Mine got better temps than my GTX260 ON AIR, STOCK COOLER.
2. Whoever said this won't overclock is a moron because nVidia will cripple the stock speeds to allow a massive overhead. It makes them look better.
3. The GTX260 does not consume less power than a correctly bios modded 4870.
4. This is all speculation, and so is your retort.
5. Nobody interested in benchmarking/performance cares about the environment.
6. The thing that got nVidia into issues with their sales was sticking to the old designs and not pushing into the new era. Why not really turn the tables and have a cooler on both sides of the card? Wait that'd kill slot SLI... lame. Still at least I'm THINKING.
7. A lot of what you've said about the 260 means that you have actual knowledge on the subject. Have you owned one?
8. Don't point fingers and call BS when I see a ton of flaws in a number of your statements.
9. I'm going 100% behind the statement "Don't knock it until you've tried it," and I honestly can't see dual card solutions beating SLI/Crossfire these days when the synthetic and real world tests prove having two single cards with a bridge clip is less energy efficient but produces better results. Go ahead, ask Fitseries, dark2099, and a bunch of other people here.
1- Well that goes against nature, every benchmark out there proves the contrary. Ah only in one GPU as I said, but that's enough for crippling OCability.
3- Irrelevant. It's stock what matters, very few people will deal with bioses. Anyway if it's so easy why hasn't Ati solved that in ANY of their newer cards????
5- Graphics cards are for gaming. I don't care what people like doing with them. I'm very interested in performance, but I'm very concerned about the environment too AND specially the price. 100w more on a card means $100 more per year of operation.
6- BS. Unless you are strictly talking about them sticking to 65nm.
7- I don't have to own every single card on the market. I have friends, they even live in the same city and all!! Incredible!! One of them owns a small store 50 m away from my home. He doesn't always have a card available, that he doesn't have to sell. But many times he would let me play with some of the builds he has to mount.
Whenever I can't test things myself or ask friends that I can see and touch (not by phone, email, forums, etc) I rely on reviews and benchmarks. Mostly those of Wizzard. With the incresed load of BS in forums in general, I don't believe anything a forum member says unless he has something to back the info, I don't care who he is. If it's not someone with some kind of legal or public responsability (like reviewers who are exposed) I don't care what he has to say.
8- Be specific.
9- GX2 was faster than 9800GTX SLI considering it's clocks are 30% lower (EDIT: sorry 23% I compared it to my brother's OC 9800). Faster clock for clock that is. I don't have to ask to know SLI/Crossfire is usually faster than dual cards, I know. I didn't say the contrary.
Posted on Reply
#61
Binge
Overclocking Surrealism
1. On review sites they do not tweak the fan controls of either card, making them both suck.
3. I don't care about what the average person does with their card because -I- have the ability to search on Google and find a number of like-minded people who are willing to put 20 minutes into solving a problem before sitting down and spanking off to Crysis.
5. See 3.
6. You're still living in the past along with nVidia's process... I'm talking about 65nm, using a IHS to keep noobs from crushing their GPUs, and their design.
7. Running a few tests with cards you're borrowing and lacking the real time to play around with to maximize performance is a really silly way to respond to me when I've stated that I believe in thorough experience with all of these new cards.
8. Give me a break... I recognized and responded to your whole statement in a point by point analysis/breakdown. I was very specific.
9. 9800s may not SLI as well as GTX260. In fact I'm pretty sure they don't SLI as well as GTX260.
Posted on Reply
#62
DarkMatter
Binge1. On review sites they do not tweak the fan controls of either card, making them both suck.
3. I don't care about what the average person does with their card because -I- have the ability to search on Google and find a number of like-minded people who are willing to put 20 minutes into solving a problem before sitting down and spanking off to Crysis.
5. See 3.
6. You're still living in the past along with nVidia's process... I'm talking about 65nm, using a IHS to keep noobs from crushing their GPUs, and their design.
7. Running a few tests with cards you're borrowing and lacking the real time to play around with to maximize performance is a really silly way to respond to me when I've stated that I believe in thorough experience with all of these new cards.
8. Give me a break... I recognized and responded to your whole statement in a point by point analysis/breakdown. I was very specific.
9. 9800s may not SLI as well as GTX260. In fact I'm pretty sure they don't SLI as well as GTX260.
I find it funny this way of posting, haha:

1. 3. 5. When speaking about how a card IS, ACTUALLY HOW it is for 95% of the people is what matters. I don't care if you managed to make your Honda Civic faster than that other guy's Ferrari. Ferrari IS faster.

6. As I said 65nm yes. Everything else BS. Are you a GPU engineer? How do you know those things are REALLY of the past. You know, we still use wheels, because it's still the best solution. CHANGE != evolution. i.e RV670's ringbus was the OMFGL33T revolution until copying Nvidia in RV770 is the revolution now. Funny.

7. Running a few tests (4-6 hours with each piece of hardware, full review runs, 3DM06, Crysis, COD4, Bioshock, UT3, TF2...) that bring similar results as the ones present in reviews, yes I think is enough to get an idea. I don't make those things when the cards are supernew, so I already use the things that better worked for others. I only make one run per bench, just to check, more than looking everything for myself. It's always consistent with reviews until now, so I think I will continue believing in my "method", thank you very much. If something needs more than 5 hours to fix or find a solution, see point 1.

9. AND because of that the GTX295 can be much faster too. Point is who knows? It's not me saying the card WILL perform, I'm all the time speculating it CAN perform, in response to the guy who assured it won't. There's a VERI BIG difference. And it is you debating it won't again.
Posted on Reply
#63
Binge
Overclocking Surrealism
DarkMatterI find it funny this way of posting, haha:

1. 3. 5. When speaking about how a card IS, ACTUALLY HOW it is for 95% of the people is what matters. I don't care if you managed to make your Honda Civic faster than that other guy's Ferrari. Ferrari IS faster.

6. As I said 65nm yes. Everything else BS. Are you a GPU engineer? How do you know those things are REALLY of the past. You know, we still use wheels, because it's still the best solution. CHANGE != evolution. i.e RV670's ringbus was the OMFGL33T revolution until copying Nvidia in RV770 is the revolution now. Funny.

7. Running a few tests (4-6 hours with each piece of hardware, full review runs, 3DM06, Crysis, COD4, Bioshock, UT3, TF2...) that bring similar results as the ones present in reviews, yes I think is enough to get an idea. I don't make those things when the cards are supernew, so I already use the things that better worked for others. I only make one run per bench, just to check, more than looking everything for myself. It's always consistent with reviews until now, so I think I will continue believing in my "method", thank you very much. If something needs more than 5 hours to fix or find a solution, see point 1.

9. AND because of that the GTX295 can be much faster too. Point is who knows? It's not me saying the card WILL perform, I'm all the time speculating it CAN perform, in response to the guy who assured it won't. There's a VERI BIG difference. And it is you debating it won't again.
1 & 7. Just to be fair I'm comparing cars like Mustangs to Ferraris here... Not a every day 4 banger to a high end horsepower machine. Be fair. In terms of graphics, both are high end cards. That is a poor analogy as I don't need to gut my graphics card to make it run faster. Please be fair to the enthusiast here as well. Some of us live, breathe, and eat tech for breakfast and if it came completely bundled with all the answers and no mystery then we would be out of a hobby. Just because you're unwilling to make something work doesn't mean someone will and do it better than your lazy solution.

6. My father has worked for ATi. I go to him with a lot of questions and random info and we both love to look at new tech. That aside I don't need to be an engineer to know that the IHS on the GTX200 series of cards is causing ALL of the heat issues. If they didn't have it then they would be pwning ATi's little arse in terms of heat. Their design is last generation~ The difference between core architecture of the RV6xx and RV7xx is insane.

9. I addressed that in point 4 a few posts back. This is all speculation, but current history is leading me to believe that the power/architecture restrictions on dual cards is keeping them from reaching as high of a potential as two singular cards in SLI/Crossfire.
Posted on Reply
#64
DarkMatter
Binge1 & 7. Just to be fair I'm comparing cars like Mustangs to Ferraris here... Not a every day 4 banger to a high end horsepower machine. Be fair. In terms of graphics, both are high end cards. That is a poor analogy as I don't need to gut my graphics card to make it run faster. Please be fair to the enthusiast here as well. Some of us live, breathe, and eat tech for breakfast and if it came completely bundled with all the answers and no mystery then we would be out of a hobby. Just because you're unwilling to make something work doesn't mean someone will and do it better than your lazy solution.

6. My father has worked for ATi. I go to him with a lot of questions and random info and we both love to look at new tech. That aside I don't need to be an engineer to know that the IHS on the GTX200 series of cards is causing ALL of the heat issues. If they didn't have it then they would be pwning ATi's little arse in terms of heat. Their design is last generation~ The difference between core architecture of the RV6xx and RV7xx is insane.

9. I addressed that in point 4 a few posts back. This is all speculation, but current history is leading me to believe that the power/architecture restrictions on dual cards is keeping them from reaching as high of a potential as two singular cards in SLI/Crossfire.
1. I just wanted to be sure the point was caught. But I again emphatize that it's important, if you really want to be fair, to say things as they are and as they will be for most pleople (you can always and "but when..." after that). Enthusiast don't need to know what is faster than what and when, they learn it themselves. Newbies that enter this kind of forums everytime they need to buy their next card with their hard earned money (or 18 months of mommy's pay) does. For every enthusiast that posts here 100 visitors come in and read what we post here. Thse guys won't mod, probably won't even oC, so saying xxxx is better based on anything but what they can buy in stores, is missleading, and because they know nothing they will follow what's being said here as a Bibble. You want to be fair? BE fair then.

6. I studied 2 years of engineering, that doesn't make me an expert, but I do read a lot too. So I guess it just boils down to who has the bigger pennis? There's no need IMHO, when I'm always saying that I'm not sure of anything, as nothing is certain, I think it's this point of view the right one. On the other hand it is you and lemonadesoda saying HOW things ARE GOING to happen. Unless you have a cristal ball you should have not replied in the first place, because I just pointed out we did noy know nothing.

On the other hand, being that your father worked for Ati is not a surprise that you think what you think. It wouldn't matter if he was the lead engineer, you'd have only been told half the story. It's there where the key is. Ati believes in one way of doing things, Nvidia in the completely oposite one. Both have the best experts in the world. So the ABSOLUTE truth is Nvidia is wrong? Again you were only told half the story, or better said you just decided to believe half the story. I read both and I believe in both.

About the IHS, as you said is there for noobs to not break the card more than anything IMO. And it's not something to joke about, many friends broke their chips when changing the cooler. What are you going to say? That they were noobs and shouldn't have touched anything? They were noobs in fact, but they did take a lot of care. If you can't see the value an IHS can be for the masses, then your point 1. doesnt make sense at all, you are reducing even more the installed base of people that would run something different than stock. OH and what heat issues????
Posted on Reply
#65
PCpraiser100
OMG its the 7900GX2's successor!!!!!! ROFL!!!
Posted on Reply
#66
SteelSix
AsRockYeah thats a 3rd PCB but it's only about 1 inch long lol.
Looks like aluminium to me, part of the cooler..
Posted on Reply
#67
Binge
Overclocking Surrealism
DarkMatter1. I just wanted to be sure the point was caught. But I again emphatize that it's important, if you really want to be fair, to say things as they are and as they will be for most pleople (you can always and "but when..." after that). Enthusiast don't need to know what is faster than what and when, they learn it themselves. Newbies that enter this kind of forums everytime they need to buy their next card with their hard earned money (or 18 months of mommy's pay) does. For every enthusiast that posts here 100 visitors come in and read what we post here. Thse guys won't mod, probably won't even oC, so saying xxxx is better based on anything but what they can buy in stores, is missleading, and because they know nothing they will follow what's being said here as a Bibble. You want to be fair? BE fair then.

6. I studied 2 years of engineering, that doesn't make me an expert, but I do read a lot too. So I guess it just boils down to who has the bigger pennis? There's no need IMHO, when I'm always saying that I'm not sure of anything, as nothing is certain, I think it's this point of view the right one. On the other hand it is you and lemonadesoda saying HOW things ARE GOING to happen. Unless you have a cristal ball you should have not replied in the first place, because I just pointed out we did noy know nothing.

On the other hand, being that your father worked for Ati is not a surprise that you think what you think. It wouldn't matter if he was the lead engineer, you'd have only been told half the story. It's there where the key is. Ati believes in one way of doing things, Nvidia in the completely oposite one. Both have the best experts in the world. So the ABSOLUTE truth is Nvidia is wrong? Again you were only told half the story, or better said you just decided to believe half the story. I read both and I believe in both.

About the IHS, as you said is there for noobs to not break the card more than anything IMO. And it's not something to joke about, many friends broke their chips when changing the cooler. What are you going to say? That they were noobs and shouldn't have touched anything? They were noobs in fact, but they did take a lot of care. If you can't see the value an IHS can be for the masses, then your point 1. doesnt make sense at all, you are reducing even more the installed base of people that would run something different than stock. OH and what heat issues????
Whoa whoa whoa... My father has worked for a bunch of companies, and don't get the wrong impression. He worked with them back when the Rage 128 was being put together. I'm using that as a reference to sources I have for information. If you're saying you want a completely fair comparison than a 4870 will wax and stomp the floor with 260s on the market that are running stock speeds. Give the general population of gamers a bit more respect. Not all of them, myself included, suck on mom's teet and buy a card only to learn nothing about it.

What heat issues??? Just look on Google "GTX280 IHS". There you'll find people in strange discussions talking about how their GTX280 is overheating to 113C because of incorrect contact with the IHS causing an insulating effect to the GPU. ATi does not have an IHS on their new chips and they are sold to the masses. Get a grip... I'm sorry your friends broke their chips. There are unfortunate accidents every day, but that's no reason to put another thermal barrier between the die and the cooler.
Posted on Reply
#68
Bjorn_Of_Iceland
SteelSixLooks like aluminium to me, part of the cooler..
Yep its part of the cooler. check the other side.
Posted on Reply
#69
DarkMatter
BingeWhoa whoa whoa... My father has worked for a bunch of companies, and don't get the wrong impression. He worked with them back when the Rage 128 was being put together. I'm using that as a reference to sources I have for information. If you're saying you want a completely fair comparison than a 4870 will wax and stomp the floor with 260s on the market that are running stock speeds. Give the general population of gamers a bit more respect. Not all of them, myself included, suck on mom's teet and buy a card only to learn nothing about it.

What heat issues??? Just look on Google "GTX280 IHS". There you'll find people in strange discussions talking about how their GTX280 is overheating to 113C because of incorrect contact with the IHS causing an insulating effect to the GPU. ATi does not have an IHS on their new chips and they are sold to the masses. Get a grip... I'm sorry your friends broke their chips. There are unfortunate accidents every day, but that's no reason to put another thermal barrier between the die and the cooler.
Ha! don't try to fool anyone, you won't. Search HD4870 or HD4850 overheating issues because there are much more of them. So why have the problems? There's no IHS there. LOL. What a valid point you made my friend. Anyway which performance card is free of some overheating samples nowadays?

I didn't understand the first paragraph well, are you saying the HD4870 stomps the floor of GTX260s?? :roll: I wouldn't call stomp a 2% difference... www.techpowerup.com/reviews/Leadtek/GeForce_GTX_260_Extreme_Plus/26.html

DON'T look at the Leadtek card and come with the typical claim it's overcloekd, etc. Look at the OTHER GTX260, and in higher resolutions if you want to see the HD4870 ahead. Aaaand... That's right 2% wohhooo Ati OWNS. WOOOAAaaaaHHhhHH!!

And the thing is that noobs can buy factory overclocked Nvidia cards for the same price than those with stock clocks (that Leadtek for example). Same with Ati, don't get me wrong, but usual OC's of Nvidia are 15%, 20% etc, while Ati cards rarely surpasses the 10% mark.

Man you just lost my respect, we were having a good discussion, but you just ruined it for me with that one. Stomp. :shadedshu

Anyway a little bit more focused in the topic:

www.guru3d.com/article/core-i7-multigpu-sli-crossfire-game-performance-review/6

I was curious about how GTX260 and HD4870 did in dual card setups, so I did a table in Excel with those results for a better/easier comparison. It will surprise you and many others I'm sure.

In short, I was right. It's 19% faster overall in newest games and up to 57% (:eek:) at 2560x1600. I'm sure 8.12s improves performance a bit, but until I see facts, for me that's what matters. Specially with Ati, that always "improves performance in newer driver releases", but I've been seing driver comparisons along the year in many sites that proved it false, as it being an overall improvement.
Posted on Reply
#70
Binge
Overclocking Surrealism
Alright. You've hit a magic button and I'm actually hurt. You've taken the point and twisted it so much you're even agreeing with me on a number of points! Why are you attacking me in a way that makes you SEEM right in a non-existing argument?

I've never heard of anyone on these forums (not the schmucks you seem to cling to as your comparison for gamers/people who would use these cards) have an ATI card overheat on them at stock. Hell! One time while I had my water cooling setup on my graphics cards (4870 in crossfire) the motor in my pump melted and the cards hit TJMax and shut my PC down, but it never killed the cards. After all was said and done I reattached the air cooling and they ran perfectly until I could RMA the pump. No failure. Why attack me about ATi cards overheating when you KNOW it's a fan speed fix away from good operating temperatures?

What I've been saying the WHOLE TIME has been that scaling between crossfire and SLI has become so good that single PCI-E slot solutions with dual GPU are less powerful than two single cards on their own. You won't focus on the point I'm making.

While I am dispelling some of the crap you're spewing out about these cards without ever tweaking and maximizing a card out for yourself I am not here to argue about reviews! I could care less because every review I've come across has been different. On other reviews I've seen Guru3D.com has a ton of good reviews and sheets showing which card is the biggest and baddest. The 4870s 512 compare neck and neck with a GTX260 192, but the GTX260s always need a bit of an increase as their stock clocks are 576/1242/2000. You would change the values and you'd change the fan speeds to get the most out of your card, so stop being a hypocrite. If someone can crank the crap out of two single cards and get higher #s in frames or benches that makes the dual card solution only appealing for restrictive PCI-E configurations. Give up the attitude and face facts. You're showing that SLI/Crossfire is improving because the motherboard is improving and I'm seeing through results with forum members that it's gone beyond the bridge chip!

Do me a huge favor and stick to the point. I believe the GTX295 will not hold up against 2x GTX260 216 in clocks/temps/results.
Posted on Reply
#71
Solaris17
Super Dainty Moderator
Easy boys...facts are facts and you can dispute those freely but all of this sarcasm and slight is getting a little too eadgy and as such your both getting closer to severing your necks...so lets try to keeping an amazing discussion a little more cival..kk?
Posted on Reply
#72
DarkMatter
BingeAlright. You've hit a magic button and I'm actually hurt. You've taken the point and twisted it so much you're even agreeing with me on a number of points! Why are you attacking me in a way that makes you SEEM right in a non-existing argument?

I've never heard of anyone on these forums (not the schmucks you seem to cling to as your comparison for gamers/people who would use these cards) have an ATI card overheat on them at stock. Hell! One time while I had my water cooling setup on my graphics cards (4870 in crossfire) the motor in my pump melted and the cards hit TJMax and shut my PC down, but it never killed the cards. After all was said and done I reattached the air cooling and they ran perfectly until I could RMA the pump. No failure. Why attack me about ATi cards overheating when you KNOW it's a fan speed fix away from good operating temperatures?

What I've been saying the WHOLE TIME has been that scaling between crossfire and SLI has become so good that single PCI-E slot solutions with dual GPU are less powerful than two single cards on their own. You won't focus on the point I'm making.

While I am dispelling some of the crap you're spewing out about these cards without ever tweaking and maximizing a card out for yourself I am not here to argue about reviews! I could care less because every review I've come across has been different. On other reviews I've seen Guru3D.com has a ton of good reviews and sheets showing which card is the biggest and baddest. The 4870s 512 compare neck and neck with a GTX260 192, but the GTX260s always need a bit of an increase as their stock clocks are 576/1242/2000. You would change the values and you'd change the fan speeds to get the most out of your card, so stop being a hypocrite. If someone can crank the crap out of two single cards and get higher #s in frames or benches that makes the dual card solution only appealing for restrictive PCI-E configurations. Give up the attitude and face facts. You're showing that SLI/Crossfire is improving because the motherboard is improving and I'm seeing through results with forum members that it's gone beyond the bridge chip!

Do me a huge favor and stick to the point. I believe the GTX295 will not hold up against 2x GTX260 216 in clocks/temps/results.
And as I told you in the first reply I KNOW it probably won't be faster!! It's a shame you are unable to read, if that's all what you are discussing. It doesn't matter if it doesn't, this is not a thread about GTX260 SLI, it's about the GTX295. 2x HD4870 are faster than a single X2 (not always though) and way cheaper right now, but doesn't mean the X2 is not worth of a card isn't it?? As I proved above the GTX260 scales better, that goes directly against the opinions that first, Crossfire scales better, and second that the GTX295 WILL be slower. Maybe yes, maybe not, but with the facts we already have the GTX295 has all the chances of being much better of a card. Assuming that the dual card won't scale as well comparatively to what the X2 is capable of is stupid. It doesn't need to be faster than GTX260 SLI, just the X2, that's what I've been discussing all the time. So far we know:

1- Chip in 65nm incarnation already scales better in dual card configs, beyond what the X2 and RV770 crossfire can do, while consuming less.
2- news/rumors from respectable sources that told that a 55nm 240 SP GT200 based Quadro card with 4GB of ram running at 650 Mhz, has 160W TDP compared to 234w of it's 65nm daddy or 183W of the GTX260.
3- in the past generation both dual cards from both vendors were "close" to their crossfired single cards, but the GX2 was underclocked (600 vs 675) and the X2 was clocked above its single card cousin (825 vs 775), which suggests the G92 already scaled better in the GX2 than the X2, IN A TIME where crossfire by itself scaled much better than SLI.

With those precedents the GTX295 has all the chances to be a fast, cool and not so power hungry card. Deliberately forgetting about those facts, lemonade made his BS comments and you followed suit. I respond.

It doesn't matter if a multicard slution is faster, it has always been, it will always be (until they make it viewable as a single card, single frame buffer card), but a single card solution even if it's dual has a lot of benefits and is definately worth to exist. No more no less Crossfired HD4850s are as fast as the X2 and far cheaper, but many people here, that have to know that fact, chose to go with the X2. 2xGTX260 are going to be faster, probably, but the GTX295 has a place in the market, right next to the X2 and it will probably be better on every front as I explained. So why in the hell has the X2 the right to exist but not the 295? That's what I've been discussing. You should had read better if everything you were replying to was that 2xGTX260 will still be faster, I said so in my first reply after the one calling BS on lemonade. What a waste of time. But given the facts and the rumors I have no doubt it will be both cooler and consume less than 2xGTX260 on the other hand.

About the overheating issues, I don't care what you heard or whatnot, there's been a lot of those in these forums and you will find a lot just googling. Prior to your post I had never heard of that problem with the GTX280s instead and God knows that there are far less entries when you google it that with the RV770. I guess that means we are tied, I know you seem to think you are the beholder of ABSOTUTE TRUTH, but it's not the case. Just as with the opinion that Nvidia is in the past, you are not right here. RV770 has as many overheating issues as the GT200 and has no IHS, so couldn't it be maybe because of the increasing processing power than EVERY new generation card suffers from some overheating samples and that we don't need to find a guilt???? I ask.

EDIT: en.expreview.com/2008/12/09/geforce-gtx295-with-480sp-surely-to-come-in-ces-09.html

Maybe not the best source Expreview, they've been pretty right lately. 289w TDP, so much less than 2x260GTX and HD4870 X2.
Posted on Reply
#73
Crazy Buddhist
Two PCB's why?
GFCHow can it have less power ? It's built on 55nm process, unless they down-clock it because of the heat - it's going to be faster.
The main concern for me is, why the hell did they use dual-PCB setup like last time? Because of that, there's barely any place for the fan, which means it's going to be a really hot card.
Maybe the dual PCB acts as two sides of a tunnel, if you consider fan placement and venting on these style cards, ducting the air past all the components on the two boards. The casing provides the rest of the tunnel enclosure. It seems to be a design that runs cool enough.

CB
Posted on Reply
#74
Solaris17
Super Dainty Moderator
AsRockYeah thats a 3rd PCB but it's only about 1 inch long lol.
haha yes the 1 inch long dvi connector side.
cdawallthats what i thought but looking at there design there is no spot for it.
their is they didnt connect it the connector for that ribbon is under the pcb
Crazy BuddhistMaybe the dual PCB acts as two sides of a tunnel, if you consider fan placement and venting on these style cards, ducting the air past all the components on the two boards. The casing provides the rest of the tunnel enclosure. It seems to be a design that runs cool enough.

CB
thats exactly what it does however as for cool i have some bad news my cores will load at ~91c but i have 2 gx2's side by side so i suppose that might have alot to do with it.
Posted on Reply
#75
Solaris17
Super Dainty Moderator
drew up how their supposed to connect for you guys wondering or not getting it.

Posted on Reply
Add your own comment
Dec 15th, 2024 05:30 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts