Wednesday, December 10th 2008
NVIDIA GeForce GTX 295 Spotted
VR-Zone scored the first photo of the upcoming GeForce GTX 295 card that's reported to make first appearance at next year's Consumer Electronics Show (CES). Unlike previous reports, the card will feature a sandwich design, like most dual GPU cards released by NVIDIA. Two 55nm GT200 CPUs will be incorporated in this card. From the picture we also see two DVI and one Display ports. The source also reports the card is using 8+6 pin connector combo to deliver external power. The pricing is yet to be disclosed, but card makers are speculating that NVIDIA will price it competitively against AMD 4870X2.
Source:
VR-Zone
96 Comments on NVIDIA GeForce GTX 295 Spotted
Trapped between PCBs... Yeah because the chips in the X2 aren't trapped between the PCB and the plastic protector that is at exactly the same distance right?? What is worse is that in the X2 the second GPU recieves hot and dirty* air and because of that it's much much hotter than the first one. With two PCBs both GPUs recieve fresh and clean air and are cooled much better. The fan keeps moving the air, so it doesn't matter how hot the air gets because there are 2 GPUs in the same space. The pace at which a fan moves the air is not relative to how hot the air is in one place.
* I don't know the words. Dirty air = convoluted, whirl air. It is the worst enemy of air cooling, that's why cable management is so important.
The 9800GX2 overclocked wonderfully, better than the X2's in fact. Assuming the GTX295 will be hot and won'0t overclock well because it follows the same fundations than the GX2, when this last one was cooler and oCed better than Ati's X2 cards is stupid.
Power consumption: the GTX260 consumes much less than the HD4870, why in hell would a 55nm GT200 consume more than the X2? Simply it won't and you have to compare it to that card. It will be power hungry of course, but considering against what it will compete, even mentioning the fact means your trying to say it will consume more, which is FALSE, or in the best case for your defense UNKNOWN.
2. Whoever said this won't overclock is a moron because nVidia will cripple the stock speeds to allow a massive overhead. It makes them look better.
3. The GTX260 does not consume less power than a correctly bios modded 4870.
4. This is all speculation, and so is your retort.
5. Nobody interested in benchmarking/performance cares about the environment.
6. The thing that got nVidia into issues with their sales was sticking to the old designs and not pushing into the new era. Why not really turn the tables and have a cooler on both sides of the card? Wait that'd kill slot SLI... lame. Still at least I'm THINKING.
7. A lot of what you've said about the 260 means that you have actual knowledge on the subject. Have you owned one?
8. Don't point fingers and call BS when I see a ton of flaws in a number of your statements.
9. I'm going 100% behind the statement "Don't knock it until you've tried it," and I honestly can't see dual card solutions beating SLI/Crossfire these days when the synthetic and real world tests prove having two single cards with a bridge clip is less energy efficient but produces better results. Go ahead, ask Fitseries, dark2099, and a bunch of other people here.
3- Irrelevant. It's stock what matters, very few people will deal with bioses. Anyway if it's so easy why hasn't Ati solved that in ANY of their newer cards????
5- Graphics cards are for gaming. I don't care what people like doing with them. I'm very interested in performance, but I'm very concerned about the environment too AND specially the price. 100w more on a card means $100 more per year of operation.
6- BS. Unless you are strictly talking about them sticking to 65nm.
7- I don't have to own every single card on the market. I have friends, they even live in the same city and all!! Incredible!! One of them owns a small store 50 m away from my home. He doesn't always have a card available, that he doesn't have to sell. But many times he would let me play with some of the builds he has to mount.
Whenever I can't test things myself or ask friends that I can see and touch (not by phone, email, forums, etc) I rely on reviews and benchmarks. Mostly those of Wizzard. With the incresed load of BS in forums in general, I don't believe anything a forum member says unless he has something to back the info, I don't care who he is. If it's not someone with some kind of legal or public responsability (like reviewers who are exposed) I don't care what he has to say.
8- Be specific.
9- GX2 was faster than 9800GTX SLI considering it's clocks are 30% lower (EDIT: sorry 23% I compared it to my brother's OC 9800). Faster clock for clock that is. I don't have to ask to know SLI/Crossfire is usually faster than dual cards, I know. I didn't say the contrary.
3. I don't care about what the average person does with their card because -I- have the ability to search on Google and find a number of like-minded people who are willing to put 20 minutes into solving a problem before sitting down and spanking off to Crysis.
5. See 3.
6. You're still living in the past along with nVidia's process... I'm talking about 65nm, using a IHS to keep noobs from crushing their GPUs, and their design.
7. Running a few tests with cards you're borrowing and lacking the real time to play around with to maximize performance is a really silly way to respond to me when I've stated that I believe in thorough experience with all of these new cards.
8. Give me a break... I recognized and responded to your whole statement in a point by point analysis/breakdown. I was very specific.
9. 9800s may not SLI as well as GTX260. In fact I'm pretty sure they don't SLI as well as GTX260.
1. 3. 5. When speaking about how a card IS, ACTUALLY HOW it is for 95% of the people is what matters. I don't care if you managed to make your Honda Civic faster than that other guy's Ferrari. Ferrari IS faster.
6. As I said 65nm yes. Everything else BS. Are you a GPU engineer? How do you know those things are REALLY of the past. You know, we still use wheels, because it's still the best solution. CHANGE != evolution. i.e RV670's ringbus was the OMFGL33T revolution until copying Nvidia in RV770 is the revolution now. Funny.
7. Running a few tests (4-6 hours with each piece of hardware, full review runs, 3DM06, Crysis, COD4, Bioshock, UT3, TF2...) that bring similar results as the ones present in reviews, yes I think is enough to get an idea. I don't make those things when the cards are supernew, so I already use the things that better worked for others. I only make one run per bench, just to check, more than looking everything for myself. It's always consistent with reviews until now, so I think I will continue believing in my "method", thank you very much. If something needs more than 5 hours to fix or find a solution, see point 1.
9. AND because of that the GTX295 can be much faster too. Point is who knows? It's not me saying the card WILL perform, I'm all the time speculating it CAN perform, in response to the guy who assured it won't. There's a VERI BIG difference. And it is you debating it won't again.
6. My father has worked for ATi. I go to him with a lot of questions and random info and we both love to look at new tech. That aside I don't need to be an engineer to know that the IHS on the GTX200 series of cards is causing ALL of the heat issues. If they didn't have it then they would be pwning ATi's little arse in terms of heat. Their design is last generation~ The difference between core architecture of the RV6xx and RV7xx is insane.
9. I addressed that in point 4 a few posts back. This is all speculation, but current history is leading me to believe that the power/architecture restrictions on dual cards is keeping them from reaching as high of a potential as two singular cards in SLI/Crossfire.
6. I studied 2 years of engineering, that doesn't make me an expert, but I do read a lot too. So I guess it just boils down to who has the bigger pennis? There's no need IMHO, when I'm always saying that I'm not sure of anything, as nothing is certain, I think it's this point of view the right one. On the other hand it is you and lemonadesoda saying HOW things ARE GOING to happen. Unless you have a cristal ball you should have not replied in the first place, because I just pointed out we did noy know nothing.
On the other hand, being that your father worked for Ati is not a surprise that you think what you think. It wouldn't matter if he was the lead engineer, you'd have only been told half the story. It's there where the key is. Ati believes in one way of doing things, Nvidia in the completely oposite one. Both have the best experts in the world. So the ABSOLUTE truth is Nvidia is wrong? Again you were only told half the story, or better said you just decided to believe half the story. I read both and I believe in both.
About the IHS, as you said is there for noobs to not break the card more than anything IMO. And it's not something to joke about, many friends broke their chips when changing the cooler. What are you going to say? That they were noobs and shouldn't have touched anything? They were noobs in fact, but they did take a lot of care. If you can't see the value an IHS can be for the masses, then your point 1. doesnt make sense at all, you are reducing even more the installed base of people that would run something different than stock. OH and what heat issues????
What heat issues??? Just look on Google "GTX280 IHS". There you'll find people in strange discussions talking about how their GTX280 is overheating to 113C because of incorrect contact with the IHS causing an insulating effect to the GPU. ATi does not have an IHS on their new chips and they are sold to the masses. Get a grip... I'm sorry your friends broke their chips. There are unfortunate accidents every day, but that's no reason to put another thermal barrier between the die and the cooler.
I didn't understand the first paragraph well, are you saying the HD4870 stomps the floor of GTX260s?? :roll: I wouldn't call stomp a 2% difference... www.techpowerup.com/reviews/Leadtek/GeForce_GTX_260_Extreme_Plus/26.html
DON'T look at the Leadtek card and come with the typical claim it's overcloekd, etc. Look at the OTHER GTX260, and in higher resolutions if you want to see the HD4870 ahead. Aaaand... That's right 2% wohhooo Ati OWNS. WOOOAAaaaaHHhhHH!!
And the thing is that noobs can buy factory overclocked Nvidia cards for the same price than those with stock clocks (that Leadtek for example). Same with Ati, don't get me wrong, but usual OC's of Nvidia are 15%, 20% etc, while Ati cards rarely surpasses the 10% mark.
Man you just lost my respect, we were having a good discussion, but you just ruined it for me with that one. Stomp. :shadedshu
Anyway a little bit more focused in the topic:
www.guru3d.com/article/core-i7-multigpu-sli-crossfire-game-performance-review/6
I was curious about how GTX260 and HD4870 did in dual card setups, so I did a table in Excel with those results for a better/easier comparison. It will surprise you and many others I'm sure.
In short, I was right. It's 19% faster overall in newest games and up to 57% (:eek:) at 2560x1600. I'm sure 8.12s improves performance a bit, but until I see facts, for me that's what matters. Specially with Ati, that always "improves performance in newer driver releases", but I've been seing driver comparisons along the year in many sites that proved it false, as it being an overall improvement.
I've never heard of anyone on these forums (not the schmucks you seem to cling to as your comparison for gamers/people who would use these cards) have an ATI card overheat on them at stock. Hell! One time while I had my water cooling setup on my graphics cards (4870 in crossfire) the motor in my pump melted and the cards hit TJMax and shut my PC down, but it never killed the cards. After all was said and done I reattached the air cooling and they ran perfectly until I could RMA the pump. No failure. Why attack me about ATi cards overheating when you KNOW it's a fan speed fix away from good operating temperatures?
What I've been saying the WHOLE TIME has been that scaling between crossfire and SLI has become so good that single PCI-E slot solutions with dual GPU are less powerful than two single cards on their own. You won't focus on the point I'm making.
While I am dispelling some of the crap you're spewing out about these cards without ever tweaking and maximizing a card out for yourself I am not here to argue about reviews! I could care less because every review I've come across has been different. On other reviews I've seen Guru3D.com has a ton of good reviews and sheets showing which card is the biggest and baddest. The 4870s 512 compare neck and neck with a GTX260 192, but the GTX260s always need a bit of an increase as their stock clocks are 576/1242/2000. You would change the values and you'd change the fan speeds to get the most out of your card, so stop being a hypocrite. If someone can crank the crap out of two single cards and get higher #s in frames or benches that makes the dual card solution only appealing for restrictive PCI-E configurations. Give up the attitude and face facts. You're showing that SLI/Crossfire is improving because the motherboard is improving and I'm seeing through results with forum members that it's gone beyond the bridge chip!
Do me a huge favor and stick to the point. I believe the GTX295 will not hold up against 2x GTX260 216 in clocks/temps/results.
1- Chip in 65nm incarnation already scales better in dual card configs, beyond what the X2 and RV770 crossfire can do, while consuming less.
2- news/rumors from respectable sources that told that a 55nm 240 SP GT200 based Quadro card with 4GB of ram running at 650 Mhz, has 160W TDP compared to 234w of it's 65nm daddy or 183W of the GTX260.
3- in the past generation both dual cards from both vendors were "close" to their crossfired single cards, but the GX2 was underclocked (600 vs 675) and the X2 was clocked above its single card cousin (825 vs 775), which suggests the G92 already scaled better in the GX2 than the X2, IN A TIME where crossfire by itself scaled much better than SLI.
With those precedents the GTX295 has all the chances to be a fast, cool and not so power hungry card. Deliberately forgetting about those facts, lemonade made his BS comments and you followed suit. I respond.
It doesn't matter if a multicard slution is faster, it has always been, it will always be (until they make it viewable as a single card, single frame buffer card), but a single card solution even if it's dual has a lot of benefits and is definately worth to exist. No more no less Crossfired HD4850s are as fast as the X2 and far cheaper, but many people here, that have to know that fact, chose to go with the X2. 2xGTX260 are going to be faster, probably, but the GTX295 has a place in the market, right next to the X2 and it will probably be better on every front as I explained. So why in the hell has the X2 the right to exist but not the 295? That's what I've been discussing. You should had read better if everything you were replying to was that 2xGTX260 will still be faster, I said so in my first reply after the one calling BS on lemonade. What a waste of time. But given the facts and the rumors I have no doubt it will be both cooler and consume less than 2xGTX260 on the other hand.
About the overheating issues, I don't care what you heard or whatnot, there's been a lot of those in these forums and you will find a lot just googling. Prior to your post I had never heard of that problem with the GTX280s instead and God knows that there are far less entries when you google it that with the RV770. I guess that means we are tied, I know you seem to think you are the beholder of ABSOTUTE TRUTH, but it's not the case. Just as with the opinion that Nvidia is in the past, you are not right here. RV770 has as many overheating issues as the GT200 and has no IHS, so couldn't it be maybe because of the increasing processing power than EVERY new generation card suffers from some overheating samples and that we don't need to find a guilt???? I ask.
EDIT: en.expreview.com/2008/12/09/geforce-gtx295-with-480sp-surely-to-come-in-ces-09.html
Maybe not the best source Expreview, they've been pretty right lately. 289w TDP, so much less than 2x260GTX and HD4870 X2.
CB