• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Plans to Nuke R700

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
19,114 (2.99/day)
Location
UK\USA
That's why I admire them as a company. Of course, it makes them susceptible to being overtaken, but that's the thing about it, they completely welcomed it in this case. NV isn't stupid. They know this stupid war will never be over until a 3rd party join their fray. They can afford to get beat up like this & it will make them stronger for it. It also means that we will be seeing some terribly kickass stuff from both companies in the next year or so :toast:

AMEN to that :rolleyes:
 
Joined
Feb 21, 2008
Messages
5,004 (0.81/day)
Location
NC, USA
System Name Cosmos F1000
Processor Ryzen 9 7950X3D
Motherboard MSI PRO B650-S WIFI AM5
Cooling Corsair H100x, Panaflo's on case
Memory G.Skill DDR5 Trident 64GB (32GBx2)
Video Card(s) MSI Gaming Radeon RX 7900 XTX 24GB GDDR6
Storage 4TB Firecuda M.2 2280
Display(s) 32" OLED 4k 240Hz ASUS ROG Swift PG32UCD
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR RM1000e 1000watt
Mouse G400s Logitech, white Razor Mamba
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
VR HMD Steam Valve Index
Software Win10 Pro, Win11
Nobody has an ATI bias, not me at least. Speculation and bias are unrelated anyway.

Its a play on words. Its saying that an ATi bias is created through rumors, not fact.


But I was joking. I prefer Nvidia, but I feel the best card for the money I ever bought was the 9800xt back in the day. I miss that ground breaking engineering.:( (it was a big jump in technology)
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,314 (7.52/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
with what shall we nuke it dear nvidia dear nvidia?

It's that "Hole in my bucket" poem, right?

With what shall we nuke it, dear NV, dear NV

With what shall we nuke it, dear NV, dear NV, with what?
.
.
.
.
.
G200b
.
.
.
.
R780 (Super RV700 X2)
.
.
.
D12U
saga continues. woops!
 
V

v-zero

Guest
I think nVidia may have to face facts that they simply cannot win with the GT200 based parts, be they 55nm or otherwise. It is a doomed architecture, the second coming of the FX series.
 
Joined
Feb 21, 2008
Messages
5,004 (0.81/day)
Location
NC, USA
System Name Cosmos F1000
Processor Ryzen 9 7950X3D
Motherboard MSI PRO B650-S WIFI AM5
Cooling Corsair H100x, Panaflo's on case
Memory G.Skill DDR5 Trident 64GB (32GBx2)
Video Card(s) MSI Gaming Radeon RX 7900 XTX 24GB GDDR6
Storage 4TB Firecuda M.2 2280
Display(s) 32" OLED 4k 240Hz ASUS ROG Swift PG32UCD
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR RM1000e 1000watt
Mouse G400s Logitech, white Razor Mamba
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
VR HMD Steam Valve Index
Software Win10 Pro, Win11
I think nVidia may have to face facts that they simply cannot win with the GT200 based parts, be they 55nm or otherwise. It is a doomed architecture, the second coming of the FX series.

Thats a little premature.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Thats a little premature.

Too premature, indeed. Especially considering how GT200 is quite better in performance-per-watt. They could very easily clock the GT200b higher or add some extra SP/TMU and still mantain the lead on power consumption. They could cut down some ROPs and clock the card higher... There are so much things they can do to return to competition...

It will be toughter for them to regain the price/perf lead and have loose profits, though. But it is very soon to speculate on this too. They are struggling in this front right now because of the low yields. It's impossible to predict if they will not have a lot better yields with GT200b.
 

Jansku07

New Member
Joined
May 24, 2008
Messages
171 (0.03/day)
Location
Finland
Processor 4000X2 @ 2,1Ghz w/ AC Freezer
Motherboard MSI K9AGM3-FD
Memory 2GB DDR2
Video Card(s) 3650 w/ 256MB GDDR3
Storage WD 320GB - 7200RPM
Display(s) DELL m991 /1024x768/ 85hz
Case resonating metal case
Power Supply FSP 350W -> 16A@12V
Software Vista Home Premium x86 SP2 FIN
Benchmark Scores oh noeees.
Especially considering how GT200 is quite better in performance-per-watt
How's that? G200 consumes +10% more power and is +10% faster than R770. I wouldn't call that superior to R770. Sure R770 idles higher, but it is a powerplay problem that can be fixed via drivers, not a flaw in architechture.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
http://www.techpowerup.com/reviews/Point_Of_View/GeForce_GTX_260/23.html

You have to compare GTX260 to the HD4870. The higher performance model will always draw more power, just as HD4850 is better than HD4870 and 8800 GT higher at the top, despite G92 not being as efficient as newer chips. An HD4870 to compete with GTX280 would not increase it's power consumption linearly, but exponentially, not much, not very exagerated but enough to make it clearly lose to GTX280 in perf/watt.
 
V

v-zero

Guest
Too premature, indeed. Especially considering how GT200 is quite better in performance-per-watt. They could very easily clock the GT200b higher or add some extra SP/TMU and still mantain the lead on power consumption. They could cut down some ROPs and clock the card higher... There are so much things they can do to return to competition...

It will be toughter for them to regain the price/perf lead and have loose profits, though. But it is very soon to speculate on this too. They are struggling in this front right now because of the low yields. It's impossible to predict if they will not have a lot better yields with GT200b.

Let me put it another way: GT200 will not be economically viable within its lifespan.
 

Jansku07

New Member
Joined
May 24, 2008
Messages
171 (0.03/day)
Location
Finland
Processor 4000X2 @ 2,1Ghz w/ AC Freezer
Motherboard MSI K9AGM3-FD
Memory 2GB DDR2
Video Card(s) 3650 w/ 256MB GDDR3
Storage WD 320GB - 7200RPM
Display(s) DELL m991 /1024x768/ 85hz
Case resonating metal case
Power Supply FSP 350W -> 16A@12V
Software Vista Home Premium x86 SP2 FIN
Benchmark Scores oh noeees.
The higher performance model will always draw more power
You're comparing the highest model (ATI) and the second highest model (NVIDIA) to eachothers. If we look at the difference between HD4850 and GTX260 you'll see that it's 3%. That fits well inside the error margin, so I wouldn't call that "quite better". The difference between HD4870 and GTX260 is 7%, not exactly groundshattering IMO. There might be a little advantage in perf/watt for NVIDIA, but it isn't big enough to turn the tables for g200 (especially consedering the size of the die -> perf/price).
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Let me put it another way: GT200 will not be economically viable within its lifespan.

We are not talking about GT200, GT200b will be more than viable. See most of the things people are saying now of GT200 were said about G80 and R600 in the past, especially of the latter. Then came G92 and RV670 which had almost the same architecture, but had big improvements in the fab process (many things in the marchitecture related to the process too) and were more "optimized": narrower memory controler, improved ALUs, etc.

Very few has been said about GT200b (confirmed by Nv, I couldn't care less of the rumors). We only know it will be 55nm. A die-shrunk GT200 they call. I remember a time when G92 and RV670 were also called just a die-shrink. And was true, to an extent, but they were a lot more. This "news" tell us that GT200b might have GDDR5 and DX10.1, and that would mean it will be a lot more than just a die-shrink. Much more than what G92 was to G80.

As to why they didn't make G92 in the first place and why they have repeated the "mistake" with GT200, I do have my theory. If you want my opinion, G80 was as it was and GT200 is as it is (same concept, more extreme), not because they were looking for gaming performance alone, but also for other aplication's performance too: CUDA and (IMO to more extent than what most may think) TESLA. I'd bet TESLA is as important for Nvidia as the workstation market is for Intel and AMD. Enthusiast here (and elsewhere) have the tendency of overestimate desktop market and underestimate the bussiness market. They will base their next TESLA on GT200 as they did with G80 (no G92 TESLA) and it requires some things that graphics don't require.

Those things (common claim: "You don't need a 512 bit interface, it doesn't affect gaming performance") are what made G80, R600 (was also made with GPGPU in mind) and GT200 very big and expensive, a better balanced for gaming chip is coming soon to fix that just as G92 came in the past. Until then you have what you have, and you can buy it or not. They don't have to please you all the time, they sell their product so you can buy it, but they don't owe you anything. Sincerely, people need to understand Nvidia and Ati (and Intel, AMD, etc...) are companies doing their bussiness, they don't owe us anything. In the case of Nvidia, GT200 is the product, which they made to implement on GTX cards as well as in Quadro and TESLA solutions, GT200b will only be desktop and fit better that (and only that) role. But GT200 on it's own is a good product, better than the competition in many ways except the price, if you don't like it don't buy it, but in no way it is comparable to the FX series. In fact, contrary to FX, GTX cards are faster than the competition, while being better in perf/watt, heat output and overclocking. Even only by die-shrinking it (without the aforementioned optimization that IMO is inevitable and was on Nv's mind from the start) they could fix the perf/price, because it will probably allow both lower prices and better clocks.

In the end all that I said is speculation, as we don't know anything and I don't necesarily believe in it. I just wrote it to counter your speculation. That way we stay neutral, your message is it will never be viable and my message is it could kinda own again. The real thing will be somewhere in the middle.

You're comparing the highest model (ATI) and the second highest model (NVIDIA) to eachothers. If we look at the difference between HD4850 and GTX260 you'll see that it's 3%. That fits well inside the error margin, so I wouldn't call that "quite better". The difference between HD4870 and GTX260 is 7%, not exactly groundshattering IMO. There might be a little advantage in perf/watt for NVIDIA, but it isn't big enough to turn the tables for g200 (especially consedering the size of the die -> perf/price).

Unlike you, I'm comparing the right cards. You have to compare the cards on the same performance level, no matter from which brand they are or where is their place in the lineup. Comparing GTX260 to HD4850 is like comparing and sports car to an utilitary. And 280/4870 like comparing the Ferrari Modena to a Maranello. Of course they will consume more they are faster and it's a lot more expensive (in $ and consumption) to increase performance the higher you go. Same with GPUs. There are physical limits and constraints on the perf/watt matter and because of that, the higher you aim the worse it will be. Nvidia aimed higher AND using a bigger fab process, it would be natural if they had a lot worse perf/watt, but in reality they have the better one. 55nm wil only increase that advantage. BTW that advantage seems to be architectural as it was present in G92/RV670 too. Look at the charts, G92 owns.
 
Top