• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GT300 A Leap Forward for NVIDIA GPU Architecture

G-wiz

New Member
Joined
Dec 12, 2008
Messages
65 (0.01/day)
Location
Florida
System Name G-wiz
Processor Phenom II X3 720BE @ 33501Ghz
Motherboard MSI 790FX GD70
Cooling Cooler Master V8
Memory 4GB Patriot Viper DDR3 1333@ 1400Mhz
Video Card(s) 2 XFX 4850 512MB Crossfire X
Storage 2 sata 160 GB
Display(s) NEC MultiSync 1760VM running at resolution of 1280X1024
Case COOLER MASTER Centurion 590
Audio Device(s) OnBoard
Power Supply Corsair TX 850Watt PSU
Software Window 7 64Bit
I hate what Nvidia and ATi are doing. they can't let the user enjoy their video cards for at least half a year or more without coming out with new cards, whether it's a die shrink or an entirely new card.

This just left me pondering if I should buy a GTX260 216 or wait a whole year for GT300. If this could be described in words, fuckery would be the term used to describe both companies.
 
Joined
Jan 21, 2008
Messages
237 (0.04/day)
System Name PC2.1
Processor Intel i7 3770k @4.6GHZ
Motherboard MSI Z68A-GD80
Cooling Corsair H100i
Memory 16GB Corsair XMS 1866MHz
Video Card(s) SLI EVGA 780 Classifieds
Storage Samsung 830 250gb /Samsung EVO 840 120GB
Display(s) 3x Dell 27" IPS screens
Case Thermaltake T81 Urban
Power Supply Cooler Master V1000
Software Windows 8.1 64bit
sexy, this looks like the card to get.
GT212 will be a place holder and only be used to get to 40nm, but there are rumors already thatr its being scraped.
 
Joined
Sep 22, 2007
Messages
3,527 (0.56/day)
Location
UK
System Name Dream Weaver
Processor Ryzen 5600
Motherboard B450 MSI Gaming Pro Carbon AC
Cooling NH-D14
Memory 16GB Flare X @ 3200MHz CL14
Video Card(s) Sapphire Vega 64 Nitro+
Storage 500GB evo 970 m.2 + 4TB HDDs
Display(s) ASUS 27" 1440p 144hz FreeSync
Case Fractal Design Define R3
Power Supply Corsair 850W
Software Windows 10 x64
I hate what Nvidia and ATi are doing. they can't let the user enjoy their video cards for at least half a year or more without coming out with new cards, whether it's a die shrink or an entirely new card.

This just left me pondering if I should buy a GTX260 216 or wait a whole year for GT300. If this could be described in words, fuckery would be the term used to describe both companies.

I kind of agree with you, it makes one hesitant to buy new hardware when there's always something new and better around the corner. But this is the computer industry and change is always a good thing. If they didn't bother bringing cards out all the time we'd still be on 90nm x850xt's lol.

I hope this new architecture brings some good performance gains and I also hope AMD have something good to counter them.
 

Woody112

New Member
Joined
Jan 18, 2008
Messages
562 (0.09/day)
Location
Florida
System Name Woody's MBP
Processor 2.4 C2D
Cooling Frikin Air
Memory Munskin 2x2 gig 1066mhz
Video Card(s) 9400gt/9600gt W/512mb
Storage 1TB WD scorpio blue
Display(s) 15.4" on lapie/ 24" Acer P241w on the wall/ 52" LCD TV
Software OSX/ Win 7
Lets see the next gen of ATI should have:
-shader clock
-512 bit bus
-MIMD
-Physics
Here's to my wishful thinking, because I serious doubt it's all going to happen.
I never like the small bus bandwidth of ati cards. And the fact they have never implemented a shader clock sucks. If they would keep pace with Nvidia and use the MIMD architecture and implement physics that would make one hell of a 5870x2. But ya wishful thinking.:rolleyes:
 

leonard_222003

New Member
Joined
Jan 29, 2006
Messages
241 (0.04/day)
System Name Home
Processor Q6600 @ 3300
Motherboard Gigabyte p31 ds3l
Cooling TRUE Intel Edition
Memory 4 gb x 800 mhz
Video Card(s) Asus GTX 560
Storage WD 1x250 gb Seagate 2x 1tb
Display(s) samsung T220
Case no name
Audio Device(s) onboard
Power Supply chieftec 550w
Software Windows 7 64
This bullshit Nvidia serves us is to stray away some ATI costumers , yeah we are going to give you life like graphics and 1 billion shaders operations pixels ....etc. whatever sounds good for idiots to take the bait.
End result is some jerky looking games and very few ones that look good.
Except the crysis series we have some ugly games and if we look at what ATI/Nvidia tells us about their graphics cards we should expect an orgasm or some life changing experience.
Probably the next GT300 will run crysis with all details even on full HD , wow , what an achievement.
Don't expect more than this people , it's more likely to see better graphics that will wow everyone on the next xbox or playstation 4 than you will see on computer , because we lack games and good people to bother making anything in the land where a game is pirated in hours after release.
This Nvidia people amaze me how stupid they think we are , well , some are , the ones that cheer CUDA but they don't use it in anything they do or the people that cheer physx but they didn't played a game in their life with physx , the brainwashed people.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,233 (7.55/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
This bullshit Nvidia serves us is to stray away some ATI costumers , yeah we are going to give you life like graphics and 1 billion shaders operations pixels ....etc. whatever sounds good for idiots to take the bait.

Probably the next GT300 will run crysis with all details even on full HD , wow , what an achievement.

If you look carefully, "spicing-up" the specs is what AMD has resorted to, the most in recent past: "320 stream processors", "GDDR4" (in reference to Radeon HD 2900 XT, in which despite specs that looked like the card would eat 8800 GTX (before its release), it fell flat, and eventually lost to even the $230 8800 GT), etc. And yes, that's what usually happens with releases like this, when the upcoming GPU makes minced-meat out of the game that made the previous generation GPUs sweat. We saw that with Doom 3, where a single 8800 GTX was able to max-out Doom 3 at any resolution.
 
Joined
Aug 30, 2006
Messages
7,221 (1.08/day)
System Name ICE-QUAD // ICE-CRUNCH
Processor Q6600 // 2x Xeon 5472
Memory 2GB DDR // 8GB FB-DIMM
Video Card(s) HD3850-AGP // FireGL 3400
Display(s) 2 x Samsung 204Ts = 3200x1200
Audio Device(s) Audigy 2
Software Windows Server 2003 R2 as a Workstation now migrated to W10 with regrets.
nVidia HAVE to do this as a forward planning thing. With Intel's Larrabee coming out later 2009 (IIRC) then Intel will have a super scalar math device; so much more flexible than nV or ATI offerings.

With this nVidia will get a lot more flexibility in math, making CUDA much more powerful for GENERAL MATH rather than very specific SIMD math that it does now.

I'm not so sure how MIMD will help GPU rendering though. The "graphics pipeline" remains the same. However, it would allow CUDA AND graphics rendering to happen at the same time. (At the moment IIRC it can't. It can only do ONE thing at a time... so if you mix graphics and CUDA it needs to "swap" between math and graphics processing which is incredibly inefficient.)

If someone can explain how MIMD helps GRAPHICS performance, pls post.
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
19,082 (3.00/day)
Location
UK\USA
Dayum going be hard to save money for 9+ months lol....
 
Joined
Jan 2, 2008
Messages
3,296 (0.53/day)
System Name Thakk
Processor i7 6700k @ 4.5Ghz
Motherboard Gigabyte G1 Z170N ITX
Cooling H55 AIO
Memory 32GB DDR4 3100 c16
Video Card(s) Zotac RTX3080 Trinity
Storage Corsair Force GT 120GB SSD / Intel 250GB SSD / Samsung Pro 512 SSD / 3TB Seagate SV32
Display(s) Acer Predator X34 100hz IPS Gsync / HTC Vive
Case QBX
Audio Device(s) Realtek ALC1150 > Creative Gigaworks T40 > AKG Q701
Power Supply Corsair SF600
Mouse Logitech G900
Keyboard Ducky Shine TKL MX Blue + Vortex PBT Doubleshots
Software Windows 10 64bit
Benchmark Scores http://www.3dmark.com/fs/12108888
nVidia engineers are evil geniuses. They state a concept, ATi would try to simulate it, then ATi will fail. After 2 gens, ATi would get it right, then nVidia would bend time space warp again. lol
 
Last edited:

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,115 (6.63/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2Ă—BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
cannon fodder to me, aslong as it works good i dont care anymore, but if it comes with a 500 USD price tag, screw that.

lets say this if 1 company or 2 falls out of existence i will probably stop following computer tech and become a hermit on that front, go back to playing Console games.:roll::banghead::shadedshu
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,233 (7.55/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
nVidia engineers are evil geniuses. They state a concept, ATi would try to emulate it, then ATi will fail. After 2 gens, ATi would get it right, then nVidia would bend time space warp again. lol

If you look at the way R600 was built, that's far from emulating NVIDIA's DX10 GPU architecture. The same design, with a simple step-up in transistor counts yielded RV770, which still haunts NVIDIA.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,115 (6.63/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2Ă—BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
isnt that the truth, AMD loaded theirs up with SPs where NV decided to do a clock trick with their SPs, different techniques.
 
Joined
Apr 21, 2008
Messages
5,250 (0.87/day)
Location
IRAQ-Baghdad
System Name MASTER
Processor Core i7 3930k run at 4.4ghz
Motherboard Asus Rampage IV extreme
Cooling Corsair H100i
Memory 4x4G kingston hyperx beast 2400mhz
Video Card(s) 2X EVGA GTX680
Storage 2X Crusial M4 256g raid0, 1TbWD g, 2x500 WD B
Display(s) Samsung 27' 1080P LED 3D monitior 2ms
Case CoolerMaster Chosmos II
Audio Device(s) Creative sound blaster X-FI Titanum champion,Creative speakers 7.1 T7900
Power Supply Corsair 1200i, Logitch G500 Mouse, headset Corsair vengeance 1500
Software Win7 64bit Ultimate
Benchmark Scores 3d mark 2011: testing
looks new generation GPU's , and we wait the until the release it cuz seems come in2010 not in 2009
 
Joined
Jun 27, 2008
Messages
96 (0.02/day)
This Nvidia people amaze me how stupid they think we are , well , some are , the ones that cheer CUDA but they don't use it in anything they do or the people that cheer physx but they didn't played a game in their life with physx , the brainwashed people.

CUDA needed for PhysX acceleration. :roll:

As for PhysX, 3DMark Vantage? :banghead: :banghead:
 
V

v-zero

Guest
I'm getting whiffs of the FX architecture... All the chickens have been counted.
 
V

v-zero

Guest
I hate what Nvidia and ATi are doing. they can't let the user enjoy their video cards for at least half a year or more without coming out with new cards, whether it's a die shrink or an entirely new card.
How dare they push the frontiers of consumer technology, the thoughtless bastards!

:roll:
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.18/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
CUDA needed for PhysX acceleration. :roll:

As for PhysX, 3DMark Vantage? :banghead: :banghead:

I've also used CUDA to assist in 3D rendering. When there's extra features you can't call it anything but a luxury. :p
 
Joined
Nov 1, 2008
Messages
4,213 (0.72/day)
Location
Vietnam
System Name Gaming System / HTPC-Server
Processor i7 8700K (@4.8 Ghz All-Core) / R7 5900X
Motherboard Z370 Aorus Ultra Gaming / MSI B450 Mortar Max
Cooling CM ML360 / CM ML240L
Memory 16Gb Hynix @3200 MHz / 16Gb Hynix @3000Mhz
Video Card(s) Zotac 3080 / Colorful 1060
Storage 750G MX300 + 2x500G NVMe / 40Tb Reds + 1Tb WD Blue NVMe
Display(s) LG 27GN800-B 27'' 2K 144Hz / Sony TV
Case Xigmatek Aquarius Plus / Corsair Air 240
Audio Device(s) On Board Realtek
Power Supply Super Flower Leadex III Gold 750W / Andyson TX-700 Platinum
Mouse Logitech G502 Hero / K400+
Keyboard Wooting Two / K400+
Software Windows 10 x64
Benchmark Scores Cinebench R15 = 1542 3D Mark Timespy = 9758
Well... just another upgrade. Won't see any difference between this and current cards using dx9 and 10. We gotta wait till dx11 before games start to look better, and by the time that's out Nvidia will be on the 2nd gen and ATI would have brought something better out.
 
Joined
Jan 21, 2008
Messages
237 (0.04/day)
System Name PC2.1
Processor Intel i7 3770k @4.6GHZ
Motherboard MSI Z68A-GD80
Cooling Corsair H100i
Memory 16GB Corsair XMS 1866MHz
Video Card(s) SLI EVGA 780 Classifieds
Storage Samsung 830 250gb /Samsung EVO 840 120GB
Display(s) 3x Dell 27" IPS screens
Case Thermaltake T81 Urban
Power Supply Cooler Master V1000
Software Windows 8.1 64bit
So much wishfull thinking.
LOL some of you are right, we wont even have a need or uses for DX11 when this hits, and when the first WOW factor DX11 games hit, this card will be like the 8800GTX on crysis, lots of bitching going on.
So as cool as it will be to have the first topend DX11 GPU it will be just like all the rest, plays DX9/10 game maxed, but will be hard pressed to cope with the DX11 killers like "Crysis" 2 " The other Island", and its X-pac "Crysis, the cave we forgot about on the other side of the island".
 
Joined
Jun 20, 2008
Messages
2,873 (0.48/day)
Location
Northants. UK
System Name Bad Moon Ryzen
Processor Ryzen 5 5600X
Motherboard Asrock B450M Pro4-F
Cooling Vetroo V5
Memory Crucial Ballistix 32Gb (8gb x 4) 3200 MHz DDR 4
Video Card(s) 6700 XT
Storage Samsung 860 Evo 1Tb, Samsung 860 Evo 500Gb,WD Black 8Tb, WD Blue 2Tb
Display(s) Gigabyte G24F-2 (180Hz Freesync) & 4K Samsung TV
Case Fractal Design Meshify 2 Compact w/Dark Tempered Glass
Audio Device(s) Onboard
Power Supply MSI MPG A850GF (850w)
VR HMD Rift S
I hate what Nvidia and ATi are doing. they can't let the user enjoy their video cards for at least half a year or more without coming out with new cards, whether it's a die shrink or an entirely new card.

This just left me pondering if I should buy a GTX260 216 or wait a whole year for GT300. If this could be described in words, fuckery would be the term used to describe both companies.

I'm hoping you are planning to step up your resolution, because, as it is, your two 4870's in crossfire are drastic overkill for 1280x1024. Especially with that processor.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
While it's true all that has been said about DX11, remember that the news are not about that, it's about: "the GT300 will introduce the GPU to MIMD (multiple instructions multiple data) mechanism. This is expect to boost the computational efficiency of the GPU many-fold. The ALU cluster organization will be dynamic, pooled, and driven by a crossbar switch.".

That's not related to DX11 or any other API, that's how the GPU works internally and it's a HUGE improvement over SIMD.

I agree with lemonadesoda in that this might mostly affect GPGPU and very little to graphics processing, but that's assuming that the load balance is fairly efficient nowadays, which we really don't know. I think that we probably don't know enough about how exactly they work in that front. IMO it is assumed by most that Nvidia's SPs are very efficient, because they are centainly much more efficient than Ati ones when load balancing, "scalar" versus VLIW and all, and IMO that makes us believe that Nvidia's ones have to be above 90-95% efficiency. But it still remains the fact that maybe Nvidia ones are still below a 75% and if MIMD can increase that to around 90-95% that's already a 15-20% increase for free. Add into the mix what lemonade said about graphics+CUDA at the same time and also that the card will probably be able to perform a context change (from vertex to pixel, for example) in the same clock and we might be really getting somewhere.

Maybe this can help answering your last question lemonade? It's funny, because I thought this and almost convinced myself of that possibility, as I was writing... :laugh:
 

vampire622003

New Member
Joined
Mar 6, 2007
Messages
135 (0.02/day)
Location
Austin
Processor AMD Phenom X4 @ 2.5GHZ
Motherboard AsRock
Memory 5GB DDR2-800MHz @ 891MHz
Video Card(s) ATI HD 4850 GDDR3 512MB w/ ZEROTherm GX815 Cooler
Storage 2X Western Digital SATAII 320GB (RAID 0)
Display(s) Acer 17" LCD Model AL1716
Audio Device(s) Sound Blaster Audigy 2 ZS
Power Supply ORION 585Watt PSU
Software Windows 7 x64
There aren't even that many game sout for DX10 yet, lol, nor have we perfected it. :laugh:
 
Top