• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Sues NVIDIA Over Chipset License, NVIDIA Responds

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.68/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
Funny, because mine (8800GT) is MUCH MUCH faster than my Quad. The more filters I enable on TMGENC 4.6 the bigger the difference is.

And once again, I'm not saying the CPU is not an essential part, and I have never interpreted Nvidia's words like saying that either. But think about this, Atom can't do anything, but when you pair it up with a more than modest GPU, it does wonders. What Nvidia said is that the days of $1000 PCs with a $500 CPU and $50 GPU are gone. 50%/50% ($250 each) gives much better results and not only for the gamers.
My quad runs at 3.6 or 4Ghz for encoding. And it walks away from my 8800GT the higher I have the H.264 profile.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
My quad runs at 3.6 or 4Ghz for encoding. And it walks away from my 8800GT the higher I have the H.264 profile.

I wasn't really talking about the profile, but the filters, de-interlacing, color correction...

But well, you are comparing an OCed $1000 CPU to a $100 GPU on the first attempt by TMGENC to do GPU transcoding. If that doesn't tell you something about my point I don't know what you want to hear...
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.68/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
I wasn't really talking about the profile, but the filters, de-interlacing, color correction...
I'm talking about it all. Very High profile with deinterlacing and detelecine.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I'm talking about it all. Very High profile with deinterlacing and detelecine.

I must admit that color corection and noise reduction are the things that most takes advantage of my GT and that is slow for the CPU and I always use them, because I'm usually not happy of how they look on my phone. It might be that.

I have performed some test on a small file with and without CUDA 2.0 enabled and some different filters and the results have been around the same: 7 mins 30 s with CUDA disabled and 5 mins with CUDA enabled, with a 25% of MAX CUDA usage. I usually get more usage on longer videos and a better ratio of performance in favor to the GT.
 
Last edited:
Joined
Dec 5, 2006
Messages
7,704 (1.18/day)
System Name Back to Blue
Processor i9 14900k
Motherboard Asrock Z790 Nova
Cooling Corsair H150i Elite
Memory 64GB Corsair Dominator DDR5-6400 @ 6600
Video Card(s) EVGA RTX 3090 Ultra FTW3
Storage 4TB WD 850x NVME, 4TB WD Black, 10TB Seagate Barracuda Pro
Display(s) 1x Samsung Odyssey G7 Neo and 1x Dell u2518d
Case Lian Li o11 DXL w/custom vented front panel
Audio Device(s) Focusrite Saffire PRO 14 -> DBX DriveRack PA+ -> Mackie MR8 and MR10 / Senn PX38X -> SB AE-5 Plus
Power Supply Corsair RM1000i
Mouse Logitech G502x
Keyboard Corsair K95 Platinum
Software Windows 11 x64 Pro
Benchmark Scores 31k multicore Cinebench - CPU limited 125w
i wouldnt count on it, code had to be forced to work on DX encoded hardware, Nvidia is in a pretty tough spot right now, i didnt know the agreement was so thin skinned. Well to answer your question, NV didnt have enough time to release a chipset at launch so they decided to use the 200SPP/MCP for SLI on the X58 ala Skulltrail, but with this suit NV may not be able to create a chipset for Intel Platforms again, just the slave chip for SLI functions (if intel will allow them to continue with that) All intel is doing is forcing others out so they can increase their overall revenue despite them getting a partial share from board makers due to socket designs etc, I see rough waters for both companies and probably a countersuit by NV against Intel for anti competitive practices. TBH Intel may build a good board but they always get Surplanted by 3rd Parties wether its Intel, AMD or NV chipsets used.

It would be impossible for Intel to do that realistically... The slave chip could simply be an extra part, it wouldn't have to have anything to do with Intel themselves.

Personally I think it's pretty silly that Intel "can" even make a suit concerning Nvidia making hardware that will make use of their hardware...

Why doesn't GM just say nobody can make tires for their cars... :slap:
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.68/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
It would be impossible for Intel to do that realistically... The slave chip could simply be an extra part, it wouldn't have to have anything to do with Intel themselves.

Personally I think it's pretty silly that Intel "can" even make a suit concerning Nvidia making hardware that will make use of their hardware...

Why doesn't GM just say nobody can make tires for their cars... :slap:
Not the same principle. It's more like somebody coming up with a new communication bus and ecu for a car, then a different company coming along and using that bus and ECU design in their own car.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.48/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
You can't make a chip for a processor unless you have licences to the buses used. Like, NVIDIA needed a licences from AMD in order to start making chips that work on the HyperTransport bus (correct me if I'm wrong). Intel is making the same transition so NVIDIA needs to update their licences for access to QuickPath.


The CPU is like an engine in a car; without it, you won't get anywhere. The GPU is like a radio in the dash; without it, using your car just got a little bit more boring (still does what it has to though). A CPU can run a console interface without trouble but a GPU is dead in the water without a CPU to spoon feed it information. It was very stupid of NVIDIA to say to Intel that their product is useless. NVIDIA is in hot water and all they accomplished was to make it boil. Stupid NVIDIA.
 
Joined
Jun 20, 2007
Messages
3,942 (0.62/day)
System Name Widow
Processor Ryzen 7600x
Motherboard AsRock B650 HDVM.2
Cooling CPU : Corsair Hydro XC7 }{ GPU: EK FC 1080 via Magicool 360 III PRO > Photon 170 (D5)
Memory 32GB Gskill Flare X5
Video Card(s) GTX 1080 TI
Storage Samsung 9series NVM 2TB and Rust
Display(s) Predator X34P/Tempest X270OC @ 120hz / LG W3000h
Case Fractal Define S [Antec Skeleton hanging in hall of fame]
Audio Device(s) Asus Xonar Xense with AKG K612 cans on Monacor SA-100
Power Supply Seasonic X-850
Mouse Razer Naga 2014
Software Windows 11 Pro
Benchmark Scores FFXIV ARR Benchmark 12,883 on i7 2600k 15,098 on AM5 7600x
You are ignorant and take your information as it is served , you don't actually analyze it.
The video shows a stupid thing that can be done by the cpu also and just as fast or faster , there are just a few dots there , can you actually contradict me on this that the CPU can't do a picture like the mona lisa that fast ?
You are missing the fact all those presentations are made on a CPU , anything that is involded there at the base it needs a CPU , the gpu is a simple powerfull raw calculator , but it is stupid and anything complex makes it obsolete.
You are talking like some fanboy who doesn't know anything more than what it reads on forums and internet , the gpu is not a miracle computing power , they don't even have a fab where to build this things, the cpu business has fabs and if they want they can build a CPU made from 4 billion transitors and organized in 1000-2000 threads with the usual tweaking only a CPU with fabs can do ( lot's of mhz's ) , then it will encode a movie so fast your head will spin but they can't do that because they must build a CPU wich works well on all kinds of software and most of them don't need this kinds of parallelization or don't take any advantage from having this.
I'm dissapointed i'm talking with people that are defending any hype is going around and a few years latter they discover how blown out of proportions some things were and how manipulative some companies are about any stupid feature they have and how much this can help people.
Why do you think it's so hard to make anything work on a GPU ( CPU software i mean ) , because it's so basic in what it can do and they hit the limitations so often they always end up making more stuff on the CPU or the limitations are so big there is no advantage using the GPU for a particular software.
What the GPU is as a piece of hardware is very simple for CPU engineers and the raw power can be matched and by far overtaken , but they can't do it just for the purpose of being the best at number crunching, they will build it as a video card and then they will have a reason to build a big stupid calculator.
The reason Nvidia fears so much Intel is because they know the GPU is not anything groundbreaking , it's not finetuned like a CPU is and the old rumors Nvidia would want to build a CPU is hillarious , they could never compete with Intel or AMD considering they don't have a fabs and can't finetune a CPU to that level even if they had , plus many companies made some sort of CPU's with extraordinary numbers ( tera flops ) but as always they can't be profitable for destkop or even just as powerfull ( IBM made lot's of so called powerfull cpu's and the latest one would be the ones in xbox360 and ps3 ).
They are cocky before they fall , they want to die with pride ( Nvidia ) .


I'm confused. Is this about morals and ethics, or?

Now you're off on a tangent, trying to support your rash comments about Intel, by trying to prove that a GPU succeeding a CPU as a the primary core component is essentially hogwash.
You state that GPUs are too simplistic, well so was a Pentium 133 compared to Nehalem. What's your point really? GPUs have evolved, and will continue to do so, to the point where all they need is the proper interfacing capibilities and we're good to go.

Why so much love for the CPU, it's just a damned piece of computing hardware.
 
Last edited by a moderator:

alexp999

Staff
Joined
Jul 28, 2007
Messages
8,012 (1.27/day)
Location
Dorset, UK
System Name Gaming Rig | Uni Laptop
Processor Intel Q6600 G0 (2007) @ 3.6Ghz @ 1.45625v (LLC) / 4 GHz Bench @ 1.63v | AMD Turion 64 X2 TL-62 2 GHz
Motherboard ASUS P5Q Deluxe (Intel P45) | HP 6715b
Cooling Xigmatek Dark Knight w/AC MX2 ~ Case Fans: 2 x 180mm + 1 x 120mm Silverstone Fans
Memory 4GB OCZ Platinum PC2-8000 @ 1000Mhz 5-5-5-15 2.1v | 2 x 1GB DDR2 667 MHz
Video Card(s) XFX GTX 285 1GB, Modded FTW BIOS @ 725/1512/1350 w/Accelero Xtreme GTX 280 + Scythe sinks| ATI X1250
Storage 2x WD6400AAKS 1 TB Raid 0, 140GB Raid 1 & 80GB Maxtor Basics External HDD (storage) | 160GB 2.5"
Display(s) Samsung SyncMaster SM2433BW @ 1920 x 1200 via DVI-D | 15.4" WSXGA+ (1680 x 1050 resolution)
Case Silverstone Fortress FT01B-W ~ Logitech G15 R1 / Microsoft Laser Mouse 6000
Audio Device(s) Soundmax AD2000BX Onboard Sound, via Logitech X-230 2.1 | ADI SoundMAX HD Audio
Power Supply Corsair TX650W | HP 90W
Software Windows 7 Ultimate Build 7100 x64 | Windows 7 Ultimate Build 7100 x64
Benchmark Scores 3DM06: 19519, Vantage: P16170 ~ Win7: -CPU 7.5 -MEM 7.5 -AERO 7.9 -GFX 6.0 -HDD 6.0
I'm confused. Is this about morals and ethics, or?

Now you're off on a tangent, trying to support your rash comments about Intel, by trying to prove that a GPU succeeding a CPU as a the primary core component is essentially hogwash.
You state that GPUs are too simplistic, well so was a Pentium 133 compared to Nehalem. What's your point really? GPUs have evolved, and will continue to do so, to the point where all they need is the proper interfacing capibilities and we're good to go.

Why so much love for the CPU, it's just a damned piece of computing hardware.

I think thats enough on this argument now. I don't want to have to delete anymore people's posts. And dont make things personal either please.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I think it's enough with the argument. People don't have very clear what each of the companies saidHere's what really happened:

"Nvidia" (you'll understand why it's quoted when you read the link) said: http://news.softpedia.com/news/Nvidia-Trumpets-Death-of-the-CPU-84407.shtml

On the other hand Intel said: http://news.softpedia.com/newsImage...less-You-039-d-better-Upgrade-Your-CPU-2.jpg/

Now look at the list Intel themself prepared, now who really has the edge at this moment that CUDA and Stream are more mainstream?

- Convert your music: the CPU, definately or a sound card anyway. CPU 1 - 0 GPU
- Edit and publish photos: when it comes to editing PS CS4 shows how much better the GPU is to handle big photos. CPU 1 - 1 GPU
- Render Pictures and animations: if we are speaking about speed, Gelato is so much faster than anything else... CPU 1 - 2 GPU
- Playing games: depends on the game, but come on Intel... CPU 1 - 3 GPU
- Edit and encode videos: after repeating the same test that I have done above with my quad running @1200 Mhz, CUDA enabled and after seeing the result. CPU 1 - 4 GPU

So IMHO Intel just got owned in their own slides. And when it comes to who said what he shouldn't, we have personal e-mail that should have never come to light against a public slide. Who talked too much in reality??? In reality, by what the links tell, it was not Intel's fault, it was not Nvidia's fault, it is The Inq.'s fault, messing with Nvidia as always. Pff. What is sad is that both Intel and Nvidia fell into the game.
 
Joined
Jun 20, 2007
Messages
3,942 (0.62/day)
System Name Widow
Processor Ryzen 7600x
Motherboard AsRock B650 HDVM.2
Cooling CPU : Corsair Hydro XC7 }{ GPU: EK FC 1080 via Magicool 360 III PRO > Photon 170 (D5)
Memory 32GB Gskill Flare X5
Video Card(s) GTX 1080 TI
Storage Samsung 9series NVM 2TB and Rust
Display(s) Predator X34P/Tempest X270OC @ 120hz / LG W3000h
Case Fractal Define S [Antec Skeleton hanging in hall of fame]
Audio Device(s) Asus Xonar Xense with AKG K612 cans on Monacor SA-100
Power Supply Seasonic X-850
Mouse Razer Naga 2014
Software Windows 11 Pro
Benchmark Scores FFXIV ARR Benchmark 12,883 on i7 2600k 15,098 on AM5 7600x
I think thats enough on this argument now. I don't want to have to delete anymore people's posts. And dont make things personal either please.

Cry?

Seriously?


I think it's enough with the argument. People don't have very clear what each of the companies saidHere's what really happened:

Hence why the confusion over whether this is about business ethics or? Because none of us are a stranger to greed in the supposed 'free market.'
If we're trying to keep it about technology and computing architecture, then there really isn't an arguement. GPUs are lapping CPUs. It doesn't mean the CPU is dead, it just means it's priority might change.


But how that equates to whether Nvidia or Intel is 'in the wrong', makes no sense.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.48/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
- Convert your music: the CPU, definately or a sound card anyway. CPU 1 - 0 GPU
- Edit and publish photos: when it comes to editing PS CS4 shows how much better the GPU is to handle big photos. CPU 1 - 1 GPU
- Render Pictures and animations: if we are speaking about speed, Gelato is so much faster than anything else... CPU 1 - 2 GPU
- Playing games: depends on the game, but come on Intel... CPU 1 - 3 GPU
- Edit and encode videos: after repeating the same test that I have done above with my quad running @1200 Mhz, CUDA enabled and after seeing the result. CPU 1 - 4 GPU
TIE: Music conversions could actually be done very well on a GPU because it is mostly floating point data; however, SSE has enhanced the CPU's ability to handle audio streams tremendously over the years. Something still has to convert binary -> audio though.

CPU: CPU does all the work (ALU) for for rendering pictures. All the GPU does is convert binary -> image. That's something a CPU can easily do but it hasn't done it for two decades.

CPU: 2D Animations (lots of ALU work) are almost always CPU derived just like pictures, the GPU only takes care of the menial task of binary -> display.

TIE: Games are a two way street. CPUs handle caching of map information, physics, audio, AI, etc. The GPU handles the work it is handed by the CPU (everything to create a 3D scene). They are both stressed and both necessary (unless you're talking 2D games).

GPU: Encoding videos are also a heavy FPU task so GPUs have the advantage there (still needs assistance from the CPU though in communication with the HDD and system RAM).

I figure CPU 2, GPU 1. In any case the GPU has the advantage, it still couldn't do anything without the aid of the CPU. The CPU could be made to do everything a GPU can do with the only penalty being time.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.68/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
TIE: Music conversions could actually be done very well on a GPU because it is mostly floating point data; however, SSE has enhanced the CPU's ability to handle audio streams tremendously over the years. Something still has to convert binary -> audio though.

CPU: CPU does all the work (ALU) for for rendering pictures. All the GPU does is convert binary -> image. That's something a CPU can easily do but it hasn't done it for two decades.

CPU: 2D Animations (lots of ALU work) are almost always CPU derived just like pictures, the GPU only takes care of the menial task of binary -> display.

TIE: Games are a two way street. CPUs handle caching of map information, physics, audio, AI, etc. The GPU handles the work it is handed by the CPU (everything to create a 3D scene). They are both stressed and both necessary (unless you're talking 2D games).

GPU: Encoding videos are also a heavy FPU task so GPUs have the advantage there (still needs assistance from the CPU though in communication with the HDD and system RAM).

I figure CPU 2, GPU 1. In any case the GPU has the advantage, it still couldn't do anything without the aid of the CPU. The CPU could be made to do everything a GPU can do with the only penalty being time.
You could even eliminate the gpu's time advantage by integrating a floating point unit into a cpu.
 

DrPepper

The Doctor is in the house
Joined
Jan 16, 2008
Messages
7,482 (1.22/day)
Location
Scotland (It rains alot)
System Name Rusky
Processor Intel Core i7 D0 3.8Ghz
Motherboard Asus P6T
Cooling Thermaltake Dark Knight
Memory 12GB Patriot Viper's 1866mhz 9-9-9-24
Video Card(s) GTX470 1280MB
Storage OCZ Summit 60GB + Samsung 1TB + Samsung 2TB
Display(s) Sharp Aquos L32X20E 1920 x 1080
Case Silverstone Raven RV01
Power Supply Corsair 650 Watt
Software Windows 7 x64
Benchmark Scores 3DMark06 - 18064 http://img.techpowerup.org/090720/Capture002.jpg
What happened to the good old days when gpus did games and cpus did erm other stuff.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.48/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Believe you/me, I don't like it either. NVIDIA especially is under the impression that high FLOP performance is important. AMD and Intel are looking at throwing a GPU on the CPU so it could "borrow" some FLOP performance from the GPU. It's a train wreck in theory (CPU is going to want to use the GPU when the GPU is busy on graphic work)--we'll see about application. It just sounds a whole lot more complex than what is needed.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Ford, you can't overlook the fact that Photoshop CS4 runs some filters much better with a cheap CPU and a cheap GPU than with the fastest Quad. Also much better with handling big photos. Believe me on this.

About rendering, GPUs can do the work very well and much faster. Nvidia's Gelato is a ultra fast renderer. Sure it's pretty much worthless for the professionals as it is now (haven't tried the last two versions anyway), but it demostrates the potential is there. Once that GPUs start supporting 64bit (G300 will and I suppose RV870 too) better they will hadle that kind of tasks even much better.

And about games, imagine you have a E6600 and a HD3850. You want to upgrade for better gaming and have $200-300. What do you buy?

The slide and my comments about it, they were not about who does what (again the CPU is necessary, a very fast $500+ one no), but in an upgrade scenario which will give the better results in a near future, when GPGPU is better handled? When GPUs are MIMD instead of SIMD? (G300 again)

Off course, CPUs can evolve too, but unless they are able to do the same work that GPUs do today (comparatively) you will always need a discrete GPU for demanding shoftware and games. In that scenario (which is wider everyday) you would have a lot of FPUs wasting die area most of the times. GPGPU solves this better, as when that highly demanding work is not being performed, they can simply shut down and let the CPU do the ALU work.

Even for those who don't game, it's a better solution. For them it's the same if they have to pay for the die area in the CPU or in the GPU, and there's no miracles in the silicon, performance would be similar. A GPU can always be replaced, replacing the CPU is much more difficult and it would also be much more costly if it had to be >>500mm^2 to contain the CPU+GPU substitute.
 
Top