Monday, July 18th 2011
AMD Radeon HD 7000 Series to be PCI-Express 3.0 Compliant
AMD's next generation of graphics processors (GPUs) that will be branded under the HD 7000 series, are reported to be PCI-Express Generation 3 compliant. The desktop discrete graphics cards will feature PCI-Express 3.0 x16 bus interfaces, and will be fully backwards-compatible with older versions of the bus, including Gen 1 and Gen 2. Motherboards sold today feature Gen 2 PCI-E slots, although some of the very latest motherboards launched by major vendors feature PCI-Express 3.0 slots.
The new bus doubles the bandwidth over PCI-E 2.0, with 1 GB/s of bandwidth per lane, per direction. PCI-Express 3.0 x16 would have 32 GB/s (256 Gbps) of bandwidth at its disposal, 16 GB/s per direction. AMD's next generation of GPUs, codenamed "Southern Islands" will be built on the new 28 nm process at TSMC, and will upscale VLIW4 stream processors. Some of the first PC platforms to fully support PCI-Express 3.0 will be Intel's Sandy Bridge-E. Whether AMD's GPUs have hit a bandwidth bottleneck with PCI-E Gen 2, or is AMD trying to just be standards-compliant, is a different question altogether.
Source:
Donanim Haber
The new bus doubles the bandwidth over PCI-E 2.0, with 1 GB/s of bandwidth per lane, per direction. PCI-Express 3.0 x16 would have 32 GB/s (256 Gbps) of bandwidth at its disposal, 16 GB/s per direction. AMD's next generation of GPUs, codenamed "Southern Islands" will be built on the new 28 nm process at TSMC, and will upscale VLIW4 stream processors. Some of the first PC platforms to fully support PCI-Express 3.0 will be Intel's Sandy Bridge-E. Whether AMD's GPUs have hit a bandwidth bottleneck with PCI-E Gen 2, or is AMD trying to just be standards-compliant, is a different question altogether.
85 Comments on AMD Radeon HD 7000 Series to be PCI-Express 3.0 Compliant
Innovation I never say it’s not welcome.
That was Very funny TheMailMan78
Cuda will be dead when this rolls out or it will become massively less important and more specialized and as AMD is going put this design into APU's in the next 2 years, yep. Cuda is a dead man walking.
Now if nvidia was smart, they would move to standards compliance.
As it stand I don't see CUDA going anywhere for a long time, especially for distributed computing (F@H anyone) and super computing applications. I just don't see the point in offering a competing standard to PhysX as it's a dead market, the problem is game developers outside of the RTS and MMO genres have turned their back on the PC platform, and the hardware they develop for is the better part of half a decade old, and can't handle good graphics let alone PhysX.
It's an upgrade trust me especially when you take into account M$ is going to stop adding security updates for XP.
I like XP but would never go back to it. X64 XP was buggy as shit
That and no DX10/11.
I count 22 games
I wouldn't be worrying over DX11 anytime soon, I would worry about when will us the peeps get 64bit games
64 bit is the only thing that can maximize DX11 graphical/gameplay output
-----------
PCI-e 2.0 -> PCI-e 3.0
Expect that SLi or Crossfire jitter/stutter to disappear
I also find it funny that you call XP Bloated, while at the same time push Windows Vista SP3.
if you mean windows 7 thats as far from bloated as you could get, had it running on riva TNT's and MX440 based rigs last week
After having to use Both XP and windows 7 on about 25 different machines each in the past 2 weeks i can tell you windows 7 is infact not very bloated at all
Xp starts off quicker but then the updates come and slow it right down
How is this On topic at all?
i'm just hoping PCI-E v3 is a sign the next gen cards will actually be able to use that bandwidth
Well, let me see... When XP was first released, you could run it on a machine with 64MB of RAM, now with all the bloat it's accrued over the years you need at least 1GB.
I really was sad to move away from XP x64, as it was truly my favorite of the Microsoft Operating Systems (low resources, 64bit, familiar interface, Server 2003 kernel, very fast).
but will the GPU cards perform better?
Yes
How and where it will perform better we will find out