• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Noises About Radeon HD 7900 Series with XDR2 Memory Grow

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
As early as in September, we heard reports of AMD toying with Rambus XDR2 memory on its next generation of high-performance GPUs. Apart from our own community's response, that news met with a wall of skepticism as it was deficient in plausibility. New reports from Chinese websites have raised the topic again with fresh rumors that AMD will attempt to implement XDR2 on some of its next-generation ultra-high end products after all. XDR2, according to Rambus, can transport twice the amount of data per clock as GDDR5.

Apparently AMD and Rambus have had much more cordial relations with each other, than other companies the latter engaged in patent disputes with. In 2006, AMD settled outstanding disputes with Rambus by willing to pay licensing costs for certain technologies claimed by Rambus, turning a leaf in the relations between the two. What Chinese sources are suggesting now, is that AMD will design its high-end GPU (codename: "Tahiti") in a way that will let it support both GDDR5 and XDR2. Certain higher-end SKUs based on Tahiti will use XDR2, while the slightly more cost-effective SKUs will use GDDR5.

In related news, other sources told TechPowerUp that AMD could adopt a "top-to-bottom" strategy with the high-end portion of its next-generation of products. This means that AMD could launch the dual-GPU "New Zealand" graphics card first, followed by single-GPU SKUs.

View at TechPowerUp Main Site
 
Last edited:
Sounds like a PR stunt by Rambus to try and get its share price up...
 
I have my sights on the second version GTX 560 ti, but would it be wise to wait and see what AMD is cooking ?? As I'm in no hurry to upgrade my GPU.
 
It's all about performance, anyone has any idea on XDR2?
But looking at rambus financial situation, they have to start selling their work at a reasonable cost.
 
It's all about performance, anyone has any idea on XDR2?

XDR2 has never been used on consumer PC GPUs, but Rambus makes a lot of song and dance about its "superiority over GDDR5".

XDR is used in Playstation 3 consoles.
 
What is the difference between the two here?
 
What is the difference between the two here?

XDR2 apparently pushes twice the amount of data per clock compared to GDDR5. There is no comparison between the two from a neutral source, but here's Rambus PR video:

 
Last edited by a moderator:
XDR2 apparently pushes twice the amount of data per clock compared to GDDR5. There is no comparison between the two from a neutral source, but here's Rambus PR video:


I see, perhaps add that to the OP.

So AMD are going for that "difference" to make their products more appealing that just another generic GPU.

Either that or this stuff really does work.
 
Last edited by a moderator:
What is the difference between the two here?

From the little i have researched, massive performance improvement with lower wattage than gdr5...

Just to put it simply...
 
Yeah, this is one of the things I've been waiting for. I had a pair of 128 MB Samsung PC800 RDRAM 10 years ago... on a Willamette 1.6 with a TNT2 32mb. lol. And it rocked. DDR has became a norm ever since it's launch - it killed both SD and RD on the desktop side.
 
Last edited by a moderator:
Yeah, this is one of the things I've been waiting for. I had a pair of 128 MB Samsung PC800 RDRAM 10 years ago... on a Willamette 1.6 with a TNT2 32mb. lol. And it rocked. DDR has became a norm ever since it's launch - it killed both SD and RD on the desktop side.

You are currently running DDR3-SDRAM.
 
massive performance improvement with lower wattage than gdr5...
That's how I see it with 348-bit and 1Gb (or maybe less) AMD could achieve more bandwidth (thru-put), while keeping efficiency in check and cost down.

Beats a costly 512-Bit PCB, crammed with 2Gb of ram... That looks to be the trend for the competition and we know how AMD likes to buck the status quo. If it works it's good for the consumer.

wall of skepticism as it was deficient in plausibility...:shadedshu
That wasn't so much from the readers and respondents...
 
Yeah, this is one of the things I've been waiting for. I had a pair of 128 MB Samsung PC800 RDRAM 10 years ago... on a Willamette 1.6 with a TNT2 32mb. lol. And it rocked. DDR has became a norm ever since it's launch - it killed both SD and RD on the desktop side.

lol you so funny. the type of RAM we use is always SDram :D

we should actually be saying DDR2 SDRAM or DDR3 SDRAM.
 
I think in that case it's just DDR. I've got a Willamette 1.6 myself but the mobo broke.. And my daughter's pc is still running single core AMD, with DDR (because it's not DDR2 like my pc and far less DDR3 ;))
 
XDR2 apparently pushes twice the amount of data per clock compared to GDDR5. There is no comparison between the two from a neutral source, but here's Rambus PR video:

Rambus have an employee call Rob Dhat ? Why doesn't that surprise me ?
 
Last edited by a moderator:
You are currently running DDR3-SDRAM.

Obviously. :rolleyes: I was referring them as a whole. Yes, GDDR and DDR are different. Everybody knows that. This is the first change on memory used under desktops (not just on your mobo).

lol you so funny. the type of RAM we use is always SDram :D

we should actually be saying DDR2 SDRAM or DDR3 SDRAM.

Technically, yes, but we're currently using DDR RAM. 10 years ago, there was also SDR RAM for GPU's. So this is similar, as in an alternative to GDDR (XDR). :rolleyes:
 
XDR 2 has a reference clock rate of only 800MHz. While GDDR5 has a reference clock rate of 1.5GHz. With that XDR2 can provide 16 bits of data per clock cycle. While GDDR5 can only provide 2 bits of data per clock cycle (which is why you see it at 1.5GHz). XDR2 is generally more efficient and requires less power. Along with it's use of FDMA, FlexLink, Micro-threading, differntial i/o's, etc I have to wonder if XDR2 is oc'ing friendly? From the initial information so far I would think that it's not. But if that is true I wouldn't think that is a negative thing. It already offers enough bandwidth. However, I also have to wonder if this can be achieved using current PCIe standards? Or would we need a next gen PCIe?

I would love to see a video tested using GTA V or SR3 with XDR2.
 
XDR 2 has a reference clock rate of only 800MHz. While GDDR5 has a reference clock rate of 1.5GHz. With that XDR2 can provide 16 bits of data per clock cycle. While GDDR5 can only provide 2 bits of data per clock cycle (which is why you see it at 1.5GHz). XDR2 is generally more efficient and requires less power. Along with it's use of FDMA, FlexLink, Micro-threading, differntial i/o's, etc I have to wonder if XDR2 is oc'ing friendly? From the initial information so far I would think that it's not. But if that is true I wouldn't think that is a negative thing. It already offers enough bandwidth. However, I also have to wonder if this can be achieved using current PCIe standards? Or would we need a next gen PCIe?

Type of memory is unrelated to PCI-E lanes. It makes little difference on performance, nowhere more than what a new GPU offers over an older one.
 
Type of memory is unrelated to PCI-E lanes. It makes little difference on performance, nowhere more than what a new GPU offers over an older one.
I'm referring to the video card overall performance (IE 7970 or whatever they call it) not the IC.
 
Obviously. :rolleyes: I was referring them as a whole. Yes, GDDR and DDR are different. Everybody knows that. This is the first change on memory used under desktops (not just on your mobo).

No, you weren't referring them as a whole. You were clearly ignorant to the fact that we use SDRAM even today. GDDR and DDR are both kinds of SDRAM. As a matter of fact, even RDRAM was a kind of SDRAM.
 
I'm referring to the video card at that point not the IC.

And it doesn't, there's nothing limiting the card from using XDR on PCI-E 2.0 x16.

No, you weren't referring them as a whole. You were clearly ignorant to the fact that we use SDRAM even today. GDDR and DDR are both kinds of SDRAM. As a matter of fact, even RDRAM was a kind of SDRAM.

Really? I'm clearly ignorant eh? Way to go buddy. Yes, they ARE kinds of SDRAM. But they DIFFER between each. Full stop.
 
Back
Top