• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD "Jaguar" Micro-architecture Takes the Fight to Atom with AVX, SSE4, Quad-Core

Joined
Nov 19, 2012
Messages
753 (0.17/day)
System Name Chaos
Processor Intel Core i5 4590K @ 4.0 GHz
Motherboard MSI Z97 MPower MAX AC
Cooling Arctic Cooling Freezer i30 + MX4
Memory 4x4 GB Kingston HyperX Beast 2400 GT/s CL11
Video Card(s) Palit GTX 1070 Dual @ stock
Storage 256GB Samsung 840 Pro SSD + 1 TB WD Green (Idle timer off) + 320 GB WD Blue
Display(s) Dell U2515H
Case Fractal Design Define R3
Audio Device(s) Onboard
Power Supply Corsair HX750 Platinum
Mouse CM Storm Recon
Keyboard CM Storm Quickfire Pro (MX Red)
I believe that AIDA does round-trip latency, and Ikaruga (love that game btw) probably claims that the GDDR5 used has a CL of 32ns. 1600 MT/s CL9 DDR3 has a CL of ~11.25ns max, close to three times less.

Still, with some intelligent queues and cache management, this won't be too much of a problem.


## EDIT ##
Have I ever mentioned how I hate it when I get distracted when replying, only to find out I made myself look like an idiot by posting the exact same thing as the person before me? Well, I do.
Sorry Ikaruga.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.80/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
No, and I don't really understand why would I joke about ram timings on my favorite enthusiast site. Do you understand that I was citing the actual latency of the chip itself, and not the latency the MC will have to deal with when accessing the memory?
For example, a typical DDR3@1600 module has about 12ns latency in a modern PC.

You mean the 32ns refresh? That's not access speeds my friend, that is how often that a bit in a DRAM cell is refreshed. All DRAM needs to be refreshed since data is stored in a capacitor and needs to be replenished as caps leak when they're disconnected from active power. Other than that, I see no mention of 32ns there.

That "32ns" sounds a lot like tRFC on DDR3 chips, not access latency.
 
Joined
Feb 18, 2011
Messages
1,259 (0.25/day)
I believe that AIDA does round-trip latency, and Ikaruga (love that game btw) probably claims that the GDDR5 used has a CL of 32ns. 1600 MT/s CL9 DDR3 has a CL of ~11.25ns max, close to three times less.

Still, with some intelligent queues and cache management, this won't be too much of a problem.


## EDIT ##
Have I ever mentioned how I hate it when I get distracted when replying, only to find out I made myself look like an idiot by posting the exact same thing as the person before me? Well, I do.
Sorry Ikaruga.

Yes I meant that speed, sorry for my English:shadedshu
 
Joined
Feb 20, 2011
Messages
124 (0.02/day)
Location
UK
so sony will be accessing it with there own version of LibGCM , along with OCL 1.2 means some awesome and better control over the hardware - something they cant really do now in the PC world as the hardware is variable , could see `on the fly` changes to core useage depending on whether is high physics load or a cut scene movie
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.80/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
claims that the GDDR5 used has a CL of 32ns. 1600 MT/s CL9 DDR3 has a CL of ~11.25ns max, close to three times less.

Isn't that kind of moot since GDDR5 can run at clocks that are 3 times faster than DDR3-1600? It's the same deal that happened when moving from DDR to DDR2 and to DDR3. Latencies increased but access times remained the same because the memory frequency increased which compensates for it and at the same time provides more bandwidth.

Yeah, there might be more latency, it's possible, but I don't think it will make that much of a difference. Also with more bandwidth you can load more data into cache in one clock than DDR3. So I think the benefits will far outweigh the costs.
 
Joined
Feb 18, 2011
Messages
1,259 (0.25/day)
Isn't that kind of moot since GDDR5 can run at clocks that are 3 times faster than DDR3-1600? It's the same deal that happened when moving from DDR to DDR2 and to DDR3. Latencies increased but access times remained the same because the memory frequency increased which compensates for it and at the same time provides more bandwidth.

Yeah, there might be more latency, it's possible, but I don't think it will make that much of a difference. Also with more bandwidth you can load more data into cache in one clock than DDR3. So I think the benefits will far outweigh the costs.

I don't think the price is the reason why we still don't use GDDR5 as main memory in PCs, after all they are selling graphics cards for much more than how much a GDDR5 ram kit or a supporting chipset/architecture would cost. I did not really red anything about overcoming the GDDR5 latency issue in the past, so that's what made me curious.

.....and Ikaruga (love that game btw)
:toast:
 
Joined
Dec 1, 2011
Messages
343 (0.07/day)
Location
Ft Stewart
System Name Queen Bee
Processor 3570k @ 4.0GHz
Motherboard Gigabyte UD3 Z77
Cooling Water Loop by EK
Memory 8GB Corsair 1600 DDR3
Video Card(s) MSI GTX 970 Gaming WaterCooled
Storage 1x Western Digital 500GB Black 1x Intel 20GB 311 SSD
Display(s) BenQ XL2420G
Case CoolTek W2
Power Supply Corsair 650Watt
Software Windows 7 Pro
we also know it will have 18gcn clusters = 1152 gcn cores rated at 800mhz
and it was rated at 1.84gflops or something actually

18gcn clusters! Can that be right! That would mean Jaguar would get 576 which is more then a 7750, and that alone is 40w of power. Something is a miss here for me.

So 45+45+ say 45 again (cpu) is 135w+!! Something has to be a miss.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.80/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
18gcn clusters! Can that be right! That would mean Jaguar would get 576 which is more then a 7750, and that alone is 40w of power. Something is a miss here for me.

So 45+45+ say 45 again (cpu) is 135w+!! Something has to be a miss.

They already said that the graphics power is going to be similar to a 7870, didn't they?
 
Joined
Feb 13, 2012
Messages
523 (0.11/day)
You mean the 32ns refresh? That's not access speeds my friend, that is how often that a bit in a DRAM cell is refreshed. All DRAM needs to be refreshed since data is stored in a capacitor and needs to be replenished as caps leak when they're disconnected from active power. Other than that, I see no mention of 32ns there.

That "32ns" sounds a lot like tRFC on DDR3 chips, not access latency.

and that is exactly what latency is tho, as what happens the ram issues the data to the cpu, and after 32ns it refreshes to send the next batch,gpus are highly paralleled so they arent as affected by latency as most gpus just need a certain amount of data to render while the ram sends the next batch, while cpus are a much more random and general purpose than gpus, for example were certain calculations would be issued from the ram, but in order for the cpu to complete the process it must wait for the second batch of data for example, in such a case the cpu would wait for another 32ns, and this is a big issue with cpus now adays but i think it can be easily masked with a large enough l2 cache for the jaguar cores(I think having 8 of them means 4mb cache that can be shared meaning one core can have all 4mb if it needs to. another thing i can think of is whether all 8gb refresh all at once, or whether sony will allow for the ram to work in turns to feed the cpu/gpu more dynamicaly rather than in big chunks of data(note that bulldozer/piledriver have relatively large data pools of l3 and l2 cache to mask their higher latency, and with steamroller adding larger l1 cahce aswell that sais something, and not to mention how much l3 cache affects piledriver in trinity which is pretty slower than fx piledriver, while phenom II vs athlon II barely had any affect due to its lower latency)

18gcn clusters! Can that be right! That would mean Jaguar would get 576 which is more then a 7750, and that alone is 40w of power. Something is a miss here for me.

So 45+45+ say 45 again (cpu) is 135w+!! Something has to be a miss.

jaguar gets 576 what?
and the highest end jaguar apu with its graphics cores(128 of them?) is rated at 25watt and with much higher clockspeed than 1.6ghz(amd in their presentation said jaguar will clock 10-15 higher than what bobcat wouldve clocked at 28nm) so ur talking atleast over 2ghz.
and if llano with 400outdated radeon cores, and 4 k10.5 cores clocked atleast 1.6 before turbo, so expect jaguar to be much more efficient on a new node and power efficient architecture, say 25watt max for the cpu cores only, if not less, that leaves them with 75-100watt headroom to work with(think hd7970m rated at 100w, thats 1280gcn cores at 800mhz, this would have 1152gcn cores at 800mhz and after a year of optimization its easily at 75watt)to add up to 100-125w which is very reasonable and since its an apu u just need one proper cooler, also think of graphics cards rated at 250w only requiring one blower fan and a dual slot cooler to cool both gddr5 chips and the gpu. in other words the motherboard and the chip can be as big as a hd7970(but with 100-125w u only need something the size of hd7850 which is rated at 110w-130w) but then of course add the br drive and other goodies. main point is cooling is no problem unless multiple chips are involved requiring cooling the case in general rather than the chip itself using a graphic card style cooler
They already said that the graphics power is going to be similar to a 7870, didn't they?
more like between hd7850 and hd7970m, it seems 800mhz is the sweet spot in terms of performance/efficiency/die size considering an hd7970m with 1280gnc cores at 800mhz is at 100w versus 110w measured/130w rated on hd7850 with 1024w with 860mhz
not to mention the mobile pitcairn loses 30watts-75watts measured/rated when clocked at 800mhz(advertised tdp on desktop pitcairn is 175wat but measured at 130w according to the link i have below)

http://www.guru3d.com/articles_pages/amd_radeon_hd_7850_and_7870_review,6.html
here is a reference in regards to the measured tdp, because advertised tdp by amd is higher but also consider other parts on the board and allowing overclock headroom or whatever the case is
 
Last edited:
Joined
Sep 15, 2011
Messages
6,759 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
I don't think the price is the reason why we still don't use GDDR5 as main memory in PCs, after all they are selling graphics cards for much more than how much a GDDR5 ram kit or a supporting chipset/architecture would cost. I did not really red anything about overcoming the GDDR5 latency issue in the past, so that's what made me curious.

:toast:

Guys, you need to stop the confusion. You CANNOT use GDDR5 in your PC as a main memory because the Graphic DDR5 is special RAM only to be used in graphics. Is a very big difference between how a GPU and CPU uses RAM. Also GDDR5 is based on DDR3 so you are already using it for a long time. Don't know exactly the specifics, but you can google it already... ;)
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.80/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Guys, you need to stop the confusion. You CANNOT use GDDR5 in your PC as a main memory because the Graphic DDR5 is special RAM only to be used in graphics. Is a very big difference between how a GPU and CPU uses RAM. Also GDDR5 is based on DDR3 so you are already using it for a long time. Don't know exactly the specifics, but you can google it already... ;)

Read the entire thread before you jump to conclusions, this is mainly stemming from the PS4 discussion.

GDDR5 itself can do whatever it wants, there are no packages or CPU IMCs that will handle it though, that does not mean that it can not be used. PS4 is lined up to use GDDR5 for system and graphics memory and I suspect that Sony isn't just saying that for shits and giggles.

Also it's not all that different, latencies are different, performance is (somewhat, not a ton,) optimized for bandwidth other latency but other than that, communication is about the same sans two control lines for reading and writing. It's a matter of how that data is transmitted, but your statement here is really actually wrong.

Just because devices don't use a particular bit of hardware to do something doesn't mean that you can't use that hardware to do something else. For example, for the longest time low voltage DDR2 was used in phones and mobile devices and not DDR3. Does that mean that DDR3 will never get used in smartphones? Most of us know the answer to that and it's a solid no, GDDR5 is no different. Just because it works best on video cards doesn't mean that it can not be used of a CPU that would be build with a GDDR5 memory controller.
 
Joined
Feb 18, 2011
Messages
1,259 (0.25/day)
Guys, you need to stop the confusion. You CANNOT use GDDR5 in your PC as a main memory because the Graphic DDR5 is special RAM only to be used in graphics. Is a very big difference between how a GPU and CPU uses RAM. Also GDDR5 is based on DDR3 so you are already using it for a long time. Don't know exactly the specifics, but you can google it already... ;)

Please consider switching from "write-only mode" on the forum, and read my comments if your reply to me:

many thanks:toast:
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,291 (7.53/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Guys, you need to stop the confusion. You CANNOT use GDDR5 in your PC as a main memory.

Oh but you can. PS4 uses GDDR5 as system memory.
 
Joined
Sep 15, 2011
Messages
6,759 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Please consider switching from "write-only mode" on the forum, and read my comments if your reply to me:

many thanks:toast:

Please don't tell me what to do, or what I am allowed to do or not.

many thanks:toast:

Oh but you can. PS4 uses GDDR5 as system memory.

PS4 is NOT PC...But if what you all say is true, than why nobody introduced GDDR5 for PC?? Is from a long time on video cards. And why is it called Graphic DDR then?
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,661 (2.86/day)
Location
PiteĂĄ
System Name Black MC in Tokyo
Processor Ryzen 5 7600
Motherboard MSI X670E Gaming Plus Wifi
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Corsair Vengeance
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston KC3000 1TB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Dell SK3205
Software Windows 11 Pro
Benchmark Scores Rimworld 4K ready!
Ps4 is pretty much a custom PC.

EDIT: With a custom OS.
 
Last edited:

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,291 (7.53/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
PS4 is NOT PC...But if what you all say is true, than why nobody introduced GDDR5 for PC?? Is from a long time on video cards. And why is it called Graphic DDR then?

The CPU and software are completely oblivious to memory type. The only component that really needs to know how the memory works at the physical level is the integrated memory controller. To every other component, memory type is irrelevant. It's the same "load" "store" "fetch" everywhere else.

Just because GDDR5 isn't a PC memory standard doesn't mean it can't be used as system main memory. It would have comparatively high latency to DDR3, but it still yields high bandwidth. GDDR5 stores data in the same ones and zeroes as DDR3, SDR, and EDO.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.80/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
And why is it called Graphic DDR then?

Because it is optimized for graphics, not exclusively for graphics.

You're just digging yourself into a hole.
 
Joined
Jan 22, 2013
Messages
478 (0.11/day)
System Name Desktop
Processor i5 3570k
Motherboard Asrock Z77
Cooling Corsair H60
Memory G Skill 8gb 1600 mhz X 2
Video Card(s) Sapphire Radeon 7850 X 2
Storage 1 TB Velociraptor, 240GB 840 Samsung
Display(s) 27" Samsung LED X 2
Case Thermaltake V9
Power Supply Seasonic 620 W, CX600M on stand by
Software Win 8.1 64
Benchmark Scores Benches are silly
Strange they do this before Sony's PS4 announcement tomorrow. Both new consoles from MS and Sony gonna have these new cores in their CPUs, I thought Sony would ask them for all the "flare" they can get. It's also strange only four cores allowed on the PC side while there will be more in the consoles (assuming that all the leaks are correct ofc).

Game consoles are not mobile. These are designed for mobile devices with 5-25W power envelope.

The CPU and software are completely oblivious to memory type. The only component that really needs to know how the memory works at the physical level is the integrated memory controller. To every other component, memory type is irrelevant. It's the same "load" "store" "fetch" everywhere else.

Just because GDDR5 isn't a PC memory standard doesn't mean it can't be used as system main memory. It would have comparatively high latency to DDR3, but it still yields high bandwidth. GDDR5 stores data in the same ones and zeroes as DDR3, SDR, and EDO.

^^tru dat!

Because it is optimized for graphics, not exclusively for graphics.

You're just digging yourself into a hole.

By graphics you probably meant bandwidth which is correct.

I'm guessing latency is not as much of an issue when it comes to a specific design for a console rather than a broad compatibility design for PC.
 
Last edited:

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.92/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
I'm guessing latency is not as much of an issue when it comes to a specific design for a console rather than a broad compatibility design for PC.

my thoughts as well. these arent meant to be generic multipurpose machines, they're meant to he gaming consoles with pre-set roles, and time to code each game/program to run specifically on them.


this gives game devs the ability to split that 8GB up at will, between CPU and GPU. that could really extend the life of the console, and its capabilities.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.80/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
By graphics you probably meant bandwidth which is correct.

That's to go without saying. GDDR is optimized for graphics which performs best under high bandwidth, high(er) latency situations.
I'm guessing latency is not as much of an issue when it comes to a specific design for a console rather than a broad compatibility design for PC.
I'm not willing to go that far, but I'm sure they will have stuff to mitigate any slowdown it may cause such as intelligent caching and pre-fetching.
 
Joined
Feb 18, 2011
Messages
1,259 (0.25/day)
my thoughts as well. these arent meant to be generic multipurpose machines, they're meant to he gaming consoles with pre-set roles, and time to code each game/program to run specifically on them.
Yes that's one of the advantage of working on closed systems like consoles. It helps a lot both in development speed and efficiency wise, but the developing procedure is still the same.

this gives game devs the ability to split that 8GB up at will, between CPU and GPU. that could really extend the life of the console, and its capabilities.
You can't do anything else but split unified memory (this is also the case with APUs and IGPs on the PC ofc), that's why it's called unified.
The N64 was a console and developers actually released titles on it, but this doesn't change the fact how horrid the memory latency really was on that system, and how much extra effort and work the programmers had to make to get over that huge limitation (probably the main reason why Nintendo introduced 1T-SRAM in the Gamecube, which was basically eDram on die).
If you really wan't to split unified memory into CPU and GPU memory (you can't btw, but let's assume you could), it's extremely unlikely that developers will use more than 1-2GB for "Video memory" in the PS4, not only because the bandwidth would be not enough to use more, but also because it's simply not needed (ok, the ps4 is extremely powerful on the bandwidth side and perhaps there will be some new rendering technique in the future which we don't know about yet, but current methods like deferred rendering, voxel, megatexturing, etc will run just fine using only 1-2GB for "rendering").
 
Last edited:
Top