• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Plays the VRAM Card Against NVIDIA

Status
Not open for further replies.
Joined
Sep 6, 2013
Messages
3,354 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Yeah, I don't get that part either. Save from publisher's hit-and-run tactics, I don;t understand why anyone would be "waiting for a game" and must play it within a month from release. This used to be a thing in the 80s and 90s when you had to purchase a physical copy, otherwise you were out of luck. But today, games are available pretty much everywhere and can be played at any time.
People don't have patience. Forget optimizations and bugs. People prefer to pay $70 for every game in Day 1, than pay half the price 6 months latter.
 
Joined
Apr 6, 2020
Messages
70 (0.04/day)
System Name Carnival of Glass
Processor Intel i9 14900K (previously 12900K/9900K, 8086K/Xeon X5670)
Motherboard ASRock Z790 PG SONIC (Gigabyte Z690 Aorus Master, Gigabyte Z370 Aorus Gaming 7/390 Des/X58A-UD7)
Cooling Corsair Hydro open loop, 480mm XR7, 360mm XR5!
Memory 32GB Corsair Dominator 6000MT DDR5 @6466 CL36-38-38-72-114-2
Video Card(s) Zotac RTX 3090 w/Corsair XG7 block (previously 1080Ti/970) +200 core +800 RAM +shunt mod
Storage 1x 2TB Samsung Evo 980 boot, 2TB Sabrent RQ, 2x2TB Crucial MX, 2x4TB WD SN850X, 16TB NAS!
Display(s) Acer Nitro 27" 4K, Koorui 27" 2K144Hz Acer 24" 1080p LED, 65" and 55" 4K TVs
Case Corsair 7000X (previously Corsair X570 Crystal SE)
Audio Device(s) Onboard + EVGA Nu Audio Pro 7.1, Yamaha Y2K AV Amp, Rotel RX-970B + 4x Kef Coda IIIs :D
Power Supply Corsair HX1500i Modular PSU
Mouse Corsair Darkstar/M65/Logitech G502 Lightspeed (previously G600 MMO)
Keyboard Corsair K70 Pro Optomech Axon, Logitech G910 Orion Spectrum (previously G19)
VR HMD HTC Vive Cosmos x2
Software Windows 11 x64 Enterprise (legal!)
Benchmark Scores https://www.3dmark.com/spy/18709841 https://valid.x86.fr/s9zmw1 https://valid.x86.fr/t0vrwy
That's just a reality, deal with it. The fact remains AMD has you covered regardless of how shit the port is, and Nvidia doesn't.

Consoles have been defining the baseline resource needs for decades now, the writing was always on the wall. That 4070ti you got there is going to run into trouble sooner rather than later.


Yep... that alone proves the point either way, Fury X fell off much faster than the 6GB 980ti while they were about equal on launch.

Now the tables are turned.
Its pretty ironic to see crappy VRAM caps on cards by the same company pushing RT which requires additional VRAM over non RT, don't you think.
Though you have to drop a lot of performance and get more bugs to get your extra VRAM.. It is a point though, however now that card they keep playing is at least valid. Back when 8GB of VRAM was almost too much for games at the time, that didn't future proof the 290 series and onwards quite so much - they still sucked but they remained cheap and popular. And they are not fast enough to take advantage of that 8GB.. It only really makes sense now. Frankly i would have rather AMD/ATi put more in actually making their GPUs good along with their drives instead of giving you bonus RAM you won't use until 10 years later... lol

Some of the RAM fudgery was due to shortages and the pandemic too, but i'd be lying if i said it didn't also annoy me - plus AMD PCB quality has only just improved and their silicon use is generally of a lower quality/higher yield type. Despite all the amazing PR and Advertising AMD do, they still haven't actually "won" anything yet, even if they had a few things first (that again, only matter now, when those machines are no longer fast enough to take full advantage of it) etc.. I fell for the "native quad core" and "integrated memory controller" crap too almost, i was a total AMD sucker.. Haha! Pretty sure nVidia copied AMD with the whole amount of RAM on the 3060 vs the Ti though.. 12GB on a 3060 and 2060 why? And indeed 4GB of their new HBM invention on the Fury helped kill it...that and it was kind of shit but...that was their best early chance at actually countering nVidia if they had added more RAM or used GDDR funnily enough! But thanks to HBM, you can't really even mod them easily to change the RAM! They certainly have plenty of income sources that aren't PCs to keep competitive though, monopoly on 2 of the 3 popular consoles etc.. AMD have become rich by being devious and doing deals, not by being the best at what they do. Not like nobody does that but AMD, they lie to the customers far more than most. How's your "eight core" FX?
 
Joined
Jan 14, 2019
Messages
12,353 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Though you have to drop a lot of performance and get more bugs to get your extra VRAM.. It is a point though, however now that card they keep playing is at least valid. Back when 8GB of VRAM was almost too much for games at the time, that didn't future proof the 290 series and onwards quite so much - they still sucked but they remained cheap and popular. And they are not fast enough to take advantage of that 8GB.. It only really makes sense now. Frankly i would have rather AMD/ATi put more in actually making their GPUs good along with their drives instead of giving you bonus RAM you won't use until 10 years later... lol

Some of the RAM fudgery was due to shortages and the pandemic too, but i'd be lying if i said it didn't also annoy me - plus AMD PCB quality has only just improved and their silicon use is generally of a lower quality/higher yield type. Despite all the amazing PR and Advertising AMD do, they still haven't actually "won" anything yet, even if they had a few things first (that again, only matter now, when those machines are no longer fast enough to take full advantage of it) etc.. I fell for the "native quad core" and "integrated memory controller" crap too almost, i was a total AMD sucker.. Haha! Pretty sure nVidia copied AMD with the whole amount of RAM on the 3060 vs the Ti though.. 12GB on a 3060 and 2060 why? And indeed 4GB of their new HBM invention on the Fury helped kill it...that and it was kind of shit but...that was their best early chance at actually countering nVidia if they had added more RAM or used GDDR funnily enough! But thanks to HBM, you can't really even mod them easily to change the RAM! They certainly have plenty of income sources that aren't PCs to keep competitive though, monopoly on 2 of the 3 popular consoles etc.. AMD have become rich by being devious and doing deals, not by being the best at what they do. Not like nobody does that but AMD, they lie to the customers far more than most. How's your "eight core" FX?
I don't feel being lied to by AMD. I only feel being lied to by reviews that basically all say that Ryzen chiplets are super easy to cool, when my experience is the exact opposite. They're good CPUs, but still. They. Run. Hot.

RDNA 2 on the other hand, is a hands down excellent line of products, on par with Nvidia in rasterization, stability, updates, etc, and is cheaper, too. I also like the AMD driver way more than Nvidia's Windows 95 style retro control panel with the way too convoluted 3D Settings menu. The only thing that's lacking is RT performance, but that's not great on any current card except for maybe the 4090, anyway.
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I don't feel being lied to by AMD. I only feel being lied to by reviews that basically all say that Ryzen chiplets are super easy to cool, when my experience is the exact opposite. They're good CPUs, but still. They. Run. Hot.

RDNA 2 on the other hand, is a hands down excellent line of products, on par with Nvidia in rasterization, stability, updates, etc, and is cheaper, too. I also like the AMD driver way more than Nvidia's Windows 95 style retro control panel with the way too convoluted 3D Settings menu. The only thing that's lacking is RT performance, but that's not great on any current card except for maybe the 4090, anyway.
Nope most 4090 owners on here just go on about dlss3 because they're GPU got crypled by portal without it, same as every other card doing actual path tracing because guess what we don't actually have the tech to actually do AAA titles with anything like real Raytacing, were replacing one fudge with another to benefit Nvidia's choice of technology to back.
IE we helped Nvidia prep for AI.
But don't worry Nvidia can give you back free frames inserted between the two in your Vram buffer that will make it go vroom, at least until you 180 faster than casual.

I never did like v-sync or triple buffering all added too much lag and still do.
 
Joined
Nov 3, 2011
Messages
695 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H150i Elite LCD XT White
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K FreeSync/GSync DP, LG 32UL950 32in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
If you let developers off the hook, hardware will never be able to keep up.

Also, what you're describing would only happen if developers were all noobs. That's not how textures are handled (look up mipmaps, if you're curious).
Game consoles have been defining the minimum gaming PC specs for many decades.

8 GB VRAM-equipped PC GPU card can't match PS5's random access VRAM range and scope. VRAM problem is not an issue for NVIDIA owners with 16 GB VRAM or greater.

Read


Dr. Jon Peddie, president of JPR. "Some products like Nvidia’s RTX 4090 did exceptionally well despite its high price, so almost everything we thought we knew about economics and market behavior seemed to be turned on its head in Q4.”

NVIDIA's VRAM tactics worked since RTX 4090 has 24 GB of VRAM.
 
Last edited:
Joined
Apr 6, 2020
Messages
70 (0.04/day)
System Name Carnival of Glass
Processor Intel i9 14900K (previously 12900K/9900K, 8086K/Xeon X5670)
Motherboard ASRock Z790 PG SONIC (Gigabyte Z690 Aorus Master, Gigabyte Z370 Aorus Gaming 7/390 Des/X58A-UD7)
Cooling Corsair Hydro open loop, 480mm XR7, 360mm XR5!
Memory 32GB Corsair Dominator 6000MT DDR5 @6466 CL36-38-38-72-114-2
Video Card(s) Zotac RTX 3090 w/Corsair XG7 block (previously 1080Ti/970) +200 core +800 RAM +shunt mod
Storage 1x 2TB Samsung Evo 980 boot, 2TB Sabrent RQ, 2x2TB Crucial MX, 2x4TB WD SN850X, 16TB NAS!
Display(s) Acer Nitro 27" 4K, Koorui 27" 2K144Hz Acer 24" 1080p LED, 65" and 55" 4K TVs
Case Corsair 7000X (previously Corsair X570 Crystal SE)
Audio Device(s) Onboard + EVGA Nu Audio Pro 7.1, Yamaha Y2K AV Amp, Rotel RX-970B + 4x Kef Coda IIIs :D
Power Supply Corsair HX1500i Modular PSU
Mouse Corsair Darkstar/M65/Logitech G502 Lightspeed (previously G600 MMO)
Keyboard Corsair K70 Pro Optomech Axon, Logitech G910 Orion Spectrum (previously G19)
VR HMD HTC Vive Cosmos x2
Software Windows 11 x64 Enterprise (legal!)
Benchmark Scores https://www.3dmark.com/spy/18709841 https://valid.x86.fr/s9zmw1 https://valid.x86.fr/t0vrwy
Why the smeg would someone mention a shitty PS5 in this thread? 16GB of shared RAM means its already doomed, plus AMD APU means its...trash! ;)

Anyway, i think the other reason is...ready for a slight conspiracy? Crypto... AMD gave more RAM on their Radeon cards after the 290/X series, unlike nVidia, because.. Even though the card wasn't fast enough to actually use 8GB VRAM and run above 1080p or 4K etc, including the RX 4/5 series here as they're basically all the same with some small revisions.. But they are more useful for crypto mining! Only reason i couldn't use my old GTX 970 for mining is it didn't have enough VRAM, not that i'd bother mining..but it certainly made their cards more useful for the...wrong reasons. My 290X was of course faster than my 970 that replaced it, but the 970 is still working well as is the SSD i got at the same time, whereas the 290X died after a couple of months and arrived with damaged fans - it was an MSi card which says it all really.
 
Joined
Nov 3, 2011
Messages
695 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H150i Elite LCD XT White
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K FreeSync/GSync DP, LG 32UL950 32in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
Why the smeg would someone mention a shitty PS5 in this thread? 16GB of shared RAM means its already doomed, plus AMD APU means its...trash! ;)

Anyway, i think the other reason is...ready for a slight conspiracy? Crypto... AMD gave more RAM on their Radeon cards after the 290/X series, unlike nVidia, because.. Even though the card wasn't fast enough to actually use 8GB VRAM and run above 1080p or 4K etc, including the RX 4/5 series here as they're basically all the same with some small revisions.. But they are more useful for crypto mining! Only reason i couldn't use my old GTX 970 for mining is it didn't have enough VRAM, not that i'd bother mining..but it certainly made their cards more useful for the...wrong reasons. My 290X was of course faster than my 970 that replaced it, but the 970 is still working well as is the SSD i got at the same time, whereas the 290X died after a couple of months and arrived with damaged fans - it was an MSi card which says it all really.
Your RTX 3090's 24 GB VRAM wreaked NVIDIA's 8 GB VRAM is a good enough narrative. :laugh:

Heavy VRAM usage is useful for the game's fundamental artwork.

PS4 / PS4 Pro era has 4 GB to 5.5GB VRAM game design hence 6 GB to 8 GB VRAM is good enough for PCs until PS5 / XSX reached sizable numbers e.g. 58 million.

Xbox Series X was designed with a 10 GB fast memory range as its primary VRAM range. Xbox Series X allocates 13.5 GB of memory to Games

Consoles don't have PC duplicated data problems.

READ https://www.techpowerup.com/306713/...ultaneous-access-to-vram-for-cpu-and-gpu?cp=2

Microsoft has implemented two new features into its DirectX 12 API - GPU Upload Heaps and Non-Normalized sampling have been added via the latest Agility SDK 1.710.0 preview, and the former looks to be the more intriguing of the pair. The SDK preview is only accessible to developers at the present time, since its official introduction on Friday 31 March. Support has also been initiated via the latest graphics drivers issued by NVIDIA, Intel, and AMD. The Microsoft team has this to say about the preview version of GPU upload heaps feature in DirectX 12: "Historically a GPU's VRAM was inaccessible to the CPU, forcing programs to have to copy large amounts of data to the GPU via the PCI bus. Most modern GPUs have introduced VRAM resizable base address register (BAR) enabling Windows to manage the GPU VRAM in WDDM 2.0 or later."

They continue to describe how the update allows the CPU to gain access to the pool of VRAM on the connected graphics card: "With the VRAM being managed by Windows, D3D now exposes the heap memory access directly to the CPU! This allows both the CPU and GPU to directly access the memory simultaneously, removing the need to copy data from the CPU to the GPU increasing performance in certain scenarios." This GPU optimization could offer many benefits in the context of computer games, since memory requirements continue to grow in line with an increase in visual sophistication and complexity.

A shared pool of memory between the CPU and GPU will eliminate the need to keep duplicates of the game scenario data in both system memory and graphics card VRAM, therefore resulting in a reduced data stream between the two locations. Modern graphics cards have tended to feature very fast on-board memory standards (GDDR6) in contrast to main system memory (DDR5 at best). In theory the CPU could benefit greatly from exclusive access to a pool of ultra quick VRAM, perhaps giving an early preview of a time when DDR6 becomes the daily standard in main system memory.

---
Your argument is based on ignorance.

AMD's Fusion is coming for gaming PC.
 
Last edited:
Joined
Apr 6, 2020
Messages
70 (0.04/day)
System Name Carnival of Glass
Processor Intel i9 14900K (previously 12900K/9900K, 8086K/Xeon X5670)
Motherboard ASRock Z790 PG SONIC (Gigabyte Z690 Aorus Master, Gigabyte Z370 Aorus Gaming 7/390 Des/X58A-UD7)
Cooling Corsair Hydro open loop, 480mm XR7, 360mm XR5!
Memory 32GB Corsair Dominator 6000MT DDR5 @6466 CL36-38-38-72-114-2
Video Card(s) Zotac RTX 3090 w/Corsair XG7 block (previously 1080Ti/970) +200 core +800 RAM +shunt mod
Storage 1x 2TB Samsung Evo 980 boot, 2TB Sabrent RQ, 2x2TB Crucial MX, 2x4TB WD SN850X, 16TB NAS!
Display(s) Acer Nitro 27" 4K, Koorui 27" 2K144Hz Acer 24" 1080p LED, 65" and 55" 4K TVs
Case Corsair 7000X (previously Corsair X570 Crystal SE)
Audio Device(s) Onboard + EVGA Nu Audio Pro 7.1, Yamaha Y2K AV Amp, Rotel RX-970B + 4x Kef Coda IIIs :D
Power Supply Corsair HX1500i Modular PSU
Mouse Corsair Darkstar/M65/Logitech G502 Lightspeed (previously G600 MMO)
Keyboard Corsair K70 Pro Optomech Axon, Logitech G910 Orion Spectrum (previously G19)
VR HMD HTC Vive Cosmos x2
Software Windows 11 x64 Enterprise (legal!)
Benchmark Scores https://www.3dmark.com/spy/18709841 https://valid.x86.fr/s9zmw1 https://valid.x86.fr/t0vrwy
Consoles are kids toys! :p PS4 Pro can allocate RAM between system and GPU depending on what the game asks for. Ask my PS4Pro Dev/Test kit! Also you seem to be prattling on about ReBAR for some reason... Don't have an XSX or PS5, they were outdated on release thanks to RTX 3090 ;)

And i couldn't give a fuck about AMD. Go away Lisa. Bad enough she fucked up chip designs in the 80s!

You're ignorant if you think i got a 3090 just for gaming! :roll:
 
Joined
Dec 29, 2022
Messages
222 (0.31/day)
Well, what about that Neural Texture Compression tech? Seemed rather... innovational.
 
Joined
Nov 3, 2011
Messages
695 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H150i Elite LCD XT White
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K FreeSync/GSync DP, LG 32UL950 32in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
Consoles are kids toys! :p PS4 Pro can allocate RAM between system and GPU depending on what the game asks for. Ask my PS4Pro Dev/Test kit! Also you seem to be prattling on about ReBAR for some reason... Don't have an XSX or PS5, they were outdated on release thanks to RTX 3090 ;)

And i couldn't give a fuck about AMD. Go away Lisa. Bad enough she fucked up chip designs in the 80s!

You're ignorant if you think i got a 3090 just for gaming! :roll:
Consoles didn't pretend they are workstations. ReBar is only part of the solution to reduce PC's duplicated data issues and DirectX 12 API - GPU Upload Heaps is the other half.

XSX and PS5 target lower price and near-idiot-proof box gamer market. Digital edition PS5 (with 16 GB GDDR6-14000 and 512 DDR4) has a $399 asking price and RTX 3090 wouldn't match that price. RTX 3090 is not functional without a host x86 PC.

I didn't assume RTX 3090 is just used for games, that's your assumption.

"8 GB VRAM" wouldn't be able to match the console's 10 to 12 GB VRAM random access pattern.

RX 480 / RX 580 was under Raja Koduri's administration. For raster and before Raja Koduri, R9 290X reached 64 ROPS. Under Raja "Mr TFLOPS" Koduri, the raster was stalled at 64 ROPs into Vega 64, Vega II, and RX 5700 XT generations.

Lisa Su wasn't responsible for AMD's failed Bulldozer.

Lisa Su's team was responsible for CELL which is a flawed design along with NVIDIA's flawed 32-bit compute incompetent GeForce 7-based RSX. Don't assume I'm not aware of the PowerPC debacle, Hint: Amiga PowerPC.
 
Last edited:
Joined
Apr 6, 2020
Messages
70 (0.04/day)
System Name Carnival of Glass
Processor Intel i9 14900K (previously 12900K/9900K, 8086K/Xeon X5670)
Motherboard ASRock Z790 PG SONIC (Gigabyte Z690 Aorus Master, Gigabyte Z370 Aorus Gaming 7/390 Des/X58A-UD7)
Cooling Corsair Hydro open loop, 480mm XR7, 360mm XR5!
Memory 32GB Corsair Dominator 6000MT DDR5 @6466 CL36-38-38-72-114-2
Video Card(s) Zotac RTX 3090 w/Corsair XG7 block (previously 1080Ti/970) +200 core +800 RAM +shunt mod
Storage 1x 2TB Samsung Evo 980 boot, 2TB Sabrent RQ, 2x2TB Crucial MX, 2x4TB WD SN850X, 16TB NAS!
Display(s) Acer Nitro 27" 4K, Koorui 27" 2K144Hz Acer 24" 1080p LED, 65" and 55" 4K TVs
Case Corsair 7000X (previously Corsair X570 Crystal SE)
Audio Device(s) Onboard + EVGA Nu Audio Pro 7.1, Yamaha Y2K AV Amp, Rotel RX-970B + 4x Kef Coda IIIs :D
Power Supply Corsair HX1500i Modular PSU
Mouse Corsair Darkstar/M65/Logitech G502 Lightspeed (previously G600 MMO)
Keyboard Corsair K70 Pro Optomech Axon, Logitech G910 Orion Spectrum (previously G19)
VR HMD HTC Vive Cosmos x2
Software Windows 11 x64 Enterprise (legal!)
Benchmark Scores https://www.3dmark.com/spy/18709841 https://valid.x86.fr/s9zmw1 https://valid.x86.fr/t0vrwy
Hint: I repair, re-build and service Amigas a lot in my spare time :)

Not reading much of the other crap you're typing to be honest, my GPU has GDDR6X so i don't care, but.. At least my PS4 Pro Dev/Test kit can be made more useful with Linux.. Just a shame the GPU driver only really works properly on the Slim i have due to different GPU archs, of course.

Bulldozer will go down in history as the biggest waste of Silicon ever made. And your asking price is bollocks, PS5s were £850 for a long time as was XSX. lol.

Also, anyone have any idea how much my PS4 Pro is worth? :roll:I only paid £1139 for my 3090 mate as i sold my 1080Ti with custom cooling for £500 ;P

Oh yeah, that was it, CELL! God that was a piece of shit but my PS3s are only useful due to being jailbroken.. PS5 has sat in its box for years waiting for that one glorious day we can exploit it! I was patient with my PS4s and it paid off!

Did you forget about Lisa Su's earlier work on something like the Atari chips or some other early computer/console systems? Have to google which now but she fucked it. :p Pretty sure it was Jay Miner that did the Atari 7800 chips actually.. No wonder i like mine! Can't say i like Amiga any more though, such poor system design for the time really, they were bound to fail and good god the OS and hardware is often about as stable as an Italian taxi driver that's got stuck behind two old priests in a Skoda!


RIP.. :love:
 
Joined
Nov 3, 2011
Messages
695 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H150i Elite LCD XT White
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K FreeSync/GSync DP, LG 32UL950 32in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
Hint: I repair, re-build and service Amigas a lot in my spare time :)

Not reading much of the other crap you're typing to be honest,
You argued "And i couldn't give a fuck about AMD. Go away Lisa. Bad enough she fucked up chip designs in the 80s!" bullshit.

The UK is a small country and I don't care i.e. that's your problem.

PS4 / PS4 Pro with modified desktop Linux is slow due to netbook-class Jaguar CPUs.

Article date: November 24, 2022

Standard PlayStation 5 debuted at £449 / $499 / AU $749

November 24, 2022: £479.99 / $499 / AU $799.95.

The disc-less PS5 Digital Edition, Sony's August 2022 price rises have nudged that up to £389.99 / $399.99 / AU $649.95.

Paying more than £479.99 is not the official price. You're the real bollocks.

Did you forget about Lisa Su's earlier work on something like the Atari chips or some other early computer/console systems? Have to google which now but she fucked it. :p Pretty sure it was Jay Miner that did the Atari 7800 chips actually.. No wonder i like mine! Can't say i like Amiga any more though, such poor system design for the time really, they were bound to fail and good god the OS and hardware is often about as stable as an Italian taxi driver that's got stuck behind two old priests in a Skoda!


RIP.. :love:
Lisa Su wasn't at Atari and She wasn't old enough for developing 1980s game consoles. I paid very little attention to Atari 7800 and its shovelware games and the best 8-bit game console was Nintendo Entertainment System.

Lisa Su's summer jobs at Analog Devices.

Lisa Su obtained her master's degree from MIT in 1991. From 1990 to 1994, she studied for her Ph.D. under MIT advisor Dimitri Antoniadis.

In June 1994, Su became a member of the technical staff at Texas Instruments.

Atari's downfall is during Jack Tramiel's Commodore MK2 aka Atari Corporation.

CEO Jack Tramiel's cost-cutting was the downfall of Commodore Semiconductor Group(CSG)'s MOS Technology 65xx CPU family design that caused Acorn to develop the ARM CPU. LOL.

The main reason for switching from CSG/MOS 65xx CPU family to Motorola's 68000 CPU family is due to CEO Jack Tramiel's cost-cutting the CPU R&D. CSG/MOS 65xx couldn't keep up with Intel's X86 evolution.

Commodore-International Limited and CEO Jack Tramiel (and his Commodore Mk2 Atari Corporation) are the major factors, NOT Lisa Su!

Lisa Su is judged when She is the CEO of a company!

Your narrative is a load of crap.


Oh yeah, that was it, CELL! God that was a piece of shit but my PS3s are only useful due to being jailbroken.. PS5 has sat in its box for years waiting for that one glorious day we can exploit it! I was patient with my PS4s and it paid off!

Read "Race for a New Game Machine: Creating the Chips Inside the Xbox 360 and the PlayStation 3" book which led design for IBM's PPE and SPE.

David Shippy was one of the lead architects for the POWER2*, G3 PowerPC, and POWER4* processor designs. He is the chief architect for the power processing unit for the Cell processor. https://www.goodreads.com/book/show/22121796-race-for-a-new-game-machine

Lisa Su's role is to represent IBM when they interface with Sony and Toshiba.

For a raster graphics workload, STI's CELL's SPU wasn't designed like ATI's Xenos GpGPU. IBM PPE's effective GFLOPS claims are rubbish i.e. they are worst than Jaguar CPUs. IBM lacks modern GPU design experience when compared to AMD(ATI) or NVIDIA or VIA S3.

From https://forum.beyond3d.com/threads/...-rsxs-lack-of-power.48995/page-6#post-1460125
Against NVIDIA GeForce 7 and PS3's RSX GPU design flaws
------------------------
Unmasking NVIDIA's "The Way its Meant to be Played" during NVIDIA's GeForce 7 series.

"I could go on for pages listing the types of things the spu's are used for to make up for the machines aging gpu, which may be 7 series NVidia but that's basically a tweaked 6 series NVidia for the most part. But I'll just type a few off the top of my head:"
1) Two ppu/vmx units
There are three ppu/vmx units on the 360, and just one on the PS3. So any load on the 360's remaining two ppu/vmx units must be moved to spu.
2) Vertex culling
You can look back a few years at my first post talking about this, but it's common knowledge now that you need to move as much vertex load as possible to spu otherwise it won't keep pace with the 360.
3) Vertex texture sampling
You can texture sample in vertex shaders on 360 just fine, but it's unusably slow on PS3. Most multi platform games simply won't use this feature on 360 to make keeping parity easier, but if a dev does make use of it then you will have no choice but to move all such functionality to spu.
4) Shader patching
Changing variables in shader programs is cake on the 360. Not so on the PS3 because they are embedded into the shader programs. So you have to use spu's to patch your shader programs.
5) Branching
You never want a lot of branching in general, but when you do really need it the 360 handles it fine, PS3 does not. If you are stuck needing branching in shaders then you will want to move all such functionality to spu.
6) Shader inputs
You can pass plenty of inputs to shaders on 360, but do it on PS3 and your game will grind to a halt. You will want to move all such functionality to spu to minimize the amount of inputs needed on the shader programs.
7) MSAA alternatives
Msaa runs full speed on 360 gpu needing just cpu tiling calculations. Msaa on PS3 gpu is very slow. You will want to move msaa to spu as soon as you can.
Post processing
8) 360 is unified architecture meaning post process steps can often be slotted into gpu idle time. This is not as easily doable on PS3, so you will want to move as much post process to spu as possible.
9) Load balancing
360 gpu load balances itself just fine since it's unified. If the load on a given frame shifts to heavy vertex or heavy pixel load then you don't care. Not so on PS3 where such load shifts will cause frame drops. You will want to shift as much load as possible to spu to minimize your peak load on the gpu.
10) Half floats
You can use full floats just fine on the 360 gpu. On the PS3 gpu they cause performance slowdowns. If you really need/have to use shaders with many full floats then you will want to move such functionality over to the spu's.
11) Shader array indexing
You can index into arrays in shaders on the 360 gpu no problem. You can't do that on PS3. If you absolutely need this functionality then you will have to either rework your shaders or move it all to spu.
Etc, etc, etc...

NVIDIA's GeForce 8 (CUDA) series is a large improvement.

AmigaOS wasn't designed with MMU since 68000 did NOT have an MMU. LOL "Old school" Unix vendors with 68000 CPU-based workstations used a custom MMU until 68010's slow 68451 and 68020's 68851 add-on MMUs. 68030 was released in 1987 with a built-in MMU. Motorola wasn't taking Unix seriously and 68030 was late to the party. Many "Old school" Unix vendors started their RISC CPU R&D due to Motorola's inferior R&D roadmap.

Commodore's toy mindset has MMU less Amiga 1200's 68EC020 CPU baseline and doubled down with Amiga 4000 with MMU-less 68EC030. Motorola shouldn't offer MMU-less 68EC030.

Microsoft Xenix and Intel 80286 (with built-in MMU since 1982) dominated UNIX shipments in the 1980s. Intel 80386 has built-in MMU for Xenix, Linux, and Windows NT. Linux originated from an 80386-based PC. Unlike Motorola, Intel is consistent with the built-in MMUs with 286 and 386 to the current date.

Though you have to drop a lot of performance and get more bugs to get your extra VRAM.. It is a point though, however now that card they keep playing is at least valid. Back when 8GB of VRAM was almost too much for games at the time, that didn't future proof the 290 series and onwards quite so much - they still sucked but they remained cheap and popular. And they are not fast enough to take advantage of that 8GB.. It only really makes sense now. Frankly i would have rather AMD/ATi put more in actually making their GPUs good along with their drives instead of giving you bonus RAM you won't use until 10 years later... lol

Some of the RAM fudgery was due to shortages and the pandemic too, but i'd be lying if i said it didn't also annoy me - plus AMD PCB quality has only just improved and their silicon use is generally of a lower quality/higher yield type. Despite all the amazing PR and Advertising AMD do, they still haven't actually "won" anything yet, even if they had a few things first (that again, only matter now, when those machines are no longer fast enough to take full advantage of it) etc.. I fell for the "native quad core" and "integrated memory controller" crap too almost, i was a total AMD sucker.. Haha! Pretty sure nVidia copied AMD with the whole amount of RAM on the 3060 vs the Ti though.. 12GB on a 3060 and 2060 why? And indeed 4GB of their new HBM invention on the Fury helped kill it...that and it was kind of shit but...that was their best early chance at actually countering nVidia if they had added more RAM or used GDDR funnily enough! But thanks to HBM, you can't really even mod them easily to change the RAM! They certainly have plenty of income sources that aren't PCs to keep competitive though, monopoly on 2 of the 3 popular consoles etc.. AMD have become rich by being devious and doing deals, not by being the best at what they do. Not like nobody does that but AMD, they lie to the customers far more than most. How's your "eight core" FX?
Intel has E-Cores... Look in the mirror. hypocrite.

Bulldozer Chief Architect removed from AMD, https://techreport.com/forums/viewtopic.php?t=85582 (Date: Dec 24, 2012)

Mike Butler, Chief Architect of the Bulldozer architecture, apparently doesn't work for AMD anymore.

Mike Butler is currently an Architect at Samsung.
 
Last edited:
Joined
Jun 6, 2022
Messages
622 (0.68/day)
Reviving a thread with an OT argument
AMD does have a point here, and Hardware Unboxed did offered us a good reason to avoid models that are VRAM limited even today.
Good luck shopping!
Meanwhile, in the real world and beyond HU donuts...

AMD did not withdraw the donuts from the stall if they did not smell suspicious. I know that they are vulnerable with this subject and act through intermediaries, such as HU.

Intel has E-Cores... Look in the mirror. hypocrite.
I had FX 8300 and I wanted to brag about multithreading. Eight cores, at that time, you could only find Intel processors 10 times more expensive.
Disaster!
The second disaster came when the little Pentium G4560 destroyed this FX in games with GTX 960.
It certainly doesn't compare to E-cores, "8 cores" being one of AMD's big lies, much bigger than the one now, with vRAM.

E-cores from 14700K outperform 3700X in multicore tests. They are not really small. These efficient cores made AMD supporters avoid the subject of renderings, encodings and multitasking.
ecores.jpg


My 13500 fights with 7700/X and not with 7600/X thanks to these E-cores.
 
Last edited:
Joined
Jun 2, 2017
Messages
9,210 (3.36/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Good luck shopping!
Meanwhile, in the real world and beyond HU donuts...

AMD did not withdraw the donuts from the stall if they did not smell suspicious. I know that they are vulnerable with this subject and act through intermediaries, such as HU.


I had FX 8300 and I wanted to brag about multithreading. Eight cores, at that time, you could only find Intel processors 10 times more expensive.
Disaster!
The second disaster came when the little Pentium G4560 destroyed this FX in games with GTX 960.
It certainly doesn't compare to E-cores, "8 cores" being one of AMD's big lies, much bigger than the one now, with vRAM.

E-cores from 14700K outperform 3700X in multicore tests. They are not really small. These efficient cores made AMD supporters avoid the subject of renderings, encodings and multitasking.
View attachment 331137

My 13500 fights with 7700/X and not with 7600/X thanks to these E-cores.
Calling HUB an AMD shill is so laughable I stopped reading after that.
 
Joined
May 11, 2020
Messages
238 (0.14/day)
E-cores from 14700K outperform 3700X in multicore tests. They are not really small. These efficient cores made AMD supporters avoid the subject of renderings, encodings and multitasking.
Intel's E-cores are really ugly piece of technology, and the worse part - instead of adding more cool P-cores (making 9-Pcores or 10-Pcores), or expand L3 Cache on few Megs., they added more ugly E-cores, which have energy efficiency of ~5 years old CPU, just to bump Core-Count number on the box.
 
Joined
Jan 20, 2019
Messages
1,561 (0.73/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
Good luck shopping!
Meanwhile, in the real world and beyond HU donuts...

AMD did not withdraw the donuts from the stall if they did not smell suspicious. I know that they are vulnerable with this subject and act through intermediaries, such as HU.


I had FX 8300 and I wanted to brag about multithreading. Eight cores, at that time, you could only find Intel processors 10 times more expensive.
Disaster!
The second disaster came when the little Pentium G4560 destroyed this FX in games with GTX 960.
It certainly doesn't compare to E-cores, "8 cores" being one of AMD's big lies, much bigger than the one now, with vRAM.

E-cores from 14700K outperform 3700X in multicore tests. They are not really small. These efficient cores made AMD supporters avoid the subject of renderings, encodings and multitasking.
View attachment 331137

My 13500 fights with 7700/X and not with 7600/X thanks to these E-cores.

oh look Gica the e-core cornish pasty came out of the closet looooooooool I actually like having you around, always good comedy.

E-cores independently beating a 3700X.... thats extremely impressive! Absolutely, if i were buying a system today with multithreading performance in mind intels 13th gen would be right up there in the list of options. Either that or the 7950X with a cheaper cooler + forward Gen upgrade support on AM5. Can't go wrong with either option.

Are you still defending 8GB VRAM for everyone? Honestly i respect this type of obstinate confirmation to live by the 8GB sword. If its perfectly in line with your personal gaming preferences and performance goals (and going forward), that's an amazing place to be. Some of us demand more and end up paying the penalty of higher premiums. We need more Gica's! 8-G-B for life!
 
Joined
Jul 20, 2020
Messages
1,135 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Audio Device(s) Dragonfly Black
Power Supply EVGA 650 G3 / EVGA BQ 500
Mouse JSCO JNL-101k Noiseless
Keyboard Steelseries Apex 3 TKL
Software Win 10, Throttlestop
Good luck shopping!
Meanwhile, in the real world and beyond HU donuts...

AMD did not withdraw the donuts from the stall if they did not smell suspicious. I know that they are vulnerable with this subject and act through intermediaries, such as HU.


I had FX 8300 and I wanted to brag about multithreading. Eight cores, at that time, you could only find Intel processors 10 times more expensive.
Disaster!
The second disaster came when the little Pentium G4560 destroyed this FX in games with GTX 960.
It certainly doesn't compare to E-cores, "8 cores" being one of AMD's big lies, much bigger than the one now, with vRAM.

E-cores from 14700K outperform 3700X in multicore tests. They are not really small. These efficient cores made AMD supporters avoid the subject of renderings, encodings and multitasking.
View attachment 331137

My 13500 fights with 7700/X and not with 7600/X thanks to these E-cores.

You necroed a thread to answer a 9 month old post to compare 12 e-cores to a 4-1/2 year old 8-core processor from 2 generations ago? What for? The comparison is meaningless.

And to post a video that shows that a $500 GPU is sometimes faster than a $400 GPU and sometimes not. Not really a good argument for the $500 GPU.

You seem confused.
 
Joined
Jan 14, 2019
Messages
12,353 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Good luck shopping!
Meanwhile, in the real world and beyond HU donuts...

AMD did not withdraw the donuts from the stall if they did not smell suspicious. I know that they are vulnerable with this subject and act through intermediaries, such as HU.


I had FX 8300 and I wanted to brag about multithreading. Eight cores, at that time, you could only find Intel processors 10 times more expensive.
Disaster!
The second disaster came when the little Pentium G4560 destroyed this FX in games with GTX 960.
It certainly doesn't compare to E-cores, "8 cores" being one of AMD's big lies, much bigger than the one now, with vRAM.

E-cores from 14700K outperform 3700X in multicore tests. They are not really small. These efficient cores made AMD supporters avoid the subject of renderings, encodings and multitasking.
View attachment 331137

My 13500 fights with 7700/X and not with 7600/X thanks to these E-cores.
You seriously had to revive a 6 month-old thread to promote how good Intel's e-cores are against AMD's 5 year-old architecture in the CPU-Z benchmark which is indicative of absolutely nothing? Dude, are you for real? :kookoo:
 
Status
Not open for further replies.
Top