Hint: I repair, re-build and service Amigas a lot in my spare time
Not reading much of the other crap you're typing to be honest,
You argued "
And i couldn't give a fuck about AMD. Go away Lisa. Bad enough she fucked up chip designs in the 80s!" bullshit.
The UK is a small country and I don't care i.e. that's your problem.
PS4 / PS4 Pro with modified desktop Linux is slow due to netbook-class Jaguar CPUs.
We compare the prices, specs and features of Sony's PlayStation 5 consoles
www.whathifi.com
Article date: November 24, 2022
Standard PlayStation 5 debuted at
£449 / $499 / AU $749
November 24, 2022:
£479.99 / $499 / AU $799.95.
The disc-less PS5 Digital Edition, Sony's August 2022 price rises have nudged that up to
£389.99 / $399.99 / AU $649.95.
Paying more than
£479.99 is not the official price. You're the real bollocks.
Did you forget about Lisa Su's earlier work on something like the Atari chips or some other early computer/console systems? Have to google which now but she fucked it.
Pretty sure it was Jay Miner that did the Atari 7800 chips actually.. No wonder i like mine! Can't say i like Amiga any more though, such poor system design for the time really, they were bound to fail and good god the OS and hardware is often about as stable as an Italian taxi driver that's got stuck behind two old priests in a Skoda!
en.wikipedia.org
RIP..
Lisa Su wasn't at Atari and She wasn't old enough for developing 1980s game consoles. I paid very little attention to Atari 7800 and its shovelware games and the best 8-bit game console was Nintendo Entertainment System.
Lisa Su's summer jobs at
Analog Devices.
Lisa Su obtained her
master's degree from MIT in
1991. From
1990 to 1994, she studied for her Ph.D. under MIT advisor Dimitri Antoniadis.
In June
1994, Su became a member of the technical staff at
Texas Instruments.
Atari's downfall is during Jack Tramiel's Commodore MK2 aka Atari Corporation.
CEO Jack Tramiel's cost-cutting was the downfall of Commodore Semiconductor Group(CSG)'s MOS Technology 65xx CPU family design that caused Acorn to develop the ARM CPU. LOL.
The main reason for switching from CSG/MOS 65xx CPU family to Motorola's 68000 CPU family is due to CEO Jack Tramiel's cost-cutting the CPU R&D. CSG/MOS 65xx couldn't keep up with Intel's X86 evolution.
Commodore-International Limited and CEO Jack Tramiel (and his Commodore Mk2 Atari Corporation) are the major factors, NOT Lisa Su!
Lisa Su is judged when She is the CEO of a company!
Your narrative is a load of crap.
Oh yeah, that was it, CELL! God that was a piece of shit but my PS3s are only useful due to being jailbroken.. PS5 has sat in its box for years waiting for that one glorious day we can exploit it! I was patient with my PS4s and it paid off!
Read "Race for a New Game Machine: Creating the Chips Inside the Xbox 360 and the PlayStation 3" book which led design for IBM's PPE and SPE.
David Shippy was one of the lead architects for the POWER2*, G3 PowerPC, and POWER4* processor designs. He is the chief architect for the power processing unit for the Cell processor. https://www.goodreads.com/book/show/22121796-race-for-a-new-game-machine
Lisa Su's role is to represent IBM when they interface with Sony and Toshiba.
For a raster graphics workload, STI's CELL's SPU wasn't designed like ATI's Xenos GpGPU. IBM PPE's effective GFLOPS claims are rubbish i.e. they are worst than Jaguar CPUs.
IBM lacks modern GPU design experience when compared to AMD(ATI) or NVIDIA or VIA S3.
From
https://forum.beyond3d.com/threads/...-rsxs-lack-of-power.48995/page-6#post-1460125
Against NVIDIA GeForce 7 and PS3's RSX GPU design flaws
------------------------
Unmasking NVIDIA's "The Way its Meant to be Played" during NVIDIA's GeForce 7 series.
"I could go on for pages listing the types of things the spu's are used for to make up for the machines aging gpu, which may be 7 series NVidia but that's basically a tweaked 6 series NVidia for the most part. But I'll just type a few off the top of my head:"
1) Two ppu/vmx units
There are three ppu/vmx units on the 360, and just one on the PS3. So any load on the 360's remaining two ppu/vmx units must be moved to spu.
2) Vertex culling
You can look back a few years at my first post talking about this, but it's common knowledge now that you need to move as much vertex load as possible to spu otherwise it won't keep pace with the 360.
3) Vertex texture sampling
You can texture sample in vertex shaders on 360 just fine, but it's unusably slow on PS3. Most multi platform games simply won't use this feature on 360 to make keeping parity easier, but if a dev does make use of it then you will have no choice but to move all such functionality to spu.
4) Shader patching
Changing variables in shader programs is cake on the 360. Not so on the PS3 because they are embedded into the shader programs. So you have to use spu's to patch your shader programs.
5) Branching
You never want a lot of branching in general, but when you do really need it the 360 handles it fine, PS3 does not. If you are stuck needing branching in shaders then you will want to move all such functionality to spu.
6) Shader inputs
You can pass plenty of inputs to shaders on 360, but do it on PS3 and your game will grind to a halt. You will want to move all such functionality to spu to minimize the amount of inputs needed on the shader programs.
7) MSAA alternatives
Msaa runs full speed on 360 gpu needing just cpu tiling calculations. Msaa on PS3 gpu is very slow. You will want to move msaa to spu as soon as you can.
Post processing
8) 360 is unified architecture meaning post process steps can often be slotted into gpu idle time. This is not as easily doable on PS3, so you will want to move as much post process to spu as possible.
9) Load balancing
360 gpu load balances itself just fine since it's unified. If the load on a given frame shifts to heavy vertex or heavy pixel load then you don't care. Not so on PS3 where such load shifts will cause frame drops. You will want to shift as much load as possible to spu to minimize your peak load on the gpu.
10) Half floats
You can use full floats just fine on the 360 gpu. On the PS3 gpu they cause performance slowdowns. If you really need/have to use shaders with many full floats then you will want to move such functionality over to the spu's.
11) Shader array indexing
You can index into arrays in shaders on the 360 gpu no problem. You can't do that on PS3. If you absolutely need this functionality then you will have to either rework your shaders or move it all to spu.
Etc, etc, etc...
NVIDIA's GeForce 8 (CUDA) series is a large improvement.
AmigaOS wasn't designed with MMU since 68000 did NOT have an MMU. LOL "Old school" Unix vendors with 68000 CPU-based workstations used a custom MMU until 68010's slow 68451 and 68020's 68851 add-on MMUs. 68030 was released in 1987 with a built-in MMU. Motorola wasn't taking Unix seriously and 68030 was late to the party. Many "Old school" Unix vendors started their RISC CPU R&D due to Motorola's inferior R&D roadmap.
Commodore's toy mindset has MMU less Amiga 1200's 68EC020 CPU baseline and doubled down with Amiga 4000 with MMU-less 68EC030. Motorola shouldn't offer MMU-less 68EC030.
Microsoft Xenix and Intel 80286 (with built-in MMU since 1982) dominated UNIX shipments in the 1980s. Intel 80386 has built-in MMU for Xenix, Linux, and Windows NT. Linux originated from an 80386-based PC. Unlike Motorola, Intel is consistent with the built-in MMUs with 286 and 386 to the current date.
Though you have to drop a lot of performance and get more bugs to get your extra VRAM.. It is a point though, however now that card they keep playing is at least valid. Back when 8GB of VRAM was almost too much for games at the time, that didn't future proof the 290 series and onwards quite so much - they still sucked but they remained cheap and popular. And they are not fast enough to take advantage of that 8GB.. It only really makes sense now. Frankly i would have rather AMD/ATi put more in actually making their GPUs good along with their drives instead of giving you bonus RAM you won't use until 10 years later... lol
Some of the RAM fudgery was due to shortages and the pandemic too, but i'd be lying if i said it didn't also annoy me - plus AMD PCB quality has only just improved and their silicon use is generally of a lower quality/higher yield type. Despite all the amazing PR and Advertising AMD do, they still haven't actually "won" anything yet, even if they had a few things first (that again, only matter now, when those machines are no longer fast enough to take full advantage of it) etc.. I fell for the "native quad core" and "integrated memory controller" crap too almost, i was a total AMD sucker.. Haha! Pretty sure nVidia copied AMD with the whole amount of RAM on the 3060 vs the Ti though.. 12GB on a 3060 and 2060 why? And indeed 4GB of their new HBM invention on the Fury helped kill it...that and it was kind of shit but...that was their best early chance at actually countering nVidia if they had added more RAM or used GDDR funnily enough! But thanks to HBM, you can't really even mod them easily to change the RAM! They certainly have plenty of income sources that aren't PCs to keep competitive though, monopoly on 2 of the 3 popular consoles etc.. AMD have become rich by being devious and doing deals, not by being the best at what they do. Not like nobody does that but AMD, they lie to the customers far more than most. How's your "eight core" FX?
Intel has E-Cores... Look in the mirror. hypocrite.
Bulldozer Chief Architect removed from AMD,
https://techreport.com/forums/viewtopic.php?t=85582 (Date: Dec 24, 2012)
Mike Butler, Chief Architect of the Bulldozer architecture, apparently doesn't work for AMD anymore.
Mike Butler is currently an Architect at Samsung.