That's true about first part.
Second part, was similar back in NES days I remember where cartridge was way too expensive. But they found a way to reduce overall costs. They can do it again. Hence the concept of mass production. They just went with CD cause it fit more at the time while being cheaper. They could find a way but they wont cause its expensive for them to do so. Its all about minimal costs. Like lack of originality in components for most part as all use same x86 processor and even GPU just clocked differently and or more CU's (well, playstation has some kind of proprietary function on their AMD cpu but not entirely sure what it is as I didn't read too far into it).
Im like Luther, I am an old duck too and just prefer the old days of consoles vs now. But that is just my opinion.
There is a lot of wishful thinking in that statement. First off, "they found a way to reduce costs. They could do it again" - did they, though? Carts were always expensive, and disks have always been much cheaper. From what I remember, the biggest cost savings came from using less storage in the carts, which of course constrained development massively (especially for the N64 which was competing with the disk-based PS1). You will always be able to stamp a thin metal film and squeeze it between a few layers of plastic more cheaply than you are able to assemble a non-volatile silicon-based storage medium. That's just reality. I mean, I would love if someone came up with an affordable non-volatile storage medium that performed on par with flash, but... the storage industry spends
billions in R&D on that every year (not to mention universities around the world), but flash is the best we've got, and any replacement is likely to be based on some exotic tech and thus very expensive at least to begin with (such as 3D Xpoint). Flash has a cost. A controller has a cost. A PCB has a cost. A casing has a cost. You can't just wish those away, and volume pricing only gets you so far.
As for it all being about minimal costs: have you seen how many game developers go bankrupt every year? How they struggle between projects? How beholden they are to publishers and platform holders? For developers, it's about survival, about still having a job. For publishers, it's (a lot) about profits - but even their margins wouldn't survive a $20 cost hike for physical games without also increasing game prices. And a $20 base cost for a high performance flash-based game cartridge of sufficient capacity is realistic.
And as for "lack of originality in components"? Are there any alternatives you know of? The computing industry has matured and consolidated massively in the past two decades. In the early 2000s a start-up could show up with some new tech and deliver revolutionary features or performance from seemingly nothing. That is not even remotely possible today, simply because things have developed far beyond that point. I guess they could have gone for an ARM-based CPU, but that would mean no backwards compatibility, and besides, ARM scales worse than X86 at high power. For GPUs, you have a handful of vendors making very similar designs with very similar featuresets - a necessity to support the standards put in place to allow for development. And only two vendors make high performance GPUs, and only one of them has a CPU product and is open for semi-custom work.
These aren't the Xbox/PS2 days when a new console could realistically bring with it new and revolutionary features, simply because coming up with new and revolutionary features is increasingly difficult as time goes by. Most vendors deliver the same featuresets, though mobile GPU vendors are a generation or two behind. Nvidia did something like this with RTX, but now AMD seems to have caught up within a single generation - and it's standardized too, through DXR. And while standards do bring conformity, they also bring ease of development, ultimately delivering better games as developers can spend more time making games and less time learning how to use new and weird tools. The PS2 and PS3 were both excellent illustrations of how promising hardware can be undermined by it being difficult to program for (well, the PS3 had a sub-par GPU, but it's CPU was supposed to be revolutionary, but mostly turned out to be terrible to write games for).
This is how technological development will always go. There is a finite amount of possible hardware configurations that will do what is needed, and a finite amount of possible new techniques to achieve this. The majority will be developed early, and as time goes on, development of new tech/features will slow, simply because the list of new possible solutions is both shorter and the things on that list are much more complex. There might come paradigm shift-shift-like developments that kick off a new wave of innovation (replacing silicon in transistors, for example), but we really haven't seen anything like that in computing since its inception. So expecting equally dramatic developments in a mature field of technology (current PCs and consoles) as in an immature one (say, NES and SNES era consoles) is naive and out of touch with reality. It simply isn't going to happen.