Monday, November 4th 2024

AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20

As we entered November, Valve just finished processing data for October in its monthly update of Steam Hardware and Software Survey, showcasing trend changes in the largest gaming community. And according to October data, AMD's discrete GPUs are not exactly in the best place. In the top 20 most commonly used GPUs, not a single discrete SKU was based on AMD. All of them included NVIDIA as their primary GPU choice. However, there is some change to AMD's entries, as the Radeon RX 580, which used to be the most popular AMD GPU, just got bested by the Radeon RX 6600 as the most common choice for AMD gamers. The AMD Radeon RX 6600 now holds 0.98% of the GPU market.

NVIDIA's situation paints a different picture, as the top 20 spots are all occupied by NVIDIA-powered gamers. The GeForce RTX 3060 remains the most popular GPU at 7.46% of the GPU market, but the number two spot is now held by the GeForce RTX 4060 Laptop GPU at 5.61%. This is an interesting change since this NVIDIA GPU was in third place, right behind the regular GeForce RTX 4060 for desktops. However, laptop gamers are in abundance, and they are showing their strength, placing the desktop GeForce RTX 4060 in third place, recording 5.25% usage.
Source: Steam Survey
Add your own comment

222 Comments on AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20

#201
AusWolf
Onasi@fevgatos
You are technically correct (the best type, I suppose), but what Aus is driving at it that at sizes where hypothetically both 1440p and 4K would hit a certain ppi that is high enough for the task, like 150+ for typical monitor distance or 300+ for mobile usage, the PERCEIVED image quality will be very close, if not indistinguishable. I certainly can’t reliably tell the difference between, say, a recent iPhone with 460 ppi and a new Galaxy Ultra with 500+, not in a meaningful way.
Exactly. If you upgrade your 1440p monitor with 300 ppi to a bigger 4K one with the same 300 ppi, your picture in gaming (not in Photoshop) will look the same... Just bigger.

When I upgraded from 1080p 24" to 1440 UW 34", my ppi, and therefore, my image quality stayed roughly the same. I just got more FOV and desktop area.
fevgatosWell if your goal is just to make it appear better then we might as well be using a 14" 1080p monitor at a distance of 2 meters, it will be so tiny you won't see any misshaps.
Yeah, might as well if you don't mind the size.
fevgatosThe photoshop was just an example to demonstrate the concept. Set it to 1:1 pixel view and just draw a horizontal line with each pixel having a unique color. On a 1080p monitor you can only get 1920 different colors, on a 4k screen you can get double that. This is literally what image detail is, how many unique colors you can get.
It was a bad example, an apples to oranges comparison. What you're talking about has nothing to do with what I'm talking about.
fevgatosIt's not different in games. What do you think happens in a game when you drop resolution from 4k to 1080p? What do you think happens to the image? There were 8.3m pixes at 4k, now we are down to 2m. Do you think those extra 6.3m pixels were doing nothing there?
Don't forget you're dropping resolution on the same screen size which lowers your ppi to the floor. That's why it'll look like crap.

Of course you have less detail on a smaller resolution screen, but if it's a smaller size, you won't notice it.
Posted on Reply
#202
JustBenching
OnasiOkay, this is just a ridiculous maximalistic statement at this point. Static images and actual screen quality in daily usage aren’t the same thing. There is more to it than raw pixel count. You are arguing for something that nobody brought up. Do you actually, unironically think that there won’t be a point in desktop screens where resolution increases will no longer lead to perceivable improvements in image quality and the performance hit will just not be worth it? Hint - there absolutely will come such a time.
I know everyone likes clowning on Apple (for some reason), but their Retina concept isn’t actually just a marketing meme and there was thought put behind it. And most serious researchers, even those noting some imperfections with it, tend to agree on the principle.


Yes, it’s called a laptop screen. And yes, the WHOLE GOAL IS IMPROVING PERCEIVED DISPLAY QUALITY, that’s the whole point, not just jacking off to numbers.

Edit: Also, make a discussion or something guys, I just noticed the thread we are in and this is a derail if I ever saw one.
We are not anywhere near that point for desktop monitors though. 16k resolution at 42" might get to that diminishing point.
Posted on Reply
#203
Onasi
fevgatosWe are not anywhere near that point for desktop monitors though. 16k resolution at 42" might get to that diminishing point.
Wrong. 8K at 27 will already do the trick. At normal desktop seating distance anything beyond 300 ppi would be much of muchness for anyone with normal or near normal vision, either naturally or compensated.

But, as always, you are picking a really weird hill to die on and argue for, so I am out to not further derail.
Posted on Reply
#204
JustBenching
AusWolfDon't forget you're dropping resolution on the same screen size which lowers your ppi to the floor. That's why it'll look like crap.
Forget the monitor, you have 2 monitors side by side. A 1080p and a 4k. What happens to those 6.3m pixels?
Posted on Reply
#205
AusWolf
fevgatosForget the monitor, you have 2 monitors side by side. A 1080p and a 4k. What happens to those 6.3m pixels?
Are PPI and viewing distance the same? If so, then I don't care.

4K can give you a much bigger screen area, or a much better image quality (ppi), or a little bit of both. That's what happens with the 6.3m pixels.

It's not the pixels alone that give you a better image, but your perception of them, which is highly dependent on your ppi and viewing distance (not to mention your eyesight).

Edit: I have a question too. You have one image. You open it in Photoshop on your 4K screen and zoom in on it. Then, you open it on your phone and don't zoom in on it. Where will it look better?
Posted on Reply
#206
JustBenching
AusWolfAre PPI and viewing distance the same? If so, then I don't care.

4K can give you a much bigger screen area, or a much better image quality (ppi), or a little bit of both. That's what happens with the 6.3m pixels.

It's not the pixels alone that give you a better image, but your perception of them, which is highly dependent on your ppi and viewing distance (not to mention your eyesight).

Edit: I have a question too. You have one image. You open it in Photoshop on your 4K screen and zoom in on it. Then, you open it on your phone and don't zoom in on it. Where will it look better?
The question isn't whether you care, the question is what were those 6.3m pixels displaying if not extra details?
AusWolfEdit: I have a question too. You have one image. You open it in Photoshop on your 4K screen and zoom in on it. Then, you open it on your phone and don't zoom in on it. Where will it look better?
Depends on the resolution of the phone, doesn't it?
Posted on Reply
#207
AusWolf
fevgatosThe question isn't whether you care, the question is what were those 6.3m pixels displaying if not extra details?
No. The question is, would you notice those 6.3m extra pixels?
fevgatosDepends on the resolution of the phone, doesn't it?
No. It depends on the PPI of the phone compared to your monitor.

Edit: Okay, let's make it simple. Here's two pictures. Lean back in your chair and tell me, which one looks better to you? (Hint: they're the exact same picture)

Posted on Reply
#208
JustBenching
Currently on my phone, the top one looks worse, there is a lot of lost detail especially on the edges. Will try on my monitor in the afternoon.
Posted on Reply
#209
AusWolf
fevgatosCurrently on my phone, the top one looks worse, there is a lot of lost detail especially on the edges. Will try on my monitor in the afternoon.
Wait, I had to edit my post because my image editor didn't save properly. Look again (I'm not sure if you can see what I mean on your phone, though - I can on mine).

I don't mind if you PM me your observations. We've probably derailed this thread long enough.
Posted on Reply
#210
LittleBro
fevgatos4k DLSS Q looks disgustingly better than 1440p native. Period.
fevgatosIt's not about the PPI, ppi is irrelevant.
You clearly don't seem to understand how human eye vision relates to change in display's PPI. It's true, that 24" 1440p display will have same amount of pixels as 28" 1440p. Yet your eye will see a different picture, you can easily stop using more sophisticated AA method on that 24" because it won't be so noticeable as on higher display.

Make a screenshot while playing on lower resolution, then display that same screenshot on bigger monitor with same resolution. In case your vision is really okay as you said, you should be able to notice a difference. I had once a discussion with Mac laptop dude. He bought that laptop with 13" screen with shitload of pixels to work with photos. "Retina is like made for this, it's the best you can get." And yet professional video and photo editors use much higher EIZO displays. Why? Because at that 13" 2048x1536 retina they can't see sh*t with their own eyes, they can't see what their filters and other applied effects do.

While having more PPI on same physical display size increases that display's resolution, it decreases human eye options to perceive those details. This, of course, differs from person to person. It's like hearable human ear frequency range differs from person to person. And this eye or ear "resolution" will degrade with time (as person gets older). I hate when AC/DC adaptor near my bed does that high pitched constant tone when it's charging up a phone. My girlfriend can't hear it and she's even 4 years younger.

When video, image or sound record gets downscaled, portion of information gets permanently destroyed. You can't really re-create back that information (unless it's stored in some form), you can only guess or use methods to improve that guessing accuracy (e.g. interpolation). That's what DLSS/FSR/XeSS does. Of course, upscaling algorithms are getting better and better, but still, you can't re-create the missing information. You can generate something similar but it will never be same as original, meaning it will never be same as authors intended it to be. That's why I call everything else but native fake.

Maybe try getting 24-bit FLAC sound record with rich tone variery, especially rich in lower frequencies, convert it to 320 kbps MP3, then convert it back to 24-bit FLAC and compare both using $200 headphones and standard DAC. You will notice that especially lower tones are somehow poor or even missing. When you downscale 24-bit FLAC to MP3 with like 7-8 times less bitrate, you lost a lot of information which you can't really recreate by upscaling it back to original resolution. Even with application of so called "AI", various filters, etc, it won't reach the quality of original sound. It may end up having similar sized file as original record but it won'be the same, it will be something different than autor made it to be.
Posted on Reply
#211
the54thvoid
Super Intoxicated Moderator
Stop with the derail. Pretty sure there was an actual monitor post made recently - go find it, go there.

OT's will just get deleted, and posters reply banned.
Posted on Reply
#212
SPARSTE96
I'm not surprised at all by this news, Amd keeps having problems with Radeon every generation like high idle power consumption, stuttering and low utilization issues with old games ( like 15-20 years old games), less frequent driver updates for older products and the list goes on.

At the same time it's really a shame that Nvidia is allowed to raise their prices as much as they want, we need competition, but Amd and Intel aren't capable of competing... the GPU market is SO bad right now.
Posted on Reply
#213
kapone32
AusWolfSo who is the end user? Enthusiasts? Or regular people who just want to run games?
There is the rub. In every scenario AMD is cheaper as well. You have people commenting on how middle of the road a card like the 7900XT without realizing that it is in the top of every chart that matters to Gamers. There is even a poll to prove that.
SPARSTE96I'm not surprised at all by this news, Amd keeps having problems with Radeon every generation like high idle power consumption, stuttering and low utilization issues with old games ( like 15-20 years old games), less frequent driver updates for older products and the list goes on.

At the same time it's really a shame that Nvidia is allowed to raise their prices as much as they want, we need competition, but Amd and Intel aren't capable of competing... the GPU market is SO bad right now.
The News vs the truth.

1. High Idle power? I have no idea what you mean. Do you mean more than 10 watts, 20 Watts? I would also ask you to show me a GPU that has better idle power than the 6500XT.
2. Give me a Game, I have plenty of old ones too. Do you mean DOS or Steam like Kingdoms of Aamluar or Sleeping Dogs? Maybe Just Cause 2? I have not seen what you describe. Maybe Praetorians?
3. Less frequent updates for older products. Yeah it kind of blows that Vega is no longer getting driver support. Except my 5600G has the latest driver.

What is the truth is that the narrative takes away the fact that China was openly buying as many 4090/4080 GPUs as they could and Nvidia was allowed to use that as part of their numbers. It is like when TW3 Kingdoms launched on Steam. It instantly became the most popular TW titile in terms of Sales but TWWH is the real driver of the Total War economy. Then you combine that with the tech media all using 4090s to the point where you will see comments on TPU like you can't Game at 4K unless you have a 4090/4080 and the 7900XTX/7900XT are not as good at RT so they are not worth the money. Then you look at prices and realize that sales are down across the board with MBs and GPUs priced to the moon. Even Storage as volatile as it has been has been it has nothing like the price gouging that Nvidia started. As an example if the 7900XT was $450 US there would be no reason to buy anything else. Where I live that is about the cost of a 4060 ti. Is a 4060Ti better than a 7900XT at anything? Before you answer that read some GPU reviews on TPU and focus on where the 7900XT is on the Gaming charts.

These modern reviews also do not use CPU Intensive Games that are the ones that make your PC cry. Like City Skylines2 or Factorio, it is the first Game at 4K where I had to turn on Hyper RX once my population started reaching 1 million. Try that at 4K high and you will see clearly the separation between CPUs in cores and clock speed. In fact most Games at 4K high on these modern systems is a great way to guage CPU performance. It let me know that a 7800X3D is not as fast a 7900X3D in City Skylines2 at 4K and niether will it produce as many FPS in Racing Games like AMS2. Reveiws use 4K Ultra or whatever the highest setting the Game allows and that all but ensures that the GPU does all the work as the frame buffer will alwys be on when you allow all the candy.

If you want pure raster AMD is a great choice.
Posted on Reply
#214
SPARSTE96
kapone32There is the rub. In every scenario AMD is cheaper as well. You have people commenting on how middle of the road a card like the 7900XT without realizing that it is in the top of every chart that matters to Gamers. There is even a poll to prove that.


The News vs the truth.

1. High Idle power? I have no idea what you mean. Do you mean more than 10 watts, 20 Watts? I would also ask you to show me a GPU that has better idle power than the 6500XT.
2. Give me a Game, I have plenty of old ones too. Do you mean DOS or Steam like Kingdoms of Aamluar or Sleeping Dogs? Maybe Just Cause 2? I have not seen what you describe. Maybe Praetorians?
3. Less frequent updates for older products. Yeah it kind of blows that Vega is no longer getting driver support. Except my 5600G has the latest driver.

What is the truth is that the narrative takes away the fact that China was openly buying as many 4090/4080 GPUs as they could and Nvidia was allowed to use that as part of their numbers. It is like when TW3 Kingdoms launched on Steam. It instantly became the most popular TW titile in terms of Sales but TWWH is the real driver of the Total War economy. Then you combine that with the tech media all using 4090s to the point where you will see comments on TPU like you can't Game at 4K unless you have a 4090/4080 and the 7900XTX/7900XT are not as good at RT so they are not worth the money. Then you look at prices and realize that sales are down across the board with MBs and GPUs priced to the moon. Even Storage as volatile as it has been has been it has nothing like the price gouging that Nvidia started. As an example if the 7900XT was $450 US there would be no reason to buy anything else. Where I live that is about the cost of a 4060 ti. Is a 4060Ti better than a 7900XT at anything? Before you answer that read some GPU reviews on TPU and focus on where the 7900XT is on the Gaming charts.

These modern reviews also do not use CPU Intensive Games that are the ones that make your PC cry. Like City Skylines2 or Factorio, it is the first Game at 4K where I had to turn on Hyper RX once my population started reaching 1 million. Try that at 4K high and you will see clearly the separation between CPUs in cores and clock speed. In fact most Games at 4K high on these modern systems is a great way to guage CPU performance. It let me know that a 7800X3D is not as fast a 7900X3D in City Skylines2 at 4K and niether will it produce as many FPS in Racing Games like AMS2. Reveiws use 4K Ultra or whatever the highest setting the Game allows and that all but ensures that the GPU does all the work as the frame buffer will alwys be on when you allow all the candy.

If you want pure raster AMD is a great choice.
1)Read the teachpowerup reviews about the high idle power, i think it has been fixed now. Still it was there, even for previous rdna generations

2) Unfortunately for me i haven't see many examples of games where this happened, i read about it at least a year ago but i've seen people complaining about it at least 3-4 times, like here www.overclock.net/threads/rx-6000-cards-are-disgustingly-bad-for-old-games.1805753/ and buildapc/comments/127kaqo and AMDHelp/comments/1bqmkbj
"Nvidia spents lots and lots on game-specific optimizations throughout the years that AMD didn't catch up, and right now it makes no financial sense for AMD to work on the old stuff. Plus higher market share of discrete Nvidia means many lesser-known games only optimized for Nvidia instead of AMD. Overall you're more likely to have a worse experience playing older/less popular games with AMD compared to Nvidia."

3) not even talking about Vega, when rx 7000 released many people were angry because updates for rx6000 slowed so much.... also unrelated but destiny 2 had awful performance issues on Amd in 2020 when the nex dlc dropped and Amd took like MONTHS to fix the absymally low performance. I had 2 friends with a 5700xt that complained so much about this....

Also Nvidia has Dynamic Super Resolution which allows me to play older games at 5120x2880p instead of 1440p ( and when i get a 4k monitor i can play older games in 8k), Amd's virtual super resolution can go above 4k but they don't even advertise it... in fact until now i thought it couldn't go above 4k at all....

"As an example if the 7900XT was $450 US there would be no reason to buy anything else" I agree but that will never happen
Posted on Reply
#215
kapone32
SPARSTE961)Read the teachpowerup reviews about the high idle power, i think it has been fixed now. Still it was there, even for previous rdna generations

2) Unfortunately for me i haven't see many examples of games where this happened, i read about it at least a year ago but i've seen people complaining about it at least 3-4 times, like here www.overclock.net/threads/rx-6000-cards-are-disgustingly-bad-for-old-games.1805753/ and buildapc/comments/127kaqo and AMDHelp/comments/1bqmkbj
"Nvidia spents lots and lots on game-specific optimizations throughout the years that AMD didn't catch up, and right now it makes no financial sense for AMD to work on the old stuff. Plus higher market share of discrete Nvidia means many lesser-known games only optimized for Nvidia instead of AMD. Overall you're more likely to have a worse experience playing older/less popular games with AMD compared to Nvidia."

3) not even talking about Vega, when rx 7000 released many people were angry because updates for rx6000 slowed so much.... also unrelated but destiny 2 had awful performance issues on Amd in 2020 when the nex dlc dropped and Amd took like MONTHS to fix the absymally low performance. I had 2 friends with a 5700xt that complained so much about this....

Also Nvidia has Dynamic Super Resolution which allows me to play older games at 5120x2880p instead of 1440p ( and when i get a 4k monitor i can play older games in 8k), Amd's virtual super resolution can go above 4k but they don't even advertise it... in fact until now i thought it couldn't go above 4k at all....

"As an example if the 7900XT was $450 US there would be no reason to buy anything else" I agree but that will never happen
1. Wait you just said it is a nothing burger

2. Anecdotal from 3 people but I am not going to argue I had a 5600Xt but got a 6800Xt at launch and it has been Golden, same thing happened when I got a 7900XT. When you think about it Older Games have less features so the raw performance of modern PCs should be huge as an advantage and not disadvantage, Take TW Rome and see how fast a modern PC is. The Max resolution is 1080P. I seriously have no idea where you get that. Do you mean Hairworks on the Witcher? Maybe you mean Physx. That died as soon as Nvidia locked you out if the program detected an AMD card and would not work. I could give you the features in CP2077 but to be honest the raw raster performance and what 4K looks like on a modern PC is fine for my eyes and everyone else in my circle. This is anecdotal as well but my friends with 3080s are not as happy as those with 6800Xts.

3. Yep those 3 months to add 7000 to the universal stack were so long and had such a negative effect on performance that the entire community wet their panties while the other 90% of users did not even know. anything had happened. We can both use individual Games to critique both AMD and Nvidia performance. Hogwarts and Starfield come to mind.

4. AMD is not Intel and also have added lot's of things like Freesync and FSR (As much as it is derided it is universal) but even before that AMD cards have always been cheaper than

AMD software is that good and I know that AMD has advertised it but again you raised an issue that is a nothing burger.

I am not going to deny that most AMD cards do not sell as well as Nvidia cards but when the entire narrative is against you and people that have not used AMD cards strongly opine on them it shows. I will give you an example. How many of the main Youtube channels have done a deep dive on AMD software? How many tech media sites have done a deep dive on AMD software? How many know what AMD software is today if they only use Nvidia cards in their videos? That is sad because AMD software today can easily give you whatever you want. I have been playing a lot of City Skyines 2 lately and my population on my latest build is over 800,000 with 200km of Train Tracks, 35 Subways stops and 6 interchanges. 1/3 of the CIty is Office buildings so there is plenty of traffic from outside. Playing at 4k it tanked my FPs to from50s and 60s to the low 20s to high 30s. Well I went into AMD software and clicked on the Icon for CS2 and instantly HYper RX was activated. Went back into City Skylines 2 and now we are in the 60s and back in Freesync range for butter smooth Gameplay. That means fast Vehicles on the Highways and fast moving foot traffic at Train and Subway stations. AMD are actually tryiing and the fine wine is in full effect. Just look at how many people are still happy with their 6800XT. If I was playing at 1440P I would be happy with a 6800XT too but I am an enthusiast (Not a negative for those on 1440P) and 1440P was my Qnix monitor from like 14 years ago. I just happen to like AMD as they align more with my thought process on what the PC should be. No AMD user paid for them to implement Freesync and the whole world has benefited with TVs coming with (VRR) Freesync technology for the masses to enjoy.
Posted on Reply
#216
Vayra86
fevgatosmonitors-
EDIT: nvm lol just saw mod post. You can add me to that PM convo if you still have it going :D
Posted on Reply
#217
lilhasselhoffer
So...after 9 pages this seems like a discussion about AMD versus Nvidia. I don't buy it.

What I do buy is that AMD is changing their position in the market and Steam is not a great indicator of their future. Steam doesn't track Playstation or Xbox consoles. It also skips the Switch...but that's an Nvidia product. I...have to acknowledge that the Switch has outsold the Playstation...but if you look at overall gaming PC sales versus console sales the point of contention is still that consoles outsell gaming PCs. That's...a lot of money.

I see AMD succeeding on the CPU side, and thriving with their APUs...and acknowledge that their next generation of video cards will not be high end. It seems like they are going the Nintendo route, where what they put out is not the best. It is not pushing new features. It is pushing value and profitability for a chunk of the market that they think they can milk, as long as they actually pay it attention. It's the same new logic that UserBenchmark has been called out on, in their ever amusing war against everything AMD. They have to acknowledge that AMD wins in some things, but follow it up by insulting their consumers and AMD's marketing team. They always cite that Intel is better...even when the numbers disagree. AMD evaluated the GPU race and has stepped back from the PC enthusiast market to carve out their niche in consoles...which based upon financials seems to have been a good step.

Before anyone calls it, Nvidia made a better one with selling their hardware as AI development tools. No questions there. I just look forward to the next two generations as Nvidia funnels all of their development there...and the eventual collapse of the power hungry glorified LLMs that make up current AI model. AMD will have consumers in the $300 window, but if the 4060 is what Nvidia is willing to do for that consumer hopefully there will be a reckoning.
Posted on Reply
#218
Shirley Marquez
AusWolfI think the point was that the $100 range of GPUs have been replaced with a gaping hole in recent years, which AMD could fill if they wanted to, but they don't for some wild reason.
The expectations for GPUs have risen. They're being asked to do more than they used to, including driving larger and higher resolution monitors. Plus there has been inflation since the days of $100 gaming GPUs. I don't think those are ever coming back. In addition, we now have competent integrated GPUs; those have taken over the niche that $100 cards used to occupy.
Posted on Reply
#219
LittleBro
Shirley MarquezThe expectations for GPUs have risen. They're being asked to do more than they used to, including driving larger and higher resolution monitors. Plus there has been inflation since the days of $100 gaming GPUs. I don't think those are ever coming back. In addition, we now have competent integrated GPUs; those have taken over the niche that $100 cards used to occupy.
Today's GPUs are too complicated - they are too versatile. On one side, it's good to be versatile, on the other you can't really focus on everything really well (you lack resources) . Today's GPU abbreviation stand more for General Processing Unit than Graphics Processing Unit. I don't see reason to have NPU included in both GPU and CPU. It's okay to have iGPU and NPU included in laptop SoC, because there is lack of space, so putting it together is convenient, but having iGPU and/or NPU in desktop CPU is waste of silicon by my opinion, waste of space, waste of bandwidth. More cores or large caches or better I/O capabilities would be much more useful. I mean, seriously, is there anyone who really buys Intel desktop CPU with intend to run it with that shitty iGPU? You can buy GPU which will have much higher so called "AI" performance than that tiny little NPU in desktop CPU. Or, you can buy PCI-E based "AI" accelerator add-in card, in case you don't want to use NPU capabilities of your GPU or your GPU does not support "AI".
Posted on Reply
#220
Assimilator
LittleBroI mean, seriously, is there anyone who really buys Intel desktop CPU with intend to run it with that shitty iGPU?
Literally every IT department in the world, which together buy the vast majority of CPUs produced.
Posted on Reply
#221
Robin Seina
Steam HW Survey is NO statistic.
It lacks many statistics requisits - sample choosing methodology (size of the sample, method of choosing sample), deviations, probability of error level, etc.

Thus, its data cannot be taken seriously. Afterall: "I only believe in statistics that I doctored myself." (Joseph Goebbels)
Posted on Reply
#222
LittleBro
AssimilatorLiterally every IT department in the world, which together buy the vast majority of CPUs produced.
True, serves me right. I should have been more specific about my question being related to TPU.
Posted on Reply
Add your own comment
Dec 4th, 2024 09:41 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts