Wednesday, May 29th 2024

Arm Also Announces Three New GPUs for Consumer Devices

In addition to its two new CPU cores, Arm has announced three new GPU cores, namely the Immortalis-G925, Mali-G725 and Mali-G625. Starting from the top, the Immortalis-G925 is said to bring up to 37 percent better performance at 30 percent lower power usage compared to last year's Immortalis-G720 GPU core, whilst having two additional GPU cores in the test scenario. It's also said to bring up to 52 percent better ray tracing performance and up to 36 percent improved inference in AI/ML workloads. It's also been given a big overhaul when it comes to ray tracing—due to it being aimed towards gaming phones—and Arm claims that it can either offer up to 52 percent increased performance by reducing the accuracy in scenes with intricate objects, or 27 percent more performance with maintained accuracy.

The Immortalis-G925 supports 50 percent more shader cores and it supports configurations of up to 24 cores, compared to 16 cores for the Immortalis-G720. The Mali-G725 will be available with between six and nine cores, whereas the Mali-G625 will sport between one and five cores. The Mali-G625 is intended for smartwatches and entry-level mobile devices where a more complex GPU might not be suitable due to power draw. The Mali-G725 on the other hand is targeting upper mid-range devices and the Immortalis-G925 is aimed towards flagship devices or gaming phones as mentioned above. In related news, Arm said it's working with Epic Games to get its Unreal Engine 5 desktop renderer up and running on Android, which could lead to more complex games on mobile devices.
Source: Arm
Add your own comment

8 Comments on Arm Also Announces Three New GPUs for Consumer Devices

#1
Denver
I swear I don't understand RT idea in smartphone GPUs that are limited to 10W... If ray tracing (RT) is a poor idea for mid-range and low-end GPUs(100-300W), it seems even less suitable for smartphones. This reinforces my belief that RT is purely a marketing gimmick. :p
Posted on Reply
#2
evernessince
DenverI swear I don't understand RT idea in smartphone GPUs that are limited to 10W... If ray tracing (RT) is a poor idea for mid-range and low-end GPUs(100-300W), it seems even less suitable for smartphones. This reinforces my belief that RT is purely a marketing gimmick. :p
Mobile games tend to be lower complexity and lower resolution so that might be a factor in making it possible. It might even be acceptable to just ray trace a single object on mobile. There are plenty of virtual Waifu games where you'd only need to RT one girl / guy at a time.

Of course it could also be just a gimmick but a needed stepping stone to get to the point where eventually they have a product years down the line with decent RT performance.
Posted on Reply
#3
wolf
Better Than Native
evernessinceOf course it could also be just a gimmick but a needed stepping stone to get to the point where eventually they have a product years down the line with decent RT performance.
But it's easier to just shrug it off as a gimmick right! /s You make a good point; it's got to start somewhere in consumer devices to build up from there. Personally, I've been able to get half a dozen solid RT experiences on an RTXA2000 (~desktop 3050 perf) using 70w, so I don't particularly buy the notion that low power alone is a deal breaker; it's useable at the resolutions the hardware is in its element at. The Solar Bay benchmark runs very smoothly on my Galaxy S23 too, but I'm not aware of any mobile games that have RT yet.
Posted on Reply
#5
JWNoctis
LabRat 8911 step closer to a Tricorder.
I think that was already attempted and C&D'd back in early Android days, notwithstanding Gene Roddenberry's statement that anyone capable of making a device with similar functionality is free to call it by that name. Apparently, it was not legally binding.

That being said, it is already possible to make something comparable these days, complete with an array of gas detectors and a miniaturized laser spectrometer, and all the MEMS sensors you can get. No handheld MRI yet though, the smallest non-prototype is a bedside unit.
wolfBut it's easier to just shrug it off as a gimmick right! /s You make a good point; it's got to start somewhere in consumer devices to build up from there. Personally, I've been able to get half a dozen solid RT experiences on an RTXA2000 (~desktop 3050 perf) using 70w, so I don't particularly buy the notion that low power alone is a deal breaker; it's useable at the resolutions the hardware is in its element at. The Solar Bay benchmark runs very smoothly on my Galaxy S23 too, but I'm not aware of any mobile games that have RT yet.
Could confirm this. I played RT games (mostly Control and MW5) fine on a 105W mobile 3070, slightly more performant than a desktop 3060. You don't actually need 300W for RT, at least not for RTX 30/40. 300W is 3080/4080 and above.

RT on a mobile processor though... I'll believe it when I see it outside demos. Even a 3070 mobile is not immune to the occasional disco light artifact in its RT effect, in Control.

EDIT: Arguably, lack of RT did not prevent games from showing up with exceptional dynamic lighting effects and looking gorgeous. The only example to come up to mind now is Ori and the Will of the Wisps, which is admittedly a PC and console title that also ran on Switch...A RT-enhanced edition of that game would probably do well, considering how the player character is a literal source of capital-L Light, now that I think of it.
Posted on Reply
#6
Dr. Dro
DenverI swear I don't understand RT idea in smartphone GPUs that are limited to 10W... If ray tracing (RT) is a poor idea for mid-range and low-end GPUs(100-300W), it seems even less suitable for smartphones. This reinforces my belief that RT is purely a marketing gimmick. :p
It's actually far, far more relevant than ML processing on a mobile iGPU, any SoC worth its salt will have an NPU nowadays.
JWNoctisRT on a mobile processor though... I'll believe it when I see it outside demos. Even a 3070 mobile is not immune to the occasional disco light artifact in its RT effect, in Control.

EDIT: Arguably, lack of RT did not prevent games from showing up with exceptional dynamic lighting effects and looking gorgeous. The only example to come up to mind now is Ori and the Will of the Wisps, which is admittedly a PC and console title that also ran on Switch...A RT-enhanced edition of that game would probably do well, considering how the player character is a literal source of capital-L Light, now that I think of it.
I find it fascinating that people still haven't figured out the purpose of raytracing - and that is to simplify development of titles with realistic graphics. They just had to market it as eye candy so people didn't go "well dev pal, that's your problem buddy. don't go pushing new products on me so you can get off your job easy".

For example, developers working on raster-based global illumination use pre-baked GI assets that account for a wide range of lighting conditions and interpolate between them in order to create a sense of realism. More often than not, these are also fixed light sources. Still, to achieve this, they must carefully account for every variable in the scene, taking into great consideration all sorts of things such as the assets, art direction, etc. - this is costly, takes a significant amount of dev and artist time, and greatly increases storage requirements depending on the quality of the pre-baked assets.

Ray tracing turns that into a mathematical problem, as there's an actual simulation of light casting, exactly how it works in real life. Excellent read on it:

www.cs.utexas.edu/users/fussell/courses/cs384g/lectures.old/Lecture4-Raytracing.pdf

It is an essential and integral component of photorealistic graphics simulations, graphics engines and current-generation hardware just aren't that far out yet. Current generation games are still largely using raster-based lighting and GI, RT is just tacked on top for some extra effects with a (very) limited ray count and more often than not, massive amounts of denoising applied.
Posted on Reply
#7
TumbleGeorge
Dr. DroIt's actually far, far more relevant than ML processing on a mobile iGPU, any SoC worth its salt will have an NPU nowadays.



I find it fascinating that people still haven't figured out the purpose of raytracing - and that is to simplify development of titles with realistic graphics. They just had to market it as eye candy so people didn't go "well dev pal, that's your problem buddy. don't go pushing new products on me so you can get off your job easy".

For example, developers working on raster-based global illumination use pre-baked GI assets that account for a wide range of lighting conditions and interpolate between them in order to create a sense of realism. More often than not, these are also fixed light sources. Still, to achieve this, they must carefully account for every variable in the scene, taking into great consideration all sorts of things such as the assets, art direction, etc. - this is costly, takes a significant amount of dev and artist time, and greatly increases storage requirements depending on the quality of the pre-baked assets.

Ray tracing turns that into a mathematical problem, as there's an actual simulation of light casting, exactly how it works in real life. Excellent read on it:

www.cs.utexas.edu/users/fussell/courses/cs384g/lectures.old/Lecture4-Raytracing.pdf

It is an essential and integral component of photorealistic graphics simulations, graphics engines and current-generation hardware just aren't that far out yet. Current generation games are still largely using raster-based lighting and GI, RT is just tacked on top for some extra effects with a (very) limited ray count and more often than not, massive amounts of denoising applied.
Cutting costs to game developers doesn't make those costs magically disappear. They are transferred at our expense.
Posted on Reply
#8
Dr. Dro
TumbleGeorgeCutting costs to game developers doesn't make those costs magically disappear. They are transferred at our expense.
Indeed, but that is our problem, no? ;)

It is a technology that is essential to photorealistic graphics, this is actually true. But it will take a long time for it to be fully viable, even with Nvidia's titanic (pun intended in more ways than one) efforts to make RT a thing.
Posted on Reply
Nov 17th, 2024 13:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts