• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Apple ARM Based MacBooks and iMacs to come in 2021

Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
He was wrong. Crysis will run on modern smartphones with 3-watt ARM SoCs on low just fine.
No it won't ,that was not crysis, it Was a watered down waterered down crysis port that wasn't released.


I can put together something showing crysis working on a abacus then not release , would that make it real.
 
Last edited:
Joined
Mar 28, 2020
Messages
1,753 (1.03/day)
While there have been rumors on this switch for a long time, I feel Apple's A series SOCs are indeed ready to take on the low power Intel chips. I suspect this switch will happen in the MacBook Air space since the existing A12X/Z is capable of taking on Intel's 15W chips. In terms of software, I am sure developers are more than happy to optimize apps for the mobile SOCs for better performance , rather than running an emulated version.

I feel Intel's threat has always been the mobile SOCs, rather than AMD. Sure the mobile chips are nowhere as fast in say Windows 10, but as long as they are performing well enough and offering significant battery life/ and slimmer form factor, it will start to chip away at Intel's dominance in the laptop space.

Honestly, I am not sure why Crysis got specifically pulled into this discussion, but I do feel mobile SOCs are actually up to the task for light gaming. Think Nintendo Switch which is using an aged Tegra SOC and running quite a number of graphic intensive games. Sure there are a lot of compromises to get it to run, and a lot of serious optimizations, but it is a proof of concept that works. Also, if we are comparing integrated graphics, even Intel's graphics are rubbish and can barely play any of the modern games. Yet they sell very well on laptops because most people just don't care about GPUs that can run Crysis or any modern games smoothly. And if I apply this to MacBook Air, I feel most people that uses it are not really gamers since the Air is never meant to be a gaming laptop. Which is why I feel if Apple is to transit to ARM, then this is a good opening point.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,571 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
It's not only Geekbench. I can compare the responsiveness of my phone with MediaTek MT6750T, an 8-core ARM phone-CPU with my desktop and to be honest the desktop feels less responsive, and yet consumes a heck of a lot more power.

This may have been adressed, but there is much to that discussion, especially when compared to Windows. Android doesn't have legacy stuff from several decades of operating systems, there is no such thing as a universal Android image (they have to be compiled to your specific circtuits), you don't use them the same way (that is getting muddier though)... There is no direct comparision, the same with ARM. Comparing performance directly between the two is very hard.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
This may have been adressed, but there is much to that discussion, especially when compared to Windows. Android doesn't have legacy stuff from several decades of operating systems, there is no such thing as a universal Android image (they have to be compiled to your specific circtuits), you don't use them the same way (that is getting muddier though)... There is no direct comparision, the same with ARM. Comparing performance directly between the two is very hard.

Having to compile for the specific circuit is an advantage, opportunity and strength, it means there is a probability you will get the maximum potential performance.
Smartphones are very responsive, fast and pleasant for using.
x86 PCs are not responsive, very often they are laggy, cause stuttering, and in general not in any way pleasant for using.
This might be my subjective opinion but it is my feedback and you have to consider it.

The software developers are the members of our society who take the largest salaries and yet their products have bugs, backdoors, security vulnerabilities, along with being awfully optimised in the case of PC software, and the developers always seek to do the less job and seek the shortest and easiest path to accomplish the goal.

Tell me, is it normal that in RDR2 an overclocked Core i9-9900K at 5 GHz with a Radeon RX 590 gets 37 FPS on average at 1080p? https://www.techpowerup.com/review/red-dead-redemption-2-benchmark-test-performance-analysis/4.html

Is it normal that 14 years after Crysis' releases, the game is still an example of a badly optimised title that is being used as a benchmark for modern PC hardware?

I don't think it's normal.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.81/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
x86 PCs are not responsive, very often they are laggy, cause stuttering, and in general not in any way pleasant for using.
I wouldn't have been on skt2011 for 9 years if that was the case. Even mobile x86 machines are pretty fast today considering their TDP. I guess that dynamic changes if you're speaking about the skt939 machine in your specs which is 15 years old, but if you're comparing something like that to a new iPhone 11 Pro or something, then yeah, it's going to feel slow because it's a relic in comparison.

The software developers are the members of our society who take the largest salaries and yet their products have bugs, backdoors, security vulnerabilities, along with being awfully optimised in the case of PC software, and the developers always seek to do the less job and seek the shortest and easiest path to accomplish the goal.
You speak about this like you've never worked in the field of software development. As a software engineer, I take particular offense to this. No matter how good of a developer you are, you're likely going to write software that has bugs at one point or another. You're also assuming that it's the developer's fault most of the time. There have been plenty of cases where I've written code, wrote tests for it, was over 99% certain that it would do what it's supposed to, but what it was supposed to do was inaccurately conveyed to me, so the "bug" was a misunderstanding for how a feature was supposed to operate. I'll admit, that I'll be one of the first people to say, "work smarter, not harder," but not all bugs are the developer's fault. There are plenty of times where "what the applications is supposed to do," is communicated between people with something other than programming languages. Remember, when I'm given a task, I'm being told what to do. If someone can't tell me what they need or why they need it and I have to fill in the gaps because I can't get a good answer, then this is going to happen. That isn't to say there aren't bad devs, it's just not as clear cut as you would imagine.

With that said, if you don't like how developers do their jobs, I urge you to try and do it yourself and to do it better. Software engineers get paid decently for a reason and it's not because everyone can do it, and for those who can, not all do it well and salary usually reflects that if you don't get canned.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,571 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
Having to compile for the specific circuit is an advantage, opportunity and strength, it means there is a probability you will get the maximum potential performance.

Sure. But it makes comparing hard, which was the point.

Smartphones are very responsive, fast and pleasant for using.

I don't think I have ever encountered a smartphone that is actually pleasant. Not even the proper high end ones. Everything has a very slight lag to it. But this could just be me.

x86 PCs are not responsive, very often they are laggy, cause stuttering, and in general not in any way pleasant for using.

Disagree, which is fine.

This might be my subjective opinion but it is my feedback and you have to consider it.

One can have opinions about facts, and I know that the difference between facts and opinions is getting blurry to people (for many reasons) but one can still be wrong. The argument was that it is really hard to compare ARM directly to x86 (especially on Windows) because they are massively different. Me not thinking smarthphones are pleasant is subjective and is an opininon, the same as what you say about PC's, and that is fine. The problem is when Geekbench or whatever it is says ARM is faster than x86 here look at this graph, but it isn't that simple. Apple going for ARM in Macbooks and iMacs definitely tells us that ARM is screaming forward, which is a good thing ... but it's still silly to compare phones to computers in any scenario. You can do it of course, but it really should be responded to.

I won't even try to take on the thing about devs partly because Aquinus did and also because wow.

Tell me, is it normal that in RDR2 an overclocked Core i9-9900K at 5 GHz with a Radeon RX 590 gets 37 FPS on average at 1080p? https://www.techpowerup.com/review/red-dead-redemption-2-benchmark-test-performance-analysis/4.html

I haven't played the game so I have no idea how it looks and if that performance makes sense, but on the whole yes. It's normal. The downside of gaming on PC and having a need/want to play on maxed out settings is it's a constant catch up. That's how it works.

Is it normal that 14 years after Crysis' releases, the game is still an example of a badly optimised title that is being used as a benchmark for modern PC hardware?

Also normal, because it's a meme at this point and no other game has that joke connected to it, so Crysis is the baseline in a way. And Crysis was very well optimised actually when you started fiddling with settings. You didn't need a beast machine for it to still look good. The maxed out everything settings at 4K is probably handled weirdly as it's still pretty hard to hit good FPS numbers there. But Crysis is an outlier. Some games are just weirdly made, the swamp levels in Jedi Outcast lagged pretty badly when last I played them, which was on a i3 machine which is very bad for a game based on the Quake 3 engine.
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
The maxed out everything settings at 4K is probably handled weirdly as it's still pretty hard to hit good FPS numbers there. But Crysis is an outlier.

I played Crysis and Crysis Warhead 4K60 maxed just fine.

The software developers are the members of our society who take the largest salaries and yet their products have bugs, backdoors, security vulnerabilities, along with being awfully optimised in the case of PC software, and the developers always seek to do the less job and seek the shortest and easiest path to accomplish the goal.

You're ignorant beyond any reasonable point. I changed my mind, I sincerely hope you're just trolling.
 
Joined
Jan 24, 2011
Messages
180 (0.04/day)
These results are pretty interesting. Intel Atoms have compareable TDP but still their Multi-core performance is only 15% of the ARM-based counter-part, and Single-core performance is only 18% of an ARM-based counter-part.

That means Intel violates the EU standards for energy efficiency.
Are you seriously comparing Atom x5-Z8350 which was introduced in Q1 2016 against ARM based SoCs which were introduced in Q3-Q4 2019? That's 3.5 years difference!
BTW 2W for Atom x5-Z8350 is not TDP but SDP.
I don't know from where you got 2.5-3W TDP for those ARM SoCs.
 
Last edited:
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I don't know from where you got 2.5-3W TDP for those ARM SoCs.

Most high performance SoCs are around 10W power draw under burst loads.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
Most high performance SoCs are around 10W power draw under burst loads.

:kookoo: 10-watt in your smartphone will mean that you won't be able to touch your phone.

I don't know from where you got 2.5-3W TDP for those ARM SoCs.

Qualcomm aims at 2.5 to 3W TDP for phones

1585502383516.png

 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
:kookoo: 10-watt in your smartphone will mean that you won't be able to touch your phone.
Why? You're definitely overestimating these 10W.

Mainstream smartphone SoCs are 5W, those for tablets reach 7W. I bet they boost higher.
That's roughly the same powee draw you see in ULV x86 chips, like Intel's -Y lineup.

Seriously, it's just transistors inside. You have to change the state of some number of them to perform a calculation. There's no magic.
Some optimizations are possible, but architectures made on similar node won't differ in efficiency by 10 times as you suggested earlier.
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
10-watt in your smartphone will mean that you won't be able to touch your phone.

Oh really ? Guess some of us are superhuman when we pick up something like a 200W gaming laptop.

Qualcomm aims at 2.5 to 3W TDP for phones

View attachment 149712

You don't need to use colossal fonts , it wont make any of your claims any more true than they are.

It clearly says right there it's a 5W TDP chip. Moreover take a look at this :

lol.png



How about that, the iPhone has an average well over 6 Watts. Mind you, that is average , common sense should make it obvious that these chips are going to boost for short periods of time well over that like any other chips mobile or desktop.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
A lot of the speculation here seems pretty far out of left field.

Apple just released a MacBook Air refresh, based on the design from 2018 + new keyboard. The previous redesign was in 2010. That's a life-cycle of 8 years.

MB Pro redesigns: 2012, 2016, 2019. 3-4 years for their most popular computer.

Mac Pro just saw it's first redesign since 2012...

They just released the new iPad Pro -- an Arm based tablet with keyboard and trackpad support.

There's no way they're going to trash the MBP, MP, and Air for ARM. A new product line to replace the MB and act as an AIO upgrade for the iPad Pro? Sure.

Ditching x86 entirely two years after a bunch of content creators just spend $6k+ on a new Mac Pro? Not a chance.

(Imagine the likes of Pixar holding onto MacOS while everyone else moved to Windows after the trashcan and Final Cut fiascos, their sigh of relief when the new Pro came out, only to find that their product is legacy three years after purchase. Imagine the developers of video editing software... Seriously lol stuff here).

You're ignoring a few things here:

1. All of Apple's x86 offerings are built around Intel CPUs. You know, the Intel that's having massive problems delivering those CPUs right now? CPUs that Apple doesn't have are CPUs that they can't put into their shiny Macbooks and charge a 300% markup on.
2. Apple's end goal absolutely is replacing x86 with Arm, for many reasons: they have the best Arm CPUs, they wouldn't have to pay Intel for CPUs, they wouldn't suffer when Intel has supply issues, and they can unify their OS and applications.
3. You really think Apple gives a s**t about what anyone paid last year for their overpriced junk? They don't, because they know they have a captive market. People who are dumb enough to buy Apple machines over PCs for any sort of task, are the same people who are going to buy Apple's latest and greatest every year, simply because Apple tells them to. Steve Jobs did a fantastic job of marketing to the "more money than sense" crowd. The same thing goes for the companies writing software for Apple machines.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.81/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
:kookoo: 10-watt in your smartphone will mean that you won't be able to touch your phone.
You do realize that these CPUs are designed to boost under thermally advantageous conditions and to throttle when it's not, right? They're not going to run at full tilt until it explodes.
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I played Crysis and Crysis Warhead 4K60 maxed just fine.
Same, might go for a third full run through em all , I have the time
Are you seriously comparing Atom x5-Z8350 which was introduced in Q1 2016 against ARM based SoCs which were introduced in Q3-Q4 2019? That's 3.5 years difference!
BTW 2W for Atom x5-Z8350 is not TDP but SDP.
I don't know from where you got 2.5-3W TDP for those ARM SoCs.
Epic processors, I have a phone and a laptop with them in , they totes antihalation my main listed rig, in exactly one benchmark, making a phone call, my pc sucks at that.;p:D so does that laptop though.
 
Joined
Aug 14, 2013
Messages
2,373 (0.58/day)
System Name boomer--->zoomer not your typical millenial build
Processor i5-760 @ 3.8ghz + turbo ~goes wayyyyyyyyy fast cuz turboooooz~
Motherboard P55-GD80 ~best motherboard ever designed~
Cooling NH-D15 ~double stack thot twerk all day~
Memory 16GB Crucial Ballistix LP ~memory gone AWOL~
Video Card(s) MSI GTX 970 ~*~GOLDEN EDITION~*~ RAWRRRRRR
Storage 500GB Samsung 850 Evo (OS X, *nix), 128GB Samsung 840 Pro (W10 Pro), 1TB SpinPoint F3 ~best in class
Display(s) ASUS VW246H ~best 24" you've seen *FULL HD* *1O80PP* *SLAPS*~
Case FT02-W ~the W stands for white but it's brushed aluminum except for the disgusting ODD bays; *cries*
Audio Device(s) A LOT
Power Supply 850W EVGA SuperNova G2 ~hot fire like champagne~
Mouse CM Spawn ~cmcz R c00l seth mcfarlane darawss~
Keyboard CM QF Rapid - Browns ~fastrrr kees for fstr teens~
Software integrated into the chassis
Benchmark Scores 9999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999
You're ignoring a few things here:
You basically ignored my entire post and seem to be acting more out of hatred for Apple than genuinely furthering this discussion, but isolation is boring so I’ll take the bait.
1. All of Apple's x86 offerings are built around Intel CPUs. You know, the Intel that's having massive problems delivering those CPUs right now?
”Now,” lol, where have you been?
CPUs that Apple doesn't have are CPUs that they can't put into their shiny Macbooks
Except they do, and still maintain exclusivity agreements — I think you just haven’t been paying attention.
and charge a 300% markup on.
lol, classic, good one

2. Apple's end goal absolutely is replacing x86 with Arm,
Ah, you’re on the board then? What else are they up to? What’s changed since 2014, when these rumors first started popping up?
for many reasons: they have the best Arm CPUs,
This is not a reason and avoids the significant task of rewriting the OS, drivers, apps, etc, as well as the transition and public and developer blowback. Do you know how many pro graphics apps stopped supporting Apple when the 2013 Mac Pro started showing its age, and how many professionals switched to Windows?
they wouldn't have to pay Intel for CPUs, they wouldn't suffer when Intel has supply issues,
This is a good and totally plausible reason that I agree with.
and they can unify their OS and applications.
Apple has repeatedly stated they have no interest in this — could you imagine selling the “it just works” campaign when people are trying to run Premier Pro on their iPad? There’s a reason they didn’t do this years ago...
3. You really think Apple gives a s**t about what anyone paid last year
Yes, I do, as indicated by their 2013 Mac Pro sales and the redesign...
for their overpriced junk?
Imagine rewriting the drivers for their afterburner card.
They don't, because they know they have a captive market. People who are dumb enough to buy Apple machines over PCs for any sort of task, are the same people who are going to buy Apple's latest and greatest every year, simply because Apple tells them to. Steve Jobs did a fantastic job of marketing to the "more money than sense" crowd. The same thing goes for the companies writing software for Apple machines.
In the case of consumers, sure, absolutely, and it makes total sense for them (thus my iPad Pro comments). Professionals? Definitely not, which was what my last three paragraphs were meant to emphasize.
 
Joined
Dec 13, 2019
Messages
47 (0.03/day)
Even thou 7nm is awesome, Zen2 still has some horrible power spikes, so I don't see Apple using them for now, or ever if they want to ditch x86/AMD64.
OTOH, RIP Intel.

What Power Spikes?
I have never heard of Ryzen Power Spikes before.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
they can unify their OS and applications.
That makes very little sense as noted by @claes.
Apple's Mac lineup flourished after 2005 because x86 made it much easier to provide ports of popular Windows software.

It's a complete ecosystem for photo/video editing, for software development, for science, for engineering.
And the *nix roots make it even more interesting for many.
With virtualization (be it Docker, Virtualbox or VMWare) pretty much every professional or scientific workflow can be easily migrated.
Even gaming becomes possible thanks to cloud platforms.

Going ARM without full and efficient x86 emulation will mean that software companies have to rewrite everything they want to sell to Apple customers. The cost of all that work would be enormous.
And whenever a large software company - e.g. Adobe, Mathworks, Microsoft, Autodesk, Oracle, VMWare or Salesforce - decided that they won't provide an ARM version of what they offer for macOS today, Apple would sell a few hundred thousand Macs less (in case of Adobe and Microsoft - probably millions).

That makes absolutely no sense - unless of course Apple's goal is to phase out Macs for whatever reason.


1585519463901.png
 
Joined
Dec 13, 2019
Messages
47 (0.03/day)
I mocked Apple when they released iPhone. Thought it was such a stupid idea like many of us. How wrong we turned out to be! So I learned the hard way, do not judge before it comes out.

That is funny.
I was excited, when I first saw the iPhone.
 
Joined
Mar 28, 2020
Messages
1,753 (1.03/day)
Having to compile for the specific circuit is an advantage, opportunity and strength, it means there is a probability you will get the maximum potential performance.
Smartphones are very responsive, fast and pleasant for using.
x86 PCs are not responsive, very often they are laggy, cause stuttering, and in general not in any way pleasant for using.
This might be my subjective opinion but it is my feedback and you have to consider it.

The software developers are the members of our society who take the largest salaries and yet their products have bugs, backdoors, security vulnerabilities, along with being awfully optimised in the case of PC software, and the developers always seek to do the less job and seek the shortest and easiest path to accomplish the goal.

Tell me, is it normal that in RDR2 an overclocked Core i9-9900K at 5 GHz with a Radeon RX 590 gets 37 FPS on average at 1080p? https://www.techpowerup.com/review/red-dead-redemption-2-benchmark-test-performance-analysis/4.html

Is it normal that 14 years after Crysis' releases, the game is still an example of a badly optimised title that is being used as a benchmark for modern PC hardware?

I don't think it's normal.

I am no software developer, but I feel it is not something easy to optimize. I suspect that there are many considerations, performance, UX, features, security, etc, which a software developer will need to consider. To add on to the insult, if you look at the state of Windows and Android devices, there may be 1.01 million different configurations, with some people still running ancient hardware. To optimize software taking inconsideration as wide a support possible will certainly cause issues. This is why software and games tend to be a lot more optimized on hardware + software (drivers) that are tightly controlled by a company, think game consoles and perhaps even iOS devices. For bugs, I certainly have not seen any software that is bug free. At times when you add some codes to say introduce a new feature, you can very well break something or introduce some security issue. I believe this is inevitable. Also with different hardware in your computer introducing new driver/ features, it can also result in bugs being created in a software that works perfectly fine before the update. To sum up, basically there are too many moving parts for a software to be perfectly optimized and bug/ security free. However I also don't deny that there may be developers that don't care about optimizing.

On the point of decade old Crysis releases performing poorly on modern hardware, I don't think it is impossible. Game makers may no longer support the game after a number of years. For example, with up to 16 cores available now in the retail space, some older games are still optimized for 2 or 4 cores with no plans for future updates. This may well be the same on the GPU side. Game makers don't have infinite resources to continue support of ageing games, especially when they are no longer making recurring revenue from them. So it doesn't make sense to bench a 14 year old game in this case.
 
Joined
Jan 24, 2011
Messages
180 (0.04/day)
Qualcomm aims at 2.5 to 3W TDP for phones

View attachment 149712
That article from Fudzilla was published on 30 MAY 2013 and there is no mention of 2.5-3W TDP for Snapdragon-865 or Apple A13 Bionic, not surprising considering they were introduced in 2H of 2019.
I don't understand why you even linked It when right under It your next link is about 5W TDP for Snapdragon-865.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
That article from Fudzilla was published on 30 MAY 2013 and there is no mention of 2.5-3W TDP for Snapdragon-865 or Apple A13 Bionic, so I don't understand why you even linked It when your next link is about 5W TDP for Snapdragon-865.
So as I said your comparison was simply wrong.

5-watt is when the SoC works at maximum load of its iCPU and iGPU parts. Its iCPU part works at 2.5-3 watt or lower..
 
Joined
Dec 13, 2019
Messages
47 (0.03/day)
I think that it is Windows fault, that PC are not very responsive.
Try Linux or macOS. They are much better designed.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
That makes very little sense as noted by @claes.
Apple's Mac lineup flourished after 2005 because x86 made it much easier to provide ports of popular Windows software.

It's a complete ecosystem for photo/video editing, for software development, for science, for engineering.
And the *nix roots make it even more interesting for many.
With virtualization (be it Docker, Virtualbox or VMWare) pretty much every professional or scientific workflow can be easily migrated.
Even gaming becomes possible thanks to cloud platforms.

Going ARM without full and efficient x86 emulation will mean that software companies have to rewrite everything they want to sell to Apple customers. The cost of all that work would be enormous.
And whenever a large software company - e.g. Adobe, Mathworks, Microsoft, Autodesk, Oracle, VMWare or Salesforce - decided that they won't provide an ARM version of what they offer for macOS today, Apple would sell a few hundred thousand Macs less (in case of Adobe and Microsoft - probably millions).

That makes absolutely no sense - unless of course Apple's goal is to phase out Macs for whatever reason.

View attachment 149747

Perhaps I've been a little obtuse, so here's a clarification:

Apple's goal is not to phase out Macs. It's to phase out x86 Macbooks.

Its so-called high-end workstations are expensive enough and low volume enough that Apple can continue to use Intel chips for them. They'll put the price up since they won't be getting as big a volume discount from Intel, though.

Apple is confident that it already has Arm versions of enough of the software in its Macbook ecosystem to cover enough of its users. The holdouts are either insignificant, or basically don't exist anywhere except in the Apple ecosystem, so the onus is on those devs to port their code or lose their revenue stream.

I think that it is Windows fault, that PC are not very responsive.
Try Linux or macOS. They are much better designed.

0/10 trolling effort.
 
Top