• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Apple Introduces 27-inch iMac with Retina 5K Display

Last I heard, Apple doesn't make monitors so they aint innovating anything, Whoever builds their panel (LG, SAMSUNG, AU OPTRONICS, etc) are the innovators. Dell announced this resolution a month ago,

Affordable 4K monitors have been around for PC for over a year so Apple is actually behind in the ultra HD monitor space.
 
So 1440p is a lost cause? :(
Yeah, it will turn in to 1920x1200 today...a dinosaur. It's all about the economics of mass production and mass production is for TVs. 1080p, 4K, and 8K will be the cheapest per pixel count.
 
I know that gamers like to think they're the most important people in the world and that any advance in the IT industry should be judged by how it affects them, but that's not actually how things work in reality.
but it only use the mobile version of a 290x, will it be powerful enough to drive games at 5k ??
No graphic card on the market could handle serious gaming on 5K.
so as long as you don't game on it, which isn't much of a problem anyway you will be fine.

I want to point out something with these posts: If someone is buying a Mac for gaming, they're brain dead to begin with. Who the hell buys a Mac for gaming? Someone who doesn't know what they're doing, that's who. So complaining that it sucks for gaming is like complaining that your Ford F-350 isn't as fast as a Corvette even though the engine is bigger. Higher resolution alone doesn't imply "this is for gaming".

I would stop complaining about the gaming argument since this isn't intended to be a computer for gaming and if you think it is, you've already missed the point.

Price aside, it looks like a capable computer for almost any productivity tasks including those for pictures and video. Can it play games, sure, just not at 5k much like most other PC out which can't either.
 
2560x1440 is about the threshold for a gaming PC that can drive games at max at 60fps affordable. Anything above that, expect to be reaping the additional quality in the form of video.
 
2560x1600* DVI dual-link
 
The best part is seeing Crap Daddy returns!!! Man I missed you! Who cares about Apple.... :)

Quite rightly so. I'm typing this on a Windozer 8.1 Touch panel that is, something not too familiar in Cupertino except of course for the smaller money making toys they have over there.
 
The first few 32" 4K monitors were something like ~$3500 a year or so ago. If Apple ships a 27" 5K monitor at ~$2400 with a full computer and a video subsystem capable of driving it (for desktop use) that is impressive IMO. I would't call it cheap but in todays market it does seem comparably cheap since I am sure Dell was preparing to ship their 5K display (only a display) at or near that same ~$2400 price point.

I don't really care if Apple didn't actually make the panel. I don't really care if Dell requires dual DisplayPort 1.2 to drive their 5K option. I just care that prices are (seemingly / comparatively ) coming down and hardware prowess is going up.

Having said all that, If I were to buy a 5K display I kind of would like for it to be a little bigger then 27". I'm thinking more like 32" to 40". I already have two 4K displays (28" and a 39") and I really like them. 4K and up can really benefit from having larger displays IMO which helps negate the need for scaling.

As for gaming, Who doesn't love gaming? I get it, believe me I do. However, some people really do have to do real constructive "work" with a computer that can benefit from 4K / 5K and if you can't see that then clearly you are not one of those people. Its not always about games or consuming UHD (4K / 5K) video.

Edit:

Now that I think about it, if Apple is committed to 5K then every new piece of new hardware from them running OS X should have adequate support to push a 5K external display such as the upcoming Dell which supposedly needs dual DisplayPort 1.2. It doesn't really matter how hackneyed or patched together the connection interface may be it still needs to be supported.

So that new Mac Mini should be able to drive the same type of display.
 
Last edited:
Just so everyone is clear, these mobile GPUs used are not comparable to desktop GPUs of the same model number.

Mobile R9 290MX Desktop R9 270X / HD 7870
Mobile R9 295MX Desktop R9 285

That said, for the people concerned about the GPU performance of the iMac, the Mac Pro is the device designed for people who need GPU performance, not the iMac. The iMac is focused on delivering the most performance possible in a given form factor, which means cutting down on hardware to fit that form factor. Granted, Apple is obsessed with making devices thinner and thinner, including the iMac, so there is less ability to hold a high power GPU in an iMac than there has been in the past, but the lack of higher GPU performance is also partially due to the lack of efficiency advances in AMD's line of mobile GPUs.
 
Last I heard, Apple doesn't make monitors so they aint innovating anything, Whoever builds their panel (LG, SAMSUNG, AU OPTRONICS, etc) are the innovators.

While it's true that Apple does not manufacture their own panels, this doesn't mean that they are not a driving force for innovation. Apple was first to market with a high-DPI smartphone (the iPhone 4), first to market with a high-DPI tablet (the iPad 3), and first to market with a high-DPI laptop (rMBP). Do you think that's a coincidence? No: Apple has used its marketing power specifically to push the supply chain to create affordable high-DPI displays. This has been to the benefit of all users, including those of us who use Android and Windows. If Apple hadn't shown the world how much better a high-DPI display looks, people would have puttered along at 100 DPI for years and not known what they were missing.

Affordable 4K monitors have been around for PC for over a year so Apple is actually behind in the ultra HD monitor space.

Actually, I'm still waiting for a suitable 4K monitor. Currently you have a choice between a bunch of 27" 4K monitors using the same mediocre TN panel (no thanks) and a couple of 32" 4K monitors using the Sharp IGZO panel, costing over $2000, and supporting 4K@60Hz input only via the buggy MST hack (again, no thanks). Then there's the Seiki TV, which is dirt-cheap and has a nice 39" VA panel, but is crippled by a 30Hz refresh rate.

Unless a good IPS or VA 4K monitor comes down to below $1500 (preferably below $1000), the Retina iMac looks like a better deal.

Having said all that, If I were to buy a 5K display I kind of would like for it to be a little bigger then 27". I'm thinking more like 32" to 40". I already have two 4K displays (28" and a 39") and I really like them. 4K and up can really benefit from having larger displays IMO which helps negate the need for scaling.

My dream display would be 39" with an 8K resolution (7680x4320) and 120Hz refresh rate. High DPI, tons of screen real estate, and fast refresh all in one package. That won't happen for many years, though.

Now that I think about it, if Apple is committed to 5K then every new piece of new hardware from them running OS X should have adequate support to push a 5K external display such as the upcoming Dell which supposedly needs dual DisplayPort 1.2. It doesn't really matter how hackneyed or patched together the connection interface may be it still needs to be supported.

No off-the-shelf video card supports 5K output over a single-tile connection. It has to treat the output as dual 2560x2880 displays. Many first-generation 4K monitors work this same way (even though DP 1.2 can handle 4K@60Hz, the older scaler chips can't), and this causes all kinds of problems: failure to wake up after sleep, only one half of the screen showing up, and so forth. I don't think Apple wants that kind of experience for their users.

On the other hand, maybe since Apple controls the whole OS, they won't have these problems as severe as on Windows. They have a lot more ability to lean on AMD for better driver support than the average consumer does, and more ability to get their issues prioritized - especially since Apple is probably AMD's third biggest customer now, after MS and Sony.
 
I want to point out something with these posts: If someone is buying a Mac for gaming, they're brain dead to begin with. Who the hell buys a Mac for gaming? Someone who doesn't know what they're doing, that's who. So complaining that it sucks for gaming is like complaining that your Ford F-350 isn't as fast as a Corvette even though the engine is bigger. Higher resolution alone doesn't imply "this is for gaming".

I would stop complaining about the gaming argument since this isn't intended to be a computer for gaming and if you think it is, you've already missed the point.

Price aside, it looks like a capable computer for almost any productivity tasks including those for pictures and video. Can it play games, sure, just not at 5k much like most other PC out which can't either.


Apparently you haven't been to or around the Steam and other Forums, also the number of people who want to boot camp their almighty Mac since it is a gift from God himself and its better so it should be able to play this game better.......... the same people who quote the hard drive size as the "memories size" and the "it has the quad cores and was the best available"
 
Apparently you haven't been to or around the Steam and other Forums, also the number of people who want to boot camp their almighty Mac since it is a gift from God himself and its better so it should be able to play this game better.......... the same people who quote the hard drive size as the "memories size" and the "it has the quad cores and was the best available"
Your right, that's because I never see updates with people who act as you describe. The simple fact is that Macs aren't meant for gaming regardless of what people think it's meant for and most people acknowledge that. I don't think the group of people is as big as you make it out to be.
 
What most people use it for could be accomplished with a $400 bargain bin laptop like most computer users.

The primary reason why Apple puts most of the purchase price into the display is because, sitting at Apple Stores, the screen makes it easier to convince people to shell out four digits worth of cash.
 
Last edited:
What most people use it for could be accomplished with a $400 bargain bin laptop like most computer users.

The primary reason why Apple puts most of the purchase price into the display is because, sitting at Apple Stores, the screen makes it easier to convince people to shell out four digits worth of cash.

The primary reason is because they make premium products and a good display is part of that. Good displays are always good. And yes, they are premium products, even if you don't want to buy them.
 
The primary reason is because they make premium products and a good display is part of that. Good displays are always good.
I've found the displays in Apple laptops to be pretty good and in all seriousness, OS X scales really well to the higher PPI displays.
And yes, they are premium products, even if you don't want to buy them.
That's the key right there. I don't buy a Mac because it's out of my budget. If I made twice as much as I did now, I might consider getting one, not because I have the money to waste but because I could afford the investment in a product that seemingly works well. Now with that said, in the past I used to play a lot of video games and that certainly isn't the case now. When gaming isn't your goal with your computer, Macs become a much more feasible option (price aside). It depends on what you goals are. Finally, I should add that Macs keep getting quieter. I practically never hear the fan on my Air (which belongs to the company I work for.)

All in all, Apple is a premium product in the sense that you don't need to think about it for it to work. It just works, it looks nice, and generally speaking, they're very responsive. For any normal user, I don't see how there is really anything bad about Apple other than the price tag. Regardless what any individual person may think, Apple tries to sell a premium product weather we agree with that assessment or not. The two big parts of that are how they build their products and the software that those products use. If I could easily and legally run OS X on my tower, I would. Hell, I would even pay money to do it easily and legally, but Apple doesn't want that market. In all seriousness though, I think its 50% how its built and 50% the software. That's what Apple has going for them.

Now I want to say this one more time, I never have to think about OS X on my Air for work. It just works. To me, that's worth money.
 
The primary reason is because they make premium products and a good display is part of that. Good displays are always good. And yes, they are premium products, even if you don't want to buy them.
Excepting the display, what is "premium" about it? If you separate the display from the other components. Most people would never consider it. I would argue this is a hallmark of marketing and packaging, not whatever "premium" is. Then again, perhaps "premium" is marketing and packaging in a nutshell. It isn't a tangible thing. You know, like sticking "Oreos" on a cookie sandwich instead of just "cookie sandwich." But I guess that goes back to the original point that people buy it because it is Apple, not because of the merits of the product.

When gaming isn't your goal with your computer, Macs become a much more feasible option (price aside).
Alongside all of the flavors of *nix.

Finally, I should add that Macs keep getting quieter. I practically never hear the fan on my Air (which belongs to the company I work for.)
You should never hear the fan on any laptop. If you do, there's hardware/design problems.

All in all, Apple is a premium product in the sense that you don't need to think about it for it to work. It just works, it looks nice, and generally speaking, they're very responsive.
I can say the same for my December 4, 2011, install of Windows 7 on this machine. This is a feature of all well configured computers and not exclusive to any Apple product.
 
Last edited:
I can say the same for my December 4, 2011, install of Windows 7 on this machine. This is a feature of all well configured computers and not exclusive to any Apple product.
You are not the typical user, Ford. Most people don't even know what a service or a thread even is, let alone know how to keep it squeaky clean and running well.
You should never hear the fan on any laptop. If you do, there's hardware/design problems.
Most laptops under full load, you hear a fan. It takes a long time at full load before the fan even starts making any noise on my Air. The point is, that you can do a lot with it before it will even make the tiniest bit of noise. I don't find that to always be true of PC laptops. Maybe I've just experienced the wrong ones...
Alongside all of the flavors of *nix.
If you're so inclined to go that way, sure. I think your average user would have an easier time with OS X than a distro of Linux though.
It isn't a tangible thing. You know, like sticking "Oreos" on a cookie sandwich instead of just "cookie sandwich." But I guess that goes back to the original point that people buy it because it is Apple, not because of the merits of the product.
Since OS X and the time they put into designing something isn't worth that? They're making a platform, not a laptop. They control it all the way through. You don't need to mess with things like drivers because they do it for you. That might be nothing to you, Ford, but your average person doesn't know what their doing with a computer and Apple tries to make it less painful. Sure, for you and me it's not worth while, but for a lot of people it is. Just because you don't need it doesn't mean other don't and that it's not useful.
 
Last edited:
Excepting the display, what is "premium" about it? If you separate the display from the other components. Most people would never consider it. I would argue this is a hallmark of marketing and packaging, not whatever "premium" is. Then again, perhaps "premium" is marketing and packaging in a nutshell. It isn't a tangible thing. You know, like sticking "Oreos" on a cookie sandwich instead of just "cookie sandwich." But I guess that goes back to the original point that people buy it because it is Apple, not because of the merits of the product.

You can't quantify everything, that would be a terrible world. The design (both external and internal) is premium, and generally it's not tangible, which is sort of the point. The design is part of the merits people base the purchase on. A lot of Apple sure is about design rather than performance ... and there is nothing wrong with that.
 
You are not the typical user, Ford. Most people don't even know what a service or a thread even is, let alone know how to keep it squeaky clean and running well.
Only programmers really need to know that stuff because, beyond turning services on and off, there's not much a user can do about either. And you flatter me; my hard drives are
filthy, especially the volume with the OS installed. In terms of running well, all I do there is keep malware off it.

Maybe I've just experienced the wrong ones...
Yes, yes you have. About the only laptops I've encountered that make a racket only do so when playing games on it (loading the GPU, CPU, and HDD).

You don't need to mess with things like drivers because they do it for you.
All OEMs do. Dells, HPs, Sagers, Apples, etc. all come with everything to run the operating system preinstalled. Most users never have to touch them. I've worked on a lot of computers, in fact, that are more stable with the original drivers than with updated drivers.

You can't quantify everything, that would be a terrible world. The design (both external and internal) is premium, and generally it's not tangible, which is sort of the point. The design is part of the merits people base the purchase on. A lot of Apple sure is about design rather than performance ... and there is nothing wrong with that.
I'd argue Gamma Tech and Sager laptops are "premium." Gamma Tech = semi- and fully ruggedized; Sager = portable workstations. In both instances, the "premium" is tangible because they have obvious features that set them in a league of their own.

Edit: Looks like Gamma Tech set their sights on iMac. Well, not really because iMacs aren't semi-rugged but they are in the sense that they are now offering an all-in-one desktop system.
 
Last edited:
Not everyone has a choice as to what OS, what software and what hardware they use.

For example, I know someone who used Windows XP for quite some time at work on her office work machine. A few years before Windows 8 came out (making windows 8 a non-issue) this person was unceremoniously switched from a Windows XP / PC to a new 27" iMac. She was also given a Mac Book Pro for travel by her employer. She is a developer / programer with a bachelors degree in computer science.

That could be something of a pitfall of the industry in a sense. one should be able to adapt seamlessly or near seamlessly to change and even continue their education to keep up (often paid for by your employer). If you can’t keep up then you may unfortunately find yourself unemployed very quickly and someone that future employers in the same field (making similar changes) don’t want to employ moving forward.
 
A few years before Windows 8 came out (making windows 8 a non-issue) this person was unceremoniously switched from a Windows XP / PC to a new 27" iMac. She was also given a Mac Book Pro for travel by her employer. She is a developer / programer with a bachelors degree in computer science.
IT department doesn't seem too bright unless she exclusively programs for iOS/Mac OS X. They wasted a ton of money and they're giving her poor tools to work with compared to what she likely came from (Visual Studio).

Most businesses around here are going from XP to Windows 7. I only know of one that is going from Windows 7 to Windows 8.1 and they do a ton of CAD work.
 
IT department doesn't seem too bright unless she exclusively programs for iOS/Mac OS X. They wasted a ton of money and they're giving her poor tools to work with compared to what she likely came from (Visual Studio).

Most businesses around here are going from XP to Windows 7. I only know of one that is going from Windows 7 to Windows 8.1 and they do a ton of CAD work.

People in education seem to like Apple. Also, I should note that not only do I work for an educational institution but Apple gives discounts to said institutions for their products. I should also note that we almost never have someone come to us and say that their Macbook Air isn't working. In the last 6 years, I've witnessed maybe 8 RMAs out of over 150 laptops, 4 of which were due to solder flowing on a 8 and 9000-series GeForce GPU that Apple repairs even outside warranty.

All I'm saying, Ford, is that Apple makes a solid product and that our sysadmin almost never has to think about machines already provisioned and distributed. Not only that, but our employees seem to be happy with them. In all seriousness, pair that with 11 hours+ of battery life, and you don't have much, other than the price tag, to complain about... unless you really dislike the UI like a co-worker of mine.

There is wasted money in the sense of hardware but then there is the wasted time in the sense of man-power. If you need more time to manage a PC-house than a Mac-house, that's money you're paying your sysadmin(s) to do their job and our sysadmin (singular) doesn't think about our machines often and neither did I when I was sysadmin.

I understand the resentment towards Apple, but in all seriousness, they build a good product.
 
From what I've seen in a similar environment with 100+ Dells and HPs, most of the problems stem from printers and VoIP phones. If a Dell fails, it is replaced with an HP. If an HP fails, it is replaced with an HP and the dead HP goes to HP (they're still warrantied).

Your "sysadmin almost never has to think about machines already provisioned" because there's nothing he/she can do except send it back to Apple. They're purposely designed to not be repaired by anyone except Apple so what is there to think about?
 
Your "sysadmin almost never has to think about machines already provisioned" because there's nothing he/she can do except send it back to Apple. They're purposely designed to not be repaired by anyone except Apple so what is there to think about?

Troubleshooting, initiating the RMA, and re-imaging the new machine all takes time. My point is that he doesn't need to think about machines often and the RMAs don't happen often either. In fact our last two syadmins haven't had to do an RMA because no machines have failed in the last 2 years.
From what I've seen in a similar environment with 100+ Dells and HPs, most of the problems stem from printers and VoIP phones. If a Dell fails, it is replaced with an HP. If an HP fails, it is replaced with an HP and the dead HP goes to HP (they're still warrantied).
...and all of that takes a sysadmin's time which costs money.
 
But they're saving tons of cash by buying <$500 machines instead of >$1000 machines. You have to encounter a lot of problems across a lot of machines for Apple to catch up.
 
Last edited:
Back
Top