More thoughts on why we aren't seeing another 27-inch iMac

Ditto re the screen size. I would upgrade from my 27" iMac if there was a larger screen available, and just don’t understand why Apple has abandoned the larger screen…so many people I know in the architectural community who have older 27" iMacs are also scratching their heads…

It’s pretty clear that Apple is not going to produce an iMac with a 27" screen. Now that the basic MacMini is at least as powerful as an iMac, you should consider getting an appropriate Mac Mini and 27" monitor. If 4K works for you, there are many affordable ones (in particular, the Dell Ultrasharp gets great reviews). If you must have 5K, LG is quite affordable, Samsung is Samsung, and Apple is expensive but matches the look you’re familiar with. If you’ve used the internal iMac speakers as your speakers and are not getting the Apple Display, you should check out speaker alternatives. Finally, you can repurpose your old keyboard and pointing device, although you might consider a new Magic Keyboard with TouchID (quite useful for authentication).

@jdunham The solution for your architects is a Studio Display with a Mac mini or Studio. The screens will last for two or three system upgrades.

I ran that math for a law firm figuring on ~10-12 years for the displays. We replaced 2013 vintage 27" iMacs starting in 2021. If we could have upgraded macOS on those iMacs, we would still be running them as they were perfectly fine for our basic needs.

But separating the display from the CPU means that now I can, for the next replacement cycle, save significantly, just purchasing minis or Airs.

And now that the minis is smaller, I’m sure we’ll see a mount for the back of a Studio Display to better hide the cable clutter.

Thank you Alan, I’ve been considering the MacMini w/ the Dell UltraSharp but had been holding back just in case Apple surprised us! :-) I appreciate your comments!

Thank you Jonathan, I appreciate your comments. I have considered the Studio Display got scared off by the cost…perhaps I should run the numbers again! :-)

You might want to look at the LG UltraFine 27MD5KLB-B 27" 16:9 5K IPS Monitor (I know that’s a mouthful). It uses the same panel as the Studio Display, but at this moment, it’s only $850 at B&H. I’ve purchased a few and the quality seems on-par with Apple.

Another option, which I utilize at every opportunity is to purchase Apple refurbs. Standard Studio Display is $1,359. Pretty much everything I’ve purchased for my clients is from the refurb store. You just have to wait 3-6 months from introduction, but the refurbs offer immediate delivery of a higher RAM or larger SSD configuration, assuming it’s available at your time of need.

Just a few additional things to consider.

2 Likes

I wonder if Apple introduced the 27-inch iMac because that panel became available, and the company felt that the price would come down as 5K 27-inch panels became more broadly available throughout the display industry. When the world instead standardized on 4K 27-inch panels, perhaps the cost of the 5K panel stayed high, significantly cutting into Apple’s margins.

The only problem with this theory is that the 24-inch 4.5K screen used by the current iMac would seem to be weird—I haven’t seen other displays using it at all (not that I’ve looked hard). But maybe it’s small enough or enough easier to build that it’s still worthwhile to put into a low-cost Mac.

This is very important. The upscalers built-in to displays work very well for video, and might also be acceptable for text (of course, YMMV).

And if macOS recognizes the display as Retina (either automatically or via third-party software), you can use Apple’s scaling, which looks very good and shouldn’t affect system performance unless you’re running GPU-intensive software (and if you are, you can probably turn off scaling during your rendering sessions and turn it back on when the rendering is done).

And yes, I’ve posted several comments all about why 2x scaling is better than fractional scaling. 2x is better, but fractional still looks very good and won’t impact system performance if you’ve got lots of GPU cycles to spare.

1 Like

I’m wondering why there are so few 27" 5K displays. I think there are just Apple’s plus one LG. And the Benq PD2730S that you can’t actually buy.

Windows supports high resolution displays, so that isn’t it.

There are so few, because they are expensive and most people don’t want to spend that much.

1080p and 4K displays are cheap because the manufacturers of the panels can make them in massive quantities for the TV market. But other resolutions (1200p, 1440p, 5K, 6K and various ultrawide resolutions) can only be used for computer displays, making them more expensive and less common than displays that use TV-resolution panels.

5K and 6K, being computer-only displays, plus being the highest resolution and that they’re marketed at users that want/need those high-end displays makes them expensive and limits the size of the market. And when they are OLED and with high color accuracy, that just makes them even more high-end and expensive. So I’m not surprised we don’t see a lot of them sold.

4 Likes

This has been discussed in other threads. While 4K displays are useful for video and gaming, 5K displays are not. Thus, the demand for 5K displays is comparatively negligible. By the way, Samsung also makes a 5K display, although even the reviews on its website are less than terrific.

Agreed! My professional work is in broadcast and production engineering, and in the early days I was very concerned with using displays whose native resolution matched the video being watched. But scaling has come so far that I don’t worry about it much, except where there is reason to evaluate an image at true pixel-for-pixel res. In some ways I was sorry to see the demise of CRTs because they have no “native” resolution; no fixed-pixel dimensions.

From a practical standpoint, pushing for more resolution on “computer” displays only goes so far, especially if the viewer is not 20 years old! It’s nice to have tons of desktop space, but teeny-tiny text and images are not useful in many situations. I’m not sure why someone would want a 6K display for computer use. Or, for that matter, watching movies that are 4K at best (and heavily compressed). Hype and numbers get the best of people.

1 Like

It’s not all hype.

Yes, if you stick with fonts of a certain pixel-size, then higher resolutions makes everything smaller (or you’re forced to buy insanely large displays). But all modern operating systems have scalable UI elements so more pixels-per-inch can also be used to produce same-size text and graphics, but with much smoother edges around all objects. Much like how a 1200 dpi printer produces better looking text than a 300 dpi printer.

4K movies? Depends on the size of your screen. If you’re using a 30" screen, you probably won’t see any difference (unless you’re sitting very close to the screen). But if you have a home theater screen (e.g. 65" or larger), then you can definitely see the difference, even from 15’ away. I agree that the massive compression used by most streaming services makes it mostly pointless, but 4K Blu-Ray movies definitely look better on these large screens.

This is the same reason many movie theaters use 8K projectors. Because the screen is so large that people sitting in the front row can often see the pixels of a 4K image, even though people at the back probably won’t notice much difference.

Just about everyone who is involved in working with any of the vast multitude of professional print publishing businesses, including packaging, books and magazines. Also add in automotive, food, clothing, pharmaceuticals, etc., etc. are required by the US government to include printed product information. Add in ultra teeny tiny printed signage for millions of services and products as well.

Apple thrives in print and digital spaces. It’s one of the reasons Apple is one of the largest companies in the world.

I think you are neglecting that physical resolution does not need to match rendered resolution the user sees.

In fact, the entire Retina revolution Apple kicked off with the iPhone 4 back in 2010 was based on exactly that. You build in lots of physical pixels but then you scale displayed content to end up with reasonable user-facing resolution. That way items stay the same size you’re used to, but they are much more sharp and crisp because the system can resolve their details better with the display’s higher physical resolution. And note that for this to deliver the very best results you want the scaling to be an integer multiple.

This is exactly why we Mac users love panels like Apple’s 5K 27" be it on former iMacs or now with the Studio Display. It offers 5120x2880 which can be scaled by exactly a factor 2 to deliver a conventional 27" image (displayed at 109 ppi). But it does so resolving much more detail due to the fact that it can render with 4x as many pixels. And that is why it gives us such a nice crisp image.

So in that sense more pixels are better, not because you see more (which would require displayed items getting increasingly smaller), but because your stuff gets sharper, crisper, and the details better resolved while maintaining constant size.

The only question is where should that end? My old school physicist attitude to that is to play a simple estimate game. IIRC the human eye has an angular resolution of about 0.1 mrad. A typical distance from my eyes to my large desktop screen is say about 24" which means at that distance my eye can resolve two points if they are separated by 61 microns. And those 61 microns would correspond to 416 ppi. To me that indicates that we have roughly another factor 2 to go from where we are now (Apple’s 5K 27" has 218 ppi physical resolution). Just for fun: that would mean roughly 8K on a 22.5" display or on a regular 27" just shy of about 10K. Obviously, this is just a simple estimate, but it indicates that we haven’t yet reached the end of what makes sense to optimize in terms of raw resolution.

2 Likes

To be clear, the text on a Retina display looks better because of anti-aliasing. That is, the smaller pixels result in lines that are less jagged, and it doesn’t have to use anti-aliasing (such as changing the pixel color) to compensate.


Fun fact: Did anyone use Safari on Windows (2007-2012)? It was very apparent that the same web page on Safari was much fuzzier than the page on Internet Explorer. It looked kind of blurred in comparison, even with ClearType (Microsoft’s name for anti-aliasing) enabled.

The reason was due to a difference in philosophy for how to scale fonts to different sizes. macOS prioritizes faithfulness to the font design, while Windows tries to scale to whole pixels. So the text on Windows looks sharper even though it is a distortion of the actual font face.

Safari (and other Windows apps based on Apple’s port of the Cocoa frameworks) did not use Microsoft’s ClearType. These apps provided their own anti-aliasing code based on the algorithms used by macOS.

I don’t think the Apple algorithms performed sub-pixel anti-aliasing, but only used whole-pixel algorithms. This makes text slightly less sharp, but the algorithm doesn’t have to be tuned for your monitor, unlike ClearType which does need to be tuned for best results.

The Microsoft approach probably made more sense in the days of low resolution panels (e.g. all those early LCDs at 800x600 and 1024x768), where you could see individual pixels and sometimes even subpixels. But today, with 1080p and 4K panels, individual pixels are often not distinguishable, subpixels even moreso. So an algorithm that doesn’t require tuning makes a lot more sense today.

2 Likes