non-Apple monitor -- the scaling problem

There’s been a lot of discussion about this topic elsewhere on this list. Three options for 27” 5K monitors have been mentioned: Apple, LG and Samsung. Right now the LG is available for ~$850 from B&H Photo. The Samsung has had pretty wide price fluctuations on Amazon over the last month. The Apple’s price hasn’t changed, though a refurb might be an option.

No problem. But I’d like to underscore the point that monitors are a personal preference thing. You may really like the look of a 5K display (whether Apple, LG or someone else’s). Others may not. Some want a larger screen for that many pixels while some prefer Apple’s “Retina” scaling to produce extra-sharp images.

And some would prefer an ultra-wide display (I’m one of those, but I really don’t want to get rid of a working display in order to put one on my desk).

Ultimately, if you can, see if you can go to a store and see for yourself what a variety of different displays look like. I know Micro Center has a lot of models on display. Other stores (Best Buy, Staples, etc.) also have display models, but with a much smaller selection.

If the store will let you connect your own computer to a display unit (laptops are clearly preferable for this), then that would be even better.

Reviews and commentary will help you avoid a clearly bad choice, but if you can try it yourself before buying, that will give you a better chance of getting what you think is best, vs. what a consensus of others think is best.

3 Likes

As suggested in another discussion 9to5 Mac has a good review of TB monitors:

It includes detailed specs of each device.
That page also has a link to a review of USB-c monitors.

3 Likes

I may stop by B&H to take a look at what’s available from them. They have a fairly large Apple department and often have other displays connected to Minis. As for the personal preference, yes. I prefer very sharp text but want to avoid really tiny text on a large display. The Apple Studio monitor I looked at in the Apple store seemed to provide a good balance between display type and clarity (as well as brightness and color rendering). When I was last at B&H, I think they had an LG display connected to a floor-model Mac Mini.

I understand completely. There are three ways to get larger text on a display:

  1. Buy a display with lower resolution. Clearly, if no scaling is happening, text on a 1080p display will be larger than on a 4K display of the same size.

  2. Buy a high resolution display but send a lower-than-native resolution signal to the display. For example, send a 1080p signal to a 4K display. The display’s built-in upscaler will make it fill the screen.

    The image quality may vary quite a bit from model to model, because some upscaler chips do a better job than others.

  3. Buy a high resolution display that Apple detects as Retina (or use a tool like SwitchResX to make macOS think it’s a Retina display) and use macOS’s built-in scaling to make the text the size you like.

    This creates a desktop with an effective resolution lower than the display’s native resolution, but macOS will render text and vector graphics at the display’s native resolution, so it should look sharper than either of the previous two options.

I’m not surprised, because it is a Retina display. So you’re getting the macOS GPU-based scaling, which is going to give you the best quality text for a given (effective) desktop size.

2 Likes

I’m using a 27" 4K Dell S2721QS monitor on a 2018 MacMini. When I booted the Mini up with the Dell, it took the default 1920 x 1080 resolution, just the same of the (dead) monitor it replaced. The monitor can handle resolutions to 3840 x 2160, but my aging eyes can’t. I can read type at a step of two up in resolution, but going much higher would give me eyestrain. So far I’m happy with it, and I got a good deal at MicroCenter, so it I didn’t pay much extra. 4K prices have come down.

2 Likes

I"ll have a look at the monitor. Some of the Dell displays have gotten good reviews. So, if you were to use the monitor at 3840x2160, I assume the display typeface would be far too small for you to read comfortably. That’s a consideration for me as well. Next time I’m in the area, I’ll have another look at the Apple Studio Display to see what the display type looks like on screen. I’m assuming they are using the monitor’s highest resolution.

By the way, I’m curious, is anyone using the LG Ultrafine 5K that was designed for the Mac a few years ago. Build quality was very good as I recall and it’s about $500 less expensive than the Studio display. It’s still an expensive display though at around $1,100.

See also this discussion:

In particular, I recently bought an LG UHD 4K monitor that works well with an M2 Macbook Air. One USB-c cable charges the Macbook, sends video to the monitor and gives me access to external hard drives plugged into the monitor (it seems to have similar features to the LG 5K monitor mentioned above but at much lower cost).

I’ve had one of the 4K LG UHD models (the 34" ultrawide curved) on my M1 Mini. I’m happy with it. Haven’t really been craving for Retina, but admittedly I’ve got nothing to compare it to.

Right. If there is no scaling (e.g. a non-retina screen), then a 4K display is going to render everything at half the size of a 1080p display. And it doesn’t surprise me if this is too small, even on a 27" screen.

But if macOS recognizes the display as retina, then it will do GPU-based scaling. You’ll get a full 4K resolution signal, but all the text and graphics will be larger.

So what’s the advantage if everything is displayed at the same size? The higher dots-per-inch display resolution. You will find that text and vector graphics look better because there are more pixels per unit-distance on the screen. Bitmap graphics might also look better, depending on the internal resolution of the source image data.

The downside of this GPU-based scaling is that your system will consume more video memory and the GPU will be doing a bit more work. But depending on your Mac’s configuration and normal workload, you may never notice the difference.

1 Like

This is the part I’m a bit confused about. What does it mean for a display to be recognized as a “Retina” screen if it’s not an Apple screen? I know it’s a trademark and the screens have certain characteristics, but I believe the specs on those screens have changed over the years. Or, are there non-Apple displays that are also “Retina” displays?

The new display seems to look better, but I have no way to compare because my old monitor died and I had to use an old 22-inch Samsung until I bought the Dell 27". I don’t notice any slowdown, but I don’t do much to strain the machine.

“Retina” is a marketing term, not a technical one. Modern displays will be recognized by macOS with their native resolution and then macOS offers various scaling resolutions you can choose from. It is quite common to not run these high-resolution displays at their native resolution.

If you go non-native, you can either choose an exact integer fraction (eg. display 1920x1080 on a 4K display with its native 3840x2160) or something in between like for example the 2560x1440 I have chosen on my 4K Dell U2720Q with its native 3840x2160. macOS shows you the exact resolution for various scalings if you hover the mouse over those selections in System Settings > Displays.

I have yet to find a single benchmark that shows me any difference whatsoever from my GPU accommodating a native 3840x2160 vs. non-integer fraction 2560x1440 vs. exact integer fraction 1920x1080. For sure the GPU has to do work here (and mem footprint is indeed measurably different) and for sure that work load changes depending on chosen setting (as does the mem footprint), but I have yet to see any sign that it actually affects measured performance using any benchmark I have so far run on this system (M1 Pro 14" MBP). I would expect that might change if I tried the same tests using a higher-res monitor like Apple’s XDR 6K or an 8K system. But on 4K, nothing. M1 is awesome. :slight_smile:

5 Likes

Apple Display Settings (at least in MacOS Sonoma)can show what is happening. My home setup has an Apple Studio Display as the primary monitor and a Dell 4K U2723QE as a secondary monitor. I’ve attached 3 images below to show the possible relations.

After accessing the Display settings, tap the Advanced button and turn on ‘Show Resolutions as list’ (1st image). Then tap ‘Show all resolutions’ in the monitor’s resolution list. The 2nd and 3rd images show the possible resolutions for both monitors. The top one is the native resolution. In both cases, the default resolution is precisely half the native resolution. Note that there is also a low-resolution version of that, which is created by applying the color value to each of the 4 pixels representing the simulated pixel at that resolution. The sharp version uses the native resolution for graphics and a doubled font size for text at the native resolution. So, when you ask for a 12-point font at the default resolution, the display shows you a 24-point font at the native resolution. This, of course, sharpens the image.

So, you should be able to accomplish this magic using an appropriate graphics driver (which is applied for Mac-compatible monitors). Retina does not refer to this trick but to the pixel density.



So, a 4K monitor shows a sharper image, even when set to 1K resolution. If you set the monitor to a low-resolution version of the desired view, you can see the difference a high-resolution monitor makes, even when used at less than full native resolution.

4 Likes

The video that concerned me was this (the part about performance comes around the 6:50’ mark).

I’ve been using the term more generically, not just as an Apple brand.

In general, a “retina” display is one with resolution high enough that you can not distinguish pixels at a normal viewing distance. For a desktop display, this may be around 200ppi. For a phone display, it may need to be higher (maybe >300 ppi), because these screens are typically held closer to your eyes.

FWIW, my desktop display is a 24" screen at 1920x1200 resolution. This is about 94ppi, and is NOT considered Retina by most people. A 4K (3840x2160) at 24" is 182ppi, which may or may not be considered “retina”. At 27", it’s 162ppi.

Some apps call these displays “HiDPI”, probably in order to avoid treading on Apple trademarks.

But that’s irrelevant. My arguments are not based on whether or not someone will use the term, but how macOS treats the display. For a non-retina display (like mine), Apple presents a set of resolutions for you to choose from, and does not scale its output:

A display that macOS considers “retina” doesn’t present a set of resolutions (unless you jump through some hoops to get the list). Instead, it only outputs the display’s native resolution and presents settings to let you configure the scaling factor for the user interface:

Apple auto-detects their own monitors as HiDPI (“retina”) capable. I think (but don’t know for sure) that it can perform this detection for some third-party displays (I would assume those that can report their physical size and therefore allow macOS to compute it’s native DPI).

I did some web searching to see about how to trick macOS into thinking you have a HiDPI display when it doesn’t auto-detect one. I found a lots of tips, tricks and apps which claim to do this, but it appears that most of them stopped working with the advent of Apple Silicon, which really sucks.

But here are a few articles that might help:

Sorry I can’t give you anything more concrete at this time. I was actually very surprised to learn that Apple killed off techniques that used to work. But I think it’s definitely worth giving some of these a try. If they don’t work, you haven’t lost more than a bit of time, and if they do work, it will probably look better than selecting a lower resolution.

Ah. He’s not wrong. Display scaling at a non-integer ratio does require your GPU to scale the screen. Yes, HiDPI scaling renders your desktop at 2x your configured “effective” resolution and then scales it to your display’s native resolution.

But note that he only noticed the GPU load problem because he’s doing video rendering, which is a massively GPU-intensive application.

If you are using GPU-intensive applications, then you may come to the conclusion that he came to. But if you’re not (e.g. doing more generic computing - office, web etc.) then you’ll probably have more than enough GPU power to spare and may not notice any problem.

3 Likes

No hoop jumping required. It’s one button away. See @aforkosh’s excellent post above.

This might have been different on ancient OS X versions or legacy Intel Macs, but on halfway recent macOS versions and Apple Silicon this has become quite simple and transparent.

Of course it can. In fact, I haven’t come across any 4K DP display in a couple years that wasn’t immediately recognized by macOS as scalable “retina”.

I’d argue you’d need to try to hook up a rather old or low-res monitor (or perhaps use some kind of obscure connection) to run into any kind of issue here. If OTOH you buy a screen today that you hook up via DP or USB alt mode, and it offers reasonably high res, macOS will offer scalings right away.

2 Likes

This discussion has gone all over the place, but as I suspected, what this guy is talking about doesn’t apply to most users. He’s doing high-end graphics work with demanding applications like Blender and really pushing the GPU. An average user doesn’t have to worry about GPU usage at all. In my work as a broadcast/production engineer I’ve connected all kinds of monitors to all kinds of Macs (and PCs) and never give a thought to this, even for editing systems.

I will add that I stay away from Apple’s displays as a rule because they are expensive, quirky and have limited connectivity. Apple’s attempts to make equipment “just work” is fine for non-technical uses, but gets in the way in some cases. And the marketing hype (Retina!) obscures the reality of what people actually need. I often spec Dell Ultrasharp displays for general purpose use, and I generally prefer LG over Samsung. My wife is using a calibrated BenQ monitor for photo work.

Get the display that fits your budget for the size you want, and what your eyes can actually see (mine no longer benefit from ridiculously high resolutions because everything is too damn small).

– Eric

3 Likes

Oh, thanks David, that was all very informative. The fog is starting to clear on this issue. With respect to the video, I was more or less thinking the same, since I’m not doing video rendering (or even using Photoshop). Most of my work is text-based so I’m not putting a massive load on the GPU.

FWIW, the easy solution would be switching resolutions to native or exactly half native just before doing serious rendering. Generally, Youtube videos don’t get lots of views for keeping things in perspective.

1 Like