I currently use 2 Dell 25" monitors with my M1 Mac Mini. I am considering purchasing a 43" 4K TV to use as a monitor in place of the 2 existing monitors. (The width of a 43" TV is substantially similar to the width of the 2 existing monitors)
Any major pitfalls or issues that I should be aware of? Especially looking for comments by anyone who has done something similar.
I was always told that this wouldn’t work, but…
I think the question would be what the effective resolution—pixel dimensions—you can use with the TV is versus what you get with your two 25-inch monitors. The TV is physically big, for sure, but if you have to use it at a relatively low resolution, it won’t be able to display as much content (think window size) as your two-monitor setup. Instead, you’ll get less content at a larger size.
So check out the resolution number and the pixel density, usually specified in pixels per inch. The higher the pixel density, the crisper text will appear.
Thanks for that tip. I will include that in my research.
There was an earlier thread where the suggestion was made of going from dual large monitors to a single very large (34") monitor. This might be something for you to consider.
No reason you can’t use a 4K TV - as long as both computer and TV support HDMI 2.0 (and you’ve got a cable that can handle the bandwidth), then you should be able to display a 60Hz image without lossy data compression.
But that may not be a guarantee. So do your homework. Things to watch out for:
Not all TVs that are sold as 4K have a panel with native 4K resolution (3840x2160). Some, especially at the smaller sizes and lower prices, may have lower resolution panels. It will receive a 4K signal, but it will be downsampled/dithered for the screen.
Make sure you can disable overscanning. A TV image (even a modern digital one, for some reason) is a bit larger than the physical screen. The edges are cut off when it is displayed.
Most TVs have a configuration mode to disable this. If yours doesn’t, macOS does let you reduce the image size to compensate, but that comes at a cost of lower resolution.
A TV (or computer) that isn’t HDMI 2.0 compatible won’t have enough bandwidth for a 4K 60Hz signal. You may be able to drop down to 30 Hz refresh, but you may find the result annoying. TVs can try to compensate by using 4:2:0 subsampling on a 60 Hz image. This creates a slightly blurry image (loss of color detail), but keeps up the higher frame rate.
Some TVs may only support computer-friendly image processing on one of several HDMI ports. So read your manual and use the correct port if you can’t seem to get everything working correctly.
Note that none of the above behaviors are problematic for a TV. When you’re watching a movie or a TV broadcast, you probably won’t notice things like downsampling (on a small screen) or color compression. But they will seriously compromise usability for a computer desktop.
None of these are deal-breakers. You can get very good picture quality from a TV. But it’s not guaranteed. I frequently plug my laptop into a hotel TV when I’m traveling and the quality image I get varies quite a bit from location to location. It’s fine for streaming TV shows and movies, but I almost always stick to the laptop’s own screen (which is fairly small on an 11" MacBook Air) for computing purposes.
And, of course, this is in addition to what @ace wrote above - larger screens translate to bigger pixels, not more of them. A 40" 4K TV is going to have the same number of pixels as a 75" 4K TV. If your intent is to sit right in front of the screen (and not across the room, as you would when watching a movie), then you’ll probably find that once you get larger than about 30-35", you won’t be happy with the results.