I need to replace my iMac but Apple's options aren't suitable

Waiting is best if you’re able. On the other hand, I have an iMac Pro that I brought home from work during the pandemic…and recently I got an M1 Macbook Air (16GB memory). It beats the entry-level iMac Pro in several jobs I’ve run (video transcoding for one)…

Good advice. I’m using an external SSD, a fast one from OWC, but my iMac is a 2017 and yours may not benefit from the fastest.

I should think you’d be able to migrate using Apple tools rather than CCC.

They’ve often announced Pro gear, MacBook Pro and the Mac Pro iterations at WWDC as Marilyn suggests. I’d expect this year we will see the next tier of M-series Macs.

The M1 mac mini is killing all the benchmarks for video processing and music production even at 16Gb limitation.

And a 40" 4k tv at costco currently runs about $175. You won’t find a better “monitor” deal anywhere. Monitors cost more. Larger costs more. smaller costs more. 1k costs more. 8k costs more. I own several of these “monitors”. Looking forward to an M1 mini. I wouldn’t want the iMac precisely because it has a small, low rez monitor.

Someone please correct me if I am wrong, but I remember always being told that monitors and televisions are vastly different, and TVs are not designed for close up viewing. Other than turning a TV on or off or changing channels, it doesn’t render much input from you, so typing, scrolling, processing images, cutting and pasting, etc. will be significantly slower. Refresh rates are not the same; monitors are necessarily much faster. Resolution rates are much different. Documents, spreadsheets, images, web pages won’t look like they should. And color is managed differently.

Well, that was definitely the case back in the days of CRTs. Televisions then had relatively large dot pitches (the spacing between dots/lines in the shadow mask or aperture grill). They also had limited ranges of frequencies and typically only had composite/S-video/component inputs, which generally can not support the high frequencies used by VGA-type interfaces.

CRT monitors, on the other hand, had much finer dot pitches (0.28mm was a common size for consumer monitors and some high end ones were even better) and included the interfaces/circuitry needed to support high resolution images.

Once CRTs went away and (almost) everything shipped started using LCD panels, much of the differences between TVs and computer monitors went away. The biggest differences became the choice of interfaces (very few TVs have or ever had VGA, DVI or DisplayPort interfaces). But even this has become less of an issue because HDMI has become very popular both for TV/video equipment and computer equipment.

Today, the differences can be pretty subtle. TVs often apply filters to adjust image quality for movies to look good. And TVs, by default, implement “overscan”, where the image is magnified slightly, cutting off the edges of the screen, but most also have settings to let you disable this. The names vary, but they are typically named things like “PC mode” or “Just scan”. Most TVs today also have “gaming” modes where they remove most (or all) of the image filters in order to minimize the latency.

There can be a difference on the HDMI signal. Video devices typically use Y/Cb/Cr encoding while computer devices typically use RGB encoding. And there are compressed encodings used for video which will make computer images look bad. But there’s massive overlap today. I’ve seen plenty of TVs that support RGB encoding and plenty of computers that can output Y/Cb/Cr, with or without compression.

But technical discussion is just that. When push comes to shove, I haven’t really noticed any real difference for a long time. I often see businesses set up TVs in conference rooms in lieu of projectors because 65" 4K TVs are less expensive than HD projectors. Just plug the TV’s HDMI cable into a computer and everything just works. You might need to put the port into PC mode, but you only need to do that once, when setting up the TV.

I’ve never had a problem with images not looking right in these rooms.

The only potential gotcha I can think of is if you are looking at small devices. Some smaller TVs (especially cheaper ones) may not have panels where the native resolution is the full 1920x1080 (HD) or 3840x2160 (4K) resolution. They will report support for those resolutions over the HDMI cable, but they will scale the image to the panel’s actual resolution. But it’s not hard to pull up the specs for most TVs in order to make sure the native resolution is full HD or 4K.

IMO, the main reasons for getting a computer monitor are:

  • Support for VGA, DVI, DisplayPort, USB or Thunderbolt interfaces.
  • Aspect ratios other than 16:9 (e.g. 16:10 and various ultra-wide aspects)
  • Resolutions other than 1080p and 4K (e.g. the 5K and 6K screens that Apple likes to use these days).
  • High refresh rates. But many TVs today support high refresh rates as well.
  • Better color calibration. But you’ll find that the default calibration on non-professional monitors (that is, ones most of us are willing to pay for) isn’t necessarily going to be a whole lot better than a good quality TV.

The main reason for getting a TV is if you need a large screen (e.g. larger than 40"). Unless you’re prepared to pay insane prices, you won’t get more than 4K resolution anyway, and TVs are much less expensive. For example, I often see 4K TVs at 65-75" for sale in Costco for under $1000.

That used to be true. It hasn’t been true in a number of years. More, televisions benefit from the benefits of mass production so once they reached comparable resolution, they quickly became superior in most applications.

Yes, color is rendered differently. I do some video work and honestly, I can’t tell the difference but I know that color pros can. I could see the difference between LED and Plasma. The thing is… most of my product will be displayed on a 4k tv anyway so any precision or preference beyond that will be lost when it’s converted to 4k tv anyway. Worse, most will be seen on youtube or other 1k displays or telephones, which are even lower resolution.

For web pages, documents, images, etc, you won’t see any difference. You get more difference from the light in the room than you do from the monitor.

For speed… most 4k tvs today are 120Hz. 240 is available. That’s faster than most gaming monitors. But we’re talking minute fractions of a second here that really don’t matter to much of anything beyond pro gaming contexts. You seriously won’t see or feel the difference.

There is one down side, I’ll admit. Many tvs don’t have a hardware “off” switch. You have to pick up the remote and touch a button instead. Even when they do have a hardware “off” switch, it’s often located in an inconvenient or hard to get to place because nobody ever uses them anyway.

Oh, and power savers. I tend to turn mine off manually rather than letting them time out and power off by themselves. That works a bit differently with a tv than with a dedicated monitor. The TV will eventually power off if you don’t have a signal going to it but it does a serious, hard power off. Just touching your mouse won’t bring it back. You’ll have to power it on.

Anyone concerned about CMYK will be SOL with a TV.

What monitor supports CMYK? That’s an encoding (I thought is) only used by printers.

CMYK is required by imagesetters. Though it isn’t necessary if someone is never going to send anything out for printing or prepress.

CMYK is a subtractive colorspace. No monitor uses it. A monitor that accepts a CMYK signal (assuming such a device actually exists) is internally converting that data to RGB before sending it to the panel. There’s no technical reason why this must produce better results compared to a high quality RGB display and a computer that has been calibrated for it.

No conversion, whether internal or external, can ever do a perfect job. The color space of a printed page is not and never can be a perfect overlap to any display. So no matter how much you invest in your monitor, there is still no substitute for printing actual pages using a high quality printer that has been calibrated for the rest of your workflow.

But either way, you’re describing an edge case. Most users are not professional printers and no consumer device (whether sold as a TV or a monitor) will ever be able to provide the kind of calibration required for that application. Specialized applications always require specialized equipment.

1 Like

I’m running the 8 core M1 MacBook Air to a VΛVΛ hub; and from the hub’s HDMI port an HDMI to DVI Amazon cable to an ancient 24" Samsung monitor.

Works beautifully – like having an iMac I can carry. :slight_smile:

And just so we’re clear, although a hub is great for other reasons, this simple $7 cable alone would accomplish the monitor part.

This confuses me. I thought iMac monitors were all 5K (Retina) these days, which seems to be higher rez than 4K. This being one reason I’m not convinced that a Mac Mini with a monitor/TV would be a satisfactory replacement for my aging iMac. (Though I do understand the argument that it’s a waste of an excellent monitor to have it permanently glued to the computer innards of a not easily upgradable iMac.)

1 Like

Ony iMac 27" monitors are a full 5K (5120 by 2880 pixels). The Intel 21" version maxed out at a bit over 4K (4096 by 2304 pixels), while the new 24" Apple Silicon iMacs are in between: 4480 by 2520 pixels. Note that the pixel density on the new iMacs (218 pixels/in) is the same as the pixel density on the 21" iMac. The latter note came from a review of the new iMacs published today on Are Technica.

I’ve owned a 2014, 2017, and 2020 iMac 5K and the first two are rather anemic today, whereas the 2020 is pretty strong though still Intel which you may not want to purchase new seeing as how Apple Silicon will be replacing it shortly - unless you’re into gaming.

My 2020 iMac 4K has a core-i9@3.6ghz Intel 10910, 128 GB OWC RAM, 4 TB SSD, AMD Radeon 5700 XT 16 GB, and 10 gb ethernet - the best Intel machine I’ve owned but a machine whose single core speed has since been eclipsed by M1’s 3.2 ghz Firestorm cores. It fits my workflow just fine, especially as it can be booted into Windows and play games like Mass Effect Legendary Edition at 5120x2880 resolution with 4K assets.

The first M1x machines - the 14" and 16" MacBook Pros are slated for introduction as soon as WWDC21 (tomorrow), but the 27" iMac replacement has apparently been delayed in favor of the 24" launch. Certainly if you have no interest in Windows you’ll probably want to wait for that model to appear.

Know now though that we’re talking major bucks when it comes to the big iMacs 2020 or later - my 2020 iMac came in around $4500 with Applecare+ sans RAM on August 4th of last year - and I expect (at least initially) the Apple Silicon replacement to be around the same price.

Once the new model debuts, you can probably get the 2020 discounted, but if you’re going for M1x these may be in short supply due to the chip shortage and pent up demand. Also note that there’s a strong possibility that user configurable RAM may no longer be an option.

The 14" and 16" MacBook replacements will probably blow anything Intel or AMD out of the water due to their performance/energy efficiency envelopes, but the iMac may still yet garner some competition from the Wintel alliance as the x86 vendors push up clocks, cooling, and core counts to astronomical levels trying to satisfy their users. You can bet the duo will be cautiously eyeing the AS competition which may attract OS agnostic users.

Verne, looks like you almost maxed out every option. The base price for the 2020 27" Intel iMac is $1800. The price for my version (top I7, 2TB SSD, mid-line graphics, 8GB memory (replaced by 32GB from OWC), 1Gbit ethernet) was about $3100 after a corporate discount.

Didn’t max out the SSD (4 TB) and left RAM at minimum on the Apple order. so including Applecare+ and sales tax it was closer to $5K.

I knew Apple Silicon was coming, but figured this was my best chance to do some Windows gaming using the same iMac (and wait out the transition on a fairly maxed out machine).

Meanwhile, the value may not plummet as far as some middle-of-the-road Mac Intel hardware since even if it was less desirable in the Apple world, it’s still a pretty powerful computer on the Wintel side and thus would remain fairly unique as a computer straddling both worlds.

If that’s the only computer you own, that might make sense.

I own many more. Having a dedicated monitor is a liability in terms of convenience and desk space as well as cost. The one on my desk currently serves quadruple duty between a work laptop, a personal mini, a personal laptop, and a personal linux box with a kvm. And I need at least one other for other computers, transient computers, etc. Having a small, 5k iMac would be a nuisance.

The correct word is occasionally, and that seems to be less and less, more so after this weeks non-announcements. Apple seems to have decided that WWDC is about developers which means about software, and the only exception is if they are introducing software to complement new hardware. Otherwise they end up with a presentation to mainly geeks about how the new MacBook Pros will be available in 6 colours. so leave that to a presentation aimed at consumers. Now if they upgraded the neural engine and introduced a new API with lots of functionality, then that would look good at WWDC.

1 Like