Two excellent resources about monitors are
in German, but their different Top 10 lists are well worth it, translate with your tool of choice
and
Two excellent resources about monitors are
in German, but their different Top 10 lists are well worth it, translate with your tool of choice
and
For everybody else reading, here’s a product-page link: ProArt Display PA278CV|Monitors|ASUS USA
Agreed. The specs look great, and ASUS is a brand known for quality. And $230 (discounted from $300) is a great price for a display promising a high degree of color accuracy.
8 posts were split to a new topic: Fuzzy text on a Studio Display in portrait orientation
On sale now at Amazon.
Speaking of Amazon sales.
The ASUS ProArt Display 27” 5K (PA27JCV) is now just $850.
Sometimes we forget how far we’ve come. Consider all the factors you didn’t need to mention:
I could go on and on…
Two relevant true-story anecdotes…
Mac vs. PC Plug-and-Play Displays
I said that displays may only be Mac vs. PC compatible, in relation to plug-and-play. Do you know what this is?
First issue is that long ago Mac and PCs used different connectors. Macs used a 15 pin DB-15, while the PC VGA connector was a 9 pin DE-9. (Please correct me if I get any of this wrong!)
And, long long ago, computer displays had fixed resolutions. Then came multi-sync displays, which could support a set of resolutions and frequencies. But the problem was, how did the computer know what the display supported, so that the computer wouldn’t send a single to the display that it couldn’t handle?
Originally PCs couldn’t. But Apple came up with a solution: they used sense codes in the DB-15 pins to signal information back to Mac’s graphics system. Later, PC monitors changed to use a DB-15 VGA pin with a different VESA scheme for signaling.
OK, so you have a Mac, but you want to use a PC display. How do you do it? The answer is to get a little adapter dongle, that converted the Mac’s display connector to the PC monitor. It also sent back the Mac’s sense codes; I think they were set by switches on the dongle.
That’s what I got from Mac Zone when I ordered a NEC MultiSync XV17+ display to go with a Power Macintosh 8500 in 1996. The display was $800, the adapter was free.
But when I got the display, there was no green. Just red and blue. So Mac Zone cross-shipped me a replacement display, at their cost. That’s fantastic service!
…which made me feel real bad when I got the new display and it had the same problem: no green. That’s when I discovered the problem wasn’t the display, it was a defective adapter dongle!
Setting the Standard
When I first started in my job we had no PCs, just mainframe terminals. Eventually the company decided we should have PCs on our desks. This must have been around 1995, because as I recall, we were getting Dell OptiPlex with a 66 MHz i486. (The reason this took so long is another story).
My team was the first team to get PCs in our organization. So, we were setting the standards.
We received the Dell PCs along with 15" Dell displays. 15" seems small, but remember we were using terminals before, which I think had 14" or smaller displays.
When we got the shipment and set them up, we were aghast. The displays were awful! They were fuzzy.
We were so upset that we took a trip to computer store in the city (Computer City?) and compared display quality. We then begged the management to spend more money to get NEC 3FGe Multisync displays, which had much higher quality.
Then it was time for the other teams to get PCs. They said to the management: “We know that 15” NEC 3FGe Multisync displays are the base level standard, but can we get 17 inch?". And so they did.
Moral: don’t be the ones that set the low bar that everyone else gets to exceed.
I use 16:10 displays. I currently have an Eizo wide gamut display (1920 x 1200) and an NEC EA242WU (NEC is now Sharp). I use them mostly for photo editing and calibrate them periodically. I mainly view what I shoot as high quality desktop images and 16:10 is a better fit than 16:9 with the aspect ration of the dSLR images, although of course less good for scanned images from 35mm film.
The NEC has USB-C, DisplayPort, and HDMI. It’s connected to an M1 Mini (USB-C), an M1 Studio (HDMI), and a MacBook Pro (DisplayPort to Mini DisplayPort). The Eizo is connected to several Macs via a KVM.
That’s $50 more than it had been until this month.
Personally, I miss vector displays.
Actually that does need a mention.
Colour Depth: “Millions” of colours (8-bits per colour) is the low cost option and does show some banding. Better quality displays have “billions” of colours (10-bits per colour). Photographers should be looking for 10-bits per colour.
I believe all Apple displays and all 5K 27" are 10-bit colour.
Chiming in again, because I was re-reading the thread and I always have something to say
.
Most of the old Macs would use a DA-15 video connector. This is (coincidentally?) signal-compatible with VGA, but is clearly a different connector. The Apple IIGS used the same connector, but without any sense pins (the IIGS only supports one frequency).
See also: Macintosh Video to VGA adapter pinout and wiring @ old.pinouts.ru
(Trivia: The letter after the “D” is the size of the shell. Apple’s 15-pin video connector is in an “A” size shell, so it’s a “DA-15” connector. The smaller size shell used for 9-pin serial ports and old PC video standards is size “E”. So the 9-pin versions (serial, CGA and EGA video) are “DE-9” and the 15-pin high-density version used by VGA is “DE-15”. “DB” technically only refers to the 25-pin connector commonly used for serial, parallel and SCSI ports. See also Wikipedia: D-subminiature)
Unfortunately, you can’t hook a normal VGA-type display to a IIGS. Even though there are adapters to make the connection, very few VGA displays will sync down to the 15 kHz pixel-clock frequency used by the IIGS (this, incidentally, is very close to the frequency used by composite video and analog TV broadcasts). Ironically old multisync-type displays from the late 80’s could, because they frequently connected to TV-type signals as well as computer signals.
PC video used a DE-9 connector for MDA, CGA and EGA displays. When they moved from digital to analog video, they packed 15 pins into the same size shell, hence the “DE-15” name for the connector used by VGA and its successors.
Correct. But most of the time, it didn’t matter. There were only a small number of standard PC frequencies (MDA, CGA, EGA, VGA and maybe one or two higher “super VGA” resolutions). Your computer would always boot to a text mode (720x400 on a standard VGA card), which every monitor can support. If you configure the computer for something that monitor can’t display, you won’t get a picture. In the worst case, you reboot to recover.
At one time, VGA had something resembling a sense pin. There was one pin that could tell a VGA card if the display is color or grayscale. IBM’s original display adapters supported this, but most others did not.
In the PC world, they adopted VESA standards for this. Specifically, DDC (repurposing VGA’s four “ID” pins) as a way for PCs to communicate with displays and EDID as a standard data format to represent display capabilities.
DDC/EDID still exists today, as a part of the DVI, HDMI and DisplayPort standards.
You found out (the hard way) about dot pitch.
Analog video is just that - analog. Although there are discrete scan lines, there is no such thing as a horizontal pixel or a discrete horizontal resolution. With a CRT, the electron beam sweeps from left to right in synchronization with the signal, and its intensity varies with the intensity of that signal. Horizontal “pixels” are an artifact of the fact that computers only change the intensity so many times per scan line, not an inherent property of the video signal.
With a monochrome display, this is great. The inner surface of the CRT is coated with a phosphor, and it glows when the beam hits it. So there is effectively unlimited horizontal resolution, subject only to the size of the spot produced by the beam (which is in turn a function of its intensity and its focusing circuitry).
But with a color display, that doesn’t work. There are three different color phosphors (red, green and blue) on the glass. There are three electron beams, to drive the three colors. And a physical object (a shadow mask or aperture grille) ensures that each beam can only hit its corresponding colored phosphors.
Note that the holes in the mask/grille are not actually pixels. The beam is still analog, varying the signal strength as it scans across a line, but it can only hit the phosphors that the mask/grille allows. When the beam isn’t aiming at a hole, it gets absorbed by the mask/grille and you don’t see anything.
The upshot of all this is that as the size of “pixels” (again, an artifact of the computer, not the video signal) gets smaller than display’s dot pitch, they start to get fuzzy.
A CRT with a dot-pitch of 0.25mm was considered very good for consumer equipment. If you wanted to spend a lot, you could get them as low as 0.21mm, but I don’t think anybody made CRTs with smaller dot pitches. I assume there were physical limits or manufacturing problems trying to make tubes with smaller dot-pitches than that.
0.25mm dot-pitch could cleanly render 1280x1024 on a 17" screen or 1600x1200 on a 20" screen. Smaller dot pitches (generally more expensive) would give you sharper images at the same resolutions. Larger dot-pitches cost less but would produce fuzzier images at the same resolutions.
Some of those early cheap displays had dot-pitches of 0.35mm or worse. These are effectively television-quality picture tubes mated with VGA interface circuitry. Those dot-pitches might be OK for something like a 25" screen at 1024x768 resolution (e.g. a conference room presentation display), but would be painful on a 15" desktop display.