Here is an extremely interesting and I dare say a bit provocative paper from the University of Cambridge on the relationship between display resolutions and the resolving power of your eye. The link is to the phys.org summary of the paper. You may not need that 8K display…
Yes, I’d say it’s good to see some rigorous testing of what a lot of product reviewers say about ever-increasing resolution levels—and anybody who is a gamer often experiences at the beginning of a game when adjusting display options before the first level begins—not adding that much to viewers’ enjoyment. I’m not sure I’d call it provocative, though, because it’s been fairly obvious since 4K movie discs became available that 4K’s improvement over Blu-ray is highly dependent on screen size and viewing distance.
From the article:
“Our eyes are essentially sensors that aren’t all that great, but our brain processes that data into what it thinks we should be seeing.”
For eyes that are not “all that great”, this is not very far from the diffraction limit of a pupil diameter of 5 mm for 500 nm (0.5 micron) light. The diffraction limit for a pupil whose diameter is d for wavelength of lambda is given by the formula 1.22 lambda/d and this evaluates to 0.12 milliradians. According to the article, for a monochrome image, the resolution of the eye is 1/(94*57) = 0.19 milliradians (since there are ~57 degrees per radian).
I had always thought that a normal eye had a resolution of 1 milliradian. This result is 5 times better. Not that bad for a “jelly-based” optical system!