Solve Myopia in a Pinch with an iPhone

Originally published at: Solve Myopia in a Pinch with an iPhone - TidBITS

Faced with the need to find his glasses in a dark house, the nearsighted Adam Engst discovers that holding the iPhone in front of his face with the flash-enabled Camera app running lets him see perfectly.


Yes, I’ve done the same, but for reading tiny print. An iPhone can be very handy for reading microscopic Apple serial numbers printed in light gray on a silver background!


If only Apple would make some sort of an iPhone holder that would strap to your head so you could use this setup without arm strain, like iGoggles… :crazy_face:

Rather than the Camera, I use the Magnify app, which has appropriate options for aiding sight. For one thing, the zoom is much easier to manipulate than in the camera app, and it’s straightforward to turn on the light. Moreover, you can set up a Control Center button for it (how I usually activate it). You can also have it activated via the Accessibility Back Tap (Touch>Back Tap) and Side Button Triple Click (General>Accessibility Shortcut) options.


One of my first questions from the Vision Pro demos was the need for prescription lenses that snap-in.

I guess it’s not possible to fully correct the image in software? I was thinking of some sort of software that helps to auto-focus the AR image until you’re happy with it. Certainly it can magnify both the pass-through image and the interface as needed.

It would depend on the hardware. You can’t (obviously) alter the image on an OLED panel in a way that will correct the focus for someone with vision impairment.

But if there is some kind of system of active optics (e.g. lenses on a motorized mount, or other more advanced technologies), then the software can adjust the optics. But there will always be a limit to how much it can adjust. Beyond some point, additional (or replacement) lenses will be needed.

You don’t want to just magnify the image, since that will reduce the amount of content visible, and everything will still be fuzzy. You want the optics to leave the images the same size and focus them on your eyes. Which is a more difficult challenge.

1 Like

This is one of the ways I use my iPhone every day…

My niece has low vision, she can only see things about 2 inches from her eyes. As she grew up, she became rebellious when she had to use adaptive devices. I was talking to my sister about it when I suddenly recalled a photo workshop I saw in the 1970s: Photography for the Blind. They gave Polaroid instant cameras to people with low vision, similar to my niece. The results were astonishing, they said it was the first time they were able to see the world, by taking a Polaroid photo and holding it close to their eyes. Until then, they could only see pictures in books, giving them a hint that the world contained more than they could see, but through someone else’s vision.
I remembered that and suggested to my sister that she should buy her daughter an iPhone, she could use the camera as an adaptive device. It was more successful than I could imagine. She now takes hundreds of photos a day, her Facebook page is a flood of photos. She sees the world through her iPhone camera, it changed her outlook on life.


I never got how Apple is Accessibility-compliant WCAG company and yet produces power adaptor information in light grey on white, or light grey on silver. I will state that later models of Macbooks (Air/Pro/iPads) have “darker” burn laser etching. However, that is still at a point size illegible to anyone over 35 or optically challenged. When working for a repair depot (Apple Authorized), I bought a USB barcode scanner by Motorola that was actually a camera that could read ALL Apple barcodes and digital codes (it was a recommend Apple P/N as well). I still use it for entering retired macs into a database, along with new mac serials into excel.
When viewing and submission of customer “accidental liquid damage”, I had a Carson “egg” usb microscope. It was practical at the time. Now, I use a Lupe (10x).
But mostly, the iPhone feature to magnify works pretty well. 3 taps/click on the SE, then a grab. Then I can pinch/zoom in and see the print screening. Seems alot of steps to read a serial when I don’t understand making it more legible can’t cost that much in black ink… :stuck_out_tongue:


Maybe 20 years ago, a friend bought pair of eyeglasses with tunable focus; I think there was a transparent liquid inside that was reshaped as he changed the focal power. That made it very helpful for people with presbyopia, in which your eyes can’t change focus at different distances. He loved them, but the company went out of business and for a while he was buying used spare parts to repair them. I wonder the iPhone effect could be redesigned to meet that need.

Like many people with high myopia, I eventually required cataract surgery after the natural lenses grew cloudy. There are a few options for that surgery, and ophthalmologist recommended having the surgery to correct my vision so distant objects are clear but I need reading glasses for close-up work, including working on computers or reading on paper. I’m generally happy with that; it’s wonderful to be able to see the sky and distant objects clearly after decades of looking through a small lens. However, the surgery left me with floaters that bother me sometimes, particularly working on screens with small type, and a need for different glasses to focus on a desktop screen or a book because I view them from different distances. I work with a 27-inch desktop screen, which is fine as long as the type isn’t too small, but I can’t deal with a smartphone screen because its type is too small. Instead I use a “dumb” mobile phone, which I avoid as much as possible because I have to put on glasses to read anything on the thing, including the phone numbers. Frankly, I’m happier seeing the stars and the birds in the trees than a screen.

There are alternatives for cataract surgery which give you more depth of focus afterwards, but they rely on using lenses that scatter light to focus light at both near and far distances, which comes with some loss of clarity. I’m afraid I can’t describe it well, but Jerry Oltion wrote about the details for amateur astronomers in Sky & Telescope some years back, which is well worth reading if you are going in for cataract surgery.


This company (which I believe was bought by Apple) uses an iPhone to project an interface on the front transparent visor (a very cheap version of the Vision Pro :slight_smile: )


Interesting! Yes, it does look like Apple bought Mira.

1 Like

The Magnify app can be useful, but the image freezing doesn’t work properly. The image will be sharp before tapping the button, but will go blurry when tapping the button. Whatever they’re doing to select a sharp image is broken. If I need a capture to be able to read it properly then just taking a photograph with Live enabled works better. That allows me to easily locate a sharp image and zoom in.