Impressions and Thoughts from Early Vision Pro Reviews

Weird. And complicated. What happens in a household with more than one person?

doug

At this point, like iOS and iPadOS, visionOS supports only a single user. So a household with more than one person should buy more then one Vision Pro. That said, I understand that the prescription inserts attach magnetically so are easily removable and can be replaced with other inserts, if Apple ever sells the inserts separately.

2 Likes

It somehow seems strange that a physical insert is needed. Oh - one question. In my case I am near sighted and needed glasses when reading my external monitor or outside walking. But if I’m reading a book, for example, on my iPhone I take off the glasses. In that case do I need a prescription to use the device?

All of that info is here.

But I understand that the Vision Pro focal length design is something like 1.5 meters. I’ve heard but can’t recall the exact distance, but you’d want correction for being able to see something that far away.

1 Like

I can’t imagine me adopting to Vision Pro anytime soon. But I do appreciate the direction it’s going in. All other VR stuff are either in a place (Massachusetts Museum of Contemporary Art had 2 Laurie Anderson designed Virtual Reality Spaces that were truly wonderful and I’ve had VR Cardboard for a long time) - or in some weird entertainment arena (meta/goggles…etc) that has no appeal to me. The potential of large workspaces, immersive collaboration and AR/VR in situ look promising technology-wise. Will Hollywood adopt VP for production as well as content ? …because I see a use for that. I wonder what these things will look like in 10 years! So, much thanks for the article wrap up, and it’s great that the new sponsor Art Authority is delivering art services for VP- although, do visit your local Art Gallery in person sometimes.

In software? I’m not sure how. People who need corrective lenses don’t need content to be bigger. They need the image to be distorted in a way that will produce a focused image on their retinas. Software can’t make an OLED screen do this. No amount of blurring the image can produce a sharper result after it passes through the eyes of someone needing corrective lenses.

To do this, you’d need to change the light path, which no flat panel screen can do. Maybe some future holographic system might be able to (or maybe not - I really don’t understand holograms), but for now, the only thing you can do is place corrective lenses between your eyes and the screen.

The AVP has (I think) some ability to mechanically adjust its optics in order to compensate for different size/shape faces, in order to align with your eyes. But it would take more than moving the existing lenses in order to compensate for people with poor vision.

You would need lenses with a variety of shapes. Correcting myopia (near-sightedness) requires concave lenses. Correcting hyperopia (far-sightedness) requires convex lenses. Astigmatism requires lenses with a non-uniform curve.

To do that in a single device without requiring you to swap lenses would require something akin to a phoropter, like what an optimetrist uses to determine your prescription. Or maybe adaptive optics. Both of which would be very large, very expensive, and probably couldn’t be used in a consumer device due to medical licensing regulations.

Hence the Apple design of allowing the insertion of corrective lenses, which would be used in addition to the rest of the lens assembly.

I suppose it would theoretically be possible to make the AVP larger so you can wear it over your regular glasses, but that would probably make the eye tracking and iris authentication systems more difficult to work reliably, since the image of your eyes would be distorted by the lenses. The presence of the frames in the field of the sensors might also create problems.

I am a bit surprised, however, that you can’t wear contact lenses while using it (at least Apple says you can’t). Maybe that would also mess up the eye tracking or iris-authentication system, but I’d love to understand how and why.

I suspect that over time, you’ll find the lens mounting rings available to opticians in much the same way they sell eyeglass frames. You’ll then buy the rings from an optician who will fit prescription lenses to them.

Probably a bit more tricky than eyeglasses because certain measurements (e.g. position of your pupil relative to the frame) will likely be different, since the AVP adjusts the position of some internal parts to fit your face. But I see no technical reason why any optician shouldn’t be able to fit their own lenses into the mounting rings where the Zeiss lenses go today.

2 Likes

That’s all very interesting. Using the device sounds like an extreme nuisance for contact lens wearers though. Imagine having to take out and put in your contacts multiple times a day, whenever you take off off put on the device!

I think Apple needs to find another solution. Do other VR headsets have the same issue?

The link to the Apple document that Doug Miller posted earlier (HT213965) has more detail about contact lenses. Some types are ok.

If you use single vision soft prescription contact lenses, you can use Apple Vision Pro without ZEISS Optical Inserts — Prescription.

  • If you use hard contact lenses, it might impact your experience with Apple Vision Pro. If you experience difficulty with eye tracking, and your eye care provider has indicated that eye glasses with a comprehensive prescription is an option for you, then you may consider ordering ZEISS Optical Inserts. Otherwise, you may use an alternate form of input such as Pointer Control.
  • Cosmetic contact lenses are not compatible with Apple Vision Pro and should be removed before using the device.
  • If you typically use reading glasses in conjunction with your prescription contact lenses, you may benefit from using ZEISS Optical Inserts — Readers in conjunction with your contact lenses.
1 Like

For issues that may affect wearers of hard contact lenses, see Mark Z’s initial report:

I suspect some of the issues are caused by reflections off the surface of the lenses that do not occur for folks with soft contacts or no contacts at all.

BTW, here’s a quick update on my hard contact lens situation.

Over the weekend I switched to the thicker face pad (AVP comes with two thicknesses) as the device kept warning me that my eyes were “too close” and could result in injury if I fell down while wearing AVP. At first I thought the thicker pad was much worse – everything seemed blurry – but later I tried it again and it was fine and the warnings stopped. On a whim, I redid the eye setup several times, wondering if the thicker pad and moving my eyes further away might make the system work better. But no: I still failed the eye test and it didn’t seem to improve the tracking. I have kept the thicker pad on, though.

Since then I’ve experimented with a lot of the accessibility options. I finally tried the “wrist” pointer just to see what it is and it’s bizarre: you get a 3D sort of clear light pole emanating from your hand that you can use to point at stuff (it reminds me of those clear plastic rods on window blinds except infinitely long). The pointer makes it much easier to see what you’re pointing at, but is very distracting.

I switched to head tracking with “pointer control” on, which displays a dot that moves around the screen as you move your head. This works fairly well and is at least more noticeable so you can see what you’re doing. One of the biggest frustrations of the eye tracking is it’s really hard to tell what’s going on when it’s not working. With the head pointer control, you can at least move your head until you find the pointer (the equivalent of jiggling the mouse until you locate the cursor on the screen) and then use it to point at what you want to control.

However, I discovered three key problems with the head tracking:

  • It’s possible to end up with a window or control in a weird place where your head has trouble directing the cursor. For example, in VP the window close button is at the very bottom. If a tall window appears right in front of me, sometimes its close button is way down south. My chin hits my chest and I literally can’t look any lower – yet I can’t reach the button to close the window! Sometimes you can do some gyrations to make this work, or you can press the digital crown to reset the view (which re-centers all the windows) and that could make them easier to retrieve. You can also use Siri to close the app. It’s a bit of a chore.

  • Pointer control is buggy. Sometimes the pointer just vanishes and you’re in the dark again, unsure what you’re pointing at. It’s like using a Mac with an invisible mouse cursor. You can click on things blindly, but it’s really hard to tell what you’re doing. When you can’t see the pointer getting back into Settings to reset Pointer Control is a challenge.

  • When I put on VP this morning, Pointer Control was turned off even though I left it on yesterday. I don’t know if that’s normal or a bug, but it’s a problem. If you can’t use eye tracking at all, there’s no way to turn head tracking back on without using your eyes! (Fortunately, I can use some eye tracking. It’s just not super-accurate or consistent.) And it’s not like you can hand a AVP to someone else to have them set it up for you like you could an iPad.

All-in-all, the Accessibility features on AVP are awesome and I’m glad and impressed they’re included, but they are definitely buggy. Since the device was such a secret even within Apple, I imagine the number of disabled testers was tiny, if any.

Finally, the best news of all: today I hooked up an Apple Trackpad from one of my Macs. The AVP only supports the newer version of the Magic Trackpad, the one that charges via Lightning and doesn’t use replaceable batteries. Fortunately, I had one of those. It showed up right within Settings under Bluetooth and I activated it.

It’s wonderful! Not perfect, but works great 99% of the time. It gives you a pointer you can use to move around the entire AVP interface, supports scrolling, long clicks (which is like Control-Clicking on Mac and brings up things like word definitions if you have a word highlighted), and other gestures.

The two issues I’ve found is that it seems to be a little too “window-based” – it doesn’t give you a free-roaming cursor that goes over the entire display, but magically jumps from window to window. In other words, it only appears within the active window. I was testing in one app, the PGA Tour golf app, and it has a sidebar of icons which you pick to enter different modes. That floating sidebar seems to be a separate window – sometimes when I got my cursor right on the edge of the main window it wouldn’t “jump” to the sidebar window so I couldn’t select those icons. Other times it jumped easily. Perhaps it needs momentum. It was just a little disconcerting. Perhaps it’s something I simply need to learn how to use properly.

That same issue can be a problem for accessing the Control Center. Normally you look up north and you see the Control Center icon floating there. But again, you can’t move your cursor there with the trackpad until that control is highlighted (active). I seemed to be able to do it with my gaze and then the trackpad click worked, but it was a little strange. (So annoying that you can’t use Siri to open Control Center. Siri just says, “You don’t have an app named ‘Control Center.’ Do you want to search the App Store?” Grrr. There really needs to be another way to open it.)

Overall, though trackpad control is terrific, and a great solution to someone having trouble with the eye tracking. It does mean you have another device you have to include as part of the AVP, though. The promise of not needing any controller but your hand and eyes is defeated – though this is really a physical defect for me since it’s my eyes that are the problem. I could see this being an issue if I wanted to sit on my deck with the AVP on and do some work outside this summer – I’d have to have my laptop nearby to pair with it and have the trackpad with me. That might not be a problem. I won’t know what a hassle it is until I try it.

I still have more things I want to test and try, but I’m learning and making progress. The hard contact lens issue definitely caught me by surprise (I didn’t learn about it until during pre-order), but it’s not the end of the world.

3 Likes

Thank you for your very complete descriptions of using the Vision Pro with rigid gas permeable contacts. I fully agree with the poster who said that hard contacts presented multiple reflecting surfaces which confuses eye tracking: whenever there is an abrupt change in index of refraction, there is a reflection from an interface between two materials. I guess that soft lenses adhere better to the cornea and are less of a problem.

I am struck that Apple, having two very successful, time-tested pointing systems (mouse and trackpad) for manipulating objects on a relatively stationary and large canvas seems to need to invent two more: a touch-oriented interface for iPads and a gaze-oriented interface for the Vision Pro. In my opinion a mouse is a far superior pointing device for an iPad of any size than touch, which, I grant, is necessary on the small screen of an iPhone. Similarly, I don’t see why a trackpad (or mouse) couldn’t be the primary user interface on the Vision Pro. It would be less portable, but we have already seen that using the device while moving is not a very good idea.

1 Like

The unspoken assumption here is that you’re using an iPad sitting in front of a flat surface upon which a mouse can move. I rarely use an iPad there and the places I do would be literally impossible for a mouse (armchair, public transport, in the car, etc.) Given that, for me, the touch interface is not only superior, it makes the iPad possible in a way it wouldn’t without a mouse.

2 Likes

Sorry, I should have been clear that almost all productive work on an iPad is done with something like a Magic Keyboard with the iPad mounted vertically - sort of a laptop replacement where one would use a trackpad (with the Magic KB). Using the touch interface in this mode is very tiring and can lead to “gorilla arm”, something Jobs mentioned. Studies have shown that the ergonomics for touch are poor in this configuration.

From my experience, I have found that it is hard to do productive work with the iPad flat on a surface (or one’s lap) though one can do certain things like access the web, read email, etc. in that configuration. For one thing, one needs to use the software keyboard, often a frustrating experience. Again, this is my experience, others might do better.

1 Like

iFixit’s second teardown video explains the Vision Pro’s astonishing resolution (3386 ppi!), and why it results in a virtual Mac screen that isn’t as good as a real one.

1 Like

It really depends on what kind of “productive work” you’re talking about.

My daughter used her iPad Pro extensively while in college, as an electronic sketchbook (she was a theater major) for costume, makeup, hair and set design work. She did most of her work with the iPad lying flat, doing her drawing with an Apple Pencil, mimicking what artists have traditionally used a pad of paper for.

But if you’re editing Office documents, well that’s a radically different use-case.

Yes, I should have excluded graphics work, where with the pencil one can do very well placing the iPad on a flat surface. Of course, using text in the drawings relies on the on-screen keyboard and all other interactions use the touch interface which can be difficult if the touch targets are too close. When I do CAD, I much prefer to use a mouse on a Mac than a pencil on an iPad. Tastes vary.

On another topic, kudos to iFixit for using the only useful measure of graphics resolution: pixels/degree. Angular resolution is what matters: if I use reading glasses I can place my face 2 feet from my 24" 4k computer monitor and get the same angular resolution as my 55" 4k OLED TV at 4.6 feet. The apparent resolution and apparent screen size as well as the image quality will be the same, everything else being equal.

I want to scream when I hear reviewers talk about having a 10 foot wide Mac screen when the angular resolution is the same as that of a small screen viewed more closely. For similar (apparent) screen sizes, the angular resolution of the AVP is about one half that of a Studio display, which has 5k pixels across compared to the ~2.5k on the virtual Mac screen used in the AVP. The Studio display will look much sharper - like going from a non-retina to a retina display. (the feature size of objects will have a similar size on both screens since the Studio uses ~2.5k points, which determines the size of on-screen objects). Again, the iFixit article explains this beautifully.

1 Like

I don’t disagree, but we have to acknowledge it’s a lot less ambiguous to specify angular over spatial resolution on a device like AVP where there is a standardized distance from eye to pixel. On Mac or iPhone that’s a whole lot harder to get right since every user has their own preference and perhaps even different uses.

I like to sit up real close to the 27" display hooked up to my main work Mac (reducing its angular resolution), but I know that every ergonomist says that’s a bad thing and sure enough plenty of Mac users sit much farther from their 27" displays (increasing the apparent angular resolution of that very same display). Similar with iPhone. In bed I’ll have it 4 inches from my face to read a book, but watching a video on transit I’ll have it over a foot from my eyes. Now which angular resolution would I say is the iPhone’s? Spatial gets around that ambiguity.

Oh and BTW, the Studio Display has 5K pixels horizontally, not across. Across (diagonally) it’s almost 6K (5874). My understanding is AVP has for each eye a WUHD display giving it per eye the same horizontal pixel count as 5K, but only 2160 in the vertical (basically like a wide 4K screen), so 5557 diagonally. My understanding is further that the largest “virtual Mac display” the AVP can display is 4K (3840x1920) which is 4405 across.

Sorry about the ambiguity of the word “across”. I meant horizontally. I think I would have said “diagonally” if I meant that.

I still hold there is no problem with angular resolution for all situations. Two devices with the same pixel count and the same angular resolution will subtend exactly the same angle at your eyes and provide identical images on the retina if the content is the same. Neglecting the inability of my eyes to focus close-up, an iPhone which is 5 inches from my eyes could look exactly the same as a TV 5 foot away if the angular resolutions are the same. Angular resolution is the sole determinant of the perceived resolution no matter what device is used or how far they are away.

Maybe we are saying the same thing.

1 Like

Jason Snell has now posted his review, tying it all the time when we bought computers not so much because they were useful but because “they were the future.”

To an extent, that’s a fair comparison—I remember my parents buying my first computer (a Franklin ACE 1000 that was an Apple ][ clone) without knowing exactly what we’d do with it.

But before I migrated to an Atari 1040ST to go to Cornell, I used that ACE 1000 for a bunch of word processing (every paper I wrote after that in high school), spreadsheet stuff (chemistry and physics lab work, mostly), rudimentary programming with BASIC, and of course games. I also learned a lot about hardware sysadmin tasks in terms of what you could get away with doing to computers, disks, and peripherals. My parents used it too for basic word processing and spreadsheet things—I can still hear the screech of the Epson LX-80 printer we got, which had replaced a daisywheel printer that let me down by breaking its S character when I had to print my college application essay.

So while I’m sure the comparison might have been apt for him and others, computers were both the future AND had immediate real-world utility that couldn’t be achieved in any other way for me. And that experience gave me background that informed everything I did in college and professionally after that.

In that sense, the Vision Pro might be a taste of the future, but I have trouble seeing how it will teach us anything about that future that is important to learn now. Or, to put it another way, I wouldn’t buy one for a 14-year-old to make sure they were exposed to the future early enough.

5 Likes

I think a key difference is that back then we didn’t know what the future was going to be. Now, for this tech, we’ve a pretty good rough idea.