Vision Pro First Impressions

So… I got my Vision Pro about 5 hours ago. Been using it almost constantly since then (with a brief break to walk to dog and clear my head). I thought I’d share my first impressions.

First, and most important, if you’ll recall when I ordered it there was a previously-unknown warning about incompatibility with hard contacts. Exactly what I wear. I ordered anyway hoping VP would work. Well… there are problems.

During initial setup I failed the “eye setup” several times. This is where you gaze at a series of dots in a circle and “tap” your fingers to select the dot in order to calibrate the eye tracking. At first I was confused at which dot I was supposed to be looking at – the “highlight” of the stared-at dot is rather subtle. Another issue I had is staring at things way in the corners. I’ve even noticed this in real life with my hard contacts – giving someone the “side eye” is a bit painful. The contacts don’t feel comfortable when I move my eyeballs to the edges of my eye. This hasn’t ever been a problem – I just turn my head instead of my eyes. But that doesn’t work with VP.

After a few eye setup failure, the system gave me a dialog that was sort of a “go on anyway or quit and get prescription lens inserts.” I chose to go on. The rest of the setup was quite painful. During the Persona setup, for example, it displays a series of screens for each step. I couldn’t get to the next screen because I couldn’t click the “Next” button! It just couldn’t track my eyes well enough to tell I was looking at that button and highlight it. This meant a two-minute process took me like 20 minutes – I somehow finally managed to get my eye look to register and got past those screens and continued into the main interface.

There, naturally, there were more problems. Stuff reviewers are reported as so natural and easy – looking at stuff and tapping with fingers – was challenging. Part of the problem was that I really had trouble telling if what I was looking at was highlighted or not. The animation is subtle. Another factor could be that since the system has trouble tracking my eyes, it isn’t registering a single point of staring. When I later found an Accessibility setting to turn on “eye pointer” it actually shows a dot – similar to the trackpad pointer dot on iPad – except this one was jumping and vibrating like something insane. I don’t know if with a normal person it stays steady and thus the highlight of a button would stay highlighted, but for me the buttons flash so briefly and I have to tap my fingers at just the right time or it doesn’t register that I wanted to click that item. A bit maddening, to say the least.

All that said, I found in Settings you can redo the eye tracking setup. I did that, did much better after some practice, but I still failed. When I continued, however, the eye tracking now is much better. Still not perfect, but that’s me and my eyes. However, I was also able to turn on some of the Accessibility settings such as voice control which lets you use voice to say, push a particular button. I don’t have to do that often, but occasionally if I just can’t get a button to click with my eyes, it’s a lifesaver.

Apparently I can also use a Bluetooth keyboard or game controller and maybe even a trackpad for the VP interface if I want (I haven’t tried that yet). That might be helpful. I’ll have to experiment.

The good news is that with practice I got much better. It’s a matter of learning how the eye tracking works for me. I found it works great straight ahead, so if I turn my head and look straight at a user interface element, I can control it probably 98% of the time. Occasionally there’s a tricky one that just doesn’t seem to want to activate to my stare, but that still could be user error on my part.

One piece of good news: Optical ID seems to work fine for me. You stare straight ahead at the picture of the sensor on the screen and it works.

With that giant caveat out of the way – and keep in mind most people probably aren’t in my position with hard lenses trying something against Apple’s recommendation – here are some general thoughts and first impression experiences that I haven’t seen in reviews.

  • I do see what people mean about the passthrough video being a little unusual. It’s really great straight ahead, but blurry off to the sides, so if you turn your head quickly, you’ll get a glimpse of that pixelated video of the real world, which feels odd. On the one hand the overall view is stunningly crisp and sharp – I was amazed – but there is a bit of an oddity to it at times. It’s not bad: just different.

    Also, I can totally see and read my phone and watch with the headset on. That’s awesome. The phone view can get a little distorted depending on the angle, but it is fully readable. Face ID won’t work with the headset on, but you can either type your passcode or briefly raise the visor up so Face ID can see your eyes and then it’s good. (Apple really should let me watch unlock the phone if I have the VP on.)

  • The overlayed user interface elements are amazing. They are so detailed, sharp, and highly readable, I really am stunned. Everything looks fantastic. Text is completely readable. I actually made some text smaller because it was just too big and I could read it fine much smaller.

  • The microphones on VP are incredibly sensitive. You can whisper and it will hear you perfectly. While I was having vision issues trying to control stuff and I’d find myself accidentally in a typing field I’d be muttering “What the-- I don’t want that!” very softly and it would dutifully type out exactly that into the URL or text field. :joy: I think is actually a good thing that it can hear so well: if you were using this with another person around or in an office, you don’t have to talk loudly to control the device. You could talk soft enough a person in a cubical right next you wouldn’t hear but VP would. Very nice.

    Related to that, Siri seems to work really well with VP and is a preferred way to launch apps (way better than scrolling through screens of icons).

  • Immersive environments are unbelievable. The first one it put me in was a snow field. I could actually hear a soft wind blowing and it changes from one ear to the other, so it feels like the wind is swirling all around you. I felt cold. Just amazing. When I choose a dark view, it changed it so it was the same view at sunset with a dark sky and red highlight on the distant mountains. Gorgeous.

  • The best thing Apple has done is anchoring virtual elements to the same place in the real world. It keeps everything stable and makes the screen or app feel like it’s really there. The one bad thing I discovered is that apps don’t seem to remember their locations – I’d get Messages set to one side where I liked it and the next time I launched it, it was in a different position and I had to hassle with putting it back where I wanted it. I’m not sure if that’s a bug or what, but hopefully that will change.

  • Watching a movie or show in a virtual theatre is amazing. In the Max app I choose the Game of Thrones Throne room and had a 100-ft screen in front of me in a grand hall with flickering torches burning on some of the walls. Very impressive. Speakers are fantastic.

    However, I also realized that I’m a fidgeter and probably 80% of the media I watch I do while also doing other things – checking on phone, playing Words With Friends on my iPad, doing things on my MacBook, etc. I can’t do any of those with the headset on. I suppose I could do them within the VP environment, but that’s a very different experience. I’m not quite sure it’s a proper substitute. It’s early days, but my first impression is I’d probably save VP video for “special” content where I really want to focus on it. Like serious movies and deep TV shows, not silly sitcoms or action flicks where I don’t have to give it all my attention.

  • I tried a FaceTime call with my 83-year-old mom to see how my avatar (sorry, Persona) looked. She thought it was me. I had a real hard time convincing her it was a drawing! I finally texted her a picture of me with the goggles on and she finally understood. I think. FaceTime doesn’t show me what my Persona looks like, so I have no idea except during creation it looked very impressive. I didn’t get any kind of “uncanny valley.” I thought it looked better than me in real life! :joy:

  • Really impressed by all the Accessibility features Apple’s included. The whole OS is impressive. When iPhone came out it was relatively bare bones – nice, but not that deep. This feels like iPhone generation 3 or 4. Still limited compare to today’s other platforms, but the starting point is much further along.

  • I did experience one crash – whole unit just rebooted – and a few bugs. I haven’t updated to the latest VisionOS yet. I’ll do that soon.

  • No real issues with the weight of the goggles on my face. I few times I needed to adjust it and sometimes VP will nag you and suggest you pull it up or down as it’s having trouble seeing your eyes. It will also do that when it needs to do Optical ID.

  • Battery life is definitely limited, though I was doing setup and testing things and it may not have had a full battery when it arrived. I did notice that it got down to 38% and when I plugged it into power and used VP, it stayed there for the next hour while I watched a TV show. In other words, charging and using VP at the same time just maintains the battery level. When I took off the headset for a bit, the battery did get up to 50%.

  • Couldn’t connect my Mac to VP. It doesn’t show up. Turns out, it requires Sonoma. Duh. Hadn’t thought of that. I guess that gives me an excuse to update. Maybe this weekend. I hope that doesn’t screw up my Mac.

Conclusions

It’s still very early in my usage so I have no idea yet about what I really think. On the one hand, the vision issues could mean enough friction I decide this isn’t worth it to me. I have yet to test out the Mac usage, which is critical for using this for work. (And yes, I’m already bummed I can’t have more than one big Mac screen. Hopefully that’s something Apple adds. I use a laptop, not a desktop, and the idea of multiple big screens was a big draw for me.)

The tech is damned impressive. Jaw dropping. It feels like there’s so much potential.

But beyond some flashy demos and niche use cases, will this make me more productive? Too soon to tell. Maybe it’s just a fun entertainment device. Seems expensive for that, but people do spend this much on big TVs. If VP developers come out with some amazing capabilities and killer features, that could sell this.

The key is going to be just like with the watch – will I miss it if I don’t have it?

I will admit after taking it off after a few hours and wandering around reality I felt a little strange. Not quite “I want to go back in” but more like how you feel after an intense few hours in a conference session or meeting. That could just be because everything was so new and overwhelming, but I don’t see this as being a device you’d live in all day long. Maybe it’s something you use for specific things or at certain times.

I will try not to stay up all night playing it!

11 Likes

Thanks so much for this Marc. Very interesting and some detail I hadn’t heard before.

I was able to participate in the in-store demo on Friday morning. To my surprise, when I arrived at the Emeryville, CA store about 10:15am, there was no crowd, and I was able to schedule a demo for 11am. They measured my prescription glasses in their equipment for the demo and provided some ‘close-enough’ inserts for the demo Vision Pro. Although the initial view was a bit fuzzy, the focus seemed perfect after a quick calibration (including looking for a QR code customized for the lens pair). I was guided through the rest of the calibration and the various demo sections by a store employee who could see what I was viewing and selecting on a linked iPad. I think the demo length was about 20 minutes. In it, we worked with navigation and object placement, saw examples of various image types and screens, and viewed a short, supposedly exclusive movie, showing clips of multiple types of footage from sports, nature documentaries, etc. Ultimately, the guide produced a spec for the appropriate sizing if I wish to order the Vision Pro.

I think it’s worth scheduling a demo if an Apple Store is convenient for you so that you can experience the device. It’s probably also worth going through it rather than just figuring out the process yourself.

Folks with hard contacts should seriously consider the trade-off between removing the contact for the session and using inserts versus dealing with the issues Mark Z faced.
A few notes:

  1. The store opened at 8 am, and I was surprised by the lack of a crowd.
  2. For this weekend, demos cannot be scheduled online; you must visit a store and schedule an appointment later. However, starting Monday, you should be able to schedule a demo via the Apple Store app or website.
  3. According to a Reddit poster, the clip show is available on the Apple TV app for the Vision Pro.
2 Likes

Great post.

In case you didn’t know, that’s a thing; my watch unlocks my phone if I have a mask on, or if I look away just as my phone is starting to unlock. You get a prominent noise/haptic on your wrist when it happens.

On the phone, settings / Face ID & passcode - scroll down to the unlock with watch setting. (Of course it’s possible that you knew this and for some reason Apple designed this setting deliberately not to work with the Vision Pro on.)

Thanks for the reminder. I thought I had that setting on, but I only had the “unlock with mask” setting on. I also thought it still needed to see your eyes.

I turned it on and I’m having mixed results. I’m not sure why – maybe the EyeSight screen confuses Face ID and makes it think you don’t have a mask on (it can see eyes) but the eyes are “wrong” and it won’t unlock?

Sometimes it unlocks with watch when I’m wearing VP, but only about 30% of the time. I haven’t figured out if it’s how I’m holding/looking at the phone. Obviously there’s a way that makes it work, but looking at it like regular Face ID does not work. I think it needs to see some part of your face and then the watch does the rest; I just haven’t figured out the right way yet.

So promise – but Apple could make it easier by knowing when I’m wearing VP and unlocking my phone (even using Optic ID in the goggles).

3 Likes

Wow. Did some more exploring. Looked at panoramas in my photo library. Hard to describe — you really have to experience it. It’s even better than in real life. Basically the picture wraps all around you. It’s like being in a planetarium with a 360° mural around you, above you, and below you.

For example, I looked at lake picture I took down in Medford, Oregon area. It feels like I can jump right in the lake before me. The mountains tower in the distance, there are people and boats in the water a hundred yards away on both sides. Rocks on the shoreline near me, below me.

I looked at a few I took at the Oregon coast and I was right on the beach. I had to check my slippers for sand. Just insane.

Then I did something I hadn’t tried yet. I recorded a spatial video. Just me calling my dog, Charlie, over to me. I played it back. OMG! Freakiest thing ever. I swear to you that dog was 3D coming out at me. In real life was still at my feet and in my view I saw two dogs: the video playing back and the real dog sitting on the floor. The video dog was coming toward me. I reached out my hand to pet him. I literally could not tell which was which. They both looked equally real. Then my hand passed through the video dog. It was a recording. I got chills.

I’ve heard people who’ve tested this device already lamenting that they didn’t get spatial videos of their kids and pets because you’re going to want that once they’re gone. It’s like a hologram of them right in front of you as though they’re still alive. Just astonishing. Would make you cry for sure.

Jaw on the floor.

Mac Update

So I upgraded my travel Mac (M1 Air) to Sonoma this morning (took a few hours to download and install). I figured that was safer than my main Mac. But so far so good. Even 1Password 7 seems to be working (though I haven’t tested it logging me into websites yet).

The best part is I got the Air to show up in Vision Pro! I am actually typing this post right now on my MacBook’s keyboard, with the 13" screen seemingly 8’ x 15’ about twice the size of my 75" TV. Text is pretty readable, but with the screen this big the edges are blurry (it gets sharper when you focus on an area). But I can still zoom my Mac’s screen (I do that a lot anyway) and it’s gorgeous huge in front of me. These letters are probably 5" tall as I type them and about 8’ away.

Seems to work really well. Even Spaces and Mission Control work fine. I Can’t figure out how to copy/paste between the systems as that wasn’t working. I don’t know if that’s not possible or typical Handoff weirdness. But I love having my Mac inside the headset! I’m not sure it’s really a productivity win – it’s not that much different/better (just bigger), but it could depend on what I’m doing and whatever else I’m running in VP. It certainly will be helpful for writing a review of VP as I don’t have to switch environments to test things in VP and write about it.

Minority Report

Just discovered something new and amazing! No idea why no reviewers have mentioned this. Now the virtual floating keyboard VP will put up for typing will let you “air type.” I actually find it quite useable. Not for long text, but great for an email address or something that’s hard to use voice dictation for. Not bad at all. I love being able to poke the “x” close button to close the keyboard.

Because of my eye trouble, I’ve wanted to poke with my fingers on apps and that didn’t work. Or does it? It does!!!

It turns out, if you bring an app close enough, you can touch it! I have Photos open to my library about 12" to my right. It’s stunning. I can use my hands just like in Minority Report and swipe to scroll, tap to open and close, etc. Everything works like magic. The only gesture that’s a little different is to zoom a photo you need to use two hands with both “pinching” and move your hands in and out to resize the photo. (It makes sense to use two hands because these windows are way too big for two fingers to work for zooming like on iPhone.)

For me this type of interface is way easier than using the eye tracking. Very cool. Will experiment with it more.

Head Tracking

Speaking of eye tracking, I went into Settings and tried other tracking methods. There’s a wrist one that I have no idea how to use, but I tried “index finger.” You point with your finger. That worked, but is very tiring. Within seconds my arm was aching. Not good.

I switched to “head” tracking, which is sort of what I was doing anyway since the eye tracking works best when my eyes are staring straight ahead. I’m still experimenting, but it feels almost the same only more reliable. I may leave it on this setting. Will see if there are any issues with it.

6 Likes

Sounds very cool. I can see dentists, tattooists and similar using these to distract people from what’s happening around them.

I had an MRI last week and would have loved this kind of distraction rather than the coffin dreams. They’ll just need to work out how to make it anti-magnetic.

2 Likes