Sorry, I should have been clear that almost all productive work on an iPad is done with something like a Magic Keyboard with the iPad mounted vertically - sort of a laptop replacement where one would use a trackpad (with the Magic KB). Using the touch interface in this mode is very tiring and can lead to “gorilla arm”, something Jobs mentioned. Studies have shown that the ergonomics for touch are poor in this configuration.
From my experience, I have found that it is hard to do productive work with the iPad flat on a surface (or one’s lap) though one can do certain things like access the web, read email, etc. in that configuration. For one thing, one needs to use the software keyboard, often a frustrating experience. Again, this is my experience, others might do better.
It really depends on what kind of “productive work” you’re talking about.
My daughter used her iPad Pro extensively while in college, as an electronic sketchbook (she was a theater major) for costume, makeup, hair and set design work. She did most of her work with the iPad lying flat, doing her drawing with an Apple Pencil, mimicking what artists have traditionally used a pad of paper for.
But if you’re editing Office documents, well that’s a radically different use-case.
Yes, I should have excluded graphics work, where with the pencil one can do very well placing the iPad on a flat surface. Of course, using text in the drawings relies on the on-screen keyboard and all other interactions use the touch interface which can be difficult if the touch targets are too close. When I do CAD, I much prefer to use a mouse on a Mac than a pencil on an iPad. Tastes vary.
On another topic, kudos to iFixit for using the only useful measure of graphics resolution: pixels/degree. Angular resolution is what matters: if I use reading glasses I can place my face 2 feet from my 24" 4k computer monitor and get the same angular resolution as my 55" 4k OLED TV at 4.6 feet. The apparent resolution and apparent screen size as well as the image quality will be the same, everything else being equal.
I want to scream when I hear reviewers talk about having a 10 foot wide Mac screen when the angular resolution is the same as that of a small screen viewed more closely. For similar (apparent) screen sizes, the angular resolution of the AVP is about one half that of a Studio display, which has 5k pixels across compared to the ~2.5k on the virtual Mac screen used in the AVP. The Studio display will look much sharper - like going from a non-retina to a retina display. (the feature size of objects will have a similar size on both screens since the Studio uses ~2.5k points, which determines the size of on-screen objects). Again, the iFixit article explains this beautifully.
I don’t disagree, but we have to acknowledge it’s a lot less ambiguous to specify angular over spatial resolution on a device like AVP where there is a standardized distance from eye to pixel. On Mac or iPhone that’s a whole lot harder to get right since every user has their own preference and perhaps even different uses.
I like to sit up real close to the 27" display hooked up to my main work Mac (reducing its angular resolution), but I know that every ergonomist says that’s a bad thing and sure enough plenty of Mac users sit much farther from their 27" displays (increasing the apparent angular resolution of that very same display). Similar with iPhone. In bed I’ll have it 4 inches from my face to read a book, but watching a video on transit I’ll have it over a foot from my eyes. Now which angular resolution would I say is the iPhone’s? Spatial gets around that ambiguity.
Oh and BTW, the Studio Display has 5K pixels horizontally, not across. Across (diagonally) it’s almost 6K (5874). My understanding is AVP has for each eye a WUHD display giving it per eye the same horizontal pixel count as 5K, but only 2160 in the vertical (basically like a wide 4K screen), so 5557 diagonally. My understanding is further that the largest “virtual Mac display” the AVP can display is 4K (3840x1920) which is 4405 across.
Sorry about the ambiguity of the word “across”. I meant horizontally. I think I would have said “diagonally” if I meant that.
I still hold there is no problem with angular resolution for all situations. Two devices with the same pixel count and the same angular resolution will subtend exactly the same angle at your eyes and provide identical images on the retina if the content is the same. Neglecting the inability of my eyes to focus close-up, an iPhone which is 5 inches from my eyes could look exactly the same as a TV 5 foot away if the angular resolutions are the same. Angular resolution is the sole determinant of the perceived resolution no matter what device is used or how far they are away.
Jason Snell has now posted his review, tying it all the time when we bought computers not so much because they were useful but because “they were the future.”
To an extent, that’s a fair comparison—I remember my parents buying my first computer (a Franklin ACE 1000 that was an Apple ][ clone) without knowing exactly what we’d do with it.
But before I migrated to an Atari 1040ST to go to Cornell, I used that ACE 1000 for a bunch of word processing (every paper I wrote after that in high school), spreadsheet stuff (chemistry and physics lab work, mostly), rudimentary programming with BASIC, and of course games. I also learned a lot about hardware sysadmin tasks in terms of what you could get away with doing to computers, disks, and peripherals. My parents used it too for basic word processing and spreadsheet things—I can still hear the screech of the Epson LX-80 printer we got, which had replaced a daisywheel printer that let me down by breaking its S character when I had to print my college application essay.
So while I’m sure the comparison might have been apt for him and others, computers were both the future AND had immediate real-world utility that couldn’t be achieved in any other way for me. And that experience gave me background that informed everything I did in college and professionally after that.
In that sense, the Vision Pro might be a taste of the future, but I have trouble seeing how it will teach us anything about that future that is important to learn now. Or, to put it another way, I wouldn’t buy one for a 14-year-old to make sure they were exposed to the future early enough.
Brian Chen of the NYTimes has what I’d call a slightly negative review. (Link should not be paywalled.)
The Vision Pro is the start of something — of what, exactly, I’m not sure.
But the point of a product review is to evaluate the here and now. In its current state, the Vision Pro is an impressive but incomplete first-generation product with problems and big trade-offs. Other than being a fancy personal TV, it lacks purpose.
Most striking to me about the Vision Pro is, for such an expensive computer, how difficult it is to share the headset with others. There’s a guest mode, but there’s no ability to create profiles for different family members to load their own apps and videos.
So it’s a computer for people to use alone, arriving at a time when we are seeking to reconnect after years of masked solitude. That may be the Vision Pro’s biggest blind spot.
For me this seems about right. I’m impressed with the technology, but less so with this particular product.
I read and liked Brian Chen’s review but I would recommend the current ATP podcast as one of the best discussions I have heard about the device by three articulate and very computer literate people, two of whom own the headset. The consensus seems to be that the device is great for movies and other entertainment but is of questionable value for production.
One thing I found interesting is some skepticism (which I share) about the utility of a gaze oriented UI. Frankly, I don’t understand how this can work well - it seems to go completely against the way we use our eyes. We are constantly moving our eyes as we gaze at things and superimposing a touch control on this behavior sounds like it would be very awkward, tiring and error-prone.
Here’s another review of Vision Pro from a programmer. He’s very enthusiastic about the device from a productivity perspective. His comments on using it as Mac are interesting, especially in how you can infinitely position your display anywhere you want. Hard to do that with a physical display no matter how flexible the stand.
A very interesting and well written review and thanks for the link. His use of s-curve to explain the likely trajectory of Vision Pro is good and expectations must be that Vision Pro will follow the same s-curve feature of most technologies that have gone the distance.
BTW, yesterday I took my Vision Pro and had my uncle try it. It’s the first time I’ve tried guest mode for another user. He’s 78 and since he had cataract surgery, he doesn’t need glasses, so it was worth a try. He’s a computer guy – he’s the one who got me into computers when he showed me his Osborne in 1981. (After seeing it, I switched from saving for an IBM Selectric typewriter to saving for a computer.)
My uncle has been reading about Vision Pro and while he probably isn’t interest in buying one (he buys 10-year-old computers and tinkers), he was super-excited to see mine. He is paralyzed and in a wheelchair and his hands are rather gnarly (limited finger movement) so I wasn’t at all sure how it would go. We just did it in my car with him in the passenger seat. He held up his hands for the initial scan and it worked even though his fingers wouldn’t extend!
He then did the eye scans. He struggled with the first one because he didn’t even know how to use the headset. I reminded him how to look at the dots and “tap” by pinching two fingers. It took him about a minute to do the first scan, and then he breezed through the other two in 30 seconds total. Way faster than I did! Mine always failed because of my hard contacts, but he said his said, “Eye scan complete” and let him use the Vision Pro.
I had him fire up the impressive rhino video. Unfortunately, I’d forgotten the only internet was tethering with my iPhone and cell coverage is terrible at the coast where he lives, so the video took forever to stream. But he got a few seconds of it finally and I heard him audibly gasp. Then he was all, “Woah! The rhinos are all around me. They look so real!” Then he was speechless for a while. It was pretty cool.
He looked at some of the photos in my Photos and was wildly impressed. He had a bit of a struggle moving the windows around to where he could see them more comfortably. “Grabbing” them can be tricky the first few times. As I explained to him, within 60 seconds of use he was already using the eye tracking and finger clicking. With a few more minutes of practice, he’d be a master. We didn’t wait that long – it was just a quick 10-minute demo. I should have thought ahead and downloaded some media for him. Next time!
But he came away wowed. I was so impressed that he was even able to use it. This could be a huge device for the physically impaired. With it he could “travel” and experience life in ways he never could in real life. The lack of a keyboard isn’t a problem since he can’t really type anyway (he works on real keyboards the same finger-poke way you do with the virtual one in Vision Pro).
Wow! Yours is the first review of tne Apple Vision Pro for its’ optimum , though not intended audience! Not flakey gen x curio techs! Elderly semi-mobile or incapacitated intelligent who want and NEED isolated but superb displays! What a boon to them! Deeper pockets, often situated in a head supported recliner… without the need of mouse, keyboard or trackpad.
Go further with this… interview him or video of his experience!