I had quickly browsed multiple articles about WWDC that mentioned AVP. None of them mentioned fitness usage. Additionally, one article distinctly mentioned that fitness was not mentioned in the presentation but unfortunately, I did not make note of it so unfortunately, I can not actually confirm this with direct evidence. I just did a MacGBT Inquiry. Here is the result:
No, fitness was not mentioned in the WWDC presentation of the Apple Vision Pro device. The Apple Vision Pro is a new device that uses advanced computer vision technologies to enable people with low vision to better interact with the world around them. The device can magnify and enhance the images of objects, text, and people, and it can also provide audio descriptions of scenes and objects. While the device may have potential applications in the field of fitness, it was not discussed in the context of fitness during the WWDC presentation.
Unfortunately this is the best evidence I can currently offer.
Still wondering why iPhone has ultra wide band sensor.
I can understand why they went for a VR headset. AR with through natural vision has the same computational requirements but many more constraints.
But I also think applications that feel immediately wrong at the launch presentation already will be quickly abandoned. Thinking about wearing a headset at someone’s birthday party. Or in any social activity.
It will free us from tiny screens. Something I have been waiting for a long time. But also isolate us from reality as never before. Social sciences agree creativity arises from interactions with it. Without will we become good robots?
I still suspect the reason the goggles debuted at the Developers’ Conference is to convince them to start building apps ASAP for the just announced Vision Pro App Store.
The Vision Pro can be connected to a power adapter if an outlet is nearby. And the battery pack is not really “big” - more like a slightly thicker iPhone that does fit easily into an average sized pants pocket.
I am truly impressed by the technology - using eye tracking to reliably direct the “cursor”, having hands always be visible even in a virtual environment, and effectively using intuitive hand gestures vs. a controller are remarkable feats. The quick creation of a virtual avatar is impressive, and the visual representation thereof on the external screen seemed impressively like you were looking through a semitransparent set of ski goggles, not at all off-putting. It will be interesting to see where this evolves over the next 5-6 years (especially with regard to cost - but I believe I did pay more once inflation is factored in for my first Mac and printer…).
Hopefully Apple can continue to reduce the weight, and I’m sure they are working on a way to “share” the same experience contemporaneously with others more impressively than screen sharing with FaceTime. As others have mentioned, I can definitely see some huge teaching and practical uses in medicine and dentistry, engineering and design. Museum tours, nature walks for people who are homebound, astronomy lessons, etc. would seem easily made more valuable.
Like others, I am quite apprehensive re. the addictive, socially-isolating potential of such a fully immersive alternative reality device (“the entertainment” of David Foster Wallace’s Infinite Jest made real). Inevitably, some susceptible individuals will not do well if given unfettered access. I’m loathe to mention it as a use case, but I’m sure the porn industry will be interested in how they can monetize it.
I can’t see myself getting Vision Pro but the only thing that could temp me would be as a viewer for a drone.
Currently in Australia it’s against the rules to fly any FPV Drone unless in an approved model aircraft field. This is due to CASA regulations which state you must always maintain actual visual contact with the UAV.
If the outward facing cameras were deemed sufficient (it would still require a regulation change) then it would make it a worthwhile consideration.
Very interesting and thoughtful comments on the AVP. I’m not surprised few people who tried it reported cybersickness, after an initial trial; it takes a while for the eyestrain to set in. I don’t know what frame rate they are using, but latency is one well-known factor in sickness, but not the only one. High frame rates should reduce the problem, but won’t make it go away completely.
I had not thought of using AVP as a computer interface, and my first impression it that it’s not something I would use because I’m keyboard oriented.
I can’t understand how people can focus on screens perhaps 2" from their eyes. If I get 6" from my computer screen, things get blurry, and more so the closer. Now, apparently these things work clearly enough, but I just don’t get how. I am approaching 70. Is that it? Are these designed for youngsters?
Do we have anyone here that is knowledgable about medial diagnostic imaging? I’m not…but if Vision Pro could improve efficiency even a little, one would think that the headsets would pay for themselves many times over. Let alone if it could help diagnose problems that would otherwise be missed.
I have very little knowledge about medical imaging, but I do have a relative that is a radiologist. In his practice they use Macs for imaging, but they do use Windows for office managing, billing, etc.
For those who are interested in AVP from a developer’s perspective, here’s a video from Paul Hudson. He got the same 30 minute demo as the press folk did. Even if you’re not a developer, you may find Paul’s somewhat unique perspective informative. (And yes, I’m a huge Paul Hudson fan boy, so take my enthusiasm with a grain of salt.)