Apple Vision Pro Demo
Today I tried out the Apple Vision Pro. It’s been a while since I’d tried the Microsoft HoloLens, and I was excited to see how far the AR industry has come since then. I’d heard that the Vision Pro is extremely cool but not so practical for consumer use yet as it’s a first generation product. But why believe the ad copy or the youtubers, when I could try it for myself?
So I popped into the Apple Store. I’ve been there enough times by now to find it easily, and I let someone know I was here for a Vision Pro Demo and they directed me to an area with couches and a bunch of Apple reps. One of them came up and checked me in, and I sat at an empty table and twiddled my thumbs for a while while they found someone to do the demo for me.
Eventually, this lady named Tonika showed up to do the demo. I figured her parents really liked a certain city in Kansas, but in reality they actually tried to name her Toniko and it was written wrong on her birth certificate. So if you're expecting, remember to be careful with the spelling! Anyways, Tonika took my glasses to a machine where they measured them to help choose the optical inserts. I’m not sure if the prescription was perfect cause it didn’t feel quite right, but it was at least good enough for a demo.
Anyway, this is what the device looked like:
I put it on, and was immediately hit by how heavy it is! Like wow, I would not want to wear that for more than a few minutes at a time, its got a long way to go before your average consumer would be comfortable with it.
I followed the on screen prompts to push the crown and finish the setup. The controls take some getting used to, but you can also kind of see how it is fairly intuitive. Point with your eyes, and tap two fingers together to click. Tap two fingers and move them to drag. There was a calibration step which is several phases of looking at certain dots and tapping them. I tried to do it fast, and most of them didn’t trigger properly. I did another round where I waited a few extra ms for the system to properly register the selection and the click, and by the third round I had it down to a science.
I was guided through the process of opening Safari, moving the window to anywhere in the 3d space, opening another window, and closing both of those windows. It was mostly seamless, except I went off script and opened a search bar in the browser, and I didn’t know how to go back and I don’t think Tonika did either. I got a brief glance at the floating onscreen keyboard. Maybe there was an escape button? Anyway I didn’t get a good enough look at it, I ended up closing the browser and reopening it so as to get back on the rails of the demo script.
Next, I was instructed to open the photos app. There were a collection of photos to view, showcasing various features one at a time. The first photo was just a photo. The next was a photo shot on iPhone 15 pro. The next was a photo taken with Apple Vision Pro, which had eye popping visuals and depth. The paper streamers at that birthday party looked so real, like actual streamers inches away from my face almost. The next was a panoramic shot of somewhere in Iceland, but I accidentally scrolled too far to the right and had to go back a few pictures to get back to it. Then I was told to press the immersive viewer button in the top right corner of the app, but there was no button in the top right corner of the photo viewer. I had seen a 3 dots menu in that corner earlier, so it was kind of cool that I found a bug without even trying. I swiped back to the earlier images and the UI from the top right was gone even on the earlier photos. So while Tonika was dumbfounded that there was no button where her script said it was supposed to be, I closed and reopened the photos app and scrolled back slowly to the panorama, and the UI was restored. (P.S. Apple feel free to reach out if you need further QA services ;) Finally there were two buttons, and I looked at and tapped (my fingers while looking at) the immersive view button and was transported to Iceland. It was pretty immersive, nothing too crazy compared to existing functionality on iPhone and android, just a slightly bigger field of view.
Finally there was a video shot on iPhone 15 pro, which actually had a pretty cool depth effect.
Next, we went to a video app, probably Apple TV, and played a few short demos. One was from the Mario movie. The visuals were ok, but the subliminal suggestion by playing a clip including the words “That was quite impressive” was not lost on me there. Then there were a few superbly immersive 3d videos of various people and animals adventuring or the like. My favorite was the up close and personal shot of a hippo with incredible detail, and some of the sea life was cool to see so close too. I did look around and noticed the field of view wasn’t 360 degrees, which did disappoint me for these experiences. I kind of wish they gave you full range of motion. It almost seemed like they limited it on purpose to keep you from turning your head around and getting yourself tangled on the battery wire. If that’s the case, maybe we include a two cent plastic clip and show us the whole thing? Let me know if you have any better info on why immersive videos have such a limited field of view.
By the end, I could comfortably move around, open, and close windows, and I’d learned that I could tap even with my fingers resting on the desk, which was more comfortable than raising them into the air to do so. It eventually felt pretty easy and intuitive. I asked if I could try the keyboard, but Tonika said they hadn’t trained her in how to demo that part. That was as comical as it was disappointing. I think I could have figured it out. I also wish I had the opportunity to try the mac screen extension feature, as that's the one most likely to be useful to people compared to other options like the Quest 3, I'm surprised it wasn't in the demo.
Tonika finished her sales script, instructing me that I can choose to buy the Vision Pro either in store or online to ship to store or home. I thanked her for her time and moseyed over to the other side where there was a Mac desktop on display with a Pro Display XDR. I opened it up to try putting vscode on the screen and seeing what the experience was like, and I found myself half expecting the mouse to click where I was looking. Ha.
While I enjoyed trying the Apple Vision Pro, I wouldn’t buy it right now. The headset is extremely heavy and I felt that it was too uncomfortable for regular use. Also the prescription didn’t seem to quite work with my eyes, so either I need slightly different lenses or it just won’t be a comfortable product for me. It would be cool to develop apps for, but for now, I’ll stick with regular web development.
Did you try the Apple Vision Pro or other similar AR/VR hardware? Share your thoughts below!