Skip navigation

Despite my skeptical blog post in 2017, I did go out and buy the first version of the Apple Watch. And the second version. And I became an Apple developer, learning a whole new development framework and a new language (Swift) and then, I wrote my first app for the Apple Watch. The Apple Watch quickly proved to be useful. My treasured self-winding watches sit in storage. The Apple Watch doesn’t really replace anything – you can get a basic watch or a health-tracker at a fraction of its price. Instead, it augments what other devices can do. I love being able to filter calls and read transcribed messages without having to find my phone.

When Apple released the new Vision Pro last month, I thought it looked kind of bulky, and adopted the nickname one of my kids gave it: “The Face Computer.” My wife had to convince me to go get a demo. I have seen my share of VR and AR demos including devices that were still in the prototype stage, and I did not think I needed to see another one. But having spent most of my career building 3D graphics technology, I couldn’t resist. 

Having fun with the demo

The demo was easy to arrange and the Apple Store demo folks were great as usual. Unfortunately, we couldn’t get eye tracking to work, spending over an hour on calibration attempts. So we had to proceed with the rest of the demo with the much more basic and somewhat awkward finger-pointing accessibility mode. But even without the eye tracking, it was clear that this is the most advanced visual computer ever made. The graphics are spectacularly clear, due to the incredibly high density of the OLED panels, and the fast refresh rate makes motion appear fluid and natural. But it is the immersive 3D content that is truly compelling – better than anything I’ve ever seen. The device is a technological tour-de-force – like a supercomputer you can wear on your head.

So did I buy one? No. Not because of the eye-tracking issue, or the high price. I can’t think of a practical application for this device in my life. Despite the amazing capabilities, I also can’t think of an app that I could write on my own that would make any sense for the experience. Vision Pro versions of my altimeter-barometer or weather station app for a mostly-stationary device? Probably not worth my time. The device shines with immersive content which is expensive to produce and requires teams of people.  There’s a reason there isn’t a Pro version of a full-length movie let alone a TV series. Even with Apple’s resources, creating immersive experiences is daunting.

Perhaps one way to think about the Vision Pro is that it’s the immersive computing equivalent of Apple’s Lisa.  When launched in 1983, the Lisa was the first mass-market personal computer with a graphical user interface. It was a commercial failure, but it paved the way for the Macintosh that Apple shipped a year later. Vision Pro may not be a commercial mass-market success either. Instead, it establishes a certain inevitability of the direction for human-computer interaction. Apple’s Macintosh popularized the core concepts introduced with the Lisa, and I can’t wait to someday buy the Macintosh-equivalent of the Vision Pro.