By Keita Funakawa
As we step into 2024, the tech world is abuzz with anticipation for Apple's imminent launch of its Vision Pro headset. At Nanome, we're not just waiting eagerly; we're preparing for a transformative shift. We wanted to share our thoughts on the Apple Vision Pro, exploring how we believe this groundbreaking headset will revolutionize the way Nanome and our users achieve more scientific breakthroughs.
Passthrough Mixed Reality First Device
At Nanome, we first tried passthrough mixed reality with the Varjo XR3 in May of 2021. Ever since then, we have seen the huge potential for passing through mixed reality. We were fortunate enough to launch the Meta Quest Pro (the first all-in-one passthrough color mixed reality device) in October of 2022 and have been a huge believer that MR is a crucial step for us to get the promised land that is lightweight AR glasses. On top of this, many of us at Nanome are huge fans of Apple, and use Apple products for every computing device, and we’ve been paying very close attention to all of the rumors these years.
So, saying that “we saw this coming” and “we’re glad Apple is taking this direction” are both an understatement. We were 99% certain Apple would go this route, and we’ve been preparing for this moment since we started the company in 2015.
No controllers: eye, hand, and voice-based controlling and input device
At Nanome, we’ve been a huge believer in voice-based input since we added voice commands to Nanome in 2019. We’re glad Apple feels the same!
Hand tracking has historically been unreliable or fatiguing (users have to keep their wrists/arms up). Based on initial impressions from the press, it seems like Apple has solved this problem by combining hand tracking with eye tracking.
We’ve been paying close attention to eye tracking and have felt that eye tracking has been underutilized and has a lot of potential. Research papers like these have piqued our interest in the past, but the bottleneck has always been devices that incorporate this technology that’s readily available AND an SDK that lets us easily build experiences around this type of user interaction. With Vision Pro and VisionOS, it seems to be finally here, and we’re thrilled!
Cost: the least suprising thing about AVP
Apple consistently introduced new product categories at the higher echelon of the market price point (iPhone when it first launched, Pro Display XDR, Apple Watch, AirPods, M series laptops, the list goes on)
Price is actually competitive when it is compared to it’s appropriate competition
Given the chip that’s inside the headset, it’s actually closer to a desktop computer + VR system ($2k+)
- Approximately ~ 5 to 9x more powerful than current all-in-one headsets
- (comparing Apple M2 chip vs XR2 chip GeekBench scores[1][2])
In addition to the chip, there’s literally only one other passthrough mixed reality solution that’s PC powered, which is the Varjo XR3, and it’s ($8k along for the headset with a $2k/yr subscription and requires a $2k+ PC = totaling $12k starting point)
On VisionOS, the Software:
Actual Spatial Computing vs Passthrough as a Skybox
Although Passthrough Mixed Reality is available on Quest devices, it is not “true” spatial computing. As a simple example, when a real-life object comes between the user and a virtual object, the physical object doesn’t occlude the virtual object. This creates a weird sensation and dissonance to the user as it’s unclear which objects ought to be where.
Based on the WWDC keynotes, it seems like Apple will be supporting actual occlusion and taking into consideration the dynamic between physical objects and virtual elements. We strongly believe this will lead to richer and more comfortable mixed reality experiences.
No avatars is the best avatars
Users tend to get decision fatigue when they have to choose which exact nose angle, eyebrow shape, and hairstyle best represents them out of thousands of possible combinations
VisionOS and the headsets completely sidestep this issue by using the headset to scan a virtual version of yourself.
No more manual guardian setup
Before the Quest 3 came along, anyone using Meta devices had to manually sketch out a safety zone to keep from bumping into stuff in the real world. With the launch of the Quest 3, Meta now has an automatic boundary drawing feature. But, if your room's a bit messy, it only marks out a small safe area. So, you might find yourself getting these constant reminders that you're about to step out of your virtual safety net.
This has been a huge point of feedback from our users that we haven’t been able to address, so we’re very glad hardware manufacturers like Apple and Meta are making progress on this.
example here
All the productivity apps, finally!
I’ve personally been disappointed with Meta when it comes to the lack of default/out-of-the-box productivity apps (like calculator, calendar, email, slack, etc). They’ve announced they’re adding 2D apps like slack and instagram but it requires the user to search for it on the store or sideload it. On launch, Apple Vision Pro is slated to be compatible with all iPad and iOS apps, which is a huge leap forward in this aspect, AND VisionOS looks to have similar default apps that are in iOS
"SharePlay" ?
Meta is calling this feature “shared spatial anchors” and is slated to be available on meta devices soon.
SharePlay or “shared spatial anchors” is basically to ensure if there are multiple users in the same location that, the location of each virtual avatar and object are synced such that you don’t have to manually try to overlap and ensure that when you’re physically pointing to something your virtual avatar isn’t pointing to something else.
There haven’t been explicit mentions from Apple that Share Play will not be available at launch, so unlike Meta’s devices, this seems like developers will be able to make SharePlay experiences already at or near the launch of the device.
iCloud, AirDrop, iMessage, all the things!
Getting files onto Quest devices is a cumbersome process. At Nanome we had to build features around this like QuickDrop and the Vault Plugin in order to enable our users to seamlessly leverage their files and data from their normal work environments (PC/Cloud/etc).
As a heavy Apple user, I can’t wait to have all the things from my iCloud auto sync, send and receive Airdrops to others nearby, and reply to iMessages in the app. This alone will get both our users and myself in XR far more often. Although there hasn’t been confirmation on to what extent iCloud will be compatible, having iMessages, notes, and mail-in their launch screenshots makes me extremely hopeful that most of iCloud will be available within the headset at launch.
Things I look forward to evaluating upon launch:
Comfort:
the materials used make it look very heavy. Were they able to properly balance the weight? This has been an issue with some of our users' Quest devices.
EyeSight & Reverse passthrough:
the unique feature where other people in the same physical location as the user are able to see the user’s eyes is something I’m excited to try. It could be a miss, but I’m hopeful that it’ll “just work” as Apple usually does.
2024: The Year of Apple Vision Pro
Apple's Vision Pro marks a significant leap forward in the realm of mixed reality, aligning with our vision at Nanome of a more immersive and intuitive future. Its blend of cutting-edge technology and user-friendly features, like seamless integration with Apple's ecosystem, will hopefully set a new benchmark in the industry.
Keep an eye on Nanome as we're gearing up to unveil a thrilling new update, designed to leverage the latest advancements in mixed reality technology. While we don’t have too much to share for now, future updates are poised to significantly enhance user experience, offering deeper immersion and more intuitive interactions. Be sure to stay connected for upcoming announcements – we're excited to elevate your experience with the newest innovations in the MR landscape to unlock more scientific breakthroughs!