Meta CEO Mark Zuckerberg told us that, in the future, we’ll hold our business meetings in virtual spaces and be represented there by avatars.
He was right about that. But it looks like Apple, rather than Meta, is building the more compelling vision for avatar-based virtual business meetings. New details have emerged about Apple’s augmented reality (AR), and it looks like something that could actually transform how professionals communicate with one another.
A quick caveat: The platforms we’re talking about have not appeared even in beta and so they cannot be actually compared. All we have to rely on is company statements, leaks, reporting, sleuthing, speculation — and common sense.
Let’s start with the sleuthing.
Apple is apparently working on an operating system called “realityOS,” which Apple sometimes abbreviates as “rOS.” We know this because references to “realityOS” and “rOS” have been discovered in pre-release iOS 13 builds, a GitHub repository, and even in App Store upload logs. The GitHub repository also alludes to a realityOS simulator, presumably for developers.
Apple uses “Reality” in the trademarked branding for two AR developer tools named “RealityKit” and “Reality Composer.”
And Apple usually names its operating systems after the associated hardware platforms. To wit:
Apple iPhone: iphoneOS (now shortened to iOS)
Apple Watch: watchOS
Apple iPad: ipadOS
Apple Mac: macOS
Apple TV: tvOS
Apple Reality: realityOS
All clues lead to my belief that Apple’s mixed reality platform will be called Apple Reality. That would be both a reference to augmented, virtual and mixed “reality,” and also to late Apple founder Steve Jobs’ famous “reality distortion field.”
I think this is a pretty good guess, and so for the remainder of this column I’ll refer to Apple’s forthcoming AR platform as “Apple Reality.”
Getting a grip on Reality
Good reporting, a smattering of leaks and a pinch of speculation suggest that Apple will ship (possibly next year, probably in 2024) a headset that can be used for both AR and virtual reality (VR). While the hardware will support VR, Apple will emphasize AR applications. Company statements, product launches, patents, and acquisitions all indicate Apple is obsessed with AR, and somewhat indifferent to VR.
Apple’s first headset will do AR like an iPhone does, but through stereoscopic goggles. The iPhone does AR by capturing real-time video through the camera, then superimposing virtual objects onto that video. With Apple’s Reality glasses, you’ll be able to see the world around you, but only on screens through video.
Apple is also reportedly working on a more advanced product — more like regular glasses — that will superimpose AR virtual objects onto your natural field of view.
I predicted a year and a half ago that Apple would use Memojis — Apple’s cartoonish representation of users, currently used for iMessage and other platforms — as avatars for virtual meetings.
Now, Bloomberg reporter Mark Gurman (who has unusually good anonymous sources inside Apple or its partners) supports my prediction by asserting that Memojis “could be central” to the experience of using a mixed-reality future version of FaceTime.
What’s great about Memojis is that the avatar conveys the represented user’s real-time facial expressions, head tilts, gestures and other non-verbal communication, while speaking in the user’s voice. The ability to convey non-verbal communication without having to appear on video is more appealing and comfortable for many users than Zoom-like video calls, which can leave people feeling exposed, uncomfortable and exhausted. (It’s called Zoom fatigue.)
Apple’s virtual meeting technology was originally created by a DreamWorks Animation company called Spaces. It originally wanted to develop a consumer experience technology for theme parks using ground-breaking technology that enabled multiple people to interact via avatars with the same virtual objects. The company later pivoted to VR conferencing technology. Apple acquired Spaces in August 2020.
Spaces’ methods and technologies integrated into a future FaceTime over Reality glasses would show a first-person view of participants in a circle or arranged around a virtual table, with all accessing shared virtual resources like white boards, 3D models, floating charts, and other virtual objects.
Crucial to this experience: each meeting participant will see other meeting participants as holograms in their own physical space, as opposed to a virtual meeting room. This theoretically reduces the mental fatigue of experiencing VR, as the environment you see matches the environment you know is actually there. It also means you wouldn’t knock your coffee over while gesturing with your hand, because you could see your cup sitting on the desk.
To talk to the person on your right, you turn your physical head to the right and make eye contact with the Memoji of that person. You’d hear their voice coming from your right, too. While person A and person B are making eye contact, person C would see both those Memojis looking at each other.
While today’s Memojis capture non-verbal cues using a camera, the Reality headset would use both cameras and other sensors that could convey those cues more finely, according to Apple patents.
The use of FaceTime for virtual meetings would be brilliant, because early adopters could appear as 3D Memojis of themselves on the same call as late adopters, who would appear as themselves on video. Business meetings could involve people on iPhones using FaceTime for a Zoomlike experience and in the same meeting people using Apple Reality having the meetings in the “metaverse” (for lack of a better term). Apple Reality users would see other Reality users in the meeting as 3D avatars and old-school FaceTime users on a floating rectangle showing video of that participant.
Gurman wrote that SharePlay would be central as well, enabling business users to share presentations and documents during virtual meetings. Apple Reality could become an extremely compelling collaboration tool, thanks to 3D AR plus SharePlay.
External cameras on the Reality headset will capture hand movements, according to Gurman, which would not only enable Memojis to display real-time hand gestures, but also let users type on virtual keyboards and write on shared virtual whiteboards.
Many of Apple’s recent additions to iOS may in fact be designed to prepare the world and prime developers for realityOS and the Reality platform — the U1 chip I told you about in 2019, ARKit, spatial audio with dynamic head tracking, and others.
Apple may believe, as I do, that AR will someday rise to displace the smartphone as the world’s most dominant and ubiquitous computing platform. And it wants to lead that future.
While Meta, the company formerly known as Facebook, sees users immersed in VR all day, including for business meetings, Apple sees users dropping into an AR meetings, then exiting Reality to get back to, well, reality.
I believe Meta is wrong and Apple is right about what people will want and how much of a virtual world people can really tolerate. Zuckerberg’s vision is that people will live and work in the VR metaverse all day. In reality, I think business people will prefer Reality.