Apple Lays Foundation for Mixed Reality Headset at WWDC22

Heading into Apple’s annual developer conference, WWDC, there were rumors that a “mixed reality” AR/VR headset would dominate the event — even though the hardware itself wasn’t expected to be revealed. Others predicted that Apple would launch a new 3D development environment.
Neither of these things happened. If anything dominated WWDC this year, it was the new M2-powered MacBooks. Regardless, WWDC 2022 included several updates to AR software that are worth the attention of developers — and that broadly hint at the much-talked-about headset. The updates included ARKit 6 (the latest version of Apple’s AR development kit), Metal 3 (a graphics API for gaming, optimized for Apple silicon) and a new RoomPlan API for scanning rooms.
As Mark Gurman from Bloomberg noted, these three AR updates “will all play into the headset next year, but not today.” The latest predictions for the headset are for Q2 in 2023, with an announcement possibly in January (which, if it happens, would make it exactly 16 years after the iPhone was announced). So, how can developers prepare for this potential blockbuster new Apple device next year?
RoomPlan
One of the highlights of last year’s WWDC was the introduction of Object Capture, allowing you to take a photo of a real-world object and turn it into a 3D model. This year’s WWDC extended 3D capture from a single object to an entire room.
The RoomPlan API, in the Swift programming language, allows you to do “parametric 3D room scans,” which will be useful for interior designers, real estate agents, retailers, and others. It can create 3D scans from the camera and LiDAR Scanner on iPhone and iPad (the models from 2020 on).
In a WWDC video, Apple’s Praveen Sharma explained that RoomPlan “uses sophisticated machine learning algorithms powered by ARKit to detect walls, windows, openings and doors; as well as room-defining objects like fireplaces, couches, tables, and cabinets.”
RoomCaptureView is the scanning and processing API that developers can use in their apps. As well as integrating room scans into an app, developers can opt to export them in USD or USDZ file formats.
You can see where all this is going. Last year it was 3D scanning a single object, this year it’s a whole room. Soon, it’ll be 3D scanning everything around you — and that’s when a pair of “mixed reality” glasses will become a much easier way to interface with 3D content.
Already, there are hints that the smartphone is going to be superseded soon, when it comes to AR content. In a separate WWDC video, entitled “Qualities of great AR experiences,” Apple AR designer Allie Dryer commented on the ergonomics of using a phone to consume AR, noting that “it can be exhausting to hold your arm out for long periods of time and uncomfortable to reach for buttons that aren’t positioned properly for one-handed use.”
ARKit 6
The main talking point with the latest version of ARKit is its new support for 4K video. As Apple broadly hints, “4K video is perfect for apps that integrate virtual and real-world content together for video creation.” In other words, it’s needed for mixed reality.
In a technical session at WWDC, Apple’s Christian Lipski said that “over the course of the past several years, we saw a lot of demand for high-resolution content — especially those apps which leverage the power of augmented reality for filmmaking are ever hungry for more pixels.”
Lipski went on to explain that “in 4k mode, an image area of 3840 by 2160 pixels is used and you can capture video at 30 frames per second.” Prior to ARKit 6, the maximum was 1920 by 1440 pixels image area, so it’s a significant improvement in image quality.
However, Apple recommends that you only use 4K video “when there is a clear need,” due to its potential impact on memory and other system resources. “Apps that benefit from the high-resolution video are good candidates [for 4K], such as video, filmmaking and virtual production apps,” said Lipski.
“Virtual production apps” is another way of saying mixed reality apps. Again, we didn’t get any details at WWDC22 of the future hardware (cough, glasses) that Apple consumers will likely soon use to experience these types of apps — but developers are being primed now to create AR and VR apps, via foundational technologies like 4K video processing in ARKit 6.
Later in his session, Lipski talked about improvements to the Object Capture API. “Now, with a new high-resolution background image API, you can take higher resolution photos of the object and create even more realistic 3D models,” he said.
No WebXR, But Broad Hints at an XR Future
While Apple has definitely enhanced its 3D capture and processing technology this year, unfortunately, there is still no sign of it supporting the web version of mixed reality: WebXR.
As AR/VR entrepreneur Thomas Nigro remarked on Twitter, WebXR “still does not work on iOS 16 Safari, despite the experimental flags available and enabled.” Being slow to support web standards is, alas, nothing new with Apple. Lack of WebXR support will have an impact on developers like Nigro, since the browser will almost certainly be the on-ramp to mainstream AR adoption — especially in the months or years before Apple launches a mixed reality headset.
All that said, Apple claimed in one of its press releases that it aims to “push the app experience forward.” There were many hints at this year’s WWDC that AR software — together with a mixed reality headset likely to be released in 2023 — will indeed be the driving force in that push towards the next era of apps.