It has been about a year since Qualcomm President and CEO Cristiano Amon and Meta Founder and CEO Mark Zuckerberg announced a multi-year strategic partnership to develop platforms and core technologies to accelerate next-generation metaverse experiences at IFA 2022. At this year’s Meta Connect 2023, we see the first commercial fruits of that partnership. Qualcomm announced the launch of the Snapdragon XR2 Gen 2 and AR1 Gen 1 systems-on-chips (SoCs) and Meta announced the Meta Quest 3 virtual reality (VR) headset running on the former and Ray-Ban Meta smart glasses leveraging the latter.
According to Qualcomm, the goal of its XR (XR being an umbrella term encompassing VR), Mixed Reality (MR) and Augmented Reality (AR)) platforms is to revolutionize spatial computing through three main pillars; “groundbreaking immersion, seamless in motion and realities reimagined”. While it’s always good to have vision, what does this really mean in terms of features and capabilities?
Virtual and Mixed Reality
For the Snapdragon XR2 Gen 2, this means a beefed-up Adreno graphics processing unit (GPU) that, according to Qualcomm, delivers up to 2.5x performance and 50% better power efficiency over the XR2 Gen 1. In terms of groundbreaking immersion, this translates to an optimized 3K x 3K display support with Snapdragon Game Super Resolution (GSR), foveated rendering, a technique that optimizes a scene for the human visual field and space warp. While GSR allows for upscaling game content, foveated rendering is a technique which helps reduce compute workload by working in conjunction with an integrated eye tracker to focus high quality rendering on the primary field of view and reduced resolution on the peripheral vision, thus optimizing performance for the critical visual field while minimizing power consumption. Space warp on the other hand works to synthesize missing frames on the XR device without any host device overhead, doubling streaming performance between the host device or the cloud and the XR device.
The XR2 Gen 2 SoC also includes an updated Hexagon neural processing unit (NPU), which Qualcomm claims delivers up to 8x higher performance per watt when compared to their previous generation. The SoC also incorporates an ultra-low power, low-latency 6 degrees-of-freedom (6DoF) Engine for Visual Analytics (EVA). Qualcomm’s FastConnect software suite and High Band Simultaneous (HBS) Multi-Link deliver WiFi 6E and WiFi 7 support up to 5.9Gbps peak speeds at 25% less power and 80% lower latency when compared with products that don’t have the software suite and HBS. Qualcomm included these capabilities to meet Qualcomm’s “seamless in motion” design driver – especially In MR applications, where the ability to understand the user’s physical context in relation to the real world is not only a nice to have but is essential for the usability of the device. These capabilities enable perception concurrency features, like head, eye, face, gesture and controller tracking. Other such features include but are not limited to: avatar encoding/decoding, plane detection, depth estimation, 3D reconstruction, semantic understanding and object recognition and tracking.
Being designed for XR applications, it is critical that the XR2 Gen 2 SoC delivers on the rigorous visual requirements mandated by these types of devices. To that end, the SoCs Spectra image signal processor (ISP) was also improved with geometric correction, noise reduction, dynamic light range capabilities and support for up to ten concurrent cameras with the goal of seamlessly blending the physical world into the virtual world. A critical element of this blending is the device’s ability to minimize video see through (VST) latency. This latency measures the delay between when the user moves their head and when the video that shows the real world is updated correspondingly. Th lower the latency, the less it will disorient the user and allow for maximum usability. Using the enhanced Spectra ISP and an optimized Qualcomm VST pipeline, Qualcomm claims a 75% or more reduction in VST latency when compared with non-Qualcomm solutions which brings the latency down to about 12ms from 50ms or greater.
Augmented Reality
Along with the XR2 Gen 2 SoC which is targeted more at the virtual reality and mixed reality segments, Qualcomm also announced the AR1 Gen 1 SoC, targeted specifically at augmented reality smart glasses. This segment requires designs to be able to be more streamlined than virtual reality and mixed reality head mounted displays as well as enable fashionable form factors. Additionally, as the market continues to mature, certain features are starting to emerge as must-haves for the segment. High speed, low latency connectivity as well as auditory-based capabilities such as hands-free calling, voice assistants and music streaming are becoming par for the course. Visual-based features such as image and video capture and heads-up notifications are also becoming established as commonly required advanced capabilities.
Consequently, the AR1 Gen 1 supports eight microphones, stereo speakers, Qualcomm’s AI Engine, Sensing Hub, 3DoF version of the EVA, Fast Connect/HBS and a 14-bit Dual Spectra ISP to enable computer vision aided audio and video capture, noise cancellation, voice commands, visual search and real time text translation.
Putting the “Reality” in XR
Launching SoC platforms for specific use cases and devices always begs the question, “That’s great but will it really be adopted by device manufacturers?”. In this case, that question is answered immediately as the partnership between Qualcomm and Meta has yielded a simultaneous release of the Meta Quest 3 using the XR2 Gen 2 and the Ray-Ban Meta smart glasses using the AR1 Gen 1 chipset with the latter serving as a definitive proof point that smart glasses can in fact be designed with the desired features while delivering a sleek, fashionable form factor. It is expected that other manufacturers will be releasing devices based on these platforms by 2024 or at CES in January. Tirias Research believes that these device segments are poised for more growth as the XR market is starting to achieve a critical mass of technology enablers, compelling use cases, and content.
Read the full article here