NVIDIA has maintained a good partnership with Qixinyiwei, from the early MRS (Multi-Res Shading) to VRS (Variable Rate Shading), and now VRSS (Variable Rate Supersampling). Qixinyiwei has launched corresponding gaze rendering solutions. On June 8th, Epic Games introduced gaze tracking gaze rendering for the first time in Unreal 4.27 Preview. Gaze rendering is a new graphics computing technology that reduces the computational complexity by lowering the resolution of the area around the gaze point, significantly improving GPU performance and providing a more enjoyable visual experience for players. Currently, this feature is only available on Windows platforms using DX12 and VRS Tier 2 GPUs.
The latest VRSS 2 version of gaze rendering technology compared to NVIDIA's early gaze rendering technologies (VRS, VRSS 1) can oversample the VR gaze point area, resulting in a clearer rendering quality in the center of the user's vision and greatly enhancing the smoothness of high-quality graphics. It provides a more immersive experience for VR device users, especially on high-resolution devices. Additionally, VRSS 2 does not require any additional work from content developers, as all gaze rendering-related tasks are completed by NVIDIA and eye-tracking technology providers. This represents a deeper collaboration between Qixinyiwei, HTC, and NVIDIA, enabling all VR content on HTC's Vive Pro 2 with Qixinyiwei eye-tracking and NVIDIA VRSS 2 GPUs to achieve gaze rendering.
In addition to performance improvements, it is worth mentioning that VRSS 2 achieves gaze rendering without requiring extra work from content developers. All gaze rendering-related tasks are completed by NVIDIA and eye-tracking technology providers. This means that as long as users use Vive Pro 2 with Qixinyiwei eye-tracking and NVIDIA VRSS 2 GPUs, all VR content on the device can benefit from gaze rendering.
According to reports, HTC's new VR products have achieved a field of view of 120 degrees, while the Pimax 8K X product has achieved a field of view of 200 degrees (diagonal, horizontal field of view is about 170 degrees) and a resolution of 8K per eye. Combined with the news that Apple's upcoming new product is rumored to have a resolution of 8K per eye, it is clear that meeting the demands of VR users with high resolution and large field of view will be an inevitable trend in the future.
Based on the physiological characteristics of the human eye, a horizontal field of view of 200-220 degrees requires a PPD (Pixel Per Degree) of 60 to match the retina's resolution. Therefore, a VR headset with a resolution of approximately 12K per eye is required to truly meet the human eye's requirements.
Facing such high-resolution demands from users, without gaze rendering, no GPU will be able to handle the computational requirements. This is particularly true for VR all-in-one devices, where the rendering capability of mobile chips is already limited. It is almost impossible to run AAA-level VR content on current VR all-in-one devices. Therefore, eye-tracking and gaze rendering will become standard configurations for VR products in the future, enabling high-resolution and large field of view experiences with limited computing power.
In addition to gaze rendering in the VR and AR fields, Qixinyiwei is also collaborating with domestic internet giants on Cloud XR, based on cloud-based gaze rendering and gaze streaming technology, to compress rendering latency and transmission bandwidth, thereby enhancing the user experience of Cloud XR.
In the future, Qixinyiwei will continue to focus on gaze rendering technology and provide better solutions and smoother and more natural experiences for VR/AR headset enterprises, VR/AR content developers, and VR/AR users.