The Meta XR Simulator makes it easier to develop VR and MR applications for Meta Quest by reducing the need to put on the headset.
If you compare VR content development to traditional flat content development, the former requires extra effort: Developers have to put on a headset instead of just looking at a monitor. Meta XR Simulator is designed to minimize this and reduce iteration times.
With the new tool, developers can quickly and easily test the mechanics, design, and user experience of their VR app without the need for a physical headset. Meta XR Simulator can simulate various Meta Quest devices and Meta APIs, as well as headset movement and touch controller input using a keyboard and mouse or game controller.
Many settings for developers
Meta XR Simulator can simulate the field of view and resolution of the selected Quest device, allowing developers to get an idea of how content will look in VR depending on the device. There is also a solution for social VR apps: the tool can simulate multiple VR users on a single machine to test interactions between them.
Finally, Meta XR Simulator also supports mixed reality development. It does this by loading artificial environments that represent the physical world and simulating Meta’s mixed reality interfaces such as passthrough and spatial anchors.
The tool was previewed at last year’s Meta Connect. It moved out of experimental status at this year’s conference. Meta emphasized during the announcement that for all its advantages, the program obviously cannot and should not completely replace physical headsets.
Meta XR Simulator supports Unity, Unreal as well as native development. More information is available in Meta’s developer resources. Announcements about these and other new developer tools are available on the Meta Quest Developer Blog.