Skip to content
benjaminsunity edited this page Oct 3, 2018 · 1 revision

Purpose: The XR system provides access to device details through InputTracking, XRSettings, XRNodes, and the TrackedPoseDriver.

Test Setup: Input and device information associated with input testing is verified by querying the XR system for device specifics, testing generic button an axis functionality, visualizing XRNode states, and using the TrackedPoseDriver. The tests are grouped into four sections.

Test Method: Perform tests by following the instruction in the four following sections.

Input Tracking and XR Settings

XR SDK and joystick information is displayed immediately forward in the scene. Verify that the Loaded Device field shows the correct XR SDK. The Supported Devices list should contain all XR SDKs included in this build. The Joystick Names list should contain all currently attached joysticks, including XR and non-XR joysticks, gamepads, and controllers that are connected.

drawing

Input Visualization, Generic Buttons, and Axes

The Generic Buttons and Axes panel is located above the Input Tracking and XR Settings panels. Verify that actuation of buttons, thumbsticks, touchpads, and triggers on an XR controller cause the appropriate response from a button or axis. Platform specific bindings can be found at https://docs.unity3d.com/Manual/XR.html for comparison.

drawing

Node States

Orange cubes should appear to the left of the Tracking Input panel. The configuration of these cubes should mirror the current physical VR setup. These cubes represent XR Nodes queried from UnityEngine.XR.InputTracking.GetNodeStates(). Take note of the red, green, and blue axes and ensure that the axes origin matches the expected origin of the platform - either on the ground or where the head node visualizers begin during scene initialization. Tracking references such as tracking cameras or Vive Lighthouses are also visualized for some platforms.
Compare the visual representation to the physical orientation of the hardware devices. Verify that the device’s head, left eye, right eye, and center eye appear and track correctly. If tracking references exist, then ensure that they appear and track correctly. Finally, manipulate the controllers to verify that they appear and track as expected.

drawing

Tracked Pose Driver

Orange spheres should appear to the right of the Tracking Input panel. The configuration of these spheres should mirror the current physical VR setup. These spheres represent physical devices controlled by the TrackedPoseDriver component. Take note of the red, green, and blue axes and ensure that the axes origin matches the expected origin of the platform - either on the ground or where the head node visualizers begin during scene initialization. Compare the visual representation to the physical orientation of the hardware devices. A visual representation will only exist for one of each: left controller, right controller, left eye, right eye, center eye, head, color camera, device pose. If a node is unused by the current XR SDK configuration then will either be positioned at the origin or alias to a similar device.

drawing
Clone this wiki locally