D2. Navigating in VR
Learning Outcomes
- Implement spatial awareness features in VR. Before class, review proxemics and build a Unity script using
Physics.OverlapSphere()
to detect when objects enter a defined radius, considering how proximity impacts user comfort. Also practice positioning GameObjects using world coordinates, and test both local and world rotations withtransform.Rotate()
usingSpace.Self
andSpace.World
.- Set up and prepare locomotion systems in VR. As preparation, configure a Unity scene with the
XR Interaction Toolkit
and anXR Rig
, review the rig’s hierarchy, and preview teleportation and snap turn setup so it’s ready for in-class locomotion testing.- Optimize a Unity project for VR performance. Ahead of the session, enable
Static Batching
andGPU Instancing
, mark non-moving objects asStatic
, and use theFrame Debugger
andStats
window to monitor batching, triangle count, and draw calls—aiming for under 500k triangles and 175 batches for smoother standalone VR performance.
Spatial Awareness
Spatial awareness in VR refers to the user’s ability to perceive and understand the positions, orientations, and relationships of objects—and of themselves—within a 3D virtual environment. This awareness is driven by multisensory input, including visual, auditory, and haptic cues, and is enhanced by technologies such as head tracking, hand/controller tracking, spatialized sound, and real-time feedback. To support this immersive perception, VR environments rely on a set of coordinate systems—including world, local, and camera-relative frames of reference—that define how objects are positioned, moved, and interacted with. Understanding these coordinate systems is fundamental to designing VR experiences that feel spatially coherent, responsive, and believable.
Coordinate Systems in VR
In VR development, coordinate systems provide the framework for positioning, orienting, and scaling every element in your virtual world. A deep understanding of coordinate systems in VR is critical for creating immersive and technically robust experiences. By mastering the interplay between world, local, and view coordinate systems—and understanding how Unity’s left-handed system operates—you can ensure that every object in your virtual world is placed accurately and behaves consistently. Coordinate systems are essential for ensuring that virtual objects maintain their intended spatial relationships as the user moves. Accurate positioning and orientation of objects help the user feel grounded within a believable, interactive space, and proper management of coordinate systems minimizes errors like drift or scaling mismatches that can break immersion. Unity uses a left-handed coordinate system for its 3D space. Here’s a breakdown of how the axes are oriented:
X
-axis: Represents horizontal movement, with the positive direction typically pointing to the right.Y
-axis: Represents vertical movement, with the positive direction pointing upwards.Z
-axis: Represents depth, with the positive direction typically pointing forward (out of the screen) in Unity’s left-handed coordinate system.
Left-handed coordinate system uses the left hand for the axis orientation. Here, the positive
Z
axis points away from the viewer. Unity uses a left-handed system for its 3D world. This means when you specify a position like(X, Y, Z)
, theZ
value increases as objects move away from the camera. Knowing your system’s handedness helps in importing assets, adjusting lighting, and ensuring proper camera behavior.
World Coordinate System
The world coordinate system is the global reference frame for the entire scene. It defines a fixed, universal orientation across the entire Unity environment. The world origin (0, 0, 0)
is typically set near the player’s starting point or a defined central location in the virtual factory. This origin serves as the anchor point from which all objects’ positions, directions, and distances are ultimately measured. The world coordinate system uses a left-handed axis convention in Unity, where:
- Positive
X
points right - Positive
Y
points up - Positive
Z
points forward (out of the screen)
These fixed world directions remain constant, regardless of an object’s orientation or parent-child relationships. This system ensures a consistent and shared spatial layout, enabling accurate placement, navigation, and alignment of objects throughout the scene.
In the logistics station, select a drone or scanner. Toggle between
Global
andLocal
handle modes in the Unity toolbar, then move the object using theMove
tool (W
). InGlobal
mode, the arrows follow the fixed world axes (X
: red,Y
: green,Z
: blue), while inLocal
mode, they align to the object’s current rotation. This clearly shows how world and local directions differ.
Local Coordinate System
Every GameObject in Unity also has a local coordinate system—its own frame of reference that travels and rotates with it. This system defines the object’s orientation and position relative to itself, rather than the world. Local axes are especially important when dealing with hierarchical relationships, such as child objects attached to a parent, or when animating complex, articulated mechanisms. In local space:
- The
X
axis represents the* object’s right direction - The
Y
axis represents the object’s up direction - The
Z
axis represents the object’s forward direction
These directions are influenced by the object’s rotation and are used in calculations like local movement, local rotation, and parenting. This makes local coordinates essential for precise control of movement and orientation, particularly in robotic arms, vehicles, characters, and modular machines.
In the exhibit station, select one of the leg joints on the quadruped robot (
Spot
). Use theRotate
tool (E
) and toggle betweenLocal
andGlobal
modes. When you rotate the joint inLocal
mode, the rotation follows the natural direction of the joint, consistent with the leg’s intended movement. InGlobal
mode, the gizmo remains aligned with the world axes, making the rotation feel off or unnatural for articulated motion.
Camera Coordinate System
The camera (or headset) coordinate system is dynamic—it updates in real time with the user’s head position and orientation. In XR, the camera becomes the user’s point of view, serving as the basis for rendering the scene and positioning UI or interactive elements. Key spatial behaviors and scenarios include:
-
World-Locked Content: Objects are fixed in the environment and do not follow the user (i.e., the camera). Ideal for props, machines, landmarks, or spatial UI that needs to stay consistent as the user moves around. For example, the CNC machines in the manufacturing station remain anchored regardless of user movement.
-
Head-Locked (View-Fixed) Content: Objects move with the user’s head, maintaining a fixed offset in view space. Common for HUDs, notifications, or tooltips. Use sparingly to avoid motion sickness or visual overload.
-
Seated Experiences: The user is expected to stay seated. The world origin is usually aligned with the user’s initial head position, making camera-relative offsets predictable in limited physical space. This is suitable for simulations or control panels where physical movement is minimal.
-
Standing Experiences: The user stands and may walk or move within a physical play area. The world origin is aligned with the floor plane, and content should be positioned with physical scale and posture in mind. This is useful for interactive demos or exploring full-scale scenes like XFactory’s exhibit zone.
-
Hybrid Experiences: Some setups blend seated and standing interactions—for example, a user may stand to explore content but sit to operate a virtual machine. In XFactory’s exhibit station, users may stand to explore AR/VR displays but sit for testing simulations, requiring flexible origin handling.
Drifts may happen as a result of gradual inaccuracies in the system’s understanding of position over time. Techniques such as periodic re-centering of the world origin help minimize drift. Regular calibration ensures that long-duration VR sessions maintain positional accuracy and prevent the feeling of disorientation.
Proxemics in VR
While coordinate systems define the logical structure of space, proxemics introduces the human dimension—how people perceive and react to spatial proximity. Proxemics is the study of personal space and social distance, and in virtual reality, it plays a crucial role in shaping interactions that feel natural, respectful, and emotionally comfortable. Designers must account not only for functional placement but also for the psychological boundaries users instinctively maintain. Respecting these spatial expectations can significantly enhance immersion and reduce discomfort. The zones of personal space in VR include:
-
Intimate Space (0–0.5m): Reserved for close personal interactions, such as handoffs, high-trust exchanges, or haptic feedback experiences. Avoid placing avatars, tools, or UI elements in this space unless the user has explicitly initiated the interaction.
-
Personal Space (0.5–1.5m): Ideal for one-on-one conversations, small-scale tasks, or interactive UI panels. This is the user’s primary comfort zone—frequently used for interaction and control, so intrusions should be intentional and minimal.
-
Social Space (1.5–4m): Best suited for group interactions, multiplayer collaboration, or shared environments. Arrange participants or elements to encourage communication while preserving personal boundaries.
-
Public Space (4m+): Appropriate for presentations, lectures, or passive observation. Use this zone for content delivery that doesn’t require direct engagement, such as signage, broadcasts, or performances.
Designing with proxemics in mind ensures that your VR experiences are not only technically accurate—but also socially intuitive. Always test interactions at different distances and in varied user postures (e.g., standing, seated, crouching). Pay attention to how users respond to avatar proximity, reachability, and UI positioning. To reduce discomfort and maintain boundaries, consider subtle visual cues like fading, blurring, or scaling of objects that breach a user’s personal space. In XFactory, for example, invisible boundaries or warning overlays are essential around active machinery like CNC mills or robotic arms—balancing safety with immersion.
Locomotion in VR
Locomotion is the method by which a user moves through and interacts with a virtual environment. In VR, effective locomotion is critical to creating an immersive, natural, and comfortable experience. Unity’s XR Interaction Toolkit
provides a range of locomotion primitives—including teleportation, snap turning, continuous turning, and continuous movement—that can be tailored to suit different applications. Key considerations in designing VR locomotion include:
-
User Comfort and Safety: Design locomotion to minimize motion sickness and disorientation. Use techniques like snap turning and teleportation for rapid repositioning while considering ergonomic factors for extended VR sessions. Ensure that only one locomotion method is active at a time via a centralized system to prevent conflicting commands.
-
Centralized Input Handling and Interaction Safety: Favor action-based input over device-based input to simplify control mapping across diverse VR hardware. A centralized locomotion system not only prevents simultaneous actions but also provides a consistent, predictable user experience by handling input conflicts effectively.
-
Coordinate Space Management and Transformations: Precisely manage conversions between device, local, and world coordinate spaces. Accurate transformation is essential for maintaining spatial consistency, immersion, and ensuring that physical movements are correctly mirrored in the virtual environment.
-
Modularity, Scalability, and Flexibility in Movement: Build locomotion systems with modularity in mind by using simple placeholder assets that can be upgraded as needed. This approach supports scalability and allows developers to tailor the locomotion behavior—whether through rapid teleportation or immersive continuous movement—to suit the specific needs of different VR applications.
In XFactory, users can explore the factory floor to observe different stations. Continuous movement lets them walk beside a moving forklift, while teleportation allows them to quickly reposition near a drone landing pad or machinery entrance. Snap turning allows the user to interact with screens or robots placed in a circular formation without physical turning.
XR Rig and Teleportation
The XR Rig
is a MonoBehaviour
that defines the user’s presence and interaction point in the virtual environment. It encapsulates the tracking origin, camera, and input controllers, and is the primary target manipulated by locomotion systems such as teleportation or continuous movement. In the context of teleportation, the XR Rig
is crucial because teleportation involves instantaneously repositioning the user within the virtual world. This repositioning is applied to the Rig’s root GameObject, ensuring that all child objects—especially the head (camera) and hands (controllers)—move together coherently. Our imported XR Rig
has the following structure
XR Rig (Root)
└── Camera Offset
├── Main Camera
├── LeftHand Controller
└── RightHand Controller
XR Rig
(Root): This is the locomotion target. When teleportation occurs, theTeleportation Provider
applies the position and rotation changes directly to this GameObject.Camera Offset
: This intermediary object allows vertical alignment of the headset (camera) relative to the play area. This is useful for accounting for standing, seated, or room-scale setups.Main Camera
: Represents the user’s head. Its position and rotation are driven by the physical headset, and it is critical for determining teleportation direction and destination.LeftHand Controller
/RightHand Controller
: These contain interaction components likeXR Ray Interactor
orXR Direct Interactor
, enabling users to point to teleportation targets or interact with objects.
When using the
XR Interaction Toolkit
, teleportation is usually enabled via anXR Ray Interactor
(typically on the left or right controller) combined with aTeleportation Provider
. Upon activation (e.g., by pressing a button), the rig is moved to the selected point on a teleportation surface.
Locomotion Mediator
The Locomotion Mediator
is a centralized component that coordinates access to the XR Rig
for various locomotion methods. It ensures that only one locomotion provider—such as teleportation, snap turn, or continuous movement—is actively controlling the XR Rig
at any given time. By mediating between multiple input-driven locomotion requests, the Locomotion Mediator
maintains consistency, prevents conflicting movement commands, and helps deliver a smooth and predictable user experience. This coordination is critical in complex VR scenarios where multiple interaction modes may be available simultaneously.
In earlier versions of Unity, this component was referred to as the Locomotion System. Unity 6 replaces it with the Locomotion Mediator to better reflect its role in managing movement requests.
XR Body Transformer
The XR Body Transformer
is a required component when using the Locomotion Mediator
. It is responsible for applying position and rotation changes to the XR Rig
in response to locomotion inputs like teleportation or turning. This component acts as the physical mover—once the Locomotion Mediator
grants access to a locomotion provider, the provider delegates movement operations to the XR Body Transformer
. This separation of concerns helps ensure that movement logic remains modular and extensible.
The
XR Body Transformer
is typically attached to the same GameObject as theXR Rig
and must be present for theLocomotion Mediator
to function correctly.
Locomotion Providers
These modular components implement specific movement behaviors including teleportation, snap turning, continuous turning, and continuous movement. Each provider extends a common abstract class and is designed to request and relinquish exclusive control over the XR Rig
via the Locomotion Mediator
. Their modular design allows developers to mix and match movement types based on application needs. Key locomotion providers in XR Interaction Toolkit
include Continuous Turn Provider
, Continuous Move Provider
, Snap Turn Provider
, and Teleportation Provider
.
Continuous Turn Provider
The Continuous Turn Provider
rotates the XR Rig
smoothly over time based on user input. While it offers a fluid rotation experience, it must be carefully calibrated since overly rapid or prolonged turning can cause discomfort or motion sickness. Developers often tie this to the horizontal axis of the right-hand joystick and adjust turn speed to strike a balance between responsiveness and comfort. It is best used in open spaces where users need to orient themselves frequently without relying on teleportation. Follow the instructions below to set it up.
- Attach the
Continuous Turn Provider
:- Select your
XR Rig
GameObject. - In the
Inspector
, add theContinuous Turn Provider
component. - In the
Mediator
field, assign the existingLocomotion Mediator
component in the scene. This ensures the turning provider cooperates with other movement systems like teleportation or continuous movement. - Ensure the
XR Body Transformer
is also attached to the same GameObject as theXR Rig
. It is responsible for applying the rotational changes generated by the turn provider.
- Select your
- Configure Turning Parameters:
Transformation Priority
: Set this to0
. It is an integer value that defines the execution order of movement or rotation operations.Turn Speed
: Set the speed of rotation in degrees per second (e.g.,60
for a smooth turning feel).
- Set Up Input Actions:
- The
Left Hand Turn Input
andRight Hand Turn Input
fields let you assign float-based input actions for smooth turning from each controller’s joystick. - In the
Continuous Turn Provider
, click the circle (object picker) icon next to theRight Hand Turn Input
field. - In the search window that appears, type turn and select
XR RightHand Locomotion/Turn
from the list. - Leave the
Left Hand Turn Input
field empty unless you want to enable turning from the left joystick as well.
- The
- Run the Scene:
- On Windows, connect your headset via the Link app and enter
Play
mode to test directly in the Editor. - On Mac, build and deploy the project to your headset using the Android build pipeline, then run it on the device to test.
- On Windows, connect your headset via the Link app and enter
- Observe and Tune:
- In the
Scene
view, watch theXR Rig
rotate around its vertical axis (Y-axis) and verify it’s pivoting from the base. - Adjust the
Turn Speed
in theContinuous Turn Provider
for more comfortable or responsive rotation.
- In the
- Debug Behavior:
- Use
Debug.Log()
to track rotation:Debug.Log("XR Rig Y-Rotation: " + transform.eulerAngles.y);
- Confirm that smooth turning functions independently of other locomotion methods like movement or teleportation.
- If rotation feels misaligned, verify the XR Rig’s structure and ensure input actions are assigned correctly.
- Use
Continuous Move Provider
The Continuous Move Provider
gradually translates the XR Rig
in response to joystick input, allowing users to move smoothly through the virtual world. This form of locomotion mimics real-world walking and enhances immersion, but it must be tuned carefully to avoid causing discomfort. Movement typically follows the headset’s facing direction and supports both forward and strafing (sideways) motion. It’s ideal for exploration-based applications where natural, fluid movement is critical. Follow the steps below to set it up.
- Attach the
Continuous Move Provider
:- Select the
XR Rig
GameObject in your scene. - In the
Inspector
, add theContinuous Move Provider
component. - In the
Mediator
field, assign the existingLocomotion Mediator
in the scene. This ensures the movement system cooperates with other locomotion providers like turning or teleportation. - Make sure the
XR Body Transformer
is attached to the same GameObject as theXR Rig
. It applies the translation updates triggered by the move provider.
- Select the
- Configure Movement Parameters:
Transformation Priority
: Set this to1
to define when this movement is applied relative to other providers (e.g., after turning).Move Speed
: Set the speed of movement in units per second (e.g.,1.5
for walking pace).Strafe
: Enable this to allow lateral (sideways) movement from joystick input.Forward Source
: Assign theMain Camera
(within the XR Rig) so that forward input aligns with the user’s current gaze direction.
- Set Up Input Actions:
- The
Move
field expects a 2D Vector input, typically from the left-hand joystick. - Click the circle (object picker) icon next to the
LeftHand Move Input
field. - In the search window, type move and select
XR LeftHand Locomotion/Move
from the list. - Ensure that this action is a 2D Vector bound to the X and Y axes of the left joystick (horizontal = strafe, vertical = forward/backward).
- The
- Run the Scene:
- On Windows, connect your headset via the Link app and enter
Play
mode to test directly in the Editor. - On Mac, build and deploy the project to your headset using the Android build pipeline, then run it on the device to test.
- On Windows, connect your headset via the Link app and enter
- Observe and Tune:
- In the
Scene
view, move using the left-hand joystick and observe theXR Rig
translating in space. - Confirm that forward movement matches the headset’s orientation and that the rig moves at a comfortable speed.
- Tweak
Move Speed
andStrafe
settings for responsiveness and comfort.
- In the
- Debug Movement Behavior:
- Use
Debug.Log()
to print the XR Rig’s position during play:Debug.Log("XR Rig Position: " + transform.position);
- Confirm that movement is applied in world space and correctly translates joystick input to the expected direction and magnitude.
- If movement appears off-angle, double-check that the
Forward Source
is correctly referencing theMain Camera
.
- Use
Snap Turn Provider
The Snap Turn Provider
rotates the XR Rig
by a fixed angle—such as 45° or 90°—whenever a valid turn input is detected. Because it avoids smooth, continuous rotation, snap turning reduces visual motion and lowers the risk of motion sickness. It is especially useful in confined or object-dense environments, where quick reorientation is needed without displacing the user. Snap turn is commonly paired with teleportation for maximum comfort and usability. Here is how it is set up.
- Attach the
Snap Turn Provider
:- Select your
XR Rig
GameObject. - In the Inspector, add the
Snap Turn Provider (Action-based)
component. - Assign the existing
Locomotion Mediator
to theMediator
field to ensure that the snap turn provider integrates smoothly with other locomotion systems. - Confirm that an
XR Body Transformer
component is attached to the same GameObject as the XR Rig. This component handles the actual rotational transformation requested by the snap turn provider.
If your
XR Rig
already includes aContinuous Turn Provider
, be sure to disable or remove it. Only one turn provider should be active at a time to avoid conflicts between smooth and snap turning behaviors. - Select your
- Configure Snap Turn Parameters:
Transformation Priority
: Set this to0
unless another provider (like movement) needs to apply first. Lower values apply earlier.Turn Amount
: Choose the angular step for each snap rotation. Typical values are45
degrees (more granular) or90
degrees (coarser, faster rotation).Debounce Time
: Set a delay (e.g.,0.5
seconds) between accepted inputs to avoid accidental rapid turning.- *
*Enable Turn Left Right
:** Enable it to allow left and right snap turns. Enable Turn Around
: Optionally enable this if you want to support a 180° turn action using a separate input. This is useful in tight or back-facing scenarios.
- Assign Input Actions:
- The provider includes
Left Hand Turn Input
andRight Hand Turn Input
fields, each expecting a float-based Input Action (not a 2D Vector). - Click the circle (object picker) icon next to the
Right Hand Turn Input
field. - In the search window, type turn and select
XR RightHand Locomotion/Snap Turn
. - Leave the
Left Hand Turn Input
empty unless you wish to support turning via the left-hand joystick. - Ensure your input action is a
Value
type bound to the joystick X-axis (left = negative, right = positive).
- The provider includes
- Run the Scene:
- On Windows, connect your headset via the Link app and enter
Play
mode to test directly in the Editor. - On Mac, build and deploy the project to your headset using the Android build pipeline, then test it on the standalone device.
- On Windows, connect your headset via the Link app and enter
- Evaluate Snap Turn Behavior:
- Use the joystick to perform snap turns and verify that the XR Rig rotates instantly by the set angle.
- Observe whether the rotation happens cleanly around the rig’s Y-axis and does not affect the camera height or controller offsets.
- Adjust the
Turn Amount
for the right balance between speed and control based on your scene’s layout (e.g., tighter turns in small spaces).
- Refine Timing and Feedback:
- If the turn feels too rapid or unintentional, increase the
Debounce Time
to filter out noise or over-triggering. - Optionally, add haptic feedback, sound effects, or UI flashes when a turn is performed to improve user awareness.
- Visual cues—such as a directional arc or quick snap effect—can help users stay oriented after turning in environments with similar or repetitive geometry.
- If the turn feels too rapid or unintentional, increase the
Teleportation Provider
The Teleportation Provider
allows users to instantly relocate the XR Rig
to a chosen position in the virtual environment. This method avoids the sensory conflict caused by continuous movement, making it highly effective at minimizing motion sickness. It typically works with Teleportation Area
and Teleportation Anchor
interactables, and is triggered via a Ray Interactor
on the controller. Teleportation is ideal for large environments, precision positioning, or setups where natural walking isn’t practical. To setup teleportation, follow the steps below.
- Attach the
Teleportation Provider
:- Select your
XR Rig
GameObject in the scene. - In the
Inspector
, add theTeleportation Provider
component. - In the
Mediator
field, assign theLocomotion Mediator
from your scene. This ensures teleportation integrates cleanly with other movement types and manages access to theXR Rig
without conflict. - Verify that the same GameObject also includes an
XR Body Transformer
. This component is responsible for applying the actual transformation—teleporting the user from one location to another.
- Select your
- Configure Teleportation Settings:
Transformation Priority
: Set this to0
to apply this movement before other providers.Delay Time
: Optionally delay teleport execution (e.g., for fade-out effects).
- Assign Controller Input:
- Select the controller you want to use for teleportation (e.g.,
RightHand Controller
). - Ensure the controller has an
XR Ray Interactor
component. - Set the
Interaction Layer Mask
toEverything
to ensure teleport targets are detected during setup and testing. - In the
Input Configuration
section of theXR Ray Interactor
, click the circle icon next toActivate Input
and selectXR RightHand Locomotion/Teleport Mode Activate
(or the corresponding action for your controller).
- Optionally, assign a reticle prefab in the
Reticle
field of theXR Interactor Line Visual
component to provide visual feedback when a valid teleport surface is targeted.
- Select the controller you want to use for teleportation (e.g.,
- Set Up
Teleportation Area
:- Go to
GameObject > XR > Teleportation Area
to create a surface users can teleport to. - Resize and position the area to match your walkable regions (e.g., floors, platforms).
- In the
Teleportation Area
component, set theInteraction Manager
field if it is not already populated. Use the sameXR Interaction Manager
used elsewhere in your scene.
You can assign a custom material (e.g., transparent or semi-transparent) to the
Mesh Renderer
of theTeleportation Area
to better match your scene’s visual style or to make teleport targets less visually intrusive. - Go to
- Set Up
Teleportation Anchor
(Optional):- Add a fixed landing spot with a defined orientation via
GameObject > XR > Teleportation Anchor
. - Position and rotate the
Teleportation Anchor
to control where and how the user will face after teleporting. - In the
Teleportation Anchor
component, assign theInteraction Manager
field using the sameXR Interaction Manager
used in the rest of your scene (e.g., from the XR Origin or controller objects). Click the circle icon to select it from the scene. - Set the
Match Orientation
dropdown based on how you want the user to be aligned after teleporting.None
keeps the user’s current orientation.Target Up
aligns only the vertical direction (Y-axis).Target Up And Forward
aligns both vertical and forward directions to match the anchor’s transform.
- Optionally, you can replace the anchor with the
Teleport Anchor
prefab (provided by Unity) underAssets > XFactory > Prefabs > VR > Teleport
to better visualize the position and direction of the teleportation anchor.
- Add a fixed landing spot with a defined orientation via
- Run the Scene and Test Teleportation:
- On Windows, connect your headset using the Link app and test teleportation directly in
Play
mode. - On Mac, build and deploy the app to your headset via Android build pipeline, then run it on the device.
- Aim the controller at a
Teleportation Area
orAnchor
using the ray. - Press the assigned teleportation input (e.g., joystick forward, trigger hold).
- Upon release or confirmation, the
XR Rig
should instantly move to the target location.
- On Windows, connect your headset using the Link app and test teleportation directly in
- Fine-Tune Interactions:
- Adjust teleport behavior for instant versus press-and-hold activation.
- Address interaction conflicts, such as grabbing and UI control—use separate layers and input actions to avoid overlap.
- Test in a variety of lighting and environment conditions to ensure visual feedback remains visible and intuitive.
- Make the teleportation area’a material transparent to make it less visually intrusive.
Optimize the Project for VR
Before proceeding to other topics in VR development, let’s follow a production-oriented workflow to optimize the Unity project for VR. These practices are essential for maintaining high performance (typically 90 FPS) in VR environments, ensuring a smooth, immersive, and comfortable user experience. Poorly optimized VR apps can lead to nausea, discomfort, and system overheating, especially on standalone devices like Meta Quest 3.
Measure Your Performance
-
Set a Target Frame Rate: Most VR headsets require 90 FPS or higher for comfort. Refer to your device’s specifications (e.g., Meta Quest, HTC Vive) and document your target FPS. Setting a clear target allows you to benchmark performance and detect regressions early during development.
-
Monitor Frame Rate Using Stats or Script: Click the
Stats
button in the top-right corner of theGame
view to view real-time information such as FPS, draw calls, and triangle count. This is the quickest way to get a basic performance overview while testing in the Editor. You can also display FPS in the scene using a simple script with1.0f / Time.deltaTime
, shown in aText
orTMP_Text
UI element for live monitoring.
- Delve Deeper Into Performance Using Profiler: For a deeper look into CPU, GPU, memory, rendering, and script performance, open the Unity Profiler via
Window > Analysis > Profiler
. It provides real-time charts and timelines, helping you pinpoint bottlenecks while your application is running. These tools are essential for identifying performance issues early — before building for and testing on VR hardware.
- Test on Device: Build and run on your actual VR hardware. Use platform tools like Oculus Developer Hub or SteamVR Performance Graph to verify in-headset FPS. Editor performance is not representative of headset runtime; real-world testing reveals actual user experience and thermal limits.
Minimize Draw Calls
-
Check Draw Calls in Stats: Open the
Game View > Stats
window. The “Batches” value shows your draw call count. Target fewer than 175 draw calls for smooth VR. In the figure above, for example, the number of batches is over 300, which is too high for VR, especially if we re targeting standalone headsets like Meta Quest 3 or HTC Vive Focus. High draw call counts increase CPU overhead and reduce frame rates, especially on mobile or standalone headsets. -
Use Static Batching: Go to
Edit > Project Settings > Player > Other Settings
. EnableStatic Batching
. This feature lets Unity group static objects, reducing CPU work needed to send rendering instructions.
- Mark Static Objects: In the
Hierarchy
, select non-moving objects. In theInspector
, enable theStatic
checkbox to let Unity batch them. Marking objects as static informs the engine that these do not change, allowing more aggressive performance optimizations like lightmapping and occlusion culling.
Reduce Polygon Count
-
Analyze Triangle Count: In the
Stats
view, checkTris
(triangle count). UseScene View > Wireframe
mode to inspect object complexity. Lower triangle counts reduce GPU load, making rendering faster and more efficient, particularly for scenes with many objects. -
Inspect Meshes: Select objects and check their
Mesh Filter
in theInspector
. View the mesh in theProject
window to see triangle details. Understanding your models’ complexity helps you spot and replace unnecessary high-poly meshes early in the pipeline. -
Simplify High-Poly Assets: Use modeling tools like Blender or optimization tools like Simplygon to reduce polycounts. Use
LOD Group
components (GameObject > LOD Group
) for objects that can have lower detail at distance. This helps maintain high visual quality up close while reducing rendering costs when objects are farther from the camera.
Optimize Textures
-
Adjust Import Settings: Select textures in the
Project
view. In theInspector
, adjustMax Size
,Format
, andCompression
. Smaller and compressed textures load faster, use less memory, and reduce the bandwidth needed for GPU rendering. -
Enable Mip Maps: In the texture import settings, check
Generate Mip Maps
to reduce overdraw at distance. Mip maps improve performance and visual quality by lowering texture resolution for far-away objects, reducing aliasing. -
Use Anisotropic Filtering: Set
Aniso Level
to2–8
for floor or angled surfaces to improve sharpness at oblique views. This reduces blurring of textures viewed from shallow angles, such as factory floors or walls, enhancing realism.
Optimize Particles and Effects
-
Simplify Particle Systems: Use
Universal Render Pipeline/Particles/Unlit
shaders for particles. Minimize particle count and lifetime. Complex particle systems are expensive and can tank performance quickly in VR; using unlit shaders avoids lighting overhead. -
Limit Post-Processing Effects: Add a
Volume
to the Main Camera. Create and assign aVolume Profile
. Use only lightweight effects likeColor Adjustments
orBloom
, and avoidMotion Blur
,Chromatic Aberration
, andLens Distortion
. Post-processing can dramatically impact performance and user comfort. Some effects are disorienting or nauseating in VR.
Apply Efficient Anti-Aliasing
- Set MSAA in URP Asset: Locate your
UniversalRenderPipelineAsset
(e.g.,URP-HighQuality
). In theInspector
underQuality
, setMSAA
to4x
. Multi-sample anti-aliasing smooths jagged edges with minimal performance hit, especially important for headset displays.
- Enable FXAA on Camera: On the
Main Camera
, enablePost Processing
. InAnti-Aliasing
, chooseFXAA
orSMAA
as lightweight options. These screen-space algorithms are efficient and help polish visual quality without sacrificing performance.
- Adjust High-Detail Materials: For detailed objects like machines or robots, use baked
Reflection Probes
and simpler shaders. Real-time reflections and complex PBR shaders are GPU-intensive and should be avoided or simulated in VR contexts.
Additional Tips
- Occlusion Culling: Open
Window > Rendering > Occlusion Culling
. Bake occlusion data to prevent rendering of objects blocked from view. This improves performance by skipping rendering for hidden geometry, especially useful in indoor scenes like XFactory.
- Lightmapping: Use
Baked Lighting
for static geometry andMixed Lighting
withShadowmask
for semi-dynamic objects. Baked lighting provides high visual fidelity at a fraction of the real-time lighting cost. To generate a lightmap, open theLighting
window (Window > Rendering > Lighting > Environment
), scroll to theLightmapping Settings
section, and clickGenerate Lighting
. Make sure your static objects are marked asLightmap Static
, and verify thatBaked Global Illumination
is enabled underMixed Lighting
for successful baking.
- Profiler Monitoring: In
Window > Analysis > Profiler
, use theRendering
tab to find bottlenecks such asCamera.Render
,Shadows.Render
, orPostProcessing.Render
. Regular use of the Profiler helps catch performance issues early and guides your optimization decisions with real data.
Key Takeaways
Designing effective VR experiences requires a balanced integration of spatial awareness, locomotion, and performance optimization. Understanding and applying world, local, and camera coordinate systems ensures that objects behave consistently and interactions feel natural, while proxemics helps maintain user comfort through respectful spatial boundaries. Implementing locomotion systems like teleportation, snap turning, and continuous movement demands careful input handling, modular design, and attention to user comfort. Finally, optimizing for VR performance—through batching, polygon reduction, texture management, efficient effects, and targeted frame rate goals—ensures smooth, immersive, and comfortable experiences, particularly on standalone headsets.