D1. Fundamentals of VR
Learning Outcomes
- Explain the core concepts of VR and their application in immersive environments. Before class, review the definitions of immersion, interactivity, and presence, and be ready to share how these concepts apply to VR experiences you’ve seen or studied.
- Describe the VR development lifecycle and its key phases. As preparation, study the stages of Conceptual Design, Prototyping, Interaction, and Optimization, and note how each stage supports building an effective VR experience.
- Identify essential Unity tools for VR development. Ahead of the session, review XR Plug-in Management, XR Interaction Toolkit, OpenXR, XR Core Utilities, and the Input System, and understand the role each plays in the VR workflow.
- Compare different VR hardware types and their applications. For your pre-work, read the hardware categories and comparison table, and choose at least one device you would consider for engineering use cases.
- Set up a basic VR project in Unity. Prior to class, create a new Unity project using the 3D Core template with XR Interaction Toolkit enabled, without importing assets or building the project yet.
What Is VR?
Virtual Reality (VR) is a technology that simulates a three-dimensional, computer-generated environment that a user can explore and interact with. The term “virtual” means “near” or “almost,” while “reality” refers to our everyday sensory experiences. In VR, these two ideas combine to create an experience that mimics real life, even if the environment is entirely artificial. In XFactory, users can enter the virtual interior of a smart manufacturing facility and explore different stations such as the assembly or welding area. With a VR headset, the user doesn’t just view the factory—they are inside it, turning their head to observe moving robots, hearing spatial audio from nearby CNC machines, and interacting with objects and UI panels just as they would in a real-world facility. Unique affordances of VR include:
-
Immersion: Immersion refers to the extent to which the VR system can convincingly substitute real-world sensory input with digital stimuli. The goal is to achieve a state where users feel fully “present” in the virtual environment, with minimal awareness of the actual physical world around them. This is accomplished through coordinated visuals, auditory, and (sometimes) haptic feedback. In XFactory, when users approach the welding station, they are enveloped in ambient factory sounds, the glow and flicker of robotic welding arcs, and positional audio cues that change as they move. The environment, lighting, and sound design all contribute to the feeling of truly “being there.”
-
Interactivity: A core principle of VR is that users can interact with the environment. This interactivity may involve manipulating objects, navigating through space, or triggering events—all in real time. For engineers, this means that virtual models can be tested dynamically, allowing for rapid iteration and problem solving. In the XFactory’s logistics station, users can pick up boxes, scan them with a virtual barcode scanner, or load them onto a forklift. These interactions aren’t scripted—they are driven by real-time physics and user actions, simulating practical tasks in logistics operations.
-
Sensory Substitution: VR works by presenting alternative sensory data to the user. By delivering precise visual and auditory cues (and sometimes tactile feedback), it “tricks the brain” into interpreting digital information as if it were coming from the real world. The challenge is to maintain sensory synchronicity to avoid discrepancies like latency or conflicting cues (which can lead to motion sickness). When using hand controllers to operate the robotic arm in the assembly station, XFactory synchronizes the haptic feedback with the visual grip and release action. Any lag would immediately break the illusion and reduce effectiveness.
-
Presence: The concept of “presence” is the subjective feeling of being inside the virtual environment. When a VR system is highly immersive and interactive, users experience a sense of being “there” rather than merely looking at a simulation. This psychological state is key to the effectiveness of VR applications, from training simulations to design reviews. Presence is powerfully demonstrated in XFactory when users walk through the exhibit area and interact with futuristic machines like a quadruped robot. The experience can be so tangible that users often forget they are in a simulated world, instinctively stepping aside to avoid virtual drones flying overhead.
In collaboration with Ghost Lab at Northeastern University, SAIL contributed to the design, development, testing, and deployment of this VR training system for the Navy. Visit the project website to learn more.
VR Development Lifecycle
VR development in Unity is a structured, multistage process that blends creative vision with technical precision. From initial concept to final deployment, developers leverage Unity’s robust 3D engine, XR Interaction Toolkit, and OpenXR support to craft virtual experiences that are immersive, interactive, and optimized for performance. Each phase—whether focused on design, prototyping, interaction, or optimization—plays a critical role in shaping user engagement and ensuring the final product runs smoothly across a range of VR hardware.
Conceptual Design
Effective VR projects begin with a strong foundation: identifying the purpose of the experience, designing user flows, and setting up the XR rig and spatial layout in Unity. At this stage, aligning technical design with user intent ensures a seamless pipeline for later development.
-
Ideation and Planning: Define clear goals for the experience, such as education, training, or simulation. In XFactory, the goal is to let users explore and interact with modern manufacturing systems like drones, robotic arms, and digital twins within a safe and immersive environment. Stakeholder input, including from training leads or subject matter experts, is often gathered here to inform priorities and use cases.
-
User Experience (UX): Design intuitive, task-driven interactions and interfaces that guide users naturally. In XFactory, teleport anchors help users move between stations, while glowing panels and floating tooltips at machines offer cues for interaction. The UX design phase also involves accessibility considerations, ensuring all users can complete tasks without excessive motion or cognitive load.
-
Technical Specifications: Select target devices and performance benchmarks, including supported frame rates and input types (e.g., HMDs, controllers, hand tracking). XFactory targets Meta Quest 3 and Windows Mixed Reality using the OpenXR plug-in, ensuring broad accessibility and smooth frame delivery. Performance constraints and platform limitations are identified early to avoid costly rework later.
-
Scene Composition and
XR Rig
Setup: Start with a well-configuredXR Rig
and a clean spatial layout that distinguishes interactive zones from static backgrounds. XFactory uses a sharedXR Rig
across scenes. The rig is repositioned via teleportation to stations like logistics or assembly, maintaining consistency across the virtual space. This modular approach also simplifies rig updates and standardizes user orientation across levels. -
Visual Design: Organize static vs. dynamic assets to optimize lighting and performance. In XFactory, static meshes like walls and floors are grouped under “StaticEnvironment” with baked lighting, while interactables like forklifts and robots are grouped for real-time updates. Clear naming conventions and layer usage also help streamline rendering logic and collision setup.
Prototyping and Creation
This stage focuses on rapid testing, blocking out scenes, and building immersive environments using Unity’s 3D tools and placeholder assets. It’s an iterative loop of testing, feedback, and incremental fidelity improvements.
-
Initial Prototyping: Use basic geometry and mockups to validate user navigation, object grabbing, and controller input. XFactory prototypes used graybox robots to test interaction distance and arm collision detection before modeling final assets. This phase is also ideal for confirming spatial scale and adjusting player reach or comfort zones.
-
Real-Time 3D Design: Implement terrain, lighting, VFX, and spatial audio to deepen immersion. XFactory’s drone launchpad features dynamic lighting, animated propellers, and ambient machinery sounds to simulate a busy outdoor tech environment. Designers may also introduce weather systems or particle effects to test responsiveness under load.
-
Scene Testing: Use Unity’s in-editor tools (like the
XR Device Simulator
) for early interaction testing. Developers validate tool pickup in XFactory’s welding cell using keyboard-driven head and hand emulation before deploying to actual VR headsets. These tests also help identify bugs related to collider bounds, interaction layering, or visual occlusion.
Interaction and Iteration
Building dynamic, responsive interactions in Unity means integrating multiple input sources, interaction systems, and feedback loops to shape user engagement. It also involves tuning these interactions based on user comfort and intended task flow.
-
XR Interaction Toolkit Integration: Use components like
Grab Interactable
,Ray Interactor
, andSocket Interactor
to quickly implement object manipulation. In XFactory, users grab engine pistons and slot them into crankshafts using snap sockets and direct interactor logic—no custom code needed. These tools accelerate development by reducing the need for bespoke physics or interaction scripts. -
Input System Integration: Leverage Unity’s Input System for handling controller buttons, joysticks, gestures, and haptic feedback. At the assembly station, XFactory uses an
InputAction
to trigger a tightening sequence on tire lug nuts, synchronizing a ratchet sound and haptic feedback viaTorqueAction
. This system also maps well to accessibility customization, like remapping controls or adjusting sensitivity. -
Locomotion Options: Provide both teleportation and smooth movement to accommodate user comfort. While XFactory defaults to teleportation, users inspecting the conveyor belt can switch to joystick movement via a toggle in the radial menu. Smooth locomotion can be restricted in certain areas to avoid nausea or preserve spatial clarity.
-
Iterative Testing: Continuously test and refine on both simulator and hardware. Adjust scenes based on feedback regarding comfort, clarity, or interaction fidelity. After usability testing, XFactory developers can consider adding larger UI canvases and repositioned machines for better line-of-sight during training simulations. Capturing user metrics and heatmaps during these tests often guides design refinements.
Optimization and Deployment
Finalizing a VR experience involves polishing the environment for performance, ensuring platform compatibility, and preparing for delivery. Deployment readiness includes stress testing, QA, and compliance with store requirements if publishing publicly.
-
Performance Tuning: Optimize lighting, LODs, physics, and rendering settings to maintain frame rate and responsiveness. In XFactory, LODs reduce the complexity of outdoor drones at a distance, and baked lighting keeps indoor welding areas performant under load. GPU and CPU profiling tools help isolate bottlenecks related to shaders, physics, or draw calls.
-
User Feedback and Comfort Enhancements: Incorporate feedback on accessibility, usability, and motion sensitivity. When test users were unsure which robots were interactive, XFactory added glowing outlines and UI pop-ups to signal interactivity. Additionally, UI text size and contrast were adjusted for users with varying levels of vision and headset resolution.
-
Platform Packaging and Plug-ins: Use Unity’s
XR Plug-in Management
with OpenXR for cross-platform builds. Configure runtime settings per device. XFactory builds as an.apk
for Meta Quest and a desktop app for PC VR, using XR Interaction Toolkit and Input System for consistent behavior across both. Build automation and cloud delivery pipelines may also be used to streamline testing and publishing.
VR Software
Unity provides a robust ecosystem for developing VR experiences. Its modular architecture relies on packages and plugins that streamline the development process, ensuring that developers can quickly set up projects, handle diverse input methods, and support a range of devices.
XR Plugin Management
XR Plugin Management is Unity’s system for managing and loading XR plug-ins. It abstracts the platform-specific details, allowing developers to switch between different XR runtime environments (such as AR and VR platforms) without altering much of the core project. With XR Plugin Management, you can build XR apps that work across multiple VR/AR devices (like Meta Quest, HTC Vive) by simply swapping plug-ins. It streamlines configuration for multiple XR platforms, reducing overhead in multi-device projects. In summary, its core concepts include:
- Abstraction Layer: Provides a unified interface for handling various XR hardware and platforms.
- Plug-in Ecosystem: Supports multiple providers, enabling support for devices from different manufacturers.
- Configuration and Optimization: Helps manage XR settings, optimizes performance, and ensures that the correct plug-in is active based on the target platform.
In Unity, plugins are external libraries—either managed (C# DLLs) or native (C++, Java, Objective-C)—that extend the engine’s functionality, often enabling platform-specific features or third-party integrations. They are commonly used for tasks like accessing native device APIs, integrating SDKs, or enhancing editor tools.
XR Interaction Toolkit
The XR Interaction Toolkit is a framework provided by Unity for creating and managing user interactions in XR environments. It offers components and tools that enable developers to easily set up common interaction patterns such as grabbing, teleportation, and UI manipulation. With the XR Interaction Toolkit, you can quickly prototype and deploy VR games or AR experiences with intuitive user interactions and create realistic interaction scenarios for simulation training programs. This toolkit provides a rapid development toolset for experimenting with new interaction paradigms in XR. Its core concepts include:
- Interaction Components: Pre-built components for common interactions (e.g., ray interactor, direct interactor, grab interactable).
- Event-Driven Architecture: Utilizes Unity events to trigger interactions, making it easier to extend and customize behavior.
- Modularity: Designed to be modular so that developers can mix and match components based on the needs of their application.
In XFactory, users grab tools, activate machines, and assemble engine parts using
XR Grab Interactable
andSocket Interactor
components.Ray Interactor
components allow for controller-based UI navigation in the exhibit station’s holographic panels.
OpenXR
OpenXR is an open standard for XR application development, designed to standardize how VR and AR hardware interact with software. In Unity, this standard is supported through provider plug-ins, which serve as bridges between the OpenXR API and specific XR hardware. Using OpenXR, you can write code that works on multiple XR devices without needing to change the underlying implementation. With a standard API, different hardware manufacturers can collaborate on improving XR experiences. Core concepts include:
- Open Standard: A universal API that reduces fragmentation across various XR devices.
- Provider Plug-ins: Specific implementations that allow Unity to communicate with different hardware ecosystems (such as Meta Quest, Windows Mixed Reality, etc.).
- Interoperability: Ensures that applications built on OpenXR can work with any hardware that supports the standard, fostering a more unified development ecosystem.
When showcasing your VR experience, you can simply pair the simulation with the newest supported headset using the OpenXR plug-in—without touching any Unity scripts.
Input System Package
The Input System Package in Unity is a modern approach to handling user input. It is designed to replace the older input system by providing a more robust, flexible, and extensible framework, especially beneficial in XR contexts where diverse input methods (controllers, gestures, voice, etc.) are common. This package allows you to seamlessly integrates inputs from various XR devices, including VR controllers and hand-tracking systems. It enables the creation of sophisticated interaction schemes that require more nuanced input handling and is ideal for projects requiring customized input schemes beyond standard keyboard/mouse or gamepad controls. Core concepts include:
- Action-Based Input: Uses input actions and bindings, allowing developers to map various input devices to logical actions.
- Extensibility: Easily supports custom devices and input types without major overhauls.
- Event-Driven Model: Provides a more responsive input system that triggers events based on user interactions.
Voice commands like “Activate Welding Sequence” and thumbstick input for drone inspection in XFactory are managed using action maps defined in the Input System. This modularity allows toggling input profiles per station with ease.
XR Device Simulator
The XR Device Simulator is a Unity tool that allows developers to simulate XR hardware input within the Unity Editor. This tool helps in testing and debugging XR interactions without needing the actual hardware during development. It allows you to quickly test and refine interactions without switching to real hardware and offers a convenient way to prototype XR interactions and debug issues directly in the editor. Core features include:
- Simulation Environment: Emulates VR/AR controllers and headset movements within the editor.
- Debugging Tool: Provides a visual and interactive interface for testing XR interactions.
- Efficiency Booster: Reduces the need for physical hardware in early stages of development, speeding up iteration cycles.
During development, XFactory features such as robot socketing mechanics and logistics scanning UI can be debugged entirely with the XR Device Simulator, allowing developers to fine-tune experience flow before hardware deployment.
VR Hardware
VR hardware includes head-mounted displays (HMDs), motion controllers, sensors, and tracking systems that work together to create immersive digital environments. These devices offer high-resolution visuals, spatial audio, and increasingly precise tracking and interaction capabilities, enabling users to experience realistic and interactive virtual worlds. As technology advances, improvements in display quality, reduced latency, hand and eye tracking, and haptic feedback continue to expand the applications of VR—from gaming and entertainment to education, engineering, and professional training. VR systems generally fall into three broad categories, each with its own tradeoffs in performance, portability, and use case suitability:
-
Tethered Systems: Devices connected to a powerful PC or workstation via cables, offering high-resolution displays, superior graphics performance, and advanced tracking capabilities. These are commonly used in engineering labs where precise 3D model visualization and high-performance simulation are required. Examples include HTC Vive Pro 2, Valve Index, and Varjo XR-4.
-
Standalone Systems: Self-contained devices with built-in processing, tracking, sensors, and battery power. They offer portability and ease of use without external computers. These are ideal for mobile demonstrations, offsite training sessions, field-based data collection or design review. Examples include Meta Quest 3, HTC Vive Focus, and Apple Vision Pro.
Some earlier VR devices were hybrid systems that used smartphones or detachable compute units for VR rendering. These were an earlier entry point into VR, but are now largely supplanted by more capable standalone devices.
Technical Capabilities
Understanding the technical capabilities of VR hardware is essential when integrating with tools like Unity for immersive engineering applications.
-
Display & Resolution: High-resolution displays enable detailed viewing of schematics, CAD assemblies, and interface elements. For example, the Varjo XR-4 delivers human-eye-level visual clarity, allowing users to interact with detailed 3D models in VR with confidence.
-
Tracking and Positional Accuracy: Accurate motion tracking is essential for realism, especially when users interact with virtual components. For example, Valve Index and HTC Vive Pro use Lighthouse tracking for sub-millimeter precision.
-
Interaction and Controllers: Controllers with features like finger tracking, capacitive sensors, and haptics enable intuitive and expressive manipulation of virtual tools. The Meta Quest 3’s controllers offer fine-grained haptics and self-contained tracking, letting users simulate hand tools and interactions in VR without external cameras.
-
Field of View (FOV) and Refresh Rates: A wide FOV enhances peripheral awareness, while high refresh rates (90Hz and above) reduce motion blur and VR sickness. For example, Valve Index’s 144Hz refresh rate improves user comfort during extended simulation sessions.
-
Developer Ecosystem and SDK Support: Unity supports all major VR platforms through plug-ins like OpenXR and the XR Plug-in Management system. For example, Meta Quest 3 offers Unity SDKs for quick deployment and device-specific optimizations—critical for developing portable training applications.
Considerations for Engineers
VR hardware should be selected based on the specific needs of engineering workflows and environments:
-
Simulation and Prototyping: Use high-end tethered devices such as Varjo XR-4 or HTC Vive Pro 2 where graphical fidelity and spatial precision are critical—for example, evaluating the mechanical fit of engine parts or inspecting factory layouts.
-
Remote Collaboration and Training: Devices like Meta Quest 3 and Pico 4 support multi-user virtual environments and remote design review sessions. These are ideal for teams working across locations or conducting training without physical equipment.
-
Ergonomics and Extended Use: Comfortable headsets such as the HP Reverb G2 reduce fatigue during long sessions. Adjustable head straps, balanced weight, and high-quality padding make them suitable for prolonged design reviews or educational walkthroughs.
-
Integration with Existing Tools: Modern headsets often integrate with platforms like Unity, Unreal, and professional CAD tools. Unity bridges engineering software and VR visualization, enabling immersive experiences like XFactory where learners interact with machine interfaces, assembly processes, and robot paths.
Common VR Devices
- A standalone, consumer-focused VR headset featuring pancake lenses for a slimmer profile and better image clarity.
- Boasts a higher resolution than the Quest 2, with significantly reduced screen-door effect.
- Includes full-color passthrough for immersive mixed reality experiences and uses the Snapdragon XR2 Gen 2 chipset, delivering roughly twice the GPU processing power of its predecessor.
- Great for gaming, productivity, and fitness in a wireless format.
- An all-in-one enterprise-grade VR headset designed for training, collaboration, and simulations.
- Features 2448 × 2448 per eye resolution, 90Hz refresh rate, and wide 120° FOV.
- Offers inside-out tracking with four built-in cameras and swappable batteries for all-day use.
- The magnesium alloy frame ensures both strength and lightness, making it ideal for extended sessions.
- A cutting-edge standalone VR/AR headset with a focus on advanced optics, mixed reality, and enterprise applications.
- Equipped with the Snapdragon XR2+ Gen 2 chipset, delivering enhanced AI and graphics performance.
- Features full-color passthrough, improved depth sensing, and advanced hand tracking for natural user interaction in training, design visualization, or remote support scenarios.
- Premium tethered VR headsets with a focus on professional and enthusiast users.
- The Vive Pro 2 features a dual 5K resolution (2448 × 2448 per eye), 120Hz refresh rate, and a wide 120° FOV.
- Utilizes Lighthouse 2.0 base stations for room-scale tracking, offering pinpoint spatial accuracy.
- Includes integrated Hi-Res certified headphones, a comfortable adjustable head strap, and modular accessories for eye tracking and facial tracking.
- A PC-tethered headset popular among enthusiasts and developers for its superior tracking and input fidelity.
- Offers a refresh rate up to 144Hz, 130° FOV, and a custom LCD panel to reduce motion blur.
- Paired with Knuckles controllers, it allows individual finger tracking, enabling precise gesture input.
- Uses external Lighthouse base stations for high-fidelity room-scale VR with sub-millimeter accuracy.
- A high-end mixed reality headset targeted at industrial, aerospace, and research applications.
- Features dual displays with exceptionally high pixel density (over 70 PPD) and bionic display technology for human-eye resolution in the focus area.
- Offers low-latency passthrough with stereo RGB cameras, real-time environment segmentation, and occlusion handling.
- The expanded FOV and depth sensors enable more realistic and immersive VR scenarios.
Setting Up a VR Project in Unity
To set up a VR project in Unity, follow the steps below to ensure proper installation, configuration, and device compatibility.
Required Modules
- Update Unity Hub: Open Unity Hub and ensure it is updated via the settings menu.
- Install Unity 6 LTS: If not already installed, select Unity 6 LTS (Long-Term Support) from the available versions.
- Add Required Modules: When installing Unity, make sure to include
Android Build Support
,OpenJDK
, andSDK & NDK Tools
. These modules are essential for building and exporting VR apps to Android-based headsets like the Meta Quest. -
Complete Installation: Follow the prompts to finish installing Unity and the selected modules.
Configuring Your VR Hardware
To prepare your VR hardware for development, use the steps below. Note that this guide focuses on the Meta Quest 3, but basic compatibility tips for other headsets are included.
- Enable Developer Mode:
- Go to the Meta Developer Portal and create a developer account.
- Create an organization and accept the developer agreement.
- Install the Meta Quest mobile app (iOS or Android).
- Pair your headset in the app under
Menu > Devices
. - Enable
Developer Mode
and reboot your headset.
- Install Meta Quest Link (Windows):
- Download and install the Meta Quest Link app](https://www.meta.com/quest/setup/) following these instructions.
- This supports both USB-C and Air Link connections to your PC.
- Connect Your Device:
- Use a USB-C cable or enable Air Link in the headset.
- Put on the headset and approve the connection when prompted.
For headsets like the HTC Vive or Valve Index, download and install Steam and SteamVR. Launch SteamVR and ensure your headset and controllers are properly recognized and functioning. SteamVR is required for most PC-tethered VR devices that use OpenXR, including HTC and Valve hardware.
Creating a New Project
- Set Up a New Project:
- Launch Unity Hub and click
New Project
. - Choose the
Universal 3D
template. - Name the project (e.g.,
My_VR_Prototype
) and save to your workspace directory. - Once inside the Unity Editor, immediately save the scene as
MainVRScene.unity
to avoid accidental data loss.
- Launch Unity Hub and click
- Enable
XR Plugin Management
:- Go to
Edit > Project Settings > XR Plug-in Management
. - Click
Install XR Plug-in Management
package. - Enable
OpenXR
under theAndroid
and/orDesktop
tab. - Follow the instructions under
Project Navigation
to fix compatibility issues.
XR Plugin Management
allows Unity to support various VR hardware through a modular system. EnablingOpenXR
ensures broad compatibility with modern headsets like the Meta Quest 3. - Go to
- Install
XR Interaction Toolkit
:- Go to
Window > Package Manager
. - In the
Package Manager
, selectUnity Registry
and search forXR Interaction Toolkit
. - Click
Install
.
The
XR Interaction Toolkit
provides built-in components for VR input, interactions, and locomotion. Installing it saves development time and ensures standardized handling of common VR mechanics. - Go to
- Set Up Interaction Profiles:
- Under
Project Settings > XR Plug-in Management > OpenXR > Interaction Profiles
, addOculus Touch Controller Profile
. - Set it as the default interaction profile.
Interaction Profiles
map controller input to specific actions and gestures in VR. Adding theOculus Touch Controller Profile
ensures correct input mapping for the Meta Quest 3 controllers. - Under
Integrating VR Components
- Create an
XR Origin (VR)
:- In the
Hierarchy
, right-click and selectXR > XR Origin (VR)
(requiresXR Interaction Toolkit
). - Set its Transform
Y
position to0.01
to prevent the camera from clipping through the floor. - Delete the default
Main Camera
. The camera insideXR Origin (VR)
is automatically tracked by the headset and should be tagged asMainCamera
.
- In the
- Set Up Input Actions:
- In the
Project
window, go toAssets > Create > Input Actions
to create a new Input Action Asset (e.g.,XRControls.inputactions
). - Double-click the asset to open the
Input Actions
editor.
- In the
- Define Action Maps:
- Create two
Action Maps
by clicking the+
icon in the left panel: - Name them
XRI LeftHand
andXRI RightHand
.
- Create two
- Add Input Actions:
- Click on an action map and navigate to the
Actions
tab. - Create a
Select
action for grabbing or clicking objects. - Create an
Activate
→ action for activating held items (e.g., flashlights, tools). - Create a
Teleport
action for moving around the environment.
- Click on an action map and navigate to the
- Set the Action Types:
- Navigate to the
Action Properties
tab. - Set the
Action Type
forSelect
andActivate
toButton
. - Set the
Action Type
forTeleport
toValue
(Vector2).
- Navigate to the
- Add Input Bindings:
- For each action, click the action name and select
+ > Add Binding
. - Under
Binding Properties
, you’ll see a field labeledPath
. This is where you define the input source manually or by listening for input. - To manually enter the path, you can type or paste the binding path directly into the
Path
field:
Select
:<XRController>{LeftHand}/triggerPressed
and<XRController>{RightHand}/triggerPressed
.Activate
:<XRController>{LeftHand}/gripPressed
and<XRController>{RightHand}/gripPressed
.Teleport
Direction (Vector2):<XRController>{LeftHand}/thumbstick
or{RightHand}/thumbstick
.Teleport
Confirmation (Button):<XRController>{LeftHand}/thumbstickClicked
or{RightHand}/thumbstickClicked
. - To listen for input, click the smallListen
button next to thePath
field. Then, press the corresponding button on your VR controller to auto-detect the path.
Repeat this for both
XRI LeftHand
andXRI RightHand
action maps to ensure full controller support. You can always refine or add alternate bindings later as needed. - For each action, click the action name and select
- Final Setup:
- Save the asset (
Ctrl/Cmd + S
). - In the
Hierarchy
, selectXR Origin (VR)
. - In the
Inspector
, find theInput Action Manager
component. - Assign the
XRControls
asset to theAction Assets
list.
- Save the asset (
Your scene now has a VR-ready rig with input mappings for interaction. You can continue by adding teleportation areas, grab interactors, or interactable objects.
Importing an XR Setup Package
To quickly make any Unity project VR-ready, you can import the pre-configured XR Setup.unitypackage
(provided in the course repository). This package includes everything needed to start developing for XR without manually setting up the rig and input system.
- Import the Package:
- Launch Unity and open the project you want to make VR-ready (e.g., XFactory).
- Drag the
XR Setup.unitypackage
file into the Unity Editor, or go toAssets > Import Package > Custom Package...
and select the.unitypackage
file. - In the import window, ensure all items are selected, especially the
XR Setup
prefab,XRI Default Input Actions
asset, and all other supporting assets.
- Add the XR Setup to Your Scene:
- In the
Project
window, go to the imported folder (XR Setup
) - Drag the
XR Setup
prefab into your active scene. - Remove the default
Main Camera
. - This prefab contains
XR Interaction Manager
,Input Action Manager
(linked toXRI Default Input Actions
), andXR Rig
. - The
XR Rig
should contain aCamera Offset
child, with three child GameObjects:
Main Camera
(head tracking)LeftHand Controller
RightHand Controller
- In the
- Confirm Input System Compatibility:
- Go to
Edit > Project Settings > Player
- Under Active Input Handling, make sure it is set to
Input System (New)
orBoth
(for compatibility with older code).
- Go to
- Test the Scene:
- Enter Play mode with your VR headset connected.
- Move your head and hands to verify that tracking and input are functioning.
This setup provides a clean starting point for VR projects using Unity’s XR Interaction Toolkit. You can now start building interactions, teleportation, and UI elements on top of this foundation.
Testing VR App in Unity
- Connect Your Device:
- Use a USB-C cable to connect your Quest 3 to your PC.
- Put on the headset and click
Allow USB Debugging
when prompted. SelectAlways Allow
if you want to avoid repeating this. - Make sure
Developer Mode
is enabled (via the Meta Quest mobile app). - Unity will stream to the headset via the
Link
app on Meta Quest 3 (or Air Link if configured separately).
- Disable the Device Simulator:
- In the Unity Editor, check the
Game
view tab for a dropdown labeledSimulator
. - If it says
Simulator
, click the dropdown and switch it toGame
. - This disables the
Device Simulator
and ensures input from your actual Quest 3 controllers is used instead.
- In the Unity Editor, check the
- Enter Play Mode and Test:
- Click the
Play
button in Unity. - Put on your headset. You should see your Unity scene rendered and be able to move and interact using your Quest 3 controllers.
- Make sure your
XR Origin (VR)
, input actions, and any interactors (e.g., teleport, grab) are functioning as expected.
- Click the
If you don’t see the Unity scene in your headset, check:
- The
Main Camera
inside yourXR Origin (VR)
is tagged correctly.Oculus Link
is active on your headset (you may need to manually launch it from the Quest menu).- No errors in the Console that block XR initialization.
Building to VR Device
- Configure
Build Profile
:- Go to
File > Build Profiles
, then clickSwitch Platform
toAndroid
if it’s not already selected. - In the same window, locate the
Platform Settings
button. - Under your current profile, ensure the
Platform
is set toAndroid
, and assign aTarget Device
if needed (e.g., Quest 3).
- Go to
- Build and Deploy to Device:
- Go to
File > Build Profiles
, then clickBuild and Run
. - Choose an output folder (e.g.,
Builds/Quest
) and enter a filename likeXFactoryDemo.apk
. - Ensure your Meta Quest 3 is in Developer Mode, connected via USB-C, and authorized for USB debugging (check the headset if prompted).
- Unity will build the APK and automatically install it on the headset.
- On your Quest 3, find the app under
Library > Unknown Sources
.
- Go to
If deployment fails, check ADB is working (
adb devices
) or verify your headset connection in the Meta Quest app.
Key Takeaways
VR enables immersive, interactive experiences by blending sensory substitution, presence, and real-time responsiveness, making it a powerful tool for engineering, training, and simulation. Developing VR in Unity involves a structured lifecycle—conceptual design, prototyping, interaction, optimization, and deployment—supported by essential tools like XR Plug-in Management, XR Interaction Toolkit, OpenXR, and the Input System. Understanding hardware options, from high-fidelity tethered devices to portable standalone systems, ensures the right balance of performance and usability for the intended application. By mastering Unity’s VR setup process, from XR rig configuration to input mapping and build deployment, creators can deliver smooth, engaging experiences across platforms. Ultimately, effective VR development requires both technical proficiency and thoughtful design to produce environments where users feel truly “present” and empowered to explore, learn, and act.