D3. Grabbing Objects in VR
Learning Outcomes
- Explain the function and configuration of the
XR Grab Interactable
component in Unity. Before class, review its purpose in enabling object grabbing, explore the different movement types, and note how it supports realistic physics and throwing in VR.- Set up ray grab interaction for objects in VR. As preparation, make an object ray-grabbable by adding a
Rigidbody
andXR Grab Interactable
, settingMovement Type
toVelocity Tracking
, enablingSmooth Position
, and testing with a ray interactor.- Describe near grab interaction and its setup in Unity. Ahead of the session, study how near-only grabbing works by replacing a ray interactor with a direct interactor on one controller.
- Understand the role of custom attach points in VR interactions. For your pre-class work, learn how custom attach points improve hand alignment when grabbing, and optionally add a simple
GripHandle
to test the concept.- Describe natural interaction behavior in VR grabbing. Review how leaving
Attach Transform
empty allows objects to maintain their natural rotation when grabbed via ray.- Apply best practices for VR object interaction. Come prepared with one UX improvement and one performance optimization (e.g., simplifying colliders, enabling haptics, cleaning hierarchy) you could apply to your own project.
XR Grab Interactable
Immersive VR experiences rely on intuitive interactions. One of the most fundamental interactions is grabbing objects. In Unity, the XR Interaction Toolkit
provides components, most notably the XR Grab Interactable
, to implement these interactions. Grabbable objects are essential to building immersive and interactive VR experiences. The XR Grab Interactable
component makes a GameObject “grabbable” in VR. When a VR controller or hand interactor selects the object, it attaches to the interactor, following its position and rotation. Upon release, the object can inherit the interactor’s velocity, enabling realistic throwing and physics-based motion.
Common Use Cases
-
Tool Use and Manipulation: Allow users to grab and use tools like wrenches, screwdrivers, welders, or assembly equipment in simulated environments. In manufacturing or repair training, objects can be configured to simulate tool-specific constraints (e.g., grab points or alignment snapping).
-
Object Inspection: Enable users to pick up and closely examine parts, products, or virtual models. Useful in training, education, and product design reviews — e.g., inspecting a 3D-printed component or rotating an engine part.
-
Assembly Tasks: Simulate assembling objects by letting users grab parts and assemble them together. Examples include putting together mechanical systems, engine components, or even medical instruments.
-
Interactive UI Elements: Combine
XR Grab Interactable
with custom interactions for VR-based UI — for example, grabbing and repositioning virtual screens, menus, or sliders. -
Simulation of Physical Properties: Use physics-based grab behavior (e.g., throw force, mass, rigidbody constraints) to simulate realistic weight and movement. Especially important for games, training, or physics education.
-
Inventory and Object Transfer: Grab interactables can be passed between hands, stored in virtual inventory systems, or used to trigger events (e.g., unlocking doors with a key object).
-
Safety and Hazard Training: Let users interact with safety equipment like fire extinguishers or emergency tools. Train users on correct handling, usage procedures, and interaction in time-sensitive environments.
-
Role-Playing and Simulation: In healthcare, grab interactables can simulate surgical tools or patient handling. In defense, law enforcement, or tactical training, they represent weapons, radios, or hand-held sensors.
Key Concepts
-
Selection and Attachment: The object is selected by an interactor (via direct hand or ray-based interaction) and attaches at a defined point. A custom attach point (if provided) ensures that the object is held with the proper orientation. In XFactory, handheld barcode scanners in the logistics station can use custom attach points to ensure they align properly with the user’s hand for scanning or placement. This supports task accuracy and avoids visual misalignment.
-
Velocity Tracking: Uses the
Rigidbody
’s velocity and angular velocity to simulate realistic physics. This method is ideal for throwing objects but may introduce slight latency. UseVelocity Tracking
for logistics boxes that users can be thrown or stacked with some natural momentum. It enhances realism and simulates handling in warehouse scenarios. -
Kinematic Movement: Moves the object using a kinematic
Rigidbody
, ensuring smooth synchronization between the object’s visual state and its physics state. UseKinematic
for the robot teach pendant in the welding station, where fine control is needed. This avoids jarring motion or misalignment during simulated robot programming. -
Instantaneous Movement: Directly updates the
Transform
each frame for immediate response. While this minimizes latency, it may bypass realistic physics interactions. -
Smoothing Settings: Options like
Smooth Position
andSmooth Rotation
reduce jitter and improve the natural feel of the interaction by gradually matching the object’s movement to the controller. In the manufacturing station, enable smoothing on the CNC part models to simulate the feel of lifting a real metal piece. This improves realism during demonstration or training scenarios, where tactile fidelity matters. -
Custom Attach Points: By creating an empty child GameObject and assigning it as the attach point, you control where and how the object is held. This is especially useful for items like tools or sports equipment where orientation is critical. In the manufacturing station, assign a custom attach point on the robot teach pendant, so it aligns correctly during interaction. This simulates proper pendant usage behavior and avoids errors in virtual training.
-
Interaction Layers and Reticles: Use the
Interaction Layer Mask
to filter which interactors can grab the object. Custom reticles provide visual feedback when a grabbable object is within reach. In the CNC machine area, restrict access to the CNC control panel using a maintenance-specific interactor prefab. This reflects real-world safety constraints—only trained users can interact with sensitive systems. -
Grab Transformers: Advanced components that further modify the object’s position, rotation, and scale during grab events. They are useful for handling multi-handed interactions or enforcing axis constraints. Use grab transformers for the mobile assembly robot’s toolhead, allowing users to move it only along predefined axes (e.g., Y-axis rail), just like in real robotic calibration tasks. This teaches spatial constraints and operational safety.
Ray Grab Interaction
You can make tools, parts, and devices within XFactory grabbable in VR using Unity’s XR Interaction Toolkit
. This section walks you through setup, interaction tuning, and coding enhancements using real examples from XFactory’s logistics, assembly, and exhibit stations. Let’s make the barcode scanner in the logistics station interactable using XR Grab Interactable
, and play a click sound and vibrate the controller when it is picked up.
- Locate the Barcode Scanner in the Logistics Station:
- Find the
Scanner_01a
in the scene or place it from theProject
window. - Rename the instance to
BarcodeScanner
if you want. - Position it on the table inside the logistics station.
- Find the
- Add Physics and Interaction Components:
- Add a
Rigidbody
component to theScanner_01a
GameObject. - Set
Collision Detection
toContinuous Dynamic
to handle fast hand movements. - Verify that it has a
Box Collider
orMesh Collider
(Convex
checked).
- Add a
- Enable Grab Interaction:
- Add an
XR Grab Interactable
component. - Set
Movement Type
toVelocity Tracking
. - Enable
Smooth Position
andSmooth Rotation
. - Set
Throw On Detach
=true
. - Set
Throw Velocity Scale
=1.0
. - Set
Throw Angular Velocity Scale
=0.5
.
- Add an
- Create a Script for Auditory Grab Feedback:
- Create a new C# script named
BarcodeScanFeedback.cs
. - Attach this script directly to the
Scanner_01a
GameObject.
using UnityEngine; using UnityEngine.XR.Interaction.Toolkit; [RequireComponent(typeof(XRGrabInteractable))] public class BarcodeScanFeedback : MonoBehaviour { [SerializeField] private AudioClip beepClip; private XRGrabInteractable grab; void Awake() { grab = GetComponent<XRGrabInteractable>(); grab.selectEntered.AddListener(OnGrab); } private void OnGrab(SelectEnterEventArgs args) { Debug.Log("Scanner picked up. Ready to scan boxes."); if (beepClip != null) { AudioSource.PlayClipAtPoint(beepClip, transform.position); } else { Debug.LogWarning("No audio clip assigned for scanner feedback."); } } }
- Create a new C# script named
- Configure the Script:
- Locate a short beep sound in the
Assets
folderXFactory > Audio
(e.g.,SFX_UI_Click_3.ogg
). - Select the
Scanner_01a
GameObject in theHierarchy
. - In the
Inspector
, locate theBarcodeScanFeedback
script and assign the audio clip to its exposed field.
- Locate a short beep sound in the
- Add Haptic Feedback on Grab:
- Open the same
BarcodeScanFeedback.cs
script. - Add the following serialized fields and method below the existing code to enable vibration when the scanner is picked up.
- Then, update the
OnGrab
method to call theSendHaptics
method at the end:
[SerializeField, Range(0f, 1f)] private float hapticAmplitude = 0.5f; [SerializeField] private float hapticDuration = 0.1f; private void SendHaptics(IXRInteractor interactor) { if (interactor is XRBaseControllerInteractor controllerInteractor) { controllerInteractor.SendHapticImpulse(hapticAmplitude, hapticDuration); } }
- ```csharp SendHaptics(args.interactorObject); ```
- Open the same
- Configure the Script:
- Save the script and return to Unity.
- Select the
Scanner_01a
GameObject in theHierarchy
. - In the
Inspector
, locate theBarcodeScanFeedback
component and setHaptic Amplitude
(e.g., 0.5) andHaptic Duration
(e.g., 0.1 seconds).
- Deploy and Test the Behavior:
- Deploy the app to your device.
- When the barcode scanner is picked up, users will feel a brief vibration along with the audio beep—creating a more immersive and tactile interaction.
Near Grab Interaction
To make objects in XFactory interactable only at close range (i.e., users must reach out and grab them with their virtual hands), you can use XR Direct Interactors
from Unity’s XR Interaction Toolkit
. This disables ray-based grabbing while still supporting tactile, immersive object handling — ideal for realistic experiences in tasks like picking up tools or parts. Follow these steps to enable near-only interaction for the barcode scanner or any other interactable item.
- Configure the
LeftHand Controller
for Near Grabbing:- In the
Hierarchy
, locate yourXR Rig
. - Expand the rig and select the
LeftHand Controller
. - Remove the
XR Ray Interactor
component from the left hand (if present). - Click
Add Component
, search forXR Direct Interactor
, and add it.
- In the
- Assign Interaction Layer to the Left Interactor:
- Create a new layer in Unity called
NearOnly
. - On the
XR Direct Interactor
(on the left hand), set theInteraction Layer Mask
toNearOnly
. - Go to
Input Configuration
. - Click the small circle icon next to
Select Input
and assignXRI LeftHand Interaction/Select
. - Click the small circle icon next to
Activate Input
and assignXRI LeftHand Interaction/Activate
.
- Create a new layer in Unity called
- Update the Interactable Object (Barcode Scanner):
- Select the
Scanner_01a
GameObject in theHierarchy
. - In the
XR Grab Interactable
component, set theInteraction Layer Mask
toNearOnly
.
- Select the
- Add a Trigger Collider:
- Select the
LeftHand Controller
GameObject in theHierarchy
. - Click
Add Component
and chooseSphere Collider
. - Check
Is Trigger
. - Set the
Radius
to around0.05
to0.1
depending on the size of your hand model. - Adjust the
Center
(e.g.,Z = 0.1
) to extend the collider just slightly in front of the hand.
- Select the
- Add a Kinematic
Rigidbody
:- With
LeftHand Controller
still selected, clickAdd Component
and chooseRigidbody
. - Uncheck
Use Gravity
. - Check
Is Kinematic
. - This ensures proper trigger detection without unintended physics behavior.
- With
- Test the Interaction in Play Mode or on Device:
- On Windows, connect your Meta Quest headset using the Link app. Enter
Play Mode
in Unity to test the scene in real time. - On macOS, build and deploy the app to the headset using
Build Settings > Android > Build and Run
. - Try grabbing the barcode scanner.
- Your left hand (configured for near interaction) should work when close.
- Your right hand (ray-based) should not work (scanner ignores the ray due to
NearOnly
layer).
- On Windows, connect your Meta Quest headset using the Link app. Enter
This setup provides a clear side-by-side comparison of near-only grabbing vs. ray-based grabbing in XFactory.
Customized Interaction
Fine-tuning interactions in XR goes beyond just making objects grabbable. You can control how they align with the user’s hands, how controller models behave during interaction, and how the system responds to subtle inputs. These customizations help create a polished and realistic experience—especially in environments like XFactory, where precision matters when handling tools, devices, and user interfaces.
Custom Attach Points
To improve the realism and ergonomics of object interaction in VR, you can define a custom attach point so that grabbed items align properly with the user’s hand. Let’s set this up for the barcode scanner in the logistics station.
- Create a Custom Attach Point:
- Right-click on the
Scanner_01a
GameObject and selectCreate Empty
. - Rename the new child GameObject to
GripHandle
. - Position
GripHandle
on the grip area of the scanner (where a user would naturally hold it). - Rotate
GripHandle
so that the controller will align properly with the back of the scanner. - For correct hand alignment in XR, the
GripHandle
should follow this axis convention:
- Z+ (blue): Points out the front of the scanner (in the direction of the controller’s grip).
- Y+ (green) → Points up.
- X+ (red) → Points left.
You can enable
Pivot
andLocal
mode in Unity’sScene
view to see the local axes while adjusting the transform. Try temporarily attaching a cube toGripHandle
to visualize orientation. - Right-click on the
- Assign the Attach Transform:
- Select the
Scanner_01a
GameObject. - In the
XR Grab Interactable
component, drag theGripHandle
object into theAttach Transform
field.
- Select the
- Deploy and Test the Behavior:
- Deploy the app to your device.
- This makes the scanner sit properly in the user’s virtual hand, improving both comfort and realism when picked up.
Custom Gripper Models
To improve realism and avoid visual overlap or unintentional input, you can replace default hand models with your own Meta Quest controller prefabs, hide the models when grabbing, and disable joystick-based movement drift.
- In the
Hierarchy
, select theLeftHand Controller
andRightHand Controller
. - In the
Inspector
, find theXR Controller (Action-based)
component. - Locate the section labeled
Model
, which includesModel Prefab
andModel Parent
. - Drag your
XR Controller Left
orXR Controller Right
prefab (left or right) into theModel Prefab
field. - Align and scale your prefab appropriately so it visually matches the controller’s origin and rotation.
-
Make sure your prefab’s renderers and meshes are not controlled by physics (no
Rigidbody
orCollider
unless intentional) and do not interfere with hand colliders or XR interactions. -
Deploy and test. Your custom controller models are shown in VR during runtime.
To animate XR controller models in Unity (e.g., hide on grab), locate the
XR Controller
component, enable theAnimate Model
checkbox, assign animation clip names to theModel Select Transition
andModel Deselect Transition
fields, and configure transitions using anAnimator Controller
.
Natural Interaction
In VR interactions, it is important that objects feel physically realistic when picked up—especially with ray-based grabbing, where the hand doesn’t directly touch the object. One common issue is that objects like the barcode scanner may unnaturally snap to a preset rotation when grabbed. To support more natural interaction, we can configure the system to preserve the object’s current rotation at the moment of pickup, while still letting it follow the controller afterward. Let’s revise our project such that the barcode scanner can be grabbed from any angle using a ray while maintaining its current orientation—just like in real life.
- Add/Confirm Required Components:
- Select the
Scanner_01a
in theHierarchy
. - Add or confirm the following components:
Rigidbody
: EnableUse Gravity
. SetCollision Detection
toContinuous Dynamic
.XR Grab Interactable
: SetMovement Type
toVelocity Tracking
. Enable:Track Position
,Track Rotation
, andThrow on Detach
.
- Select the
- Leave
Attach Transform
Empty- In the
XR Grab Interactable
component, make sure theAttach Transform
field is left blank. - This prevents it from snapping to a predefined pose.
- In the
- Attach the Rotation-Preserving Script
- In the
Hierarchy
, select theLeftHand Controller
(orRightHand Controller
) GameObject. - Ensure it has the
XR Ray Interactor
component. - Add the following script to preserve object rotation during ray grab:
using UnityEngine; using UnityEngine.XR.Interaction.Toolkit; public class RayGrabPreserveRotation : MonoBehaviour { private XRRayInteractor rayInteractor; void Awake() { rayInteractor = GetComponent<XRRayInteractor>(); rayInteractor.selectEntered.AddListener(OnSelectEntered); } private void OnSelectEntered(SelectEnterEventArgs args) { if (args.interactorObject is XRRayInteractor ray && args.interactableObject is XRGrabInteractable grab) { var attachTransform = ray.attachTransform; if (attachTransform != null) { attachTransform.rotation = grab.transform.rotation; } } } }
- In the
- Test the Interaction:
- Use your
Ray Interactor
to grab the barcode scanner from different orientations (e.g., tipped over, angled, or on its side). - Observe that the scanner is pulled toward your hand, but keeps its original rotation—no sudden snapping.
- Move and rotate your hand naturally. The scanner now follows your motion smoothly, just like a real object would.
- Use your
This setup creates a more natural, immersive experience for users working with handheld tools in XFactory’s logistics training scenarios.
Best Practices
Designing effective VR interactions requires attention to user experience, technical performance, and scene structure. The following best practices will help you create intuitive, efficient, and scalable grab interactions.
UX and Realism
Focus on creating interactions that feel natural, predictable, and immersive. Well-tuned physics, object scaling, and input responsiveness are essential for reinforcing presence in VR.
-
Accurate Scaling: Always ensure that grabbable objects (e.g., tires, boxes, scanners) are modeled and scaled to real-world dimensions. Use reference tools like a “measuring stick” to compare sizes in Unity and maintain physical believability across all XFactory stations.
-
Natural Grabbing (No Attach Transform): For objects like tires, toolboxes, or assembly parts, leave the
Attach Transform
field empty on theXR Grab Interactable
to allow grabbing from any position with preserved hand orientation. This technique enhances realism by mimicking how users would pick up these items in the real world. -
Smooth but Responsive Behavior: Enable
Smooth Position
andSmooth Rotation
on interactables, but fine-tune their values (e.g., 0.2–0.4) to prevent laggy movement or jitter. This is especially important when handling heavy tools (like a welding torch) or delicate items (like a headset in the exhibit station). -
Feedback and Clarity: Reinforce grab actions with subtle audio cues, haptic feedback (if supported), and visual indicators. For example, when a user picks up a barcode scanner, a gentle beep can signal readiness—supporting user confidence and immersion.
Performance Optimization
Smooth and stable performance is crucial, especially on mobile VR platforms (e.g., Meta Quest). Optimize physics and visuals to maintain low latency and high frame rates during all grab interactions.
-
Simplified Colliders: Use primitive colliders (e.g.,
Box Collider
,Capsule Collider
) instead ofMesh Collider
wherever possible—particularly for frequently grabbed items like boxes, tools, or drone parts. Simplified colliders drastically reduce the overhead of physics calculations. - Efficient
Rigidbody
Settings: For objects with complex interactions (e.g., assembly parts that can be thrown or placed in sockets), adjustRigidbody
parameters mindfully:- Set appropriate
mass
for realism (heavier for engine parts, lighter for handheld tools). - Use
Continuous Dynamic
collision detection to prevent tunneling. - Adjust
drag
andangular drag
to simulate material friction or air resistance.
- Set appropriate
- Object Pooling for Interactables: When using repeatable objects (e.g., boxes in Logistics or printed parts in production), implement pooling systems to reduce instantiation cost and garbage collection spikes during gameplay.
Scene Organization
Structured scenes improve efficiency and ensure consistent workflows and easier troubleshooting.
-
Clean Hierarchy Structure: Group related objects under meaningful parent GameObjects (e.g.,
AssemblyStation/Tools
,LogisticsStation/Interactables
). This organization speeds up navigation, testing, and prefab modifications. -
Consistent Naming Conventions: Name GameObjects, prefabs, and scripts with clear, descriptive labels (e.g.,
Tire_Assembly
,Box_Logistics_A
,Scanner_UI
). Avoid generic names like “Cube” or “Object001”. -
Logical Layer and Tag Usage: Define interaction-specific layers (e.g.,
Grabbable
,UI
,MachineOnly
) and use tags to distinguish object roles (e.g.,Tool
,SocketTarget
,Fragile
). This practice allows for precise control over interaction filters, collision handling, and logic in your code. -
Prefab Modularity: Turn frequently reused items (e.g., tools, teach pendants, screens) into modular prefabs with configurable grab settings. This allows you to reuse and adjust objects across multiple XFactory stations without duplicating logic or assets.
Key Takeaways
Grabbing objects in VR with Unity’s XR Interaction Toolkit centers on effectively configuring the XR Grab Interactable
to balance realism, responsiveness, and user comfort. By understanding and applying movement types like velocity tracking for natural throwing, kinematic for precision control, and smoothing for tactile fidelity, developers can create interactions that feel intuitive and immersive. Features such as ray and near grab modes, custom attach points, preserved object rotation, and well-integrated feedback (audio and haptics) enhance realism and usability, while best practices in scaling, collider simplification, and scene organization ensure both performance and maintainability. Together, these approaches make VR object interaction both technically robust and engaging for the user.