D6. Touch and Sound in VR
Learning Outcomes
- Explain the role of multimodal feedback in VR. Before class, review how audio and haptic feedback enhance realism, awareness, and safety in engineering simulations, and think of scenarios where non-visual cues could improve user interaction.
- Set up haptic feedback triggers in Unity XR. In preparation, study XR Interaction Toolkit events like
On Hover Entered
andOn Select Entered
, and verify that yourXR Rig
and controllers are properly configured withXR Ray Interactors
.- Configure and test spatial audio in a VR environment. Ahead of the session, attach an
AudioSource
to a GameObject, adjust spatial audio settings such asSpatial Blend
andMin/Max Distance
, and have placeholder audio clips ready for scripting.
Multimodal VR Experiences
In VR, creating a truly immersive experience goes beyond realistic visuals. The senses of hearing and touch play pivotal roles in bridging the gap between the virtual and the physical world. In engineering applications—such as industrial simulations, training environments, and control systems—accurate auditory cues and haptic feedback are critical for:
- Enhanced Situational Awareness: Operators in control rooms or simulation environments can quickly gauge system status and spatial relationships without relying solely on visual displays.
- Improved Safety and Efficiency: Timely and clear sensory feedback helps users confirm actions (such as engaging emergency controls) and reduces the risk of errors.
- Realistic Simulation of Real-World Systems: Sound cues and tactile sensations, such as the click of a button or the vibration of a control panel, closely mimic physical interactions, making simulations more believable.
- Increased User Engagement: A rich, multimodal experience keeps users immersed in the virtual environment, essential for effective training and remote operations.
Engineering Use Cases
-
Industrial Training Simulators: Teach operators to handle machinery, control panels, or robotic arms by providing immediate sensory feedback. In the assembly station of XFactory, users can be trained to work alongside a robotic arm to assemble a complex part like the V8 engine. As they correctly (or incorrectly) assemble each component, the system gives a confirmation beep and a short vibration pulse to the controller.
-
Remote Monitoring and Teleoperation: Allow operators to “feel” and “hear” system statuses and anomalies when controlling equipment remotely. In the logistics station of XFactory, the drone is operated remotely. If the path becomes cluttered or a system error is detected (e.g., battery low or sensor failure), a warning tone is played, and a longer vibration indicates urgency.
-
Design and Prototyping: Enable engineers to evaluate new control interfaces or equipment designs by simulating the sensory experience of interacting with physical components. In the exhibit station of XFactory, users can import and evaluate a prototype, touch-sensitive control panel. Each button provides a different audio-haptic profile to simulate physical differences (e.g., soft rubbery button vs. hard metallic switch).
-
Maintenance and Troubleshooting: Help diagnose issues by associating specific haptic and audio cues with malfunctioning systems, such as abnormal vibrations or unexpected sounds. In the welding station of XFactory, a robot’s welding head alignment is faulty. When diagnostics are run, the VR system plays a distorted buzzing sound and sends a repetitive low-pulse vibration to indicate a hardware malfunction.
Design Principles
-
Consistency and Predictability: Ensure each user action always produces the same combined audio–haptic response. A light tap on a console button might always play a soft click and deliver a brief, low-amplitude vibration; a heavy press should consistently yield a deeper thud paired with a stronger pulse. Predictable feedback builds trust and muscle memory.
-
Context-Sensitive Intensity: Scale feedback strength and duration to the importance of the event. Routine actions (opening a drawer, toggling a switch) use short, subtle cues; critical events (safety warnings, system failures) use longer, layered signals—such as a rising alarm tone coupled with rhythmic pulses—to capture attention immediately.
-
Synchronized Modalities: Align audio and haptic timing down to a few milliseconds. For example, when a virtual hammer strikes a nail, play the strike sound and trigger the controller’s impact vibration exactly together. This tight coupling reinforces the illusion of contact and weight.
-
Complementary Frequency Bands: Design haptic waveforms and audio frequencies to occupy different sensory channels. A high-frequency buzz can convey texture or fine details, while a low-frequency rumble signals mass or collision. Pair each with audio tones in a complementary range so they don’t compete for the user’s attention.
-
User Comfort and Adaptability: Allow users to adjust overall feedback levels or switch to “audio-only” or “haptic-only” modes. Continuous, high-intensity vibrations can lead to fatigue; providing balance or mute controls prevents discomfort during extended sessions.
-
Hardware Flexibility: Profile the capabilities of different VR controllers and headsets. Build feedback routines that degrade gracefully: if a device can’t reproduce a deep rumble, substitute a longer, lower-volume vibration with a matching low-pitch audio cue. Always test on the lowest-spec hardware you support.
Haptics in VR
In VR, haptics refers to the use of technology that simulates the sense of touch, allowing users to feel physical sensations in virtual environments. This is achieved through devices like gloves, vests, or controllers that provide feedback such as vibrations, pressure, or force, enhancing immersion and realism in virtual experiences. In XFactory, haptics bring machinery and tools to life. From operating a robot in the assembly station to pushing a button on a forklift dashboard in the logistics station, carefully designed haptic responses reinforce the user’s sense of presence and interaction. Key concepts include:
-
Haptic Feedback: Haptics involve using vibration or force feedback to simulate the sense of touch. In VR, this replicates the feeling of physical interaction with virtual objects. In Unity XR, this is achieved using methods like
SendHapticImpulse(float amplitude, float duration)
on XR controllers. - Tactile Sensation: This refers to the physical sensation provided by haptics. It is defined by parameters such as
Intensity
(amplitude): How strong the vibration is (range 0 to 1).Duration
: How long the vibration lasts (in seconds).
- Haptic Events: Predefined triggers (such as
On Hover Entered
orOn Select Entered
) that initiate a haptic response. These events are often associated with:- Hovering: When the user’s controller comes close to an interactable object.
- Selection/Activation: When the user confirms an interaction (for instance, by pressing a button or grabbing an object).
Principles
-
Proportionality: The strength and duration of the haptic feedback should correlate with the significance of the interaction. For instance, a simple hover might produce a gentle buzz, while a full button press produces a stronger, longer impulse. In XFactory, light buzz when hovering over the robotic tool tray in the assembly station, stronger pulse when grabbing the tool.
-
Synchronization: Haptic cues should be tightly synchronized with visual and audio feedback. Simultaneous feedback helps reinforce the user’s perception of a unified interaction event. In XFactory, when a control panel button is pressed in the welding station, a light turns on, a click sound plays, and the user feels a pulse.
-
Localization: Haptic feedback should feel as if it’s coming from the specific point of interaction. If a control on the panel is pressed, the corresponding controller should deliver the haptic impulse in a localized manner. In XFactory, when selecting the left control lever of a CNC machine, only the left controller vibrates.
-
Device Sensitivity: Since different VR devices may have varying capabilities, it’s important to calibrate haptic feedback so that it is effective and comfortable across multiple hardware setups. For example, what feels like a strong pulse on a Quest controller may be subtle on a Vive.
Example: XFactory Haptics
To create meaningful and immersive haptic feedback in VR, we will build on the existing tire assembly example at the XFactory’s assembly station. This example involves grabbing a tire and attaching it to its base on a tire stand using lug nuts. Using Unity’s XR Interaction Toolkit
, let’s go beyond default haptic settings by configuring both inspector-based parameters and extending functionality with custom C# scripts. Specifically, let’s introduce:
- Hover feedback: A light, short pulse when the controller is near the tire.
- Select feedback: A medium pulse when the tire is grabbed.
- Attach feedback: A stronger, longer pulse when the tire is successfully inserted into its socket on the stand.
This hands-on integration demonstrates how tactile cues can enhance spatial understanding and task precision in VR environments, making interactions more intuitive and engaging.
Prepare Scene & Controllers
- Set Up Your
XR Rig
:- Ensure your scene includes an
XR Rig
with bothLeftHand Controller
andRightHand Controller
objects. - If
XR Rig
is not in theHierarchy
, go toGameObject > XR > Room-Scale XR Rig
.
- Ensure your scene includes an
- Verify the Controllers:
- In the
Hierarchy
, useCtrl
/Cmd
+click to select both theRightHand Controller
andLeftHand Controller
. This allows you to apply changes simultaneously. - Each controller should include an
XR Controller (Action-based)
orXR Ray Interactor
component. If not, add anXR Ray Interactor
component to each.
- In the
- Enable Input Actions:
- Your
XR Rig
should be using theXR Default Input Actions
asset. - These are required for triggering haptics via input bindings.
- Check the
Input Action Manager
on theXR Rig
to verify this.
- Your
- Deploy and Test:
- Press triggers to feel haptic feedback on both controllers.
- Test grip and UI interactions to confirm vibration is triggered.
- Ensure haptics feel responsive and match the action.
- Check for consistency across both hands.
Configure Hover Feedback
The Simple Haptics Feedback
component in the XR Interaction Toolkit
provides a more flexible way to configure vibration feedback during common interaction events like hovering, selecting, or canceling. Let’s use it to create a light pulse when the user’s controller hovers near the tire in the assembly station.
- Add the
Simple Haptics Feedback
Component:- In the
Hierarchy
, select theLeftHand Controller
(repeat forRightHand Controller
if needed). - In the
Inspector
, clickAdd Component
. - Search for and add
Simple Haptics Feedback
.
- In the
- Enable Hover Enter Feedback:
- In the
Simple Haptics Feedback
component, scroll to theHover
section. - Check the box next to
Play Hover Entered
.
- In the
- Set Hover Feedback Parameters:
- Set
Amplitude
to0.25
for light vibration. - Set
Duration
to0.1
to create a short pulse. - Set
Frequency
to0.0
(use default frequency or leave at 0 if unsure).
These values create a gentle, noticeable cue as users move close to the tire, ideal for subtle prompts.
- Set
- Test Hover Feedback in Scene:
- Press
Play
(if using the Meta Link app) or deploy the app to your headset. - Move your controller near the tire positioned on the table.
- You should feel a soft pulse as the ray interactor enters hover range.
This hover interaction feels intuitive and non-intrusive, perfect for signaling interactivity across the factory — like near tools, robot controls, or digital screens.
- Press
- Add Object-Specific Hover Haptics (Optional):
- Optionally, write a script to enable more granular control (e.g., different objects trigger different vibrations) and define feedback per object, rather than globally on the controller.
using UnityEngine; using UnityEngine.XR.Interaction.Toolkit; [RequireComponent(typeof(XRBaseInteractable))] public class CustomHoverHaptics : MonoBehaviour { public float intensity = 0.2f; public float duration = 0.1f; private void OnEnable() { var interactable = GetComponent<XRBaseInteractable>(); interactable.hoverEntered.AddListener(OnHoverEntered); } private void OnDisable() { var interactable = GetComponent<XRBaseInteractable>(); interactable.hoverEntered.RemoveListener(OnHoverEntered); } private void OnHoverEntered(HoverEnterEventArgs args) { if (args.interactorObject is XRBaseControllerInteractor controllerInteractor) { controllerInteractor.SendHapticImpulse(intensity, duration); } } }
- Configure the Script:
- Select the
Tire
GameObject in theHierarchy
. - Click
Add Component
, then addCustomHoverHaptics.cs
. - Tweak
Intensity
andDuration
in theInspector
for a different sensation unique to this object.
- Select the
Use the
Simple Haptics Feedback
component for controller-based interaction logic. Use the custom script when you want different objects (e.g., torque wrench vs. tire) to feel distinct when hovered.
Configure Select Feedback
Now we will configure stronger haptic feedback to reinforce intentional actions, like grabbing a tire, pressing a button, or toggling a switch. In this case, the pulse confirms the user has successfully selected (grabbed) the tire in the assembly station.
- Add the
Simple Haptics Feedback
Component (if not already added):- In the
Hierarchy
, select theLeftHand Controller
(repeat forRightHand Controller
, if needed). - In the
Inspector
, clickAdd Component
. - Search for and add
Simple Haptics Feedback
.
- In the
- Enable Select Enter Feedback:
- In the
Simple Haptics Feedback
component: - Scroll to the
Select
section. - Check the box for
Play Select Entered
.
- In the
- Set Select Feedback Parameters:
- Set
Amplitude
to0.5
to make it more pronounced than hover. - Set
Duration
to0.25
to vibrate longer to signify a “confirmed” action. - Set
Frequency
to0.0
unless using a custom frequency pattern.
This pulse acts like a tactile confirmation — perfect for grabbing the tire, flipping a safety switch, or initiating a machine cycle.
- Set
- Test Select Feedback in Scene:
- Press
Play
(if using the Meta Link app) or deploy the app to your headset. - Aim at the tire, and grab it using the interactor (direct or ray).
- You should feel a firm pulse when the
Play Select Entered
event fires, confirming successful selection.
- Press
- Add Custom Select Feedback Per Object (Optional):
- Similar to the custom hover haptics example, you can different objects to give unique haptic responses on selection. Here’s a custom script you can attach directly to the object.
using UnityEngine; using UnityEngine.XR.Interaction.Toolkit; [RequireComponent(typeof(XRBaseInteractable))] public class CustomSelectHaptics : MonoBehaviour { public float intensity = 0.5f; public float duration = 0.25f; public float frequency = 0.0f; private void OnEnable() { var interactable = GetComponent<XRBaseInteractable>(); interactable.selectEntered.AddListener(OnSelectEntered); } private void OnDisable() { var interactable = GetComponent<XRBaseInteractable>(); interactable.selectEntered.RemoveListener(OnSelectEntered); } private void OnSelectEntered(SelectEnterEventArgs args) { if (args.interactorObject is XRBaseControllerInteractor controllerInteractor) { controllerInteractor.SendHapticImpulse(intensity, duration, frequency); } } }
- Configure the Script:
- Select the
Tire
GameObject in theHierarchy
. - Click
Add Component
, and chooseCustomSelectHaptics.cs
. - Modify the
Intensity
andDuration
values in theInspector
for desired feedback.
- Select the
Configure Attach Feedback
When the user successfully attaches the tire onto its platform (i.e., socket) at the tire stand in the assembly station, we want to simulate a clear tactile “click” — like snapping something firmly into place. When the socket detects the tire has been correctly placed, the controller delivers a firm, long vibration. This feedback adds a strong sense of completion to the interaction. Here is how to configure this behavior:
- Add the
Simple Haptics Feedback
Component (if not already added):- In the
Hierarchy
, select theLeftHand Controller
(andRightHand Controller
, if applicable). - In the
Inspector
, clickAdd Component
. - Add
Simple Haptics Feedback
.
- In the
- Set Attach Feedback Parameters:
- Scroll to the
Select
section. - Check the box labeled
Play Select Exited
. - Set
Amplitude
to0.7
– strong and satisfying. - Set
Duration
to0.35
– longer to simulate the weight and resistance of snapping the tire in. - Set
Frequency
to0.0
– leave at 0 unless using a specific pulse frequency.
Why this event? In socket-based interactions, the grabbable is often released (select exited) when it snaps into the socket. This makes
Select Exited
a good trigger for “attachment complete” feedback. This feedback makes it feel like the tire is firmly and correctly mounted on the stand — just like snapping a car wheel into place. - Scroll to the
- Test Attach Feedback in Scene:
- Enter
Play
mode (with Link app) or deploy the app. - Grab the tire and place it into the socket on the tire stand.
- Once inserted and released, the
Play Select Exited
event fires — triggering a deep, confident pulse on the controller.
- Enter
This sensation pairs well with visual alignment cues and a click sound for multisensory feedback.
Audio in VR
Audio in VR plays a crucial role in creating immersive and realistic experiences by simulating 3D spatial sound that corresponds to the user’s environment and physical movements. It enhances presence, emotional engagement, and user understanding by allowing users to perceive direction, distance, and material characteristics of sound sources, just as they would in a real-world engineering environment. Key concepts include:
-
Audio Feedback: Audio feedback uses sound cues to indicate interactions, state changes, or successful task completions. These auditory signals can be subtle (e.g., soft beeps for UI interaction) or mechanical (e.g., metallic clunks when machinery operates). In XFactory, placing a box into a rack at the logistics station can trigger a satisfying “clunk” sound, reinforcing task completion.
-
Spatial Audio: This technique adjusts sound characteristics—such as stereo balance, volume, and filtering—based on the user’s position and orientation. It provides directional awareness, helping users interpret where sounds are coming from and how far away they are. In the manufacturing station of XFactory, the CNC machines should sound louder as users walk toward them and quieter as they move away.
-
Audio Events: Audio events are specific triggers that play sound effects during interactions. These include object grabs, button presses, or environmental triggers (like a door opening). In the assembly station, interacting with the large display screen should trigger a crisp UI confirmation tone.
Principles
-
Immersion & Presence: Craft audio that aligns seamlessly with the visual and interactive world. Every machine, tool or environment element should emit appropriate sounds—whether it’s the steady, low-pitched hum of a CNC lathe or the distant echo of footsteps in a factory hall. Even subtle layers of ambient noise (airflow, distant machinery, electrical buzz) reinforce spatial context and keep users “in” the scene.
-
Spatial & Directional Cues: Use accurate 3D audio positioning so users can pinpoint sound sources at a glance. In a multi-station workshop, they should hear a welding arc crackle to their left or a conveyor belt whirring behind them. Consistent volume attenuation and HRTF-based panning help operators quickly locate equipment or hazards without breaking visual focus.
-
Clarity & Consistency: Define a clear “audio vocabulary” for key actions and system states. Reuse the same short beep for every successful control-panel selection or the same alert tone for safety warnings across all modules. Predictable, unambiguous sounds build muscle memory and reduce cognitive load, so users instantly recognize outcomes and system responses.
-
Layered Soundscapes By blending these levels thoughtfully—mixing quieter ambient tracks under louder interactive cues—you create a rich, believable soundscape that enhances situational awareness without masking critical feedback. Build depth by stacking multiple audio layers:
- Ambient Base: Continuous environmental loops (air handling units, distant machinery).
- Functional Mid-Layer: Operational sounds tied to active components (motors, hydraulics).
- Interactive Top-Layer: Event-driven cues (button clicks, error tones).
Event-Driven Audio
In VR, event-driven audio feedback is triggered in response to specific game or application events, such as footsteps, machinery actions, UI interaction, or object manipulation. It creates a dynamic and interactive sound experience that responds to user behavior in real time, enhancing immersion by responding to user interactions. Let’s demonstrate how to trigger different sounds when a large box is dropped onto various surfaces inside the logistics station of the XFactory
. We will use XR Interaction Toolkit
for grabbing and dropping, AudioSource
for playing sounds, and physics-based collision detection for triggering different clips.
- Prepare Your Environment:
- Inside the logistics station, create or place a grabbable box (e.g.,
Box_Large_01a_Prefab_01
). - Make sure the box has a
Rigidbody
, aCollider
, and anXR Grab Interactable
. - Position the box next to a the scale (
Industrial_Scale_01a
) and a wooden pallet (e.g., Pallet_02_Prefab_02). - Confirm the scale, pallet, and concrete floor have appropriate colliders.
- Inside the logistics station, create or place a grabbable box (e.g.,
- Create and Configure an Audio Trigger Script:
- Make sure your box prefab (
Box_Large_01a_Prefab_01
) has anAudioSource
component. If not, the script will automatically add one at runtime, but it’s good practice to add and configure it in advance if you want to control volume, spatial blend, or other settings. - Create and attach the following script to your box prefab:
using UnityEngine; public class SurfaceAudioTrigger : MonoBehaviour { public AudioClip metalClip; public AudioClip woodClip; public AudioClip concreteClip; private AudioSource audioSource; void Start() { audioSource = GetComponent<AudioSource>(); if (audioSource == null) { audioSource = gameObject.AddComponent<AudioSource>(); } } private void OnCollisionEnter(Collision collision) { string tag = collision.gameObject.tag; switch (tag) { case "Metal": PlaySound(metalClip); break; case "Wood": PlaySound(woodClip); break; case "Concrete": PlaySound(concreteClip); break; } } void PlaySound(AudioClip clip) { if (clip != null) { audioSource.PlayOneShot(clip); } } }
- Make sure your box prefab (
- Configure the Script:
- Select the box GameObject in the
Hierarchy
. - In the
Inspector
, find theSurface Audio Trigger
component. - Drag appropriate audio clips into the three fields:
Metal Clip
,Wood Clip
, andConcrete Clip
.
- Select the box GameObject in the
- Tag the Surfaces:
- Tag the scale
Metal
. - Tag the pallet
Wood
. - Tag the floor
Concrete
, - Go to
Tags & Layers > Add Tag
, addMetal
,Wood
, andConcrete
, and assign them to the respective objects.
- Tag the scale
- Test the Behavior:
- Play the scene (using the Link app) or deploy it to your VR headset.
- Pick up the box using the controller.
- Drop it on each surface.
- Listen for context-aware impact audio feedback.
This setup showcases dynamic feedback based on user action and environment context, tightly integrated VR interaction and audio design. You can expand this system to other factory objects—like different drop sounds for metal parts in the production station, or warning buzzers if items are dropped in restricted zones.
3D Spatial Audio
3D spatial audio simulates sound sources in a three-dimensional space, allowing users to perceive direction, distance, and environmental effects. This is especially useful in immersive VR environments like your XFactory
, where spatial context enhances realism and user awareness. Let’s enable ambient machine sound for the CNC_Mill_Set
GameObject in the manufacturing station using Unity’s 3D audio features.
- Prepare the CNC Machine:
- In the
Hierarchy
, locateCNC_Mill_Set
- Confirm it already has an
Audio Source
component (or add one if not).
- In the
- Configure the
Audio Source
:- Go to the
Audio Source
component. Audio Resource
: Assign a looping ambient sound (e.g.,machine.ogg
) that represents the machine hum.Play On Awake
: Enable this so the sound starts automatically when the scene loads.Loop
: Enable this to keep the sound playing continuously.Spatial Blend
: Set to1.0
to make the sound fully 3D and spatially aware.Min Distance
: Set to3
meters. The sound will be at full volume within this range.Max Distance
: Set to20
meters. Beyond this distance, the sound fades out gradually.
Optionally, adjust
Doppler Level
to simulate pitch shift when the user or object moves, orSpread
to control stereo width for large machines. - Go to the
- Automate Playback and 3D Settings (Optional):
- Attach the script below to the
CNC_Mill_Set
GameObject. - Assign an ambient clip such as
machine.ogg
in theInspector
.
using UnityEngine; // This script plays a looping ambient sound (e.g., CNC machine hum) in the environment [RequireComponent(typeof(AudioSource))] public class AmbientMachineSound : MonoBehaviour { [Tooltip("The looping ambient sound clip to play (e.g., CNC hum)")] public AudioClip ambientClip; private AudioSource audioSource; void Start() { // Get or add the AudioSource component audioSource = GetComponent<AudioSource>(); // Configure AudioSource for spatial audio playback audioSource.clip = ambientClip; audioSource.loop = true; audioSource.playOnAwake = true; audioSource.spatialBlend = 1.0f; // Fully 3D audioSource.minDistance = 3f; // Full volume within 3 meters audioSource.maxDistance = 20f; // Sound fades beyond 20 meters // Play the ambient sound audioSource.Play(); } }
The script will assign the clip to the
AudioSource
, enable looped playback and 3D spatial blending, set appropriate near and far distances, and start the sound automatically on scene load. Users will hear the machine hum fade in and out based on their proximity. This is ideal for use with prefabs or when managing multiple machines — saving time and ensuring consistent audio behavior. - Attach the script below to the
- Add
Audio Reverb Zone
for Factory Acoustics (Optional):- In the
Hierarchy
, right-click and chooseAudio > Audio Reverb Zone
. - Position the zone to fully encompass the desired area (e.g., manufacturing station).
- Go to the
Inspector
. - Set
Min Distance
to5
meters. Inside this range, full reverb is applied. - Set
Max Distance
to15
meters. Outside this, reverb fades out completely. - Set
Reverb Preset
toFactory
,Hangar
, orRoom
to simulate realistic industrial acoustics.
- In the
Reverb zones only affect 3D audio sources within their area. As the user moves in or out of the zone, sounds dynamically gain or lose reverb — enhancing the spatial feel of different factory zones.
Proximity-Triggered Audio
A Proximity-Triggered Audio
setup plays sound only when the user is within a defined range of a sound-emitting object. This improves realism and performance by avoiding unnecessary audio playback in distant or inactive areas. Instead of always playing machine audio, let’s make the CNC_Mill_Set
emit sound only when the user approaches, mimicking a real-world scenario where machine noise becomes prominent only when nearby.
- Add and Configure the Audio Source:
- Select the
CNC_Mill_Set
GameObject in the manufacturing station. - Add an
Audio Source
component if it doesn’t already exist. - Go to the
Inspector
. Audio Clip
: Assign a mechanical loop (e.g.,machine.ogg
).Loop
: Enabled.Play On Awake
: Disabled (sound will be triggered via script).Spatial Blend
:1
(Fully 3D).Min Distance
:3
(full volume within 3m).Max Distance
:20
(fades out gradually).
- Select the
- Add and Configure the Proximity Trigger Script:
- Prepare a script that checks the distance between the user’s headset (camera) and the CNC machine. If within a specified radius, the sound plays. If the user walks away, the sound stops.
using UnityEngine; /// Plays machine sound when user is near the CNC_Mill_Set object. public class ProximityAudioTrigger : MonoBehaviour { public AudioClip machineClip; public Transform userHead; public float playDistance = 6f; private AudioSource audioSource; void Start() { audioSource = GetComponent<AudioSource>(); audioSource.clip = machineClip; audioSource.loop = true; audioSource.spatialBlend = 1f; // Do not play on awake; controlled by proximity } void Update() { if (userHead == null) return; float distance = Vector3.Distance(userHead.position, transform.position); if (distance < playDistance && !audioSource.isPlaying) { audioSource.Play(); } else if (distance >= playDistance && audioSource.isPlaying) { audioSource.Stop(); } } }
- Configure the Script:
- Attach the
ProximityAudioTrigger
script to the sameCNC_Mill_Set
GameObject. - Go to the
Inspector
. - Drag your audio clip (e.g.,
machine.ogg
) into theMachine Clip
field. - Drag the
Main Camera
(XR Rig > Camera Offset > Main Camera
) into theUser Head
field. - Adjust the
Play Distance
value to control when the sound activates (e.g.,6
meters for a moderate range).
- Attach the
- Play and Test:
- Enter Play Mode in Unity with your VR headset connected.
- Walk toward the CNC machine and listen for the audio to fade in naturally as you approach.
- Move away and notice the audio fading out until it stops completely.
- Try different approach angles to confirm the sound spatialization feels realistic.
- Adjust
Play Distance
,Min Distance
, andMax Distance
in the Inspector if the audio triggers too soon, too late, or feels unnatural.
Proximity-triggered audio limits active audio sources in large scenes, reducing processing load. It also mimics how object sounds become audible only when the user is nearby and enhances spatial awareness by tying audio to physical proximity.
Key Takeaways
Designing VR experiences that incorporate both touch and sound transforms user interaction from purely visual engagement into a multisensory, intuitive process. In engineering simulations like XFactory, synchronized haptic and audio cues enhance realism, situational awareness, and task precision—whether it’s a gentle buzz to guide a hand, a crisp click confirming an action, or spatial audio placing machine noise exactly where it belongs. Effective implementations follow clear principles: feedback should be consistent, proportional to the action’s importance, contextually synchronized across senses, and adaptable to user comfort and hardware capabilities. By layering and tailoring these cues to different scenarios—training, remote operation, design evaluation, or troubleshooting—developers can create VR environments that are not only more immersive but also safer, more efficient, and more engaging for users.