F01


Learning Outcomes

  • Describe the role of spatial UIs in VR environments. Before class, review how immersive UIs support interaction, control, and feedback, drawing on engineering examples like CNC HMIs, teach pendants, and super displays in the XFactory environment.
  • Explain how VR UIs function as input systems. Ahead of the session, study how components such as buttons, toggles, and sliders can trigger actions like animations or custom scripts, focusing especially on the CNC HMI example.
  • Explain how VR UIs function as output systems. To prepare, explore how UIs convey system status, warnings, and environmental data, using the logistics scale example that outputs weight and visual alerts as a reference.
  • Apply best practices for VR UI design. In advance, identify 2–3 key principles—such as clarity, performance, or ergonomic design—and reflect on how they improve trust, comfort, and usability in VR engineering contexts.

Role of UI in VR

User interfaces (UIs) in VR function as the critical link between the user and the virtual environment, enabling inputs that affect the system and providing outputs that reflect changes or feedback from it. Imagine virtual control panels for industrial machinery, immersive data dashboards for real-time monitoring, or collaborative design reviews in a simulated environment. VR UIs support intuitive, spatial interactions, allowing engineers to manipulate digital representations of physical systems with precision and ease. Core Unity UI definitions as follows:

  • User Interface (UI): In VR, a UI is a spatial, interactive system element embedded in the 3D world, acting as both an input mechanism and an output surface. For example, in XFactory, the robot teach pendant is a virtual tablet near the UR10e robot. Users interact with it to start or stop sequences, jog joints, or trigger safety overrides—clear examples of UI as input. In response, the UI provides output through visual changes, audio cues, or haptic feedback confirming the action.

  • World Space UI: A UI that exists within 3D space like any other object in the virtual world. These UIs allow physical-like interaction (input) and visible system response (output). In XFactory, the CNC HMI is a wall-mounted panel on the machine. Users interact with its buttons and sliders to control machine behavior (input), and the panel displays real-time system status and parameters like spindle speed or G-code load state (output).

  • Screen Space UI: A UI that is fixed to the user’s view, independent of head movement or world position. While rarely used in immersive VR due to its break in spatial continuity, screen space UI can serve targeted output functions in XFactory—such as temporary overlays for calibration, loading indicators, or system warnings that must be seen regardless of user orientation.

  • XR Ray Interactor: A component of the XR Interaction Toolkit that enables users to point at and interact with UI elements at a distance using a laser-style pointer. This tool facilitates input when users select buttons, sliders, or menu items without walking up to the UI. In the curved super display at the exhibit station, users employ ray interaction to control data views, play training content, or filter performance metrics—all while receiving responsive output in the form of animations, audio cues, and updated visual states.

  • Event System: Unity’s internal system that processes and routes user interactions to the appropriate UI elements. For VR applications, it must include the XR UI Input Module, which enables controller-based interaction (e.g., raycasting or direct touch) to properly register events like OnClick or OnValueChanged. In XFactory, this system ensures that inputs from the user—such as pressing the “Emergency Stop” button—are correctly interpreted and trigger output responses like system shutdown indicators or alert messages on the display.

F02

In XFactory, VR UIs are integrated across the facility to mimic real industrial applications:

  • Robot Teach Pendant: A world-space UI with buttons for jogging robot joints, initiating pick/place routines, and loading saved paths (UI as input).
  • CNC HMI: A touch display with sliders to adjust spindle speed, buttons to load G-code, and real-time status indicators (UI as input and output).
  • Large Touch Displays: Used for real-time monitoring, safety overrides, and visualizing data analytics (primarily UI as output).
  • Curved Super Display: A panoramic interface for exploring factory performance metrics, digital twins, and training modules using ray-based interaction (UI as input and output).

Key Concepts

  • UI as Input: UIs in VR allow users to send signals to the environment—controlling machines, activating functions, or changing system states. These are deliberate user actions, typically through buttons, sliders, toggles, or ray interactions. For instance, in the robot teach pendant, users jog robot joints, initiate routines, or load saved paths. Each of these UI elements serves as an input—triggering corresponding actions in the virtual system.

  • UI as Output: UIs in VR also serve as a surface where the environment communicates back to the user—providing feedback, displaying system status, and offering confirmation or alerts. Outputs may take the form of visual changes (like a button flashing), audio cues, updated text, or system-generated animations. For example, the CNC HMI panel shows current spindle speed, loaded program status, or warning indicators based on the machine’s virtual behavior.

  • Spatial Awareness: Users interact with VR UIs from within a 3D world, so interfaces must be placed with depth and user mobility in mind. In the robot teach pendant, UI panels are placed within arm’s reach but allow slight movement without losing focus. In contrast, the super display in the exhibit station uses distance-based layout to give an immersive data exploration experience.

  • Interactivity and Feedback: Every action in VR should respond with multimodal feedback. When users press the “Start Spindle” button on the CNC HMI, a button press sound plays, the button visually animates (color change or scale), and the display updates to show machine status. In the assembly station display, sliders animate smoothly, and button presses may trigger robot arms or display warnings—each accompanied by audio and visual cues.

  • Immersion and Presence: UI elements must match the visual fidelity and logic of the environment. The large touch displays mounted across the factory use metal-like frames and appropriate screen textures to blend into the industrial setting. Even though they are interactive, their design mirrors real-world hardware, maintaining the illusion of presence.

  • Ergonomics and Usability: UI design must consider hand/controller ergonomics. The teach pendant includes large buttons (5-8 cm in world space) with ample spacing to avoid accidental presses. The CNC panel also places critical controls like “Emergency Stop” in obvious, central positions with distinguishable colors (red for stop, green for start). Layout mimics real industrial HMIs to reduce learning curves.

  • Consistency and Clarity: Across the factory, UI panels use a consistent visual style: same button size, fonts, border designs, and interaction logic. Whether a user is in the welding station or the exhibit area, a “Back” button looks and behaves the same. This consistency reduces user errors and increases trust in the system.

Design Principles

  • Design for the 3D Environment: Embrace the spatial nature of VR. The super curved display in the exhibit station wraps slightly around the user’s view, creating a panoramic effect that encourages head and body movement. The robot teach pendant panel rotates subtly to face the user’s head direction when they enter the interaction zone, ensuring optimal visibility.

  • Prioritize Interaction: Use the right interaction method for the context. The CNC HMI favors touch input due to close proximity, while the exhibit display relies on ray-based input due to its scale. For the teach pendant, users can toggle between direct touch and ray interaction depending on their distance—both provide responsive feedback through animation and sound.

  • Optimize for Performance: Heavy use of UI elements can affect performance, especially with dynamic or animated content. In XFactory, the large assembly station display uses canvas batching and simple materials to ensure UI remains performant even when multiple elements are visible. This ensures smooth interaction even on lower-end VR headsets.

  • Accessibility and Adaptability: Every UI in XFactory can be adjusted via user preferences. For instance, the super display includes a settings panel allowing the user to scale text size or enable high-contrast mode. The teach pendant can reposition itself based on the user’s height, ensuring usability whether the user is standing, seated, or in a wheelchair.


UI as Input

In VR, UI serves as a primary means of user input, allowing users to interact with digital systems as they would with real-world objects. UI as input enables the triggering of actions such as starting a motor, toggling power, or initiating sequences. These inputs often simulate pressing physical buttons, adjusting sliders, or flipping switches in a spatial, embodied context. In engineering scenarios, UI as input can replicate real industrial control logic, where a physical interface sends signals to a machine. In VR, we map this interaction into 3D space using Unity’s XR Interaction Toolkit.

F03

UI Input Elements

UI input elements are the building blocks that allow users to interact with and control systems in VR. These components translate physical-like gestures—such as pressing, sliding, or pointing—into digital signals that drive behavior in the virtual environment.

  • Virtual Buttons and Switches: Used to trigger one-time actions or toggle machine states. Buttons typically fire events when clicked (e.g., “Start” or “Stop”), while switches (toggles) are better for persistent states like power or mode settings.

  • Toggles and Sliders: Toggles simulate on/off switches, while sliders represent adjustable parameters such as speed, volume, or temperature. These are useful for systems that require gradual or continuous input.

  • Touch and Ray Interaction: Users can interact with UI through direct touch (using hands or controllers) or remotely using a ray (laser pointer). The choice depends on the UI’s size, distance, and context of use.

  • Dropdowns and Scrollbars: Useful for selecting from a list of items or navigating through long content. In VR, these must be simplified and large enough for easy interaction.

  • Input Fields: Allow text or numerical input using a virtual keyboard. Less common in VR but can be useful for entering parameters or codes in engineering simulations.

  • Gestural Zones or Proximity Triggers: Although not standard UI elements, these can function as inputs by detecting when the user enters a zone or performs a hand gesture, triggering actions without explicit button presses.

Example: CNC Machine HMI

Let’s implement a fully interactive VR UI panel on the CNC lathe machine (CNC_Lathe) in the manufacturing station. Users will be able to POWER the machine on/off (i.e., turn machine ON/OFF, play/stop machine sound, and activates/deactivates machine lights), START it (close doors and rotate chuck), and STOP it (open doors and stop chuck). Here is how to set up the UI input and implement this behavior.

  1. ** Configure the Machine HMI’s Canvas:**
    • In the Hierarchy, navigate to Manufacturing > CNC_Lathe > CNC_HMI. This is a 3D model representing the machine’s control interface. It does not yet include a Canvas for UI elements.
    • With CNC_HMI selected, right-click and choose XR > UI Canvas. This adds a new child Canvas set up for XR interaction.
    • Rename the new object to Canvas_HMI for clarity.
    • Use the gizmos and Rect Tool to rescale and reposition the Canvas so it fits cleanly on the front face of the CNC_HMI model. Make sure it is not embedded inside or blocked by the mesh.
    • Confirm Render Mode is set to World Space.
    • Use XR UI Input Module (remove Standalone Input Module).
    • Canvas must include Tracked Device Graphic Raycaster.

    01

  2. Create Overlay Buttons on the Canvas:
    • Under Canvas_HMI, add 3 Unity UI Buttons: PowerButton (Toggle), StartButton, and StopButton. To do so, right-click on Canvas_HMI and choose UI > Toggle or UI > Button - TextMeshPro.
    • Position each UI button directly over the corresponding 3D button model in the scene.
    • Use the RectTransform to size each button so it matches the shape and boundary of the physical button model.
    • Select each button’s child Text (TMP) object and disable or delete it.

    02

    The CNC_HMI model already includes 3D models of the POWER, START, and STOP buttons. Rather than replacing these with visible Unity UI elements, we’ll overlay invisible, functional UI buttons on top of the models to make them interactive.

  3. Make Buttons and Toggle Transparent:
    • Select the button objects (StartButton and StartButton).
    • Go to the Image component in the Inspector.
    • Set the A (color alpha) to 0 (fully transparent). Alternatively, assign a transparent UI sprite as the source image.
    • Disable or delete the child Text (TMP) element if you haven’t done so already.
    • Select the toggle object (PowerButton).
    • Expand its child hierarchy.
    • Select the Background image. Set A to 0.
    • Select the Checkmark image. Set A to 0.
    • Disable or delete the Label (TMP) object if you haven’t done so already.

    03

    These steps ensure the UI components remain invisible but fully functional, enabling interaction through ray or touch without obscuring the 3D button models.

  4. Confirm Interaction Readiness:
    • Ensure the Canvas includes a Tracked Device Graphic Raycaster.
    • Ensure buttons have Interactable enabled (Toggle > Interactable and Button > Interactable).
    • Leave space between buttons to prevent accidental interaction.

    At runtime, the user will see only the original button models, but interaction will be handled via the overlaid invisible Unity UI buttons. This preserves the realism of the HMI panel while enabling XR input support.

  5. Prepare Scene Elements for Interaction:
    • Attach an AudioSource to CNC machine for machine sound.
    • Locate the three machine light GameObjects (Spot Light_01, Spot Light_02, Spot Light_03).
    • Locate the chuck (the rotating part of the machine) GameObject (CNC_Chuck).
    • Locate the left and right door GameObjects (CNC_Door_L, CNC_Door_R).

    04

  6. Script the Chuck Rotation Behavior:
    • Create the following script and attach it to the CNC_Chuck GameObject to enable/disable rotation. This a prerequisite for scripting the HMI behavior as discussed next.
     using UnityEngine;
    
     public class ChuckRotator : MonoBehaviour
     {
         public float rotationSpeed = 180f;
         public bool isRotating = false;
    
         void Update()
         {
             if (isRotating)
                 transform.Rotate(Vector3.right, rotationSpeed * Time.deltaTime);
         }
     }
    
  7. Configure the Script:
    • Rotation Speed: Set to 180 (or adjust for desired RPM effect).
    • Is Rotating: Leave unchecked by default. This will be controlled by the following HMI script at runtime.

    05

  8. Script the HMI Behavior:
    • Attach the following script to Canvas_HMI.
    • This script handles input events from buttons and triggers machine animation and chuck control based on user interaction.
     using UnityEngine;
     using UnityEngine.UI;
    
     public class CNCPanelController : MonoBehaviour
     {
         [Header("Machine Elements")]
         public GameObject[] spotlights;
         public GameObject chuck;
         public Animator machineAnimator; // Animator on CNC_Lathe
    
         [Header("UI Buttons")]
         public Toggle powerToggle;
         public Button startButton;
         public Button stopButton;
    
         [Header("Audio Clips")]
         public AudioClip machineRunningClip;
    
         private AudioSource audioSource;
         private bool isMachineOn = false;
         private bool isRunning = false;
    
         void Awake()
         {
             audioSource = GetComponent<AudioSource>();
             if (audioSource == null)
             {
                 audioSource = gameObject.AddComponent<AudioSource>();
             }
    
             audioSource.playOnAwake = false;
             audioSource.loop = true;
         }
    
         void Start()
         {
             if (powerToggle != null) powerToggle.onValueChanged.AddListener(OnPowerToggle);
             if (startButton != null) startButton.onClick.AddListener(OnStart);
             if (stopButton != null) stopButton.onClick.AddListener(OnStop);
         }
    
         void OnPowerToggle(bool isOn)
         {
             isMachineOn = isOn;
    
             if (isOn && machineRunningClip != null)
             {
                 audioSource.clip = machineRunningClip;
                 audioSource.Play();
             }
             else
             {
                 audioSource.Stop();
             }
    
             foreach (var light in spotlights)
                 if (light != null) light.SetActive(isOn);
    
             if (!isOn)
             {
                 isRunning = false;
                 SetChuckRotation(false);
                 // Doors are not affected by power toggle
             }
         }
    
         void OnStart()
         {
             if (!isMachineOn) return;
    
             TriggerDoorAnimation("Close_Doors");
             SetChuckRotation(true);
             isRunning = true;
         }
    
         void OnStop()
         {
             if (!isMachineOn) return;
    
             SetChuckRotation(false);
             TriggerDoorAnimation("Open_Doors");
             isRunning = false;
         }
    
         void SetChuckRotation(bool rotate)
         {
             if (chuck == null) return;
    
             var rotator = chuck.GetComponent<ChuckRotator>();
             if (rotator != null)
                 rotator.isRotating = rotate;
         }
    
         void TriggerDoorAnimation(string triggerName)
         {
             if (machineAnimator != null)
                 machineAnimator.SetTrigger(triggerName);
         }
     }
    
  9. Configure the Script:
    • Machine Elements:
    • Spotlights: Add all 3 spotlight GameObjects to the array.
    • Chuck: Drag the CNC_Chuck GameObject (must have ChuckRotator.cs script).
    • Machine Animator: Drag the CNC_Lathe GameObject that has the Animator with the Open_Doors and Close_Doors triggers. - Buttons:
    • Power Toggle: Drag the Toggle UI element for the Power button.
    • Start Button: Drag the Button UI element for Start.
    • Stop Button: Drag the Button UI element for Stop. - Audio Clips:
    • Machine Running Clip: Drag the looping audio file to play while the machine is powered on. 06
  10. Play and Test in VR:
    • Enter Play Mode in Unity with your VR headset connected.
    • Point at the invisible UI areas over the HMI buttons and use your VR controller or hand tracking to interact.
    • Toggle Power and watch the spotlights turn on/off and machine sound start/stop.
    • Press Start to see the chuck begin rotating and doors close automatically.
    • Press Stop to halt rotation and open the doors.
    • Check that all actions feel natural in VR, with no accidental button presses or alignment issues.
  1. Optional Extensions:
    • You can extend this system with UI as output by adding status text (e.g., “Machine Running” or “Doors Closed”), visual button feedback (e.g., color change or glow on press), or haptic or audio cues to confirm each action.
    • To further improve realism, consider animating the door movement instead of snapping their positions or fading spotlights in and out smoothly rather than toggling instantly

This entire system demonstrates UI as input—the user presses buttons that cause state changes in the virtual environment, such as turning the machine on/off, starting/stopping the chuck, or opening/closing doors.


UI as Output

In VR, UI elements serve as critical output channels, displaying information from the environment back to the user. These outputs help users understand system status, receive confirmation of actions, monitor sensor readings, or respond to changing conditions. UI as output mimics real-world feedback systems, like indicator lights, gauges, or control displays, that reflect the internal state of a machine or process. In VR, this concept is extended through dynamic text updates, color changes, flashing indicators, and sound or haptic responses that mirror real-world cues. In Unity, these outputs are typically driven through scripts that update UI elements based on runtime events or data.

F04

UI Output Elements

UI output components visually communicate real-time or event-based data from the virtual environment. These elements often work in tandem with system states or user input to present contextual feedback.

  • Dynamic Text Fields: Display numerical or string values (e.g., machine status, object weight, task completion messages) updated via scripts. These fields can reflect constantly changing variables like temperature, load, or runtime conditions in real-world equipment simulations.

  • Image Components and Icons: Change sprite, color, or visibility based on system state (e.g., warnings, indicators, error lights). They serve as intuitive visual cues, replacing or enhancing physical LEDs or indicator panels in industrial contexts.

  • Animations or Flashing Effects: Draw user attention to critical changes, such as alerts or unsafe conditions. Flashing, pulsing, or movement effects increase visibility and urgency, especially in safety-critical applications.

  • Audio or Haptic Feedback: While technically sensory, these often accompany UI output to reinforce state transitions or warnings. Sounds like beeps or buzzers and controller vibrations strengthen immersion and ensure important events are noticed even if not directly seen.

  • UI Color States: Color changes (e.g., green = safe, red = warning) are widely used in engineering contexts to indicate output status. These visual conventions provide instant recognition and align with industry-standard signage for hazards and system conditions.

  • Graphs, Gauges, and Meters: More complex UIs may visualize data trends or magnitudes, though simplified for VR usability. These elements help users monitor performance or diagnostics over time, useful in training dashboards or factory analytics displays.

Example: Industrial Scale HMI

Let’s create a fully functional HMI display for the Industrial_Scale_01a located in the logistics station. We will simulate a real industrial weighing scale that reads the weight (i.e., Rigidbody.mass) of any GameObject placed on it and outputs that weight to a display. If the object exceeds a configurable maximum weight threshold, the display turns red and begins flashing to indicate overload. Here is how to implement this behavior.

  1. Identify and Prepare UI Elements:
    • In the Hierarchy, locate Industrial_Scale_01a > HMI.
    • Confirm the HMI contains a Display Background (an Image component for visual styling) and a Display Text: A TextMeshProUGUI component used to show the numeric weight.

    07

    These elements are pre-designed but currently static. We will drive them through a new script that monitors object placement and updates the text dynamically.

  2. Add a Trigger Zone for Weighing:
    • In the Hierarchy, right-click on Industrial_Scale_01a and select Create Empty. Rename the new GameObject to Weighing Platform.
    • Set its Position to 0, 0, 0, then adjust its local position (via the Transform) so it sits slightly above the actual platform surface, right where the user would place a box or object for weighing.
    • With WeighingPlatform selected, go to Add Component and add a Box Collider.
    • In the Box Collider component, check the Is Trigger box.
    • Use the Box Collider’s Center and Size values to define a tight bounding box that covers only the weighable surface area of the scale.

    08

    This method avoids modifying the original Mesh Collider used for physical collisions, while still enabling trigger-based logic for detecting when objects are placed on the scale.

  3. Create and Configure the Scale Display Script:
    • Create a script called ScaleDisplay.cs and attach it to the Weighing Platform GameObject.
     using UnityEngine;
     using TMPro;
     using UnityEngine.UI;
    
     public class ScaleDisplay : MonoBehaviour
     {
         [Header("Display Components")]
         public TextMeshProUGUI displayText;
         public Image displayBackground;
    
         [Header("Weight Settings")]
         public float weightThreshold = 50f; // Max allowed weight in kg
         private Rigidbody currentObject;
         private float currentWeight;
    
         [Header("Flash Settings")]
         public Color normalColor = Color.green;
         public Color warningColor = Color.red;
         public float flashInterval = 0.5f;
    
         private bool isOverloaded = false;
         private float flashTimer = 0f;
         private bool flashState = false;
    
         void Update()
         {
             if (currentObject != null)
             {
                 currentWeight = currentObject.mass;
                 displayText.text = $"{currentWeight:F1} kg";
    
                 if (currentWeight > weightThreshold)
                 {
                     if (!isOverloaded)
                     {
                         isOverloaded = true;
                         flashTimer = 0f;
                         flashState = true;
                     }
    
                     HandleFlashing();
                 }
                 else
                 {
                     isOverloaded = false;
                     ResetDisplayColor();
                 }
             }
             else
             {
                 currentWeight = 0f;
                 displayText.text = "---";
                 ResetDisplayColor();
             }
         }
    
         private void HandleFlashing()
         {
             flashTimer += Time.deltaTime;
    
             if (flashTimer >= flashInterval)
             {
                 flashTimer = 0f;
                 flashState = !flashState;
    
                 displayText.color = flashState ? warningColor : normalColor;
             }
         }
    
         private void ResetDisplayColor()
         {
             displayText.color = normalColor;
         }
    
         void OnTriggerEnter(Collider other)
         {
             var rb = other.attachedRigidbody;
             if (rb != null)
             {
                 currentObject = rb;
             }
         }
    
         void OnTriggerExit(Collider other)
         {
             if (currentObject != null && other.attachedRigidbody == currentObject)
             {
                 currentObject = null;
             }
         }
     }
    
  4. Configure the Script:
    • Drag the Display Text object from Industrial_Scale_01a > HMI > Display Text into the Display Text field.
    • Drag the Display Background object from Industrial_Scale_01a > HMI > Display Background into the Display Background field.
    • Set Weight Threshold to 50 kg (or another appropriate limit).
    • Set Normal Color to green (e.g., #00FF00).
    • Set Warning Color to red (e.g., #FF0000).
    • Set Flash Interval to 0.5 seconds for a visible flashing rhythm.

    09

    This setup links the virtual scale’s display to the underlying system logic so that UI elements are updated automatically in real time based on physical object interaction.

  5. Test the Output Behavior:
    • Place several objects with different Rigidbody.mass values in the scene.
    • During play mode, place an object onto the scale surface.
    • If the object has a mass below the threshold, the display shows the weight in green.
    • If the object exceeds the threshold, the weight text begins flashing red to signal overload.
    • Remove the object to reset the display, which returns to showing --- and green text.
    • This is a pure UI as output example. The user does not directly manipulate the interface—the UI passively displays dynamic system feedback, visually communicating the environment’s state.
  1. Optional Extensions:
    • Audio Alert: Add an AudioSource to the scale and play an alert sound when overload is detected. You can trigger this in the OnTriggerEnter or within the HandleFlashing() logic.
    • Animated Background: Animate the Display Background to pulse or fade during overload using a coroutine or material animation.
    • Unit Toggle (kg/lb): Extend the display to allow unit switching by integrating a small UI as Input toggle nearby.
    • Weight History Log: Add a small scrollable log UI that records the last few weight readings for QA or training purposes.

These enhancements further bridge UI output with additional sensory modalities (audio, motion) and can introduce secondary UI inputs to control display behavior, creating a more complete I/O interaction model.


Best Practices

Designing effective VR user interfaces requires attention to clarity, spatial usability, and immersive interaction. Below are key best practices—grounded in both design theory and real-world implementation using Unity’s XR Interaction Toolkit.

  • Clarity & Simplicity: Use large, legible fonts like TextMeshPro with strong contrast (e.g., white on dark panels), and reduce UI clutter by showing only contextually relevant information. High-contrast color schemes and minimal visible elements help focus user attention. Clear and simple UI reduces cognitive load, speeds up decision-making, and prevents user error—especially in time-sensitive or technical tasks. In the CNC HMI panel, use bold text to indicate spindle state, and hide G-code settings when the spindle is running.

  • Spatial Layout Consistency: Arrange UI elements in realistic, logical groupings that mirror real-world controls, with consistent padding, spacing, and alignment across panels. Use left-aligned or center-aligned layouts consistently within the same interaction context. Consistent layouts make interfaces more predictable and easier to learn, which is crucial for users switching between different stations or tools in a complex VR environment. In XFactory, all large wall displays follow a consistent top-to-bottom flow: title → message → action buttons → secondary links.

  • Performance Optimization: Use lightweight materials, throttle frequent updates (e.g., limit chart refreshes to once per second), and avoid excessive animations or high-resolution textures. Rely on event-based scripting instead of polling to keep frame rates above 90 FPS. Optimized UIs ensure smooth, responsive interaction—especially important in VR, where dropped frames or stutters can cause discomfort, motion sickness, or loss of immersion. On the curved analytics dashboard in the exhibit station, dynamic charts should update at a throttled rate (e.g., every 1 second) rather than every frame.

  • Robust Interaction Feedback: Provide immediate visual (e.g., button glow), audio (click sounds), and haptic (controller vibration) feedback on interactions. Make this feedback consistent across all UI to build trust and improve usability. Immediate feedback reassures users that their actions have registered and helps confirm successful input, reducing frustration and accidental repetitions. Teach pendant buttons highlight and play a subtle tone when pressed, confirming joint jog commands.

  • Cross-Device Compatibility: Since different headsets vary in field of view, resolution, and input style, use Unity’s XR Device Simulator and test on multiple devices (e.g., Quest, Vive, Rift). Provide adjustable UI scaling and repositioning for user comfort and accessibility. Designing for device variability ensures a consistent and inclusive experience across different hardware, which is especially important for courses, deployments, or teams using mixed equipment. In the factory-wide reset display, allow users to reposition or scale the panel using VR controllers to accommodate height and reach differences.

  • Design for Ergonomics: Position UI panels in a comfortable reach zone (typically 0.5m–2m away and 15° below eye level), and use angled or floating panels for wall-mounted interfaces. Buttons should be at least ~2cm in world units for reliable interaction. Ergonomic design prevents fatigue, improves precision, and allows users to maintain a natural posture—essential for longer sessions or physically demanding tasks. The robot teach pendant is angled slightly toward the user and positioned at waist height to reduce arm fatigue.

  • Error Recovery: Include undo, cancel, and reset options for actions that change system state, and confirm critical tasks with dialog prompts. Allow users to safely back out of flows or retry without restarting the scene. Supporting graceful recovery from mistakes builds user confidence and prevents irreversible errors—key for high-stakes or training-based environments like virtual factories. In the manufacturing station, the CNC HMI includes a “Reset System” button that first opens a confirmation panel with Confirm and Cancel options. If Confirm is pressed, a second dialog asks the user to validate the current machine state (e.g., “Is the spindle stopped?”) before finalizing the reset. This prevents unsafe interruptions and gives users a chance to back out safely. If no response is given within 10 seconds, the UI defaults to canceling the operation.


Key Takeaways

Effective VR user interfaces bridge the gap between the user and immersive environments by serving as both input mechanisms—enabling precise control through buttons, sliders, toggles, and ray-based interactions—and output channels that provide timely feedback via visual, audio, or haptic cues. In engineering contexts like the XFactory environment, spatial UIs replicate real-world industrial controls while leveraging VR’s 3D capabilities for intuitive placement, ergonomic design, and rich feedback loops. Well-designed VR UIs balance clarity, consistency, and performance with accessibility and adaptability, ensuring they remain comfortable, responsive, and trustworthy across diverse hardware setups. By integrating robust feedback, maintaining immersive presence, and following ergonomic and usability best practices, these interfaces support confident decision-making, reduce user error, and create seamless, high-fidelity interaction experiences.