C5. Namespaces
Learning Outcomes
- Explain the purpose of namespaces in C# and their role in Unity projects. Before class, review how namespaces organize scripts and prevent naming conflicts, using the provided examples as a reference.
- Recognize core UnityEngine classes and lifecycle methods. As preparation, study the roles of
MonoBehaviour
,GameObject
, andComponent
, and note how lifecycle methods likeStart()
,Update()
,Awake()
, andFixedUpdate()
are applied in scripts.- Identify audio-related classes within the UnityEngine.Audio namespace. Ahead of the session, explore
AudioSource
andAudioClip
to understand their purpose in Unity’s audio system.- Explore UnityEngine.UI components and hierarchy. For your pre-work, create a Canvas with
Text
andButton
elements in Unity and examine their properties in the Inspector.- Describe scene management capabilities in Unity. Prior to class, review how
SceneManager
methods such asLoadScene()
,UnloadSceneAsync()
, andsceneLoaded
control scene transitions.- Summarize the purpose of XR-related namespaces in Unity. In preparation, study the role of
UnityEngine.XR
,XR.Interaction.Toolkit
, andARFoundation
in XR and AR workflows.
Namespaces
Namespaces are a mechanism for organizing code into logical groups, much like folders in an operating system. They serve to encapsulate related type definitions (such as classes, structs, interfaces, etc.) into distinct groups that work together. In Unity XR development—especially for engineering applications such as training simulations, digital twins, or system diagnostics—projects quickly grow in complexity. Namespaces help manage that complexity by keeping your code organized, modular, and conflict-free. Namespaces are not just a programming convention—they reflect the same mindset engineers apply when designing real systems: modular architecture, clear boundaries, and reuse of standard components.
Why Namespaces Matter?
-
Organization and Readability In XR projects, your Unity application may include multiple systems such as device tracking, UI panels, scene management, object interaction, and feedback mechanisms. Namespaces help group these elements into logical units. Just as engineers organize CAD assemblies or modular control systems, namespaces keep your software structured, modular, and easier to navigate.
-
Avoiding Naming Conflicts Common class names like
Controller
,Manager
, orSensor
can exist in multiple areas of an XR project—for UI handling, input processing, scene control, or object interaction. Without namespaces, these duplicate names lead to conflicts. Namespaces isolate them in their respective contexts, reducing errors and improving code clarity. -
Reusability and Encapsulation Namespaces enable you to develop reusable components, such as a hand interaction system, UI panels, or scene transition logic, and apply them across different scenes or even projects. This encapsulation allows XR projects to be built from well-defined modules, minimizing redundancy and improving maintainability.
-
Team Collaboration In team-based XR development, namespaces provide clear boundaries between systems. This helps avoid accidental overlap between features, simplifies onboarding for new developers, and supports parallel workstreams. Just like standardized documentation and naming conventions in engineering, namespaces promote clarity and coordination in collaborative codebases.
UnityEngine
Namespaces
Namespaces in Unity serve as organizational units that group related classes, functions, and types, helping you structure your code in a modular and manageable way. They allow you to reference specific functionality, such as rendering, physics, or input, without conflicts between similarly named classes. The table below summarizes the most important namespaces used in Unity scripting along with brief definitions:
UnityEngine
: Core Unity engine classes includingMonoBehaviour
,GameObject
,Transform
, and other fundamental types.UnityEngine.Audio
: Provides classes for audio playback and control, includingAudioSource
andAudioMixer
.UnityEngine.UI
: Provides access to Unity’s UI elements, such as Buttons, Text, and other UI components.UnityEngine.SceneManagement
: Offers classes and methods for loading, unloading, and managing scenes.UnityEngine.EventSystems
: Contains interfaces and classes for handling UI events and input (e.g., pointer events).UnityEngine.InputSystem
: The modern, flexible input system that supports multiple devices and action mapping.UnityEngine.XR
: Provides basic classes for XR device tracking and input for VR/AR projects. It includesUnityEngine.XR.Interaction.Toolkit
, which offers advanced components and tools for handling interactions in XR applications using the XR Interaction Toolkit.
UnityEngine
The UnityEngine
namespace is the cornerstone of Unity development. It provides the essential classes and methods that power the Unity engine, enabling you to create, control, and manipulate game objects and behaviors. This namespace contains MonoBehaviour
, GameObject
, and Component
methods.
MonoBehaviour
Methods
These methods form the backbone of scripting in Unity. They allow you to control the lifecycle of your scripts and react to events during gameplay. Examples of MonoBehaviour
methods include:
Start()
: Called before the first frame update—ideal for initialization (e.g., robot parameters or sensor states).Update()
: Called once per frame—used for frame-based logic (e.g., quadruped or robot arm positioning).Awake()
: Called when the script instance is being loaded—often used for registering components.FixedUpdate()
: Called on a fixed time interval—perfect for physics calculations (e.g., forklifts or drone movement.).OnCollisionEnter()
: Called when thisCollider
/Rigidbody
has begun touching anotherRigidbody
/Collider
—useful for identifying when a box hits a rack or a robot encounters an obstacle.
MonoBehaviour
has been extensively discussed in C4.
GameObject
Methods
GameObject
is the primary building block of a Unity scene. It represents objects in your game world and can have multiple components attached to it. From instantiating and destroying objects to adding components and sending messages, these methods form the backbone of game object management in Unity. These are generally classified into GameObject
methods and Component
methods:
-
Instantiate()
: Creates a new instance of aGameObject
or prefab at runtime. For example,Instantiate(enginePart, new Vector3(2, 0, 0), Quaternion.identity);
spawns a new engine part at the given position. -
Destroy()
: Removes aGameObject
or component from the scene, either immediately or after a specified delay. For example,Destroy(defectivePart, 2f);
deletes the object after 2 seconds. -
Find()
: Searches the current scene for aGameObject
by name and returns the first match found. For example,GameObject hmi = GameObject.Find("HMI_Display");
locates a control panel named HMI_Display. -
AddComponent<T>()
: Dynamically adds a component of typeT
to theGameObject
. For example,gameObject.AddComponent<Rigidbody>();
adds physics behavior to the object at runtime. -
SetActive()
: Activates or deactivates aGameObject
, controlling whether it is visible and interactive in the scene. For example,uiPanel.SetActive(false);
hides a UI element. -
CompareTag()
: Checks whether theGameObject
’s tag matches a specified string for condition-based logic. For example,if (gameObject.CompareTag("Tool"))
filters actions to tool-related objects.
Example:
Let’s use GameObject
methods to simulate interaction with tools on the drawer next to the CNC machine in XFactory’s manufacturing station. Pressing the “New Part” button on the phone spawns a randomly selected prefab (either a raw material or finished part), attaches physics, and updates the UI with its tag. Pressing “Pass” or “Fail” removes the spawned part. Pressing the on/off button on the digital caliper toggles its display on or off. Create an empty GameObject named DrawerManager
under the drawer GameObject and attached the following InspectionDrawer.cs
script to it.
using UnityEngine;
using TMPro;
public class InspectionDrawer : MonoBehaviour
{
public GameObject rawMaterialPrefab; // Prefab for raw material
public GameObject finishedPartPrefab; // Prefab for finished part
public TextMeshProUGUI partTypeText; // UI text to show part tag
private GameObject currentPart;
// Called when "New Part" button is pressed on the phone
public void SpawnNewPart()
{
// Randomly pick a prefab to spawn
GameObject prefabToSpawn =
Random.value > 0.5f ? rawMaterialPrefab : finishedPartPrefab;
// Instantiate the part at a fixed position with random Y rotation
currentPart = Instantiate(
prefabToSpawn,
new Vector3(-13.38f, 1.12f, -13.86f),
Quaternion.Euler(0, Random.Range(0, 360), 0)
);
// Add Rigidbody for physics
currentPart.AddComponent<Rigidbody>();
// Use tag to update the part type text
if (currentPart.CompareTag("Raw Material"))
{
partTypeText.text =
"This is a Raw Material";
}
else if (currentPart.CompareTag("Finished Part"))
{
partTypeText.text =
"This is a Finished Part";
}
}
// Called when "Pass" or "Fail" button is pressed
public void RemoveCurrentPart()
{
if (currentPart != null)
{
// Remove the part from the drawer
Destroy(currentPart);
}
}
// Called when caliper on/off button is pressed
public void ToggleCaliperDisplay()
{
// Find the display under Caliper_Digital_Bottom
GameObject display = GameObject.Find(
"Caliper_Digital_Bottom/Display"
);
// Toggle its active state
if (display != null)
{
display.SetActive(!display.activeSelf);
}
}
}
- Create and Attach the Script:
- In the
Hierarchy
, right-click and selectCreate Empty
. Rename it toDrawerManager
. - In the
Inspector
, clickAdd Component
and attach theInspectionDrawer.cs
script.
- In the
- Assign Prefabs to the Script:
- Select
DrawerManager
in theHierarchy
. - Drag and drop the
CNC Mill Stock
prefab from theProject
window,Assets > XFactory > Prefabs > Production Equipment > CNC Parts
into theRaw Material Prefab
field. Make sure the prefab’sTag
is set toRaw Material
. - Drag and drop the
CNC Mill Part
prefab from theProject
window,Assets > XFactory > Prefabs > Production Equipment > CNC Parts
into theFinished Part Prefab
field. Make sure the prefab’sTag
is set toFinished Part
.
- Select
- Assign the UI Text for Part Type:
- Locate the
TextMeshProUGUI
text object underPhone/Display/Part Type
. - Drag this object into the
Part Type Text
field of theInspectionDrawer.cs
script onDrawerManager
.
- Locate the
- Connect Phone Button Events:
- Select the
New Part
button in theHierarchy
. - In the
Button (Script)
component, click the+
in theOn Click()
section. - Drag
DrawerManager
into the object field and selectInspectionDrawer -> SpawnNewPart()
from the dropdown. - Repeat this for
Pass
andFail
buttons, assigning them toRemoveCurrentPart()
.
- Select the
- Connect the Caliper On/Off Button:
- Select the
On/Off
button under the caliper. - In the
On Click()
section, add a new event. - Drag
DrawerManager
into the object field and assign it toDrawerInteraction -> ToggleCaliperDisplay()
.
- Select the
- Test the Scene:
- Enter
Play
mode. - Press
New Part
to spawn a random part in the drawer. - Press
Pass
orFail
to remove the part. - Press the
On/Off
button on the caliper to toggle the digital display. You may have noticed that the on/off button turns off the caliper’s display but doesn’t turn it back on. Can you explain the bug?
- Enter
Component
Methods
Components add functionality to GameObjects and are a core part of Unity’s component-based architecture. Whether you are retrieving a single component, finding one among children or parents, or invoking methods across multiple components, these methods enable a modular and flexible design approach. Key methods to work with components include:
-
GetComponent<T>()
: Retrieves a component of typeT
that is attached to the same GameObject. For example,Rigidbody rb = currentPart.GetComponent<Rigidbody>();
accesses the physics component on the newly spawned part in the example above. -
GetComponentInChildren<T>()
: Searches for a component of typeT
on the GameObject or any of its child objects. For example,TextMeshProUGUI displayText = phoneDisplay.GetComponentInChildren<TextMeshProUGUI>();
finds the part type text under the phone’s display. -
GetComponents<T>()
: Retrieves all components of typeT
attached to the GameObject, returning an array. For example,AudioSource[] alerts = phone.GetComponents<AudioSource>();
collects all sound sources from the phone device for control or diagnostics. -
GetComponentInParent<T>()
: Searches the GameObject’s parent hierarchy for a component of typeT
. For example,DrawerInteraction manager = GetComponentInParent<DrawerInteraction>();
accesses the logic controller from a UI button nested deep in the phone’s hierarchy.
Example:
Let’s verify key components attached to drawer-related objects in the XFactory scene, such as parts, the phone, and UI elements. The following script checks for a Rigidbody
, reads text from the phone’s display, counts AudioSource
components, and locates the main drawer manager script.
using UnityEngine;
using TMPro;
public class ComponentChecker : MonoBehaviour
{
void Start()
{
// Get Rigidbody on the current part
Rigidbody rb = GetComponent<Rigidbody>();
if (rb != null)
{
Debug.Log("Rigidbody found on part.");
}
// Find the TextMeshPro component in the phone display
TextMeshProUGUI displayText = GetComponentInChildren<TextMeshProUGUI>();
if (displayText != null)
{
Debug.Log("Found text component: " + displayText.text);
}
// Get all audio sources on the phone
AudioSource[] audioSources = GetComponents<AudioSource>();
Debug.Log("Number of audio sources: " + audioSources.Length);
// Find the InspectionDrawer script in the parent
InspectionDrawer manager = GetComponentInParent<InspectionDrawer>();
if (manager != null)
{
Debug.Log("Connected to InspectionDrawer script.");
}
}
}
- Attach the Script:
- Create or select a GameObject involved in the drawer interaction (e.g.,
Phone
). - In the
Inspector
, clickAdd Component
and attach theComponentChecker.cs
script.
- Create or select a GameObject involved in the drawer interaction (e.g.,
- Run and Observe:
- Enter Play mode in Unity.
- Watch the
Console
for log messages confirming which components were found and accessed correctly.
UnityEngine.Audio
The UnityEngine.Audio
namespace is designed to manage audio features within Unity. It provides classes and methods to control playback, mixing, and overall audio management in your projects.
Audio Source
The AudioSource
class is a component that plays back audio clips in your scene. It controls the audio playback through various methods, enabling you to start, pause, and stop sounds. This component is essential for integrating sound effects, music, and voice-overs into your game or simulation. Important Audio Source
methods include:
-
Play()
: This method initiates the playback of the assignedAudioResource
. When called, it starts playing the audio from the beginning (or resumes if the audio was previously paused) on theAudioSource
component. UsePlay()
when you want to start sound effects or background music immediately, such as when a game begins or when an event occurs. -
Pause()
: This method temporarily halts the playback of the assignedAudioResource
. However, it retains the current position in the audio clip so that playback can resume from the same point. UsePause()
when you want to temporarily interrupt the audio—for example, when pausing the game—so that it can be resumed later without restarting the clip. -
Stop()
: This method completely halts the audio playback and resets the playback position to the beginning of theAudioResource
. This is useful when you want to end the sound entirely or prepare theAudioSource
for a new clip. UseStop()
when you want to immediately cease audio playback and ensure that it restarts from the beginning the next time you callPlay()
.
Example:
The script below controls the CNC machine using ON and OFF buttons on the HMI
mounted to the control panel (CNC_Control_Panel
). When the ON button is pressed, the AudioSource
on the CNC_Mill_Set
GameObject plays the machine sound, simulating the machine being turned on. Pressing OFF stops the machine (i.e., the sound).
using UnityEngine;
public class CNCControlPanel : MonoBehaviour
{
// Assign the AudioSource from CNC_Mill_Set in Inspector
public AudioSource cncAudio;
// Called when "On" button is pressed
public void TurnOnCNC()
{
if (cncAudio != null &&
!cncAudio.isPlaying)
{
cncAudio.Play();
Debug.Log("CNC machine started.");
}
}
// Called when "Off" button is pressed
public void TurnOffCNC()
{
if (cncAudio != null &&
cncAudio.isPlaying)
{
cncAudio.Stop();
Debug.Log("CNC machine stopped.");
}
}
}
- Attach the Script:
- Select the
CNC_Control_Panel
GameObject in the hierarchy. - Click
Add Component
and attach theCNCControlPanel
script.
- Select the
- Assign the
AudioSource
:- In the Inspector for
CNC_Control_Panel
, locate thecncAudio
field in the script. - Drag the
CNC_Mill_Set
GameObject (which contains theAudioSource
) into this field.
- In the Inspector for
- Wire Up the Buttons:
- In the
Hierarchy
, expandCNC_Mill_Set > CNC_Body > CNC_Control_Panel > HMI
(Canvas). - Select the
ON
button. - In the Inspector, under the
Button (Script)
component, find theOnClick()
event list. - Click the
+
button to add a new event. - Drag the
CNC_Control_Panel
into the object field. - From the dropdown, choose
CNCControlPanel → TurnOnCNC()
. - Repeat the same steps for the
OFF
button, but assignTurnOffCNC()
instead.
- In the
- Test the Setup:
- Enter
Play
mode. - Click the ON button to hear the machine startup sound.
- Click the OFF button to stop the sound.
- Enter
AudioResource
AudioResource
(previously known as AudioClip
in older versions of Unity) is used to store imported audio data, such as machine sounds, operational alerts, ambient noise, or voice instructions. As of Unity 6, AudioResource
better reflects its role as a managed and reusable audio asset across different parts of a project. In the CNC machine control panel example above, an AudioResource
is assigned to an AudioSource
attached to the CNC_Mill_Set
. Playback is triggered by ON/OFF buttons in the HMI, allowing the simulation to respond with realistic feedback when machinery is toggled. In XR engineering applications, AudioResource
is often used for simulating machine sounds or alarms in training environments, giving auditory feedback for UI interactions (e.g., button clicks, confirmation tones), or adding realism to immersive simulations through ambient background sounds. Important considerations:
- Storage: Holds waveform data (e.g., WAV, MP3, Ogg) used at runtime. The asset is preloaded or streamed depending on performance needs.
- Usage: Assigned to an
AudioSource
, which is then attached to a scene object. Playback is controlled by scripting, animation events, or user interaction (e.g., XR controller input or UI triggers). - Configuration: Playback options such as volume, pitch, spatialization, and looping are configured via the
AudioSource
. Unity also supports 3D audio spatialization for immersive sound positioning in XR.
By separating sound assets (
AudioResource
) from playback logic (AudioSource
), Unity enables a flexible and modular approach to integrating audio into interactive and reactive XR systems.
UnityEngine.UI
The UnityEngine.UI
namespace provides a robust framework for building and managing user interfaces in Unity. It contains components that let you design interactive and visually engaging UIs for games and applications. With these tools, you can create text displays, buttons, sliders, images, toggles, and more. This namespace is essential for any project that requires user interaction, as it simplifies the process of connecting UI elements with your game logic. Key UI components include Text
, Button
, Image
, and Toggle
.
Text
The Text
component is used for displaying and updating text on the screen. It is a fundamental UI element for showing instructions, scores, messages, or any information that the user needs to read. Important properties include:
text
: Property toget
orset
the displayed string.color
: Property to change the text color.fontSize
: Property to adjust the size of the text.
Example: Using Text
properties, let’s display the real-time weight of a box on an industrial scale. If the weight exceeds 50 KG, the text turns red to signal an overload condition.
using UnityEngine;
using TMPro;
public class ScaleWeightDisplay : MonoBehaviour
{
// Assign Box_Large_01a in Inspector
public Rigidbody box;
// Assign the Display text under HMI Canvas
public TextMeshProUGUI displayText;
public float warningThreshold = 50f;
void Update()
{
if (box != null &&
displayText != null)
{
// Calculate weight using mass and gravity
float weight = box.mass;
displayText.text =
weight.ToString("00.00") + " KG";
// Change color if weight exceeds threshold
if (weight > warningThreshold)
{
displayText.color = Color.red;
}
else
{
displayText.color = Color.green;
}
}
}
}
- Attach the Script:
- Create or select a logic controller under
Industrial_Scale_01a
(e.g., an empty GameObject namedScaleLogicController
) or use the GameObject itself. - Add the
ScaleWeightDisplay
script to that object.
- Create or select a logic controller under
- Assign References in
Inspector
:- Drag the
Box_Large_01a
GameObject (with aRigidbody
) into thebox
field. - Drag the
TextMeshProUGUI
Display Text
object (under theHMI
Canvas of the scale) into thedisplayText
field. - (Optional) Set the
warningThreshold
field in the Inspector to define the weight (in KG) that triggers a red warning color (default is 50 KG).
- Drag the
- Test the Setup:
- Enter
Play
mode. - In the
Inspector
, modify themass
value of the box’sRigidbody
. - Observe the
Display
text updating in real time and changing color to red when the weight exceeds the threshold (e.g., 50 KG).
- Enter
Button
The Button
component is used for handling user clicks. It allows you to assign functions to be called when the button is pressed. The most important event related to this component is onClick
to which you can add listeners using AddListener()
.
Example: Let’s use Unity’s Button.onClick.AddListener()
to handle user input via the HMI panel on the CNC machine, so that the ON and OFF buttons on the HMI
log messages when pressed. This mirrors the earlier example where these buttons were wired to play and stop the CNC machine’s sound via an AudioSource
on the CNC_Mill_Set
. Attach the following script to the HMI
GameObject in the Hierarchy
. Clicking the ON or OFF button on the CNC machine’s HMI will trigger a debug message in the Console confirming the button press.
using UnityEngine;
using UnityEngine.UI;
public class CNCButtonController : MonoBehaviour
{
// Assign in Inspector (HMI On button)
public Button onButton;
// Assign in Inspector (HMI Off button)
public Button offButton;
void Start()
{
// Register click events for both buttons
onButton.onClick.AddListener(
HandleCNCOn
);
offButton.onClick.AddListener(
HandleCNCOff
);
}
void HandleCNCOn()
{
Debug.Log("CNC ON button pressed.");
// Add logic to start CNC here
}
void HandleCNCOff()
{
Debug.Log("CNC OFF button pressed.");
// Add logic to stop CNC here
}
}
Image
The Image
component is used for displaying images (sprites) in the UI. It can be used for icons, backgrounds, or any other visual element. Important image properties include:
sprite
: Property toget
orset
the current sprite.color
: Property to modify the tint of the image.
Example: This example demonstrates how to create a simple slideshow of engine diagrams on the Interactive Engine Diagram
UI element in the Display GT
monitor at the assembly station. It uses the Image.sprite
property to cycle through a list of three sprites and the Image.color
property to ensure the image is fully visible. A UI button with a >
icon is placed below the image, and each click updates the displayed diagram, mimicking an interactive digital manual or display panel.
using UnityEngine;
using UnityEngine.UI;
public class EngineDiagramSlideshow : MonoBehaviour
{
// The single Image UI element
public Image diagramImage;
// Array of slides as sprites
public Sprite[] diagramSprites;
private int currentIndex = 0;
void Start()
{
ShowSlide(currentIndex);
}
public void ShowNextSlide()
{
currentIndex = (currentIndex + 1) %
diagramSprites.Length;
ShowSlide(currentIndex);
}
private void ShowSlide(int index)
{
if (diagramSprites.Length == 0 ||
diagramImage == null)
{
return;
}
diagramImage.sprite = diagramSprites[index];
// Ensure the image is fully visible
// (in case it was tinted or transparent)
diagramImage.color = Color.white;
}
}
- Attach the Script:
- Add the
EngineDiagramSlideshow
script to theCanvas
GameObject underDisplay GT
. - Drag the
Interactive Engine Diagram
UI Image into thediagramImage
field. - Add the three diagram slides (as
Sprite
assets) to thediagramSprites
array in theInspector
.
- Add the
- Set Up the Button:
- Place a UI Button (with a
>
icon) under the image. Name itSlideshow
or something similar. - In the button’s
OnClick()
list, add the Canvas or GameObject with the script. - Choose
EngineDiagramSlideshow → ShowNextSlide()
.
- Place a UI Button (with a
- Preview the Result:
- Enter
Play
mode. - Click the button to cycle through the slideshow sprites.
- Enter
Toggle
The Toggle
component is a checkbox-like element that allows users to enable or disable options. It is useful for settings and options menus. Important properties include:
isOn
: Property indicating whether the toggle is on (true
) or off (false
).onValueChanged
: An event that gets triggered when the toggle state changes.
Example:
This example demonstrates how to use the Toggle
component’s isOn
property and onValueChanged
event to control the visibility of an instruction panel (Instruction Scrollbar
) on the DisplayGT monitor in the assembly station. When the user interacts with the Instructions Toggle
, the script shows or hides the scrollable panel and logs a message to the Console
indicating the current state.
using UnityEngine;
using UnityEngine.UI;
public class InstructionToggleController : MonoBehaviour
{
// Assign the "Instructions Toggle"
public Toggle instructionsToggle;
// Assign the "Instruction Scrollbar"
public GameObject instructionPanel;
void Start()
{
// Set initial visibility based on the toggle state
instructionPanel.SetActive(
instructionsToggle.isOn
);
// Add listener to handle toggle interaction
instructionsToggle.onValueChanged
.AddListener(delegate
{
instructionPanel.SetActive(
instructionsToggle.isOn
);
// Log a message indicating current state
if (instructionsToggle.isOn)
{
Debug.Log("Instructions panel shown.");
}
else
{
Debug.Log("Instructions panel hidden.");
}
});
}
}
- Attach the Script:
- Add the
InstructionToggleController
script to theCanvas
GameObject underDisplayGT
(or any suitable UI controller object in that hierarchy). - Drag the
Instructions Toggle
(the UI Toggle element) into theinstructionsToggle
field. - Drag the
Instruction Scrollbar
(the scrollable text panel GameObject) into theinstructionPanel
field.
- Add the
- Configure Initial Visibility:
- Ensure the
Instruction Scrollbar
is active or inactive in the scene based on your preferred default state. - The script will automatically sync its visibility with the toggle’s
isOn
state at runtime.
- Ensure the
- Test the Toggle:
- Enter
Play
mode in Unity. - Click the toggle to show or hide the instruction panel.
- Open the
Console
to see log messages confirming whether the instructions are shown or hidden.
- Enter
UnityEngine.SceneManagement
The UnityEngine.SceneManagement
namespace is essential for managing scenes in Unity. It provides developers with the tools needed to load, unload, and transition between different scenes. Scenes represent different levels, game states, or environments. Using these methods allows you to build fluid game experiences, such as level transitions, dynamic loading of game content, and asynchronous scene management for smoother performance. Key SceneManager
methods include:
-
LoadScene()
: TheSceneManager.LoadScene()
method loads a new scene by name or index. This is commonly used to switch between levels or restart a game. -
UnloadSceneAsync()
: TheSceneManager.UnloadSceneAsync()
method unloads a scene asynchronously, freeing up resources. This method is useful when you need to remove a scene without causing a frame rate drop or interruption. -
GetActiveScene()
: TheSceneManager.GetActiveScene()
method returns the currently active scene. This is useful for obtaining scene-specific information, such as the scene name or build index. -
sceneLoaded
(event): TheSceneManager.sceneLoaded
event is triggered once a scene has finished loading. It allows you to perform additional actions immediately after a scene transition, such as initializing game objects or setting up UI elements.
Example: Let’s trigger both loading and unloading the XFactory yard scene (exterior) by opening and closing a door in the XFactory using keyboard input. Pressing the O
key gradually rotates the door open on the Y
-axis from 0°
to -120°
. As soon as the door’s Y
rotation dips below 0°
, the Factory Yard Scene
is loaded additively using SceneManager.LoadScene()
. Pressing the C
key closes the door by rotating it back to 0°
, and once fully closed, the scene is unloaded using SceneManager.UnloadSceneAsync()
. This simulates spatially-aware scene streaming, improving performance and immersion in large environments.
using UnityEngine;
using UnityEngine.SceneManagement;
public class FactoryYardLoader : MonoBehaviour
{
public string exteriorSceneName =
"Factory Yard Scene";
public float rotationSpeed = 60f;
private Quaternion openRotation;
private Quaternion closedRotation;
private bool opening = false;
private bool closing = false;
private bool sceneLoaded = false;
void Start()
{
closedRotation = Quaternion.Euler(
0f, 0f, 0f
);
openRotation = Quaternion.Euler(
0f, -120f, 0f
);
}
void Update()
{
if (Input.GetKeyDown(KeyCode.O))
opening = true;
if (Input.GetKeyDown(KeyCode.C))
closing = true;
if (opening)
{
transform.rotation = Quaternion.RotateTowards(
transform.rotation,
openRotation,
rotationSpeed * Time.deltaTime
);
float yAngle = NormalizeAngle(
transform.eulerAngles.y
);
if (!sceneLoaded && yAngle < 0f)
{
SceneManager.LoadScene(
exteriorSceneName,
LoadSceneMode.Additive
);
sceneLoaded = true;
Debug.Log("Factory Yard Scene loaded.");
}
if (Quaternion.Angle(
transform.rotation,
openRotation
) < 0.1f)
{
opening = false;
}
}
if (closing)
{
transform.rotation = Quaternion.RotateTowards(
transform.rotation,
closedRotation,
rotationSpeed * Time.deltaTime
);
if (Quaternion.Angle(
transform.rotation,
closedRotation
) < 0.1f)
{
closing = false;
// Unload the scene when the door fully closes
if (sceneLoaded &&
SceneManager.GetSceneByName(
exteriorSceneName
).isLoaded)
{
SceneManager.UnloadSceneAsync(
exteriorSceneName
);
sceneLoaded = false;
Debug.Log("Factory Yard Scene unloaded.");
}
}
}
}
float NormalizeAngle(float angle)
{
return (angle > 180f) ?
angle - 360f :
angle;
}
}
- Attach the Script:
- Add the
FactoryDoorLoader
script to the door GameObject (e.g.,Door_01
). - Ensure the door GameObject has a
Collider
component (it does not need to be a trigger).
- Add the
- Configure Scene Reference:
- In the script’s
exteriorSceneName
field, type the exact name of your scene asset (e.g.,Factory Yard Scene
). - Go to
File > Build Profiles
and open your active build profile. - In the profile’s
Scene List
, clickAdd Open Scene
after additively opening theFactory Yard Scene
in the Editor (right-click >Open Scene Additive
).
- In the script’s
- Testing in Play Mode:
- Enter
Play
mode in the Unity Editor. - Press
O
to open the door. Once itsY
rotation passes below0°
, theFactory Yard Scene
loads additively. - Press
C
to close the door. Once it fully returns to0°
, theFactory Yard Scene
is unloaded automatically.
- Enter
UnityEngine.XR
Finally, let’s review the foundational namespaces in Unity that power XR development. The UnityEngine.XR
namespace is Unity’s base-level XR API, giving you direct access to low-level XR device data, including tracking information, device presence, and positional data for VR/AR hardware. Even if you later use higher-level systems like XR Interaction Toolkit, understanding how Unity tracks hardware through this namespace helps you troubleshoot, extend, or fine-tune your XR experiences. It is especially useful in cases where raw tracking data is needed—like monitoring head position in safety-critical areas. Key components include:
XRDevice
: Provides basic information about the connected XR hardware (e.g., whether a headset is present). It is useful for checking hardware availability at runtime.InputTracking
: Allows access to the position and rotation of XR input sources like the headset or controllers. You can track specific nodes (e.g., head, hands) in real time.XRNode
: An enumeration that defines which part of the XR device you’re tracking (Head
,LeftHand
,RightHand
, etc.).
UnityEngine.XR.Management
This namespace extends UnityEngine.XR
by managing the lifecycle and configuration of XR subsystems through the XR Plugin Management system. It allows Unity to interface with different XR platforms such as OpenXR, Oculus, or Windows MR through a standardized plugin framework. Without this namespace, Unity wouldn’t know which XR runtime to launch or how to configure it. You will work with it when setting up your project to use OpenXR for both VR and AR platforms. It is essential for building cross-platform apps that automatically load the right XR environment at runtime. Key components include:
XRGeneralSettings
: Stores global XR configuration and determines how and when XR initializes during app startup.XRManagerSettings
: Controls startup/shutdown of XR subsystems (e.g., input, rendering, tracking).XRLoader
: Each XR plugin provides a loader that handles initializing that specific platform’s features.
UnityEngine.XR.Interaction.Toolkit
The XR Interaction Toolkit builds on UnityEngine.XR
and XR.Management
to offer a high-level, component-based framework for common VR interactions. It removes the need to code most input logic from scratch and offers prefab-ready systems for locomotion, grabbing, UI, and physics-based interaction. This namespace will be your primary toolset during the VR development module. It drastically reduces development time by abstracting away controller input logic and interaction mechanics, allowing you to focus on designing interactive scenes. It’s also extensible, so you can build custom behaviors atop a robust base. Key components include:
XRController
: Maps input from XR devices such as hand controllers.XRGrabInteractable
: Enables objects to be picked up, moved, or manipulated using XR input.XRRayInteractor
: Provides ray-based interaction, like a laser pointer, for UI or distant objects.XRInteractorLineVisual
: Visually represents the ray or interaction line in VR scenes.
UnityEngine.XR.ARFoundation
AR Foundation extends Unity’s XR support to AR, abstracting the differences between ARKit (iOS), ARCore (Android), and platforms like Magic Leap. It builds on UnityEngine.XR
by introducing AR-specific features like plane detection, light estimation, and anchors. This namespace will power your AR development module, enabling real-world spatial understanding and overlaying virtual content. It allows you to build cross-device AR experiences without writing platform-specific code, making your AR apps portable and future-ready. Key components include:
ARSession
: Manages the AR lifecycle, including reset and pause/resume behavior.ARPlaneManager
: Detects and tracks real-world flat surfaces using device sensors and camera input.ARCameraManager
: Accesses and controls camera features such as light estimation, focus modes, and exposure.
UnityEngine.XR.OpenXR
This namespace represents Unity’s implementation of the OpenXR standard, which aims to unify XR development across devices and vendors. It acts as the foundation behind XR Plugin Management when the OpenXR loader is selected. OpenXR allows you to write one XR application and deploy it across multiple platforms. It’s the most scalable way to support both Meta Quest (VR) and Magic Leap 2 or mobile (AR) with the same codebase. Understanding how OpenXR fits into Unity’s XR stack ensures you are ready to build long-lasting, portable, and maintainable XR apps. Key points to consider:
- OpenXR is a vendor-neutral standard supported by Meta, HTC, Microsoft, Magic Leap, and others.
- Most functionality is configured through the Unity Editor and XR Plugin Management, not through scripting in this namespace.
- It supports device-specific extensions for advanced features (e.g., hand tracking, eye tracking, passthrough) in a modular, future-proof way.
Key Takeaways
Namespaces in C# and Unity are essential for structuring XR projects into organized, conflict-free, and reusable modules, enabling smoother collaboration and scalability. Mastering Unity’s core namespaces—such as UnityEngine
for fundamental classes and lifecycle methods, UnityEngine.Audio
for sound management, UnityEngine.UI
for interactive interfaces, UnityEngine.SceneManagement
for scene transitions, and XR-related namespaces for VR/AR functionality—empowers developers to build modular, maintainable, and immersive experiences. By understanding how each namespace groups related functionality, you can streamline code organization, prevent naming collisions, and design interactive systems that are both robust and adaptable across different platforms.