F01


Learning Outcomes

  • Explain the role of AR Foundation in cross-platform AR development. Before class, review how AR Foundation provides an abstraction layer for platforms like ARKit, ARCore, and OpenXR, and prepare to discuss deployment strategies and device compatibility.
  • Set up a Unity project with AR Foundation and a compatible provider. Ahead of the session, create a new 3D project, install AR Foundation with at least one provider plug-in, review URP configuration steps, and—if possible—deploy a basic AR scene (e.g., a cube) to a mobile device.
  • Describe the architecture of Unity’s AR Foundation. To get ready, read up on subsystems, descriptors, and managers, focusing on how they standardize AR functionality and manage real-world trackables as GameObjects.

What Is AR Foundation?

AR Foundation is a high-level framework developed by Unity that allows you to create AR experiences that run seamlessly across multiple AR platforms. Rather than working directly with platform-specific SDKs like ARKit (Apple) or ARCore (Google), AR Foundation provides a unified API and a set of manager components that handle the underlying AR functionality. These components include session tracking, plane detection, anchors, environment probes, light estimation, and more. When deployed, AR Foundation automatically connects your Unity code to the appropriate native AR SDK, abstracting away the differences between platforms while giving you access to platform-specific features when needed. AR Foundation offers several key benefits, including:

  • Multi-Platform Deployment: Write your app once in Unity and deploy it to a range of AR devices, from smartphones to advanced headsets. This shared codebase approach dramatically reduces development time and streamlines updates.

  • Simplified Development: AR Foundation unifies the AR workflow by abstracting platform differences. Developers can focus on designing compelling user experiences instead of managing low-level device-specific logic.

  • Extensibility: AR Foundation plays well with Unity’s broader ecosystem—including XR Interaction Toolkit, Shader Graph, and URP/HDRP—allowing you to extend functionality and tailor experiences to different hardware capabilities.

  • Access to Native Features: While AR Foundation simplifies development, it also allows access to underlying platform-specific features through subsystem extensions or conditional logic, giving you full control when needed.

  • OpenXR Compatibility: Support for OpenXR makes AR Foundation future-proof and aligned with industry standards for XR development, easing the integration of new devices as they emerge.

  • Toolchain Integration: Unity’s integration with popular version control systems, CI/CD pipelines, and asset management tools makes AR Foundation projects easy to collaborate on in large teams.

  • Rapid Prototyping: Unity’s real-time editor combined with AR Foundation enables fast iteration cycles, making it easier to test, debug, and refine AR interactions early in the development process.

  • Active Ecosystem & Support: Backed by Unity’s documentation, tutorials, and community support, AR Foundation benefits from continuous updates and a robust knowledge base for troubleshooting and learning.

This broad support makes AR Foundation a powerful tool for creating immersive AR experiences across both mobile and head-worn devices used in engineering, manufacturing, healthcare, education, and beyond.

Required Packages

AR Foundation’s provider plug-in packages provide a unified interface to access AR features. This package is central to your AR project as it defines the common APIs used to interact with various AR functionalities. Each plug-in is essential for enabling AR capabilities on its corresponding platform. This modular approach allows developers to mix and match components based on target devices and feature requirements.

  • Apple ARKit XR Plug-in (iOS): Enables AR functionalities for iPhones and iPads. This plug-in allows developers to access features like plane detection, face tracking, object recognition, and LiDAR-based scene understanding when available on supported Apple devices.

  • Google ARCore XR Plug-in (Android): Implements AR features on Android smartphones and tablets. It supports core features like motion tracking, environmental understanding, and image tracking, providing a consistent AR experience across a wide range of Android devices.

  • Magic Leap XR Plug-in (Magic Leap 2): Enables native AR capabilities specific to Magic Leap 2 devices, including spatial mapping, hand tracking, and eye tracking. Can also be supported via Unity OpenXR. This plug-in is optimized for Magic Leap’s spatial computing platform, but can also work with Unity’s OpenXR integration to support cross-platform development.

  • OpenXR Plug-in (HoloLens 2): Provides AR support for Microsoft HoloLens 2 via the OpenXR standard. It enables access to features like hand tracking, spatial anchors, eye gaze, and scene understanding while maintaining compatibility with other OpenXR-compliant devices.

  • Apple visionOS XR Plug-in (visionOS): Supports AR on Apple’s visionOS platform, including Apple Vision Pro.
  • Unity OpenXR: Meta (Meta Quest): Dedicated to AR experiences on Meta Quest and Meta Quest Pro headsets using passthrough AR and spatial anchors. This package provides access to visionOS-specific capabilities such as volumetric rendering, spatial scenes, and direct interaction using hands and eyes in Apple’s new spatial computing ecosystem.

  • Unity OpenXR: Meta (Meta Quest): Dedicated to AR experiences on Meta Quest and Meta Quest Pro headsets using passthrough AR and spatial anchors. This enables developers to create mixed reality applications using color passthrough, shared spatial anchors, and Meta’s interaction profiles, especially useful for MR training and productivity tools.

  • Unity OpenXR: Android XR (Android XR): Delivers AR support on Android-based XR devices such as enterprise-focused smart glasses. Designed for hardware like Vuzix or Lenovo ThinkReality, this plug-in supports enterprise-grade use cases such as remote guidance, logistics, and manufacturing overlays.

AR Foundation does not implement AR features by itself. It requires additional provider plug-in packages. Without these platform-specific extensions, AR Foundation would not be able to leverage the native capabilities of each AR-enabled device. > When planning an AR project, it is crucial to verify feature availability and compatibility for your target platform(s) and to install the corresponding provider plug-ins. Review Unity’s AR Foundation documentation and the manufacturer’s SDKs to ensure your application behaves as expected on each supported device.

Key Features

Several key features are provided by AR Foundation, which establish a robust framework that caters to simple AR demonstrations as well as complex, interactive, and multi-user experiences, forming the essential building blocks for compelling AR applications.

  • Session: Manages the AR experience by enabling, disabling, and configuring AR sessions on the device. This feature ensures that the AR experience is initialized correctly and can be dynamically managed during runtime.
  • Device Tracking: Tracks the position and orientation of the device within the physical space, which is crucial for aligning virtual elements with the real world.
  • Camera: Renders images from the device’s camera while providing light estimation to integrate virtual content with real-world lighting conditions, helping virtual objects to appear natural within the scene.
  • Plane Detection: Identifies flat surfaces such as floors, tables, or walls in the environment, enabling precise placement of virtual content on detected surfaces.
  • Bounding Box Detection: Detects and tracks bounding boxes around 3D objects, which facilitates interactive object manipulation.
  • Image Tracking: Recognizes and tracks 2D images in the real world, allowing the app to trigger AR content or interactive experiences based on visual cues.
  • Object Tracking: Recognizes and tracks physical 3D objects, creating opportunities for interactive experiences rooted in object recognition.
  • Face Tracking: Detects and monitors human faces for expressions and positioning, enabling the creation of personalized, context-aware AR experiences.
  • Body Tracking: Captures and follows the pose and movement of a human body, allowing for full-body interactive engagements in AR.
  • Point Clouds: Detect a set of feature points in the environment that help in mapping spatial structures and enhancing the accuracy of AR scene reconstruction.
  • Ray Casts: Cast virtual rays to interact with detected surfaces and objects, supporting intuitive user interactions, such as selecting or placing virtual items.
  • Anchors: Attach virtual objects to fixed points in the physical world to maintain consistent positioning as the user moves, ensuring a persistent AR experience.
  • Meshing: Generates a mesh representation of the physical environment, which can be used to create occlusion effects or provide a dynamic surface model for interactions.
  • Environment Probes: Capture environmental lighting and reflections by generating cubemaps, ensuring that virtual objects reflect the real-world lighting accurately.
  • Occlusion: Renders virtual content behind physical objects using depth and human segmentation data, enhancing realism by properly layering virtual and real elements.
  • Participants: Support multi-user AR experiences by tracking other devices in a shared session, which enables collaborative and social AR applications.

To add AR Foundation features to your project, create an empty GameObject in your scene (typically named AR Session Origin) and use the Inspector to add the relevant AR Foundation components such as AR Plane Manager, AR Raycast Manager, AR Camera Manager, or AR Face Manager. These components automatically enable the corresponding features at runtime. Most features also require an AR Session GameObject in the scene to initialize the AR system. Make sure you’ve installed the required platform provider packages via the Package Manager before configuring components.


Setting Up an AR Project

Now, let’s configure a Unity project for AR development using AR Foundation. This involves installing required packages and settings, setting up the Universal Render Pipeline (URP), creating an AR-ready scene, and deploying your project to an iOS or Android device. To make this concrete, let’s bring the virtual drone from XFactory into the (physical) room via AR.

Install AR Foundation

AR Foundation is available via the Unity Package Manager and provides the common interfaces to access AR features. This package allows you to develop AR apps in a standardized way across multiple devices. It acts as the core framework to which platform-specific providers are attached. To install:

  1. Create a new project using the Universal 3D template. Name it XFactoryAR. Click + Create project to open it.
  2. Open Unity Editor and navigate to Window > Package Manager.
  3. In the Package Manager, locate and install AR Foundation in Unity Registry.

01

Installing AR Foundation is critical for building the basic AR framework and functionalities in your project.

Install Provider Plug-ins

The provider plug-ins implement AR features for specific target platforms. Each plug-in bridges AR Foundation calls to the native AR SDK of the device, ensuring compatibility and optimal performance. They are essential for unlocking the full functionality of AR on each respective platform. Some plug-ins require extra project configuration to enable optimal performance and functionality. Proper project settings ensure that all AR features work seamlessly across devices. It also helps prevent runtime issues by aligning project configurations with each provider’s requirements. To install:

  1. Navigate to Edit > Project Settings > XR Plug-in Management.
  2. If prompted, click Install XR Plug-in Management.
  3. Once installed, choose your target platform tab. If iOS, check ARKit. If Android, check ARCore.
  4. Unity will now automatically install the required provider packages: ARKit XR Plugin for iOS or ARCore XR Plugin for Android.

02

Set Up an AR Scene

Correctly setting up your AR scene ensures that virtual objects align with the physical world, that tracking works reliably, and that AR content is rendered from the device’s perspective.

  1. Start with a Clean Scene:
    • Delete the default Main Camera. This is replaced by the camera inside the XR Origin prefab.
    • Keep or customize the default Directional Light. Delete it only if you plan to use environmental lighting or AR light estimation.
    • Delete the default Global Volume (if present), unless you are intentionally using post-processing effects compatible with URP and AR. These can often interfere or add unnecessary complexity in early AR prototyping.
    • Save it as AR101 scene.

    07

  2. Add AR Session:
    • In the Hierarchy, right-click and go to XR > AR Session. This creates the AR Session object, which initializes and manages the AR system. It is essential for tracking and accessing features like plane detection and light estimation using various managers (components).

    08

  3. Add XR Origin (Mobile AR):
    • Right-click again and select XR > XR Origin (Mobile AR). This adds the XR Origin prefab, which includes an AR-compatible camera and the proper spatial hierarchy to reflect the user’s device in the virtual space.
    • Confirm the XR Origin structure. It should contain Camera Offset > Main Camera (tagged as MainCamera).

    09

    This Main Camera under XR Origin will automatically follow the device’s position and orientation in the real world.

  4. Configure AR Camera Settings:
    • Select the Main Camera inside the XR Origin.
    • Locate the AR Camera Manager component.
    • Enable Auto Focus to keep the physical camera lens focused automatically. Useful for dynamic scenes or close-up content.
    • Choose an appropriate Light Estimation setting. For example, None uses no light data from the environment, Ambient Intensity or Ambient Spherical Harmonics enable virtual objects to reflect real-world lighting conditions, and Environmental HDR (if supported) enables realistic lighting and reflections, but may reduce performance on lower-end devices.
    • Set Facing Direction to World (back camera) for most AR use cases. Use User (front camera) only for face-tracking or selfie-style AR experiences.
    • Set Render Mode to Before Opaques unless you have a custom render ordering need. This mode ensures the camera feed renders before your AR content.
    • Also, on the Main Camera, configure the AR Camera Background component.
    • Leave Use Custom Material unchecked for most cases. AR Foundation automatically uses a platform-appropriate shader to render the camera feed. Check this only if you’re applying a custom shader/material (e.g., to add visual effects like color tint, distortion, or depth blending).

    12

  5. Import Custom Assets:
    • Drag and drop the XFactoryAssets.unitypackage into your Project window or import them via Assets > Import Package > Custom Package....
    • In the import dialog, make sure all items are selected and click Import.

    10

  6. Place the Drone into the Scene:
    • Drag the Drone prefab from Assets > XFactory > Prefabs > Drone into the Hierarchy as a top-level GameObject.
    • Select the Drone GameObject in the Scene view and position it approximately 1.5 meters in front of the camera (e.g., Position = (0, 0, 1.5)).

    11

    Since the drone is placed in world space, it will appear in front of the user at app launch and remain fixed in place as the user moves around. This setup gives you immediate visibility for testing without the drone following the device.

  7. Locate the URP Asset:
    • Open Edit > Project Settings > Graphics.
    • Under Default Render Pipeline, click the asset selector and choose Mobile_RPAsset.

    03

  8. Configure URP for AR:
    • Go to your Project window, locate the asset named Mobile_Renderer (usually found under Assets/Settings/)
    • Select it to open its Inspector.
    • Scroll down to Renderer Features.
    • Click Add Renderer Feature.
    • Choose AR Background Renderer Feature.

    04

AR Foundation supports both the Built-in Render Pipeline and the Universal Render Pipeline (URP). If you’re using URP, Unity 6 already includes default render pipeline assets you can use for AR — no need to create custom ones.

Deploying to iOS

Testing your AR application on an iOS device is crucial because it offers authentic feedback on performance, user interface behavior, and compatibility with Apple’s hardware and software ecosystem. Before you proceed, download and install Xcode from the Mac App Store. Xcode is required to compile and deploy your Unity project on iOS devices, making it an essential tool for iOS development.

13

  1. Enable ARKit:
    • Go to Edit > Project Settings > XR Plug-in Management.
    • Select the iOS tab and ensure that ARKit is enabled. Enabling ARKit connects your project to Apple’s native AR framework, allowing you to access advanced AR functionalities on iOS devices.
  2. Configure Bundle Identifier:
    • Open Edit > Project Settings > Player.
    • Set the Company Name and Product Name. This step generates a unique Bundle Identifier, which is essential for app identification and deployment on the Apple ecosystem.
    • Ensure that the Bundle Identifier (formatted as com.CompanyName.ProductName) is unique and free of spaces or special characters. A correct identifier prevents build errors and is required for distribution on the App Store.
    • Scroll down to Camera Usage Description and enter a brief explanation This will be shown to users the first time the app requests camera access.

    14

  3. Configure Build Settings:
    • Navigate to File > Build Profiles.
    • Go to Scene List, click the Add Open Scenes button, and only check Scenes/AR101. Adding your current scene ensures that it is included in the build.
    • Select iOS as your target platform and click Switch Platform. Switching the platform configures your project with iOS-specific libraries and settings.
    • Click Build to create an Xcode project. This exports your Unity project into an Xcode project, which you can then compile and deploy to an iOS device.

    15

  4. Build to Device Using Xcode:
    • Open the generated Xcode project (Unity-iPhone.xcodeproj).
    • Connect your iOS device via cable.
    • Resolve any signing issues. Enable Automatically manage signing and assign your team under Signing & Capabilities. Resolving signing issues ensures that your app is properly authenticated and trusted by the iOS system.
    • Once signing issues are resolved, select your device as the run destination and click the run button in Xcode. Running the project on your device lets you test AR functionality in real conditions.
    • Trust the developer profile on your iOS device if prompted (navigate to Settings > General > VPN & Device Management). Trusting the profile allows your device to run the app without security blocks.

    16

  5. Test on Your iOS Device:
    • On your iOS device, select Enable Developer Mode.
    • When launched, you will be prompted for camera access.
    • Once camera access is granted, the app will show your real-world environment as the background, with the drone hovering in front of you, anchored in world space.

    21

Deploying to Android

Testing your AR application on an Android device is vital since it provides accurate insights into performance, adaptability, and the overall user experience on a wide range of Android hardware configurations. Before you proceed, ensure the Android Build Support module is installed with Unity. This module is necessary to compile your project into an APK file for Android devices. On your Android device, enable USB Debugging from the Developer Options menu. USB Debugging enables communication between your Android device and the Unity Editor for testing purposes.

17

  1. Enable ARCore:
    • Go to Edit > Project Settings > XR Plug-in Management.
    • Select the Android tab and ensure that the ARCore plug-in is enabled. Enabling ARCore connects your project to Google’s AR platform, providing access to device-specific AR features.
    • Go to Project Validation and fix all the issues.

    18

  2. Configure Build Profiles:
    • Navigate to File > Build Profiles.
    • Add the open scene(s) with Add Open Scenes. This ensures that the scene you are working on will be part of the final APK.
    • Select Android and click Switch Platform. This step converts your project for Android, applying the necessary settings and libraries.

    19

  3. Deploy Your App:
    • Connect your Android device via cable.
    • In the Build Profiles window, click Refresh to list your connected device.
    • Click Build and Run. Save your build as an APK file (for example, in a folder named Builds). This creates an APK, the standard format for Android applications, allowing you to install and test your AR app.
    • Once the app installs and runs on your device, grant any necessary permissions (such as camera access). Granting permissions ensures that the app can access essential hardware and run correctly.

    20

Iterate by making small modifications (for instance, adjust the scale of a GameObject) to confirm that updated builds deploy correctly. Iterative testing is key to identifying and resolving issues quickly. Configure your Game view’s aspect ratio (for example, set it to 16:9 Portrait) within Unity to accurately simulate the mobile device experience during development. This allows you to preview and adjust how your app will appear on actual devices.


Architecture

Understanding the AR Foundation architecture is key to making informed decisions when building your AR application. Software architecture refers to the overall structure of a software system. In the context of AR Foundation, this architecture is designed to abstract platform-specific implementations into a unified interface, enabling you to develop AR apps across multiple devices with a consistent API. By understanding these elements, you can better design your application, optimize performance, and troubleshoot issues when integrating native AR capabilities.

F02

Subsystems

Subsystems (or SubsystemWithProvider) define the core features and life cycle of AR-related functionality. They provide an abstraction layer over platform-specific SDKs such as Google ARCore or Apple ARKit. This abstraction allows you to write common application code that interacts with multiple platforms seamlessly, reducing development complexity.

  • Life Cycle of Subsystems: All subsystems share the same life cycle. This predictable life cycle, managed by Unity’s active XRLoader, ensures consistency when initiating or terminating AR processes. It also helps in managing system resources effectively by monitoring the state of a subsystem; e.g., created (the subsystem is instantiated), started (it begins processing and delivers data), stopped (processing is halted), and destroyed (the subsystem is disposed).

  • Subsystem Descriptors: Each subsystem has a corresponding descriptor that outlines its capabilities and limitations (for example, whether an image tracking subsystem supports mutable libraries). Descriptors allow you to query capabilities at runtime, which is essential for creating adaptive and robust AR experiences that check for feature support on the target device.

  • Tracking Subsystems: A tracking subsystem is responsible for detecting and continuously tracking objects (known as trackables) in the physical environment. trackables are entities such as planes, images, or points that are detected and monitored by the subsystem. Each trackable is uniquely identified using a 128-bit GUID, ensuring consistent tracking across frames. Your app calls methods defined in the XRPlaneSubsystem interface to start, update, or stop plane detection. The underlying provider (ARCore for Android or ARKit for iOS) implements these methods, so your application code remains unchanged across platforms. This unified interface saves development time and minimizes errors since you don’t have to write separate implementations for each platform.

Managers

Managers in AR Foundation are components that make subsystem data accessible to your scene as GameObjects and components. They also handle the life cycle of their respective subsystems. Managers simplify your workflow by abstracting the low-level details of subsystem management, allowing you to focus on application logic and user interaction. To add managers to the scene, select the XR Origin and click the Add Component button in the Inspector and search for the specific manager (e.g., AR Plane Manager, AR Tracked Image Manager). This process creates the necessary components to interface with the underlying AR subsystems, enabling features like plane detection or image tracking. This is especially useful because it lets you treat real-world features (like surfaces, machines, or tracked parts) as interactive GameObjects that you can program, visualize, or annotate—without manually handling the raw tracking data.

  • Trackable Managers: Trackable managers wrap tracking subsystems, translating raw tracking data into usable components in your scene. For instance, ARPlaneManager creates and manages ARPlane components when the XRPlaneSubsystem detects a physical surface.

  • Trackable GameObjects: These are instantiated as children of the XR Origin, and their positions, rotations, and scales are defined relative to that origin. This setup provides a clear, organized way to work with dynamic AR content, ensuring that virtual elements align correctly with the physical environment.

  • Configuring a Trackable’s GameObject: When a new trackable is detected, the manager instantiates a prefab (if specified) or creates an empty GameObject with the corresponding ARTrackable component. If a prefab is set in the manager’s Inspector, it will be instantiated for each detected trackable. If no prefab is specified, an empty GameObject is created, and the appropriate ARTrackable component is added automatically. This mechanism guarantees that every detected trackable has a visual and interactive representation in the scene. It also makes it easy to customize the appearance and behavior of trackables by editing the prefab.

  • Enabling and Disabling Managers: Managers can be enabled or disabled to control which AR features are active. When a manager is enabled, it starts its subsystem; when disabled, it stops the subsystem. This flexibility is important for managing power consumption and processing resources, as you can turn off unused features to optimize performance. Foe example, imagine you want to place a floating drone in your AR application that hovers in mid-air and maintains its position as the user moves around. You add an AR Anchor Manager component to your XR Origin. When the user taps the screen, your app performs a raycast against the AR environment (typically planes or feature points). If a valid point is found, an ARAnchor is created at that position, and the drone prefab is instantiated as a child of that anchor. The drone prefab includes animation for spinning propellers and a smooth hover motion, giving the illusion of autonomous flight in the room.

Review this Unity documentation to learn more the AR Foundation architecture, subsystems, and managers.


Key Takeaways

AR Foundation streamlines cross-platform AR development by providing a unified API that bridges Unity projects to native AR SDKs like ARKit, ARCore, and OpenXR, allowing developers to write once and deploy across a wide range of devices. Its modular provider plug-in system enables targeted feature access, while its robust set of core features—such as plane detection, image tracking, anchors, and light estimation—supports both simple prototypes and complex multi-user experiences. By understanding its architecture of subsystems, descriptors, and managers, developers can efficiently map real-world trackables to interactive GameObjects, manage resources, and adapt to device capabilities. Proper setup in Unity, including installing AR Foundation and provider packages, configuring the AR scene, and following platform-specific deployment steps, ensures reliable performance and compatibility, making AR Foundation a powerful and future-proof tool for building immersive AR applications.