B3. Physics & Animation
Learning Outcomes
- Explain the role of physics engines in Unity for XR realism. Before class, read about Unity’s physics engines (PhysX vs. Box2D) and explore how they contribute to realistic XR experiences.
- Add and configure a Rigidbody component in Unity. As preparation, open Unity, attach a
Rigidbody
to a GameObject, and test settings likeUse Gravity
,Is Kinematic
, andMass
to see their effects in play mode.- Identify and use different collider types in Unity. Ahead of the session, review collider types, attach one to a primitive shape, and toggle
Is Trigger
to compare collision detection with trigger events.- Create and apply Physic Materials to modify object interactions. In advance, learn about properties like
Dynamic Friction
andBounciness
, then create aPhysic Material
and apply it to a GameObject to see how movement and surface response change.- Set up and experiment with joints in Unity. For your prep work, review hinge and fixed joints, then connect two GameObjects with a
Hinge Joint
to observe how they behave under gravity.- Describe the basics of animating GameObjects in Unity. Prior to class, review animation terminology and the
Animation
window to get familiar with keyframes, clips, and controllers.- Create a simple hover animation using the Animation window. Before arriving, animate a cube’s Y-position to float up and down smoothly over 5 seconds, ensuring the motion loops seamlessly.
- Design and use an Animator Controller to manage state transitions. As a pre-class exercise, create an
Animator Controller
for your animated cube and set up a parameter-driven transition (e.g., Idle → Bounce).- Import external animations into Unity. In preparation, review the process for importing
.FBX
animations from tools like Blender or Maya, then practice importing one into Unity for use in a scene.
Physics in Unity
Unity’s built-in 3D physics system enables objects to interact realistically using physical principles such as gravity, collisions, force, motion, and constraints. These mechanics are fundamental to developing engineering simulations, digital twins, and real-time system visualizations, especially when modeling equipment behavior or human-machine interactions. Physics systems bring immersion, realism, and interactivity to XR:
- Immersion: Realistic object behavior enhances user presence in virtual environments.
- Real-Time Interaction: Physics enables objects to respond naturally to user input (e.g., throwing, grabbing, pushing).
- Human-Machine Interface Simulation: Physics allows for accurate modeling of interactions between humans and devices (e.g., levers, touch panels, virtual machines).
- Robotics & Digital Twins: Simulating mechanical systems and sensors with correct forces and constraints helps replicate real-world behavior.
- Spatial Awareness: Physics-driven collisions and movement inform spatial reasoning, which is vital in training and simulation apps.
In XFactory, the realistic movement of drones in logistics, robotic arms in assembly, or a quadruped robot navigating a tech station floor all rely on properly applied Unity physics principles.
Core Physics Features
-
Character Control: Configures physics-based control systems for first-person and third-person characters, enabling realistic movement and interaction. Ideal for player-controlled avatars in action, adventure, and simulation games.
-
Rigidbody Physics: Applies physics-based behavior to GameObjects using
Rigidbody
components, allowing for gravity, forces, torque, and momentum. Common in any dynamic object interaction, such as rolling balls, falling crates, or physics-based puzzles. -
Collision: Uses
Collider
components to detect and configure collisions between GameObjects, supporting both physical and trigger interactions. Essential for environment interaction, player movement boundaries, and hit detection. -
Joints: Connects GameObjects using joints to simulate physical behaviors such as pivoting, movement constraints, and mechanical linkages. Useful in building systems like swinging doors, suspension bridges, or robotic arms.
-
Articulations: Sets up complex systems of rigid bodies and joints with more advanced constraints and physical accuracy, useful for robotics and machinery. Best for simulating biomechanics or robotic systems with precise motion and control requirements.
-
Cloth: Simulates fabric behavior for character clothing, flags, curtains, and other dynamic textiles in real-time. Adds realism to clothing and decorative elements in characters and environments.
-
Multi-Scene Physics: Manages and simulates separate physics contexts across multiple scenes in a single project, useful for layered or modular level design. Enables complex simulation setups like multiplayer environments or split gameplay areas.
-
Physics Profiler Module: Analyzes and monitors physics performance metrics in your application to identify and resolve bottlenecks. Critical for optimizing physics-heavy scenes and maintaining smooth frame rates on target hardware.
Unity’s Physics Engines
Unity provides two main physics engines to simulate object behaviors:
-
NVIDIA PhysX: Used for 3D physics simulation (handling depth-based interactions in 3D space). PhysX powers rigidbody dynamics, collision detection, joints, and other essential features for realistic object behavior in three dimensions. It is widely used in games and simulations for its performance and accuracy.
-
Box2D: Used for 2D physics (interactions in a flat plane using only X and Y axes). Box2D provides lightweight and efficient physics tailored for side-scrollers, puzzle games, or schematic simulations where depth is not required.
Rigidbody
A Rigidbody
component applies Newtonian physics (laws of motion) to a GameObject, allowing it to move and rotate in a realistic way based on physics calculations. By adjusting properties like mass, drag, and constraints, developers can fine-tune how the object behaves in different environments. The Rigidbody
component makes a GameObject responsive to:
- Gravity: The natural force pulling objects downwards
- External and Internal Forces: A push or pull on an object (e.g., propulsion)
- Collisions: Interactions between physical objects with contact
- Joints and Mechanical Constraints: Restrictions on movement or rotation
Rigidbody
Properties
Mass
: Determines how resistant the object is to forces ($F = ma$, where $m$ is mass).Linear Damping
: Simulates linear air resistance or surface friction (slows linear motion). It is calledDrag
in older versions of Unity.Angular Damping
: Simulates resistance to rotational motion (slows spinning). It is calledAngular Drag
in older versions of Unity.Automatic Center of Mass
: If enabled, Unity automatically calculates the object’s center of mass from its collider shapes. Disabling allows you to set a custom center—useful for asymmetrical objects.Automatic Tensor
: Controls whether Unity auto-calculates the inertia tensor (rotational mass distribution). Turn off to manually adjust inertia for advanced simulations like flywheels or robotic arms.Use Gravity
: Enables the object to fall naturally under simulated gravity.Is Kinematic
: If enabled, disables physics interaction—object must be moved via script or animation (useful for controlled robotic arms or assembly parts).Interpolate
: Smooths position/rotation updates between physics steps (prevents jitter in motion rendering).Collision Detection
: Determines how accurately collisions are handled (e.g., continuous detection prevents fast objects from tunneling).Constraints
: Locks position or rotation on specific axes (useful for stability in industrial machines).
Adding a Rigidbody
Let’s simulate a large shipping crate falling onto an industrial scale in the logistics station. This example demonstrates how to apply realistic physics to a GameObject so it reacts naturally to gravity, impact, and damping forces.
- Add
Rigidbody
:- Select the box prefab instance in the scene (
Box_Large_01a_Prefab_01
positioned aboveIndustrial_Scale_01a_Prefab_01
). - In the
Inspector
, clickAdd Component > Rigidbody
to enable physics-based motion. - This allows the box to fall, collide, and respond to gravity, force, and surface interaction.
- Select the box prefab instance in the scene (
- Adjust
Rigidbody
Properties:Mass = 40
to represent the realistic weight of a loaded crate.Linear Damping = 0.5
to simulate moderate air or surface resistance, slowing linear motion gradually.Angular Damping = 0.3
to slightly reduce rotational spin after impact, ensuring the crate remains stable and upright.Interpolate = Interpolate
to smooth physics updates between frames to reduce jitter, especially when the object moves slowly or is pushed by other objects.Use Gravity = Enabled
to allow the crate to fall naturally onto the scale.Is Kinematic = Disabled
to ensures the crate responds dynamically to physics forces and collisions.Collision Detection = Discrete
, appropriate for large, slow-moving crates; useContinuous
only if the object moves rapidly.
- Run the Scene:
- Press
Play
to simulate the drop. - Watch the crate fall with realistic weight, settle naturally on the scale, and respond to physical forces in a stable and believable way—mirroring real-world logistics handling like robotic loading or warehouse stacking.
- Press
Colliders
A Collider
defines an object’s physical boundary for detecting and resolving collisions. It is essential for interactivity in XR simulations. Colliders don’t cause motion; they only define space for collision detection (when and where objects make contact). They can be either primitive shapes (like boxes, spheres, capsules) for performance or mesh-based for more accurate representations. Colliders can also be set as triggers to detect overlap events without physically blocking movement.
In XFactory, use a
Mesh Collider
on the quadruped robot for precise collision zones or use aBox Collider
for tools or crates for simplicity and performance.
Types of Colliders
-
Box Collider
: A cuboid (rectangular prism) boundary used for boxy or regular-shaped objects. In XFactory, this is ideal for simulating crates, pallets, racks, or the forklift body where clean, flat surfaces define the object’s shape. -
Sphere Collider
: A spherical boundary used for round or symmetrical objects. Useful in XFactory for parts like robotic ball joints, spherical sensor modules, or small scanning devices dropped from drones. -
Capsule Collider
: A cylindrical collider with rounded ends, well-suited for elongated or humanoid shapes. In XFactory, apply it to mobile drone bodies or the quadruped robot’s legs to model their streamlined movement. -
Mesh Collider
: Uses the actual mesh geometry of an object for detailed and precise collision detection. In XFactory, Mesh Colliders are used for complex machinery such as CNC machines, 3D printers, or the car body in the welding station, where accurate geometry matters for realistic interaction. -
Terrain Collider
: A collider specialized for large, natural or irregular ground surfaces created with Unity’s Terrain system. This is useful in XFactory’s exterior scene to simulate the factory yard, roads, or loading zones with varied terrain elevation.
Adding a Collider
Now, let’s simulate the barcode scanner in the XFactory logistics station falling onto the table. This demonstrates how Unity’s physics system handles collisions using different collider types, and how collider choice affects realism and performance.
- Use the Table as Ground for the Scanner:
- In the
Hierarchy
, locate the table object already present in the scene (e.g.,Table_01a
). - Ensure the table has a
Box Collider
orMesh Collider
component. - If it’s missing, add one via
Inspector > Add Component > Box Collider
or> Mesh Collider
. - This acts as the flat surface the scanner will land on, simulating a physical tabletop.
- In the
- Prepare the Barcode Scanner:
- In the
Hierarchy
, locate the barcode scanner object (e.g.,Scanner_01a_Prefab_01
), which should already be positioned above the table. - Add a
Rigidbody
component to the barcode scanner. - Set
Mass = 2
(represents a lightweight handheld scanner). - Set
Linear Damping = 0.1
andAngular Damping = 0.4
(adds light air resistance and rotational stability). - Set
Use Gravity = Enabled
andInterpolate = Interpolate
.
- In the
- Add a
Box Collider
to the Barcode Scanner:- Select the barcode scanner in the
Hierarchy
. - Add a
Box Collider
viaInspector > Add Component > Box Collider
. This approach is fast and efficient, but less accurate for non-boxy shapes. It is useful for general physics approximations.
- Select the barcode scanner in the
- Add a
Mesh Collider
to the Barcode Scanner:- Remove the
Box Collider
from the barcode scanner. - Add a
Mesh Collider
. In theInspector
, enableConvex
to allow physics simulation with theRigidbody
. This approach closely matches the scanner’s shape but can be more performance-intensive.
You can switch between
Box Collider
andMesh Collider
to observe how each affects fall behavior and collision accuracy. - Remove the
- Run the Scene:
- Press
Play
to simulate. - Watch as the barcode scanner falls from its elevated position and collides with the tabletop.
- Compare the results between the two collider types.
Box Collider
is faster, but may not align perfectly with the scanner’s geometry.Mesh Collider
is More precise collisions, better realism, but may cause more processing overhead—especially in large-scale scenes.
- Press
This test is useful for evaluating collider strategies when balancing performance vs. physical accuracy in real-time simulations like AR/VR training or robotics prototyping.
Trigger Zones
Is Trigger
(a checkbox in the Collider
component) enables an object to detect when something enters, exits, or stays within its collider without physically interacting (no collision force is applied). Use this for non-physical detection systems—for example, when a box passes into a scanner area at the logistics station, triggering inventory logging or sensor activation in a digital twin simulation.
Physic Materials
In Unity, a Physic Material allows fine-tuning how objects slide, bounce, or stick when they collide. This affects both realism and safety simulation. You can adjust properties like friction and bounciness to achieve the desired physical behavior. Physics materials are often used on colliders to create varied surface interactions, such as slippery ice or rough terrain.
In XFactory, apply a high-friction material to the robot’s end-effector to prevent tool slippage, or a bouncy material to a tire being tested in the assembly station.
Physic Material
Properties
-
Dynamic Friction
: Friction applied when an object is already in motion, affecting how easily it slides across a surface and how much force is required to keep it moving (e.g., a forklift’s tire sliding slightly on a smooth concrete floor). -
Static Friction
: Friction that resists the start of movement when an object is at rest, determining how much force is needed to overcome initial inertia (e.g., a crate resisting movement as it is pushed). -
Bounciness
: Controls how much energy is retained after a collision, directly affecting how high or far an object rebounds (e.g., a tire bouncing slightly during a vertical drop). -
Friction Combine
: Defines how friction values from two colliding surfaces are combined—using options likeMinimum
,Maximum
,Average
, orMultiply
—to determine the resulting surface interaction (e.g., simulating the contact between a rubber tire and a metal loading ramp). -
Bounce Combine
: Determines how bounciness values from two surfaces are blended during impact, which influences how much an object rebounds (e.g., a plastic tool dropped on the concrete floor).
Use these properties to fine-tune interactions in simulations where physical realism—such as sliding resistance, bounce-back, or grip—is critical.
Adding Physic Material
Let’s simulate the tire sitting on a wooden pallet in the assembly station rolling off and bouncing across the floor. This exercise demonstrates how to use a Physic Material
to control surface properties like friction
and bounciness
, simulating realistic motion and impact behavior.
- Locate and Prepare the Tire:
- In the
Hierarchy
, locate the tire object already positioned on a wooden pallet in the assembly station (e.g.,Tire
onPallet_01a
). - Ensure the tire has a
Rigidbody
component assigned to its parent GameObject to enable physics simulation. If missing, add it viaInspector > Add Component > Rigidbody
. SetMass = 50
,Linear Damping = 0.2
,Angular Damping = 0.5
,Interpolate = Interpolate
, andCollision Detection = Discrete
. - Add a
Collider
to allow the tire to interact with surfaces. Use either aCapsule Collider
(simpler and efficient) or aMesh Collider
(for detailed shape - enableConvex
). Make sure to assign theMesh Collider
to the child GameObjects that contain both theMesh Filter
andMesh Renderer
, as the collider requires a mesh to function properly.
- In the
- Create a
Physic Material
:- In the
Project
window, navigate toAssets > Materials > Physic Material
(or a similar path to ensure proper organization). - Right-click and choose
Create > Physics Material
, then name itTireMaterial
. - In the
Inspector
, configure the material to define how the tire behaves when rolling and bouncing. Dynamic Friction = 0.3
simulates rolling resistance and contact with the floor.Static Friction = 0.4
requires some force to get the tire moving from rest.Bounciness = 0.6
allows moderate bounce upon impact.Bounce Combine = Maximum
prioritizes the higher bounciness value during collisions.
- In the
- Assign
Physic Material
to the Tire:- Select the tire in the scene.
- In the
Collider
component (e.g.,Mesh Collider
) of the child GameObject with rubbery material, assign theTireMaterial
to theMaterial
field by dragging it from theProject
window. - This controls how the tire interacts with the floor and other surfaces as it moves and collides.
- Ensure Floor, Walls, and Pallet Have Colliders:
- Select the shop floor object (
Floor_Merged
) and add aBox Collider
so it can register collisions. - Repeat the same for the walls (
Walls_Merged
). - The wooden pallet should also have a
Box Collider
to support realistic contact when the tire rolls off its edge.
- Select the shop floor object (
- Set Up the Tire for Movement:
- Use the
Move Tool
andRotate Tool
to position the tire above and near the edge of the pallet. - Tilt it slightly or raise one side so that it naturally begins to roll off the pallet when gravity is applied at runtime.
- Use the
- Run the Simulation:
- Press
Play
to start the scene. - Watch the tire roll off the pallet, bounce on the floor, and gradually come to rest based on the defined mass, damping, bounciness, and friction values.
- Try switching between a Box Collider and a Mesh Collider on the tire to observe how collider shape influences accuracy and realism.
- Press
Joints
Joints connect Rigidbody
objects to simulate mechanical constraints, linkages, and articulated mechanisms (multi-part moving systems like robot arms or doors). They allow for controlled relative motion between objects and can restrict movement along specific axes or apply forces to maintain alignment. Unity provides various joint types, such as Hinge Joint
, Fixed Joint
, and Spring Joint
, to suit different physical behaviors and interactions.
In XFactory, use
Hinge Joint
joints for the robotic arm at the welding station, orFixed Joint
joints to attach tools to the assembly robot on a mobile base.
Types of Joints
-
Hinge Joint
: Allows rotation around one axis, making it suitable for simulating mechanical pivots and simple articulated motion. This can be used for simulating robotic gripper movements or the swinging door of a storage cabinet in the logistics area. -
Fixed Joint
: Locks two objects together rigidly so they move as one while still reacting to external forces and collisions. This is ideal for attaching tools to the robot on the mobile base in the assembly station. -
Spring Joint
: Connects objects with spring-like behavior, allowing for controlled elasticity and damping during movement. This can be used to simulate shock-absorbing mounts for equipment or cable tensioning in the production station. -
Configurable Joint
: Provides detailed control over movement and rotation constraints along all axes, supporting complex mechanical interactions. This is useful for advanced simulation of a 6-DOF robotic arm in the tech station or fine-tuning a robotic gripper mechanism in the assembly area.
Simulating a Joint
Let’s simulate the opening and closing motion of a door using a hinged door. This example demonstrates how to configure a Hinge Joint
to replicate realistic door articulation, commonly used in physical environments where interactive mechanics are needed. This setup is suitable for building realistic simulations for access control, safety barriers, or interactive environments.
- Use Existing Door in the Scene:
- In the
Hierarchy
, locate a hinged door GameObject in the XFactory scene (e.g.,Door_01
or another appropriate door object). - Ensure the door is a separate GameObject, parented under a static door frame or wall (e.g.,
Door_01_Prefab_01
), which serves as the stable base.
- In the
- Add
Hinge Joint
to the Door:- Select the door GameObject.
- In the
Inspector
, clickAdd Component > Hinge Joint
. - Unity will automatically add a
Rigidbody
if one is not present (required for physics joints). - In the
Rigidbody
component, verify that bothIs Kinematic
andUse Gravity
are disabled (so the door reacts to physics but isn’t pulled down by gravity unnecessarily).
- Configure
Hinge Joint
Settings in theInspector
:- Leave the
Connected Body
field empty. Unity will automatically connect the joint to the nearest static object (i.e., the door frame) as long as that object does not have aRigidbody
. - Set the
Anchor
to the hinge side of the door—typically near the edge where the door rotates. Use Scene view Gizmos to adjust and preview the rotation axis. - Set the
Axis
to define the direction of rotation. For most doors, use(0, 1, 0)
if the door rotates around the Y-axis.
- Leave the
- Apply Joint
Limits
:- Enable
Use Limits
. - Expand the
Limits
section of theHinge Joint
. - Set
Min = -90
andMax = 0
(so the door opens towards outside). - This constrains the door to swing open up to 90°, simulating realistic physical constraints of a hinged door.
- Enable
- Enable Motor (Optional):
- Enable
Use Motor
. - Set
Target Velocity = -45
(degrees per second), so the door opens counterclockwise. - Set
Force = 50
to define how strongly the motor pushes. - Set
Free Spin = false
to allow precise controlled rotation.
- Enable
- Run the Scene:
- Press
Play
to test the simulation. - The door should rotate naturally within the defined limits.
- If
Use Motor
is enabled, the door will automatically begin swinging open—simulating an automated door mechanism. - Make sure the
Static
box on the top right of theInspector
is unchecked.
- Press
Joints are important for simulating real-time, physics-based interactions, allowing objects like doors to respond dynamically to forces, collisions, and user input. Unlike animations—which play predefined motions—joints enable interactive, physically accurate behavior that’s essential for realistic simulations.
Animating GameObjects
Unity’s animation system is a powerful tool for adding movement and dynamic behavior to objects within a scene. Animations enhance interactive experiences by making characters, environments, and UI elements feel more engaging and lifelike. While complex character animations (such as facial expressions or fluid movements) are typically created in external digital content creation software like Maya, 3ds Max, and Blender, Unity also provides built-in tools for creating scene-based animations (such as moving platforms, opening doors, robotic arm movement, or UI transitions).
In XFactory, animations can be used to illustrate a forklift loading a box onto a rack, a robotic arm assembling engine components, or CNC machinery operating as part of a production process. These animations not only improve realism but also support educational and operational objectives in engineering simulations.
Core Concepts
- Animation Clips: An Animation Clip is a timeline of movement and property changes applied to GameObjects. Each clip consists of keyframes, which record specific attributes (like position, rotation, scale, or material properties) at precise moments. Unity then interpolates the changes between keyframes to create smooth, continuous motion.
In XFactory, you can use an animation clip to show a drone lifting off from the logistics station and scanning QR codes on boxes.
- Animator Controller: An Animator Controller manages different animation clips and defines how an object transitions between them. It provides logic to switch between animations using conditions such as user input, machine states, or scripted triggers.
A robotic arm in the assembly station might switch between “Idle”, “Pick Part”, and “Assemble Part” states depending on a simulation event.
- Animation States and Transitions: These are part of the Animator Controller. Each state represents a single animation (e.g., “Idle”, “Moving”, “Operating”), and transitions define when and how an object shifts from one state to another. Conditions like a sensor detecting an object or a timer completing can trigger transitions.
The mobile robot in XFactory might transition from “Waiting” to “Moving to Assembly Station” when the operator triggers a command.
- Rigging and Skeletal Animation: For complex models like humanoid figures or articulated robots, a rig is used. A rig consists of a skeleton (a hierarchy of bones) that drives mesh deformation. Skeletal animation manipulates these bones to animate the model.
The quadruped robot in the tech station requires a skeletal rig to animate each leg’s movement while walking across the lab floor.
- Keyframes: Keyframes mark when specific properties of a GameObject change. Unity’s Animation Window allows users to define keyframes for objects manually. Between keyframes, Unity calculates the in-between frames to ensure smooth transitions.
Keyframes can animate the movement of a welding robot arm in the welding station as it joins two car parts.
- Blend Trees: A Blend Tree blends between multiple animations based on input parameters (e.g., speed, direction). This is useful for continuous movement where smooth transitions are necessary.
A mobile robot in the assembly area could use a Blend Tree to blend between turning, accelerating, and reversing animations.
- Animation Events: These allow you to trigger code or actions at a specific frame of an animation. Useful for syncing animations with sound, effects, or gameplay logic.
During a robotic machine tending animation, an event can be triggered when the part hits the machine table or tending table to play a thud sound and activate a vibration effect.
Animation Methods
-
Using the Animation Window (In-Editor Animation): Unity’s Animation Window enables visual keyframe-based animation directly within the editor. Ideal for animating simple or mechanical objects that require straightforward motion. In XFactory, you can animate the opening and closing of the CNC machine’s door or a forklift belt moving pallets from one station to another.
-
Animator Controller Setup: The Animator Controller manages logic-driven animations. It enables objects to change animations dynamically based on simulation or user-defined conditions. In XFactory, the CNC machine in the production station starts its “Processing” animation when the simulation triggers a start command.
-
Importing Animations from External Software: For complex motion and high-quality rigs, animations are often created in tools like Maya, 3ds Max, or Blender, then exported as
.FBX
files and imported into Unity. These can be configured for humanoid or generic rigs in Unity’s import settings. In XFactory, you can import detailed quadruped movement sequences created in Blender to use in the exhibition station simulation.
Animation Window
Let’s create an animation in Unity, focusing on a flying drone in a logistics station that moves between racks, hovers to scan packages, and lands/takes off. For a drone, this includes flying between racks (Position
), hovering to scan (Scale
or light pulse), landing / Take-off (Position
, Scale
), and scanning pulse effect (Material color
, Emission
, or Scale
). This kind of animation enhances realism for warehouse simulations, drone fleet management systems, or XR-based operational training.
Creating an Animation Clip
An Animation Clip
is a Unity asset that stores timed transformations (position, rotation, scale, etc.). Let’s start by creating a simple hover and scan animation for the drone:
- Open your XFactory Unity scene and navigate to the logistics station.
- In the
Hierarchy
, select the drone GameObject (Drone
) or the scanning component (Eye
). - Open the
Animation
window (Window > Animation > Animation
) and clickCreate
. - Name the clip
Drone_HoverScan
and save it in anAnimations
folder. - Select the drone GameObject (
Drone
) in theHierarchy
.- Add a
Transform > Position
property to create a hovering effect. - Start with
Y = 0.50
, move toY = 0.55
at 0.5s, and return toY = 0.50
at 1s for a gentle hover loop.
- Add a
- Select the scanning component (
Eye
).- In the
Inspector
, clickAdd Component > Animator
if it doesn’t already have one. - Create a new animation for this component.
- Add
Transform > Scale
to simulate a pulsing scan light. - Set initial scale to
(1, 1, 1)
, increase to(1.2, 1.2, 1.2)
at 0.25s, and back to(1, 1, 1)
at 0.5s.
- In the
- Press the
Play
button in the Unity Editor. Observe the drone gently bobbing up and down while the scanner component pulses in scale, simulating a live hover-and-scan behavior.
If the animation doesn’t play, make sure the object has an
Animator
component and that your clip is assigned to it. You can also loop the animation by enablingLoop Time
in theAnimation
clip’s settings (Inspector > Loop Time
).
Smoothing an Animation
To make the drone’s hover animation feel more realistic, you can smooth its vertical motion using Unity’s Curves
editor, which controls the speed and acceleration of animated values.
- Open the
Animation
window and select yourDrone_HoverScan
clip. - Switch to
Curves
mode using theCurves
icon in the bottom-left of the timeline. - Select the
Transform.Position.y
curve. - Right-click on keyframes and choose
Auto
orEase In Out
to apply natural acceleration and deceleration. -
Adjust curve handles to fine-tune the timing and smoothness of the rise and fall.
This gives the drone a more natural hover effect, mimicking how drones subtly slow down before changing vertical direction—ideal for training simulations or drone fleet visualizations.
Recording an Animation
Now, let’s animate the emission color of the drone’s scanner eye to pulse from yellow → red → yellow, simulating an active scanning state. In this example, we will use the recording method to add it manually.
- Select the material applied to the
Eye
GameObject. In theInspector
:- Ensure the shader is
URP/Lit
or another shader that supports emission. - Check the
Emission
box to enable emission properties. - Set the initial
Emission Color
to a visible value like yellow.
- Ensure the shader is
-
In the
Hierarchy
, select theEye
GameObject. Open theScannerPulse
clip. -
Click the record button (red circle) in the
Animation
window. - In the
Inspector
, under theMaterial
section:- Click the color box next to
Emission Color
and change it to red. - Move the playhead to
0.1s
, then change the color back to yellow. - Move the playhead to
0.2s
, and return to red again. - Click record again to stop recording.
- Click the color box next to
-
Select the
ScannerPulse
animation clip in theProject
window. In theInspector
, checkLoop Time
so the pulse continues during runtime. - When you press
Play
, the drone’s eye will now emit a glowing pulse that cycles between red and yellow—simulating an active scanning state.
Animator Controller
The Animator
in Unity is a robust system for managing and controlling animations on GameObjects. In engineering environments like XFactory, it enables lifelike simulation of robots, machinery, and UI behaviors. Whether animating a quadruped robot at the exhibit station or a robotic arm in the manufacturing station, the Animator allows smooth transitions between dynamic states.
Core Concepts
- Animator Component: Attached to a GameObject, it links to an
Animator Controller
that drives the animation logic. - Animator Controller: The control hub for animations, defining states, transitions, and parameters.
- State Machine: Organizes animation states (e.g., Idle, Walking) and controls how they transition.
- Parameters: Input values that trigger transitions:
Bool
: Toggle behavior (IsEngaged
)Int
: Mode selector (OperationMode
)Float
: Sensor values (MotorLoad
)Trigger
: One-time actions (StartWalking
,ReturnToBase
)
For the quadruped robot in XFactory, typical states may include
Idle → Scanning → Walking → Returning
.
Animate with Animator
Let’s use Unity’s Animator
to control simple drone states—hovering and moving—using basic transitions. This is a simpler setup than animating limb joints and is ideal for creating modular drone behavior like patrolling, scanning, or flying between zones.
- Set Up the Scene:
- Open your Unity scene with the drone in the logistics station.
- Select the root GameObject (
Drone
). - In the
Inspector
, add anAnimator
component if it is not already present.
- Set Up the Animator Controller:
- In the
Project
window, locate the auto-createdAnimator Controller
(e.g.,Drone
). - Double-click to open the
Animator
window. - Drag your
Drone_HoverScan
animation into the grid—this becomes a new state. - Right-click and choose
Make Transition
fromEntry
toDrone_HoverScan
.
- In the
- Create a Simple Patrol Movement Animation:
- With
Drone
still selected, create a new clip namedDrone_MoveForward
. - Animate
Transform.Position.z
to simulate basic forward motion. Setz = -10
in Frame 0 andz = -8.5
in Frame 2s. - Save and stop recording.
- With
- Build the State Machine:
- Switch to the
Animator
window. - Drag
Drone_MoveForward
into the grid. - Create a transition from
Drone_Hover → Drone_MoveForward
. - Add a new
Trigger
parameter namedStartFlying
. - Set this as the transition condition.
- Create a return transition from
Drone_MoveForward → Drone_Hover
. - Add another
Trigger
calledReturnToHover
. - Uncheck
Has Exit Time
and reduceTransition Duration
(e.g.,0.1
) for instant state changes.
- Switch to the
- Test the Setup:
- Enter
Play
mode. - Open the
Animator
window while the scene is running. - Manually click the
StartFlying
trigger. The drone should fly forward. - Click
ReturnToHover
. The drone returns to hover mode.
- Enter
- Optional State Ideas:
ScanPulse
: Animate emission color for the eye using material properties.Land
: Animate the Y-position to descend onto a platform.Idle
: No animation—used for drones in standby.TakeOff
: One-time lift-off motion before switching to hover.
This Animator-based setup is perfect for simple autonomous drone behavior, simulation training, or AI-driven animation without scripting complex logic.
Importing Animations
Detailed animations, such as walk or patrol cycles for quadruped robots, are often created in tools like Blender or Maya. To use an existing animation in Unity, you can create animations in Blender (free, open-source 3D suite) or Maya(industry-standard for animation pipelines). Alternatively, you can download ready-made animations from Mixamo (free animation library by Adobe - works best with biped rigs but can be adapted with retargeting), TurboSquid, Sketchfab, or ActorCore. Now, let’s use an imported animation to animate the quadruped robot in XFactory:
- Import the Animation File:
- Download or export the animation as an
.FBX
file. - Drag it into your Unity project’s
Assets > Animations > Spot
folder. - Select the
.FBX
file in theProject
window. - In the
Inspector
, switch to theRig
tab. - Set
Animation Type
toGeneric
. - In
Avatar Definition
, chooseCreate From This Model
since the FBX has its own skeleton. Set theRoot node = CINEMA_4D_EDITOR
. - Click
Apply
.
Generic
animation type is ideal for Spot’s robotic skeleton. AvoidHumanoid
unless you’re retargeting from a biped source. - Download or export the animation as an
- Assign to the Model:
- In the
Project
window, create a newAnimator Controller
, e.g.,Spot_Controller
. - Drag your animation clips into the
Animator
window as states. - Select your
Spot
model in theHierarchy
. - In the
Inspector
, assignSpot_Controller
to theAnimator
component. - Set
Avatar = WALKAvatar
.
- In the
- Test the
Animator
:- Open the
Animator
window and enterPlay
mode. - Manually trigger the animation by adding
Trigger
parameters likeStartWalk
,StartScan
, etc. or by setting up transitions based on those triggers in the Animator state machine. - Click each trigger to preview Spot’s motion live in-scene.
- Open the
For XR or robotics simulations, combine animation states with logic from scripts or input devices to control Spot’s behavior in real time.
Key Takeaways
In Unity, mastering physics and animation tools allows you to create immersive, realistic XR experiences that respond naturally to user interactions. By using Rigidbody components, colliders, and physic materials, you can control how objects move, collide, and react to forces, balancing performance with accuracy. Joints provide physically accurate mechanical connections, while animation systems—through clips, controllers, and imported assets—bring objects and environments to life with dynamic, responsive behaviors. Together, these features enable simulations that blend visual engagement with authentic physical interaction, essential for engineering, training, and interactive storytelling in XR.