B2. Rendering
Learning Outcomes
- Explain the basic concepts of rendering, pipelines, and methods in Unity and XR. Before class, review the introductory material on rendering, then explore Unity’s
Scene
view with the XFactory project to spot where materials, lighting, and post-processing appear.- Identify and inspect meshes in Unity. In preparation, read about mesh fundamentals and Unity’s documentation, then open a scene and select several GameObjects to examine their
Mesh Filter
andMesh Renderer
components to see how meshes are organized.- Recognize and modify materials in Unity. Ahead of the session, locate a few GameObjects in the XFactory scene and check their materials in the
Inspector
. Try dragging an existing material onto a GameObject to see how it alters its look.- Differentiate between Lit and Unlit shaders in Unity. As prep work, review PBR and URP shaders, focusing on Lit vs. Unlit types, then in Unity browse shader options in a material’s settings without making any changes yet.
- Apply textures to materials in Unity. Prior to class, find or download texture files (PNG or JPG), import them into your project’s
Textures
folder, and drag one onto theBase Map
slot of a material to preview its effect in the scene.- Describe lighting types and their effects in Unity scenes. For preparation, read the overview of light types and real-time vs. baked lighting, then in Unity adjust existing lights in a sample scene to observe changes in intensity or type.
What Is Rendering?
Rendering is the process of transforming 3D models, textures, lights, and effects into a 2D image that is displayed on the screen. It is the digital equivalent of photographing a real-world scene using a virtual camera. In XFactory, for example, rendering makes it possible to visualize a high-fidelity, frame-wise representation of the factory’s interior, exterior, and equipment. Rendering relies on several core components that define how each object is visually represented. These components are foundational to any real-time visualization in XR environments:
-
Materials: Define how an object’s surface interacts with light—whether it’s shiny like a steel engine part, matte like a rubber tire, or translucent like a drone scanner lens. Materials control visual cues such as color, smoothness, metallicity, and emissiveness (e.g., glowing indicators on HMI screens).
-
Shaders: Small programs that run on the GPU to determine how pixels are drawn. Shaders control color blending, lighting response, reflection, and special effects. For example, a custom shader might be used to show the heat pattern on a welding robot during operation or the real-time shadows under mobile robots.
-
Textures: 2D images wrapped onto 3D surfaces to add fine detail—such as the control panel labels on the production station, the scratches on a forklift, or the tread patterns on a tire. Textures enhance realism without increasing geometric complexity.
-
Lighting: Simulates how light behaves in a scene. Proper lighting can indicate the time of day in the exterior yard, or illuminate the workspace around the assembly station to highlight active robotic processes. Lighting types include directional, point, and spotlights, and can cast real-time shadows for enhanced spatial understanding.
-
Cameras: Define what part of the scene is rendered, similar to placing surveillance cameras around XFactory. You can simulate first-person views (e.g., from a drone or worker), wide-angle overviews, or cinematic flythroughs.
-
Post-Processing: After the basic image is rendered, post-processing effects add polish. These include anti-aliasing (to smooth jagged edges on machine parts), bloom (for glowing LED panels), motion blur, and depth of field (to mimic camera focus).
Rendering Pipelines
Unity provides several Rendering Pipelines. These are pre-defined sets of tools and processes that determine how the rendering is performed. Each pipeline offers different trade-offs in performance, flexibility, and visual fidelity. For engineering simulations like XFactory, choosing the right pipeline is critical based on target devices and required realism. To select a pipeline in Unity, go to Edit > Project Settings > Graphics
and assign your preferred pipeline asset as the Default Render Pipeline
.
-
Built-in Render Pipeline: Unity’s default rendering system. Offers general flexibility but lacks optimization for XR. Suitable for rapid prototyping.
-
Universal Render Pipeline (URP): Optimized for cross-platform performance. Ideal for VR headsets and mobile AR, such as visualizing the tech station or logistics operations on a tablet.
-
High Definition Render Pipeline (HDRP): Delivers high-fidelity graphics and advanced lighting. Recommended for high-end PCs or immersive simulations in a CAVE environment — perfect for detailed walkthroughs of welding robots or visualizing dynamic reflections on machine surfaces.
URP is the go-to rendering solution for XR in Unity, offering optimized performance and cross-platform compatibility for both AR and VR experiences. Review this Unity documentation to learn more about URP. This Unity documentation provides detailed guidelines about choosing the right Unity render pipeline for your project.
Types of Rendering Methods
Rendering methods determine how lighting and materials are calculated for objects in the scene. The choice impacts performance, visual fidelity, and compatibility with certain effects—essential when building scalable engineering XR applications.
- Forward Rendering: Each object is rendered individually, with lighting calculated per object. It is simple and supports real-time shadows, but can become inefficient with many lights and complex materials.
In XFactory, this is ideal for smaller scenes like the exhibit station, where limited dynamic lights are active and objects require transparency.
- Deferred Rendering: Renders the scene in two passes: geometry first, lighting second. It handles many dynamic lights efficiently and is ideal for realism, but offers limited support for transparent materials (e.g., glass or UI overlays).
In XFactory, this is suitable for the welding and assembly stations, where many lights (e.g., robotic welders, ambient factory lights) interact with metallic surfaces.
- Hybrid Rendering: Combines Forward and Deferred rendering based on object needs. It offers flexibility for mixed content, but
is slightly more complex to configure.
In XFactory, this can enable both detailed reflective machinery and transparent AR panels in the same scene. This is beneficial when toggling between XR modes (e.g., engineer view vs. operator view).
Adjusting Rendering Methods
In Unity 6 using URP, rendering methods like Forward
, Deferred
, and Hybrid
are configured through URP assets, as follows.
- Go to
Edit > Project Settings > Graphics
. -
Under
Default Render Pipeline
, assign your URP Pipeline Asset (e.g.,PC_PipelineAsset
). This tells Unity to use URP as your rendering system. - In the
Project
window, locate and select your URP Renderer Asset (e.g.,PC_PipelineAsset_ForwardRenderer
). -
In the
Inspector
underRendering
, set theRendering Path
toForward
orDeferred
(if supported by the target platform and enabled features).
Unity URP supports Hybrid Rendering by default. Opaque objects use the
Deferred
path when it is enabled. Transparent objects always render using theForward
path. You don’t need to manually override these—URP handles it automatically.
Meshes
In Unity, a mesh is the geometric structure that defines the shape of a 3D GameObject. Understanding meshes is essential for rendering 3D objects, as shaders, materials, and lighting define how they appear. It is a collection of vertices, edges, and faces that form a 3D object. Each vertex has XYZ coordinates that define its position in space. Meshes are made up of multiple flat polygons that collectively form complex 3D shapes. A mesh is generally structured as follows:
- Vertices: Points in 3D space that define the mesh. Each vertex stores spatial information and sometimes additional data like UV coordinates (for textures).
- Edges: Lines connecting two vertices.
- Faces (Polygons): Enclosed surfaces defined by edges. Typically made of triangles (3 vertices) or quads (4 vertices). Although the surface of the CNC machine in the manufacturing station may look smooth even though it is made up of multiple flat polygons.
- Normals: Perpendicular vectors to each face or vertex that influence lighting and shading. Normals are crucial in how light reflects off the surface, making them essential for realistic rendering. Meshes appear faceted unless normals smooth their appearance. A sphere, for example, consists of flat polygons but appears round due to normal mapping.
Review this Unity documentation to learn more about mesh component.
Exploring Meshes
To view meshes in wireframe mode:
- Open Unity and navigate to the
Scene
view. - Locate the four shading mode buttons at the top of the window.
- Select the
Wireframe Draw Mode
(circle with two lines inside). -
Observe how different objects (e.g., a cube vs. a sphere or the CNC machine in the manufacturing station) have different mesh structures.
Wireframe mode allows you to see the raw mesh without materials or shading—great for analyzing imported CAD models.
Mesh Filter
The Mesh Filter
component stores the actual mesh data. It defines the shape of a GameObject but does not control rendering. To view and edit the Mesh Filter of a GameObject:
- Select the GameObject.
- In the
Inspector
, find theMesh Filter
component. -
The assigned mesh is displayed here. You can replace it by selecting another mesh from the project.
In XFactory, if you are visualizing the car body in the welding station, its
Mesh Filter
defines the geometry that makes up the car’s body.
Mesh Renderer
The Mesh Renderer
component defines how the mesh is rendered. It determines which materials and shaders are applied to the mesh. To view and edit the Mesh Renderer
of a GameObject:
- Select the GameObject.
- In the
Inspector
, find theMesh Renderer
component. - The
Materials
section shows which materials are applied. TheLighting
andProbes
settings control how the object interacts with light and reflection. UnderAdditional Settings
, you can tweak shadows, motion vectors, and other rendering options.
The
Mesh Renderer
ensures that the mesh is visible in the game by applying rendering settings such as lighting response, transparency, and texture.
CAD Models as Meshes
Engineering use cases often require importing components that are created using CAD software such as SolidWorks or Fusion 360. These CAD models can be exported and imported into Unity as meshes for visualization. For example, many components of XFactory (e.g., UR10e robot, quadruped, CNC parts, drone) have been created in and imported from CAD software. To import a custom mesh into Unity:
- Export the CAD Model from a 3D Software:
.FBX
: Best for animation and material support..OBJ
: Simple and widely supported..GLTF
or.GLB
: Lightweight and modern format with good PBR support.
- Import the Model to Unity:
- Place the file into the Unity
Assets > Models
folder (or a similar path, to keep things organized). - Select the imported mesh in the Unity Editor and adjust import settings in the
Inspector
. - Adjust the
Scale Factor
value to match real-world units (e.g., set to0.01
if importing from millimeters). - Set
Mesh Compression
toOff
,Low
,Medium
, orHigh
to reduce file size at the cost of precision; use carefully for detailed models. - Check
Optimize Mesh
this to improve rendering performance by reordering vertices and indices for the GPU. - For
Normals
setting, choose betweenImport
,Calculate
, orNone
to control how lighting and shading are applied. - Set
Tangents
toImport
,Calculate
, orNone
based on whether normal mapping is required for your materials. - Be sure to click
Apply
at the bottom of theInspector
after making changes.
- Place the file into the Unity
Tools like Blender can be used as an intermediary to convert native CAD files into Unity-friendly formats.
Optimizing Meshes
Efficient rendering is crucial for a real-time application like XFactory, especially when dealing with high-resolution engineering models. In VR/AR, it is important to keep the poly count low due to limited GPU resources and strict performance budgets for triangles and polygons. Below are some optimization tips:
- Reduce Polygon Count: Use decimation or retopology tools in modeling software to lower complexity while preserving shape.
- Use LOD (Level of Detail): Configure Unity to display simpler mesh versions when objects are farther from the camera.
- Enable Mesh Compression: Helps reduce file size and memory usage.
- Optimize UV Mapping: Ensures that textures are used efficiently without distortion or overlap.
Tools like PiXYZ (for decimating and optimizing CAD and engineering models) are ideal for preparing assets for real-time use. Free alternatives include Blender for manual mesh cleanup. In XFactory, for example, the quadruped robot at the exhibit station may have a high-poly original model. By simplifying its mesh for distant views, XFactory runs smoother on lower-end hardware.
Materials
Materials are essential to Unity’s rendering process, defining how objects appear in a scene. They encapsulate properties such as color, texture, reflectivity, and transparency, and serve as the link between 3D meshes and shaders. Materials simulate the physical appearance of surfaces, helping create visual realism and functional clarity in engineering simulations. In XFactory, the welding station uses high-reflectivity metallic materials for robot arms and the car body, the logistics station includes wood-textured pallets, matte plastic drones, and shiny scanning devices, and the exhibit station includes transparent screen overlays and glass surfaces that use materials with custom transparency settings.
Review this Unity documentation to learn more about Materials in Unity, such as creating and assigning materials or accessing material properties in a script (which will be covered later).
Default Material
When you create a new 3D object in Unity, it is automatically assigned a default material. It is named Lit
(in the Universal Render Pipeline
), is non-editable via the editor (only accessible through scripts), and uses a generic shader that adapts to the active rendering pipeline. This neutral material is useful for placeholders during prototyping. To examine and modify the default material:
- Create a 3D Object from the menu:
+
>3D Object > Cube
orSphere
. -
Select the object and inspect the default material and shader in the
Inspector
window. -
Try dragging a different material from the
Project > XFactory > Materials
folder onto the object in theScene
view to override it.
To locate materials, open the
Project
window and use the search filter:t:Material
. Use scopes likeIn Assets
(searches your project files) orIn Packages
(includes imported assets and plugins).
Adding a New Material
Creating custom materials is essential for adding realism and differentiation between components in engineering scenes. To create a new material:
- Create a
Material
:- Navigate to
Assets > XFactory > Materials
in theProject
window. - Right-click and select
Create > Material
. - Name it descriptively (e.g.,
Box_Metal
).
- Navigate to
- Configure Material Properties in the
Inspector
:Shader
: UseUniversal Render Pipeline/Lit
for realistic PBR-based shading (orUnlit
for stylized/transparent effects).Base Map
: Set the color or assign a texture image (e.g.,.png
,.jpg
, or.tga
).Metallic Map
&Smoothness
: Use sliders or assign a texture to control surface reflectivity and glossiness.Normal Map
: Add a normal map texture to simulate detailed surface bumps and grooves without additional geometry.Occlusion Map
: Optional—enhances depth in crevices using ambient occlusion data.Emission
: Use this if the material should glow or emit light.
- Apply the Material to a GameObject:
- Drag & Drop: From
Project
window to object inScene
window. - Inspector Assignment: Drop the material into the
Mesh Renderer > Materials
field. - Object Selector: Click the
⊙
icon next to the material field and pick from the list. - Copy from Another Object: Drag a material thumbnail from one object’s
Inspector
to another.
- Drag & Drop: From
Materials in Unity are shared assets. This means any changes made to a material will affect all objects using it. To create variations without affecting the original, duplicate the material by clicking on it and pressing
Ctrl+D
(Cmd+D
on Mac).
Fixing Magenta Materials
Bright magenta materials are Unity’s way of indicating a problem—most commonly a shader compatibility error. This frequently occurs when you import new assets that use shaders incompatible with your project’s current render pipeline (e.g., URP vs. Built-in). The material is likely referencing a shader that no longer exists or isn’t supported, resulting in Unity rendering it as magenta to signal the issue. You can fix magenta materials in two different ways:
- Manual Fix:
- Select the affected GameObject.
- In
Inspector
, change the shader toUniversal Render Pipeline > Lit
.
- Automatic Fix via
Render Pipeline Converter
:- Open
Window > Rendering > Render Pipeline Converter
. - Select
Built-in to URP
. - Enable the
Material Upgrade
option. - Click
Initialize And Convert
.
- Open
This ensures all your imported 3D assets are visually correct under the URP.
Shaders
A shader is a specialized script that determines how a material is visually rendered in Unity. It defines how light interacts with a surface and how colors, textures, and reflections are computed on screen. Shaders play a vital role in simulating realistic environments, functional interfaces, and performance-optimized visualizations. Shaders are closely tied to Unity’s Render Pipeline, which governs the rendering process. Each material in Unity references a shader, and this shader dictates how the associated mesh will appear when viewed in the scene. At a high level, shaders operate at two stages:
- Vertex Shading: Modifies the position and appearance of each vertex in a 3D model. This is essential for creating effects like wave motion, bending, or object deformation.
- Fragment Shading (Pixel Shading): Calculates the final color and lighting of each pixel on the surface of the mesh. This determines how the object looks under different lighting and environmental conditions.
Review this e-book to learn how to create shaders and visual effects in Unity.
Physically Based Rendering
Physically Based Rendering (PBR) is a modern rendering approach used in Unity’s Lit Shader
, aiming to simulate how light actually behaves in the physical world. PBR distinguishes between two properties to allows materials to behave consistently across different lighting scenarios:
- Light properties (e.g., brightness, color, angle, intensity).
- Surface properties (e.g., reflectivity, roughness, albedo).
PBR is key to ensure visual fidelity and material behavior, critical for immersive applications like XR.
Shaders in URP
Unity’s Universal Render Pipeline (URP) includes a set of optimized shaders that support both high-quality visuals and real-time performance—ideal for XR environments deployed across various devices.
Lit Shader
: A PBR shader that responds to dynamic lighting, reflections, and shadows. Used for realistic materials like metal, plastic, and glass.Unlit Shader
: Ignores all lighting. Best for overlays, UI elements, and low-performance scenarios.Baked Lit Shader
: Optimized for precomputed lighting setups (lightmaps), reducing real-time computation.Terrain Lit Shader
: Special shader for large terrains with PBR support and vegetation blending.Particles
(Lit
,Simple Lit
,Unlit
): Used for visual effects like sparks from a welding robot, smoke from a 3D printer, or exhaust from a mobile platform.Sprite-Lit-Default
: For 2D UI or elements (like AR overlays) that react to lighting.Sprite-Unlit-Default
: For icons, markers, or HUDs that remain unaffected by lighting.Simple Lit
/Complex Lit
: Lightweight alternatives for platforms where full PBR isn’t necessary.
Use
URP/Lit Shader
for reflective surfaces like polished metal surfaces,Unlit Shader
for fixed-color UI overlays on control screens, andParticles > Lit
for real-time welding sparks in the welding station.
Exploring URP Shaders
You can experiment with URP shaders directly in the Unity Editor:
- Select a 3D GameObject in the Scene (e.g., a robotic arm).
- In the
Inspector
, find theMaterial
component. - Locate the
Shader
field at the top of the material settings. - Open the dropdown and navigate to
Universal Render Pipeline
. - Choose from
Lit
,Unlit
,Particles > Lit
, etc. -
Observe how different shaders affect lighting, transparency, and reflectivity.
Try switching the
Lit Shader
of the car body in the welding station to anUnlit Shader
. You will see the difference in how light interacts with reflections.
Textures
Textures are 2D images applied to 3D models to simulate surface detail and realism. While materials define how an object interacts with light, textures provide the visual content that brings surfaces to life—like rust, grain, wear, paint, or logos. Textures are typically assigned through the Base Map
of a material, but may also be used in other properties like Normal Map
, Metallic Map
, and Occlusion Map
.
In XFactory, textures help communicate function and realism—for instance, wood grain textures on logistics pallets, scratch textures on welding station robots, and touchscreen interface graphics on tech station panels.
Types of Textures
Different types of texture maps simulate specific surface properties:
Base Map
(Albedo): The primary color and surface detail. Think of it as “paint on a surface.” Used for most visual appearances.Normal Map
: Simulates bumps and fine surface details like scratches, tread patterns, weld seams, or brushed metal.Metallic Map
: Dictates which parts of the object act metallic vs. non-metallic. Black = non-metal, white = metal.Smoothness/Roughness Map
: Controls gloss and light scattering.-
Occlusion Map
: Enhances shadows in crevices and seams for depth perception.
The robotic arms in the welding station use three key textures to achieve a realistic appearance: the
Base Map
provides the core color and surface details, theMetallic Map
controls how reflective and metallic different parts of the surface appear, and theNormal Map
simulates fine bumps and panel seams without adding extra geometry.
Applying Textures to Materials
Textures are usually applied through the Material Inspector
. To apply a texture:
- Select a material (e.g.,
prop_ind_robot_arm
). - In the
Inspector
, find theBase Map
slot (or other maps as necessary). - Click the small circle
⊙
next to the texture field. - Choose a texture from the list or drag a texture file into the field.
-
Preview changes in the
Scene
orGame
view.
Using Imported Textures
Unity supports various image formats such as .PNG
, .JPG
, and .TGA
. To import textures:
- Import the Texture:
- Drag the image file into the
Assets > XFactory > Textures
folder (or any organized subfolder in your project). - Select the texture and adjust
Import Settings
in theInspector
. - Set
Texture Type
toDefault
for most uses. - Enable
sRGB (Color Texture)
forBase Map
; disable for masks or data maps. - Set
Alpha Source
appropriately for transparency needs. - Set
Wrap Mode
toRepeat
(for tiling surfaces like floors) orClamp
(for UI graphics). - Set
Filter Mode
toBilinear
orTrilinear
for smooth transitions.
- Drag the image file into the
- Assign the Texture to a Material:
- In the
Inspector
, drag the texture into the appropriate field. - Use the texture as a
Base Map
for the object’s surface color and pattern. - Assign it to the
Normal Map
slot (if a normal texture is available) to simulate fine surface detail without extra geometry. - Use grayscale textures as a
Metallic Map
,Roughness Map
, orOcclusion Map
to control reflectivity, surface roughness, and ambient shading. For instance, a cardboard texture applied to a cube’s material (Base Map
) to resemble a cardboard box should useRepeat
wrap mode andBilinear
orTrilinear
filtering to ensure the texture tiles seamlessly across all sides and maintains visual sharpness at various viewing angles and distances.
- In the
Higher-resolution textures provide more detail but can affect performance. For XR applications in engineering simulations, balance is key. For example, 2K (2048x2048) is suitable for mid-size objects (e.g., drones, carts), 4K (4096x4096) is best for close-up views (e.g., HMI screens), and 512 or 1K can be used for background or repeated elements (e.g., storage boxes).
Lighting
Lighting is a foundational component of rendering in Unity. It defines how objects appear in a scene, influences performance, and contributes significantly to realism and usability in XR environments. Lighting plays a critical role in both the visual quality and performance of a Unity scene. Thoughtful lighting design enhances clarity, immersion, and usability in interactive environments.
- Clarifying Spatial Relationships and Depth: Well-placed lighting helps users distinguish between foreground, background, and overlapping objects, making navigation more intuitive.
- Highlighting Active vs. Inactive Elements: Dynamic lighting can be used to indicate the status of scene elements—e.g., a CNC machine might glow or cast a light when powered on, drawing the user’s attention.
- Simulating Time-of-Day Conditions: Lighting setups can mimic morning, midday, or evening environments, contributing to narrative context or mood.
- Improving Realism: Realistic lighting and shadowing reinforce depth perception and material authenticity, especially when simulating industrial or physical environments.
- Managing Performance: Lighting affects rendering complexity. Real-time shadows, global illumination, and baked lighting all carry different performance costs—crucial considerations for VR or AR applications where frame rate is critical.
Use baked lighting where possible for static objects to reduce runtime overhead, and consider light probes for dynamic objects to maintain visual consistency.
Light Types
Unity supports several core light types, each serving a unique role in shaping how environments are lit—especially important in engineering visualizations like factory simulations.
-
Directional Light simulates sunlight or other distant light sources by casting parallel rays across the entire scene, regardless of distance or size. This is ideal for global lighting, such as simulating daylight over the exterior of a factory or providing uniform light coverage across large workspaces.
-
Point Light emits light in all directions from a single point in space, similar to a bare bulb. It is commonly used for overhead lights, indicator LEDs on mobile robots, or illuminating small localized areas.
-
Spot Light projects a focused cone of light in a specific direction, allowing for targeted illumination. This type is effective for robotic weld torches, inspection lamps, or simulating focused task lighting in work cells.
-
Area Light emits light from a rectangular surface, producing soft, realistic illumination—though it’s only available for baked lighting (not real-time). Use area lights to simulate architectural wall panels or soft, diffuse lighting in fixed installations.
-
Ambient Light provides low-level, omnidirectional illumination across the entire scene, without casting shadows. This helps avoid harsh contrast and can simulate subtle background lighting in large factory halls or uniformly lit interiors.
Choosing the right light type is essential for balancing visual realism with performance, especially in XR environments.
Lighting Behavior
Light in Unity simulates how it interacts with real-world surfaces:
- Reflection: Smooth surfaces (e.g., polished robot arms) reflect light directionally.
- Refraction: Transparent materials like safety glass bend light, altering object appearance.
- Shadows: Realistic occlusion improves depth perception and immersion. Main shadow types include
Hard
(crisp, defined edges),Soft
(smoother, more natural shadows),Realtime
(for dynamic objects and lighting), andBaked
(for static scenes with better performance). - Emission: Surfaces can appear to emit light—used for status indicators or screens.
Shadows enhance realism but impact performance. Use shadow-casting lights sparingly, and tune
Shadow Distance
andResolution
underProject Settings > Quality
.
Lighting Source
-
Direct Lighting: Direct light comes straight from a light source to a surface, creating strong highlights and defined shadows. It is used for clear visibility and focus (e.g., welding torch or spotlight on HMI).
-
Indirect Lighting: Indirect lighting results from light bouncing off surfaces, creating soft shadows, ambient effects, and overall realism. It is especially important in interior factory scenes with reflective materials.
-
Global Illumination: Global Illumination (GI) simulates the indirect bounce of light in a scene. Baked GI is precomputed lighting stored in lightmaps, while Realtime GI is calculated at runtime (deprecated in URP).
In URP (used in XFactory), only baked GI is supported—ideal for static environments like factory interiors.
Lighting Methods
-
Realtime Lighting: Calculated every frame. Best for dynamic objects, like drone, machines, AMRs, or robots that move or turn. Computationally expensive—use selectively.
-
Baked Lighting: Precomputed and applied to static objects. Ideal for factory structure, floors, and large machinery. Provides smooth, realistic illumination without runtime cost.
-
Mixed Lighting: Combines realtime and baked lighting. Common in XR to balance fidelity and performance.
Configuring Lights
- Add a Light:
- In the
Hierarchy
, click+ > Light > [Type]
(e.g.,Directional
,Point
,Spot
). - The light appears in the
Scene
view.
- In the
- Configure the
Light
:Type
: Choose the light type (Directional
,Point
,Spot
,Area
) based on the lighting scenario or physical source you are simulating.Mode
: Select how the light is calculated (Realtime
,Mixed
, orBaked
).Light Appearance
: Adjust settings like cookie textures (to simulate patterned light), flare, or bounce intensity to influence the visual feel of the light.Color
: Choose a color that suits the lighting context—use accurate, neutral tones for realism or warmer/cooler hues for mood and contrast.Intensity
: Controls how bright the light appears; tune it to match real-world brightness or to ensure visibility in dark areas.Range
: Sets the distance the light reaches (relevant forPoint
andSpot
lights); larger values illuminate more area but may affect performance.Culling Mask
: Defines which layers the light affects—useful for optimizing performance or isolating lighting to specific objects (e.g., only affecting machinery, not background props).Shadow Type
: Choose whether the light casts hard shadows, soft shadows, or no shadows, depending on the desired realism and performance budget.
Use
Spot
lights to mimic focused, high-intensity work lights on the XR headset stand.
Ambient Lighting
Ambient lighting provides overall scene brightness and ensures no object is left in complete darkness. To configure:
- Open
Window > Rendering > Lighting
. - Go to the
Environment
tab. - Set
Ambient Source
:Skybox
: Uses the sky environment (most realistic).Gradient
: Blend from sky to ground (efficient).Color
: Uniform ambient light.
- Tweak the
Intensity Multiplier
for brightness. -
Click
Generate Lighting
to apply.
Use ambient lighting to softly illuminate the logistics station’s walls without placing individual light sources.
Light Probes
While baked lighting works well for static objects, dynamic objects—such as moving robots or mobile equipment—need additional data to appear realistically lit in baked environments. Light Probes address this by storing information about indirect lighting (light bounced from nearby surfaces) and applying it to dynamic GameObjects. To use light probes:
- In the
Hierarchy
, click+ > Light > Light Probe Group
. - Position probes around areas where dynamic objects operate.
-
Bake lighting again (
Lighting > Generate Lighting
).
Use Light Probes along forklift routes or drone pathways in XFactory to ensure lighting stays consistent as they move through the environment.
Reflection Probes
In addition to light, dynamic and reflective objects also require believable environmental reflections. Reflection Probes solve this by capturing the surrounding scene and projecting that reflection onto materials—especially important for metallic or glossy surfaces. To use reflection probes:
- In the
Hierarchy
, click+ > Light > Reflection Probe
. - Position the probe in the environment (e.g., above a workstation).
- Adjust the bounding box to define the area of influence.
- Choose update mode:
Baked
,Realtime
, orCustom
. -
Bake lighting or update probes at runtime as needed.
In XFactory, place a
Reflection Probe
in the logistics station to reflect glow effects on surrounding shelves and objects like the drone.
Key Takeaways
Rendering in Unity is the process of converting 3D assets—meshes, materials, textures, lighting, and camera views—into the 2D images we see on screen, and it is central to achieving both visual fidelity and performance in XR projects like XFactory. Mastery of core elements such as meshes (geometry), materials (surface properties), shaders (light–surface interaction), textures (detail), and lighting (illumination and mood) ensures realistic and efficient scene presentation. Unity’s rendering pipelines—Built-in, URP, and HDRP—offer different balances of speed and quality, with URP being ideal for cross-platform XR. Choosing appropriate rendering methods (Forward, Deferred, Hybrid) and optimizing assets through mesh simplification, proper material setup, and efficient lighting can dramatically improve performance without sacrificing realism. By integrating baked lighting for static elements, real-time effects for dynamic ones, and leveraging tools like light probes and reflection probes, you can create immersive, responsive XR environments that run smoothly across a range of devices.