B2. Rendering
Learning Outcomes
- Identify and inspect meshes in Unity. In preparation, read about mesh fundamentals and Unity’s documentation, then open a scene and select several GameObjects to examine their
Mesh Filter
andMesh Renderer
components to see how meshes are organized.- Recognize and modify materials in Unity. Ahead of the session, locate a few GameObjects in the XFactory scene and check their materials in the
Inspector
. Try dragging an existing material onto a GameObject to see how it alters its look.- Differentiate between Lit and Unlit shaders in Unity. As prep work, review URP shaders, focusing on Lit vs. Unlit types, then in Unity browse shader options in a material’s settings without making any changes yet.
- Apply textures to materials in Unity. Prior to class, find or download texture files (PNG or JPG), import them into your project’s
Textures
folder, and drag one onto theBase Map
slot of a material to preview its effect in the scene.- Describe lighting types and their effects in Unity scenes. For preparation, read the overview of light types and real-time vs. baked lighting, then in Unity adjust existing lights in a sample scene to observe changes in intensity or type.
- Explain the basic concepts of rendering, pipelines, and methods in Unity and XR. Before class, review the introductory material on rendering, then explore Unity’s
Scene
view with the XFactory project to spot where materials, lighting, and post-processing appear.
What Is Rendering?
Rendering is the process of transforming 3D models, textures, lights, and effects into a 2D image that is displayed on the screen. It is the digital equivalent of photographing a real-world scene using a virtual camera. In XFactory, rendering enables high-fidelity, frame-by-frame visualization of the factory’s interior, exterior, and equipment. To achieve this, Unity relies on a set of core rendering components that determine how each object appears and behaves visually in an XR environment:
-
Meshes (Geometry): The 3D models that define an object’s shape using vertices, edges, and polygons. They form the structural foundation of everything rendered in the scene, from factory equipment to vehicles and worker avatars.
-
Materials: Control how an object’s surface interacts with light—whether shiny like a steel engine part, matte like a rubber tire, or translucent like a drone scanner lens. Properties include color, smoothness, metallicity, and emissiveness (e.g., glowing indicators on HMI screens).
-
Textures: 2D images applied to 3D surfaces to add fine details—such as control panel labels, forklift scratches, or tire tread patterns—enhancing realism without increasing geometric complexity.
-
Shaders: GPU programs that define how pixels are drawn. They determine lighting response, color blending, reflections, and special effects. For instance, a custom shader might display heat patterns on a welding robot or simulate real-time shadows under mobile robots.
-
Lighting: Simulates real-world light behavior. It conveys time of day in the exterior yard or illuminates active robotic stations inside. Types include directional, point, and spotlights, with support for real-time shadows that enhance spatial understanding.
-
Rendering Pipeline: The overarching process that converts 3D scene data into the final 2D image on screen. It governs how objects are culled (discarding what the camera cannot see), how geometry is transformed from 3D to 2D space, and how the image is enhanced with post-processing effects. Unity provides different pipelines—Built-in, Universal Render Pipeline (URP), and High Definition Render Pipeline (HDRP)—each optimized for different performance and visual fidelity needs.
Unity’s rendering begins with meshes, which provide the 3D structure, made visible by materials, textures, and shaders to define surface detail and behavior. Scene lighting adds realism by simulating how objects interact with light. Finally, the rendering pipeline determines visibility (culling), projects the 3D scene into 2D, and applies post-processing effects to deliver the polished final frame.
Meshes (Geometry)
In Unity, a mesh is the geometric structure that defines the shape of a 3D GameObject. Understanding meshes is essential for rendering 3D objects, as shaders, materials, and lighting define how they appear. It is a collection of vertices, edges, and faces that form a 3D object. Each vertex has XYZ coordinates that define its position in space. Meshes are made up of multiple flat polygons that collectively form complex 3D shapes. A mesh is generally structured as follows:
- Vertices: Points in 3D space that define the mesh. Each vertex stores spatial information and sometimes additional data like UV coordinates (for textures).
- Edges: Lines connecting two vertices.
- Faces (Polygons): Enclosed surfaces defined by edges. Typically made of triangles (3 vertices) or quads (4 vertices). Although the surface of the CNC machine in the manufacturing station may look smooth even though it is made up of multiple flat polygons.
- Normals: Perpendicular vectors to each face or vertex that influence lighting and shading. Normals are crucial in how light reflects off the surface, making them essential for realistic rendering. Meshes appear faceted unless normals smooth their appearance. A sphere, for example, consists of flat polygons but appears round due to normal mapping.
Review this Unity documentation to learn more about mesh component.
Exploring Meshes
To view meshes in wireframe mode:
- Open Unity and navigate to the
Scene
view. - Locate the four shading mode buttons at the top of the window.
- Select the
Wireframe Draw Mode
(circle with two lines inside). -
Observe how different objects (e.g., a cube vs. a sphere or the CNC machine in the manufacturing station) have different mesh structures.
Wireframe mode allows you to see the raw mesh without materials or shading—great for analyzing imported CAD models.
Mesh Filter
The Mesh Filter
component stores the mesh asset—the raw vertex, edge, and triangle data that defines the shape of a GameObject. Think of it as the blueprint of the object’s geometry. By itself, a Mesh Filter
does not display anything on screen—it only provides the data (typically paired with a Mesh Renderer
). A Mesh Filter
usually references a shared asset (like a cube, sphere, or imported model). If you assign a different mesh, the object instantly changes shape.
- Select the GameObject.
- In the
Inspector
, find theMesh Filter
component. -
The assigned mesh is displayed here. You can replace it by selecting another mesh from the project.
In XFactory, if you are visualizing the car body in the welding station, the
Mesh Filter
defines the geometry of the car’s body panels.
Mesh Renderer
The Mesh Renderer
component is responsible for drawing the mesh on screen. It takes the geometry from the Mesh Filter
, applies materials and shaders, and makes the object visible under the scene’s lighting conditions.
- The
Materials
section controls surface look (textures, colors, metallic properties). Lighting
andProbes
settings decide how the object responds to scene lights, global illumination, and reflection probes.Shadows
can be enabled/disabled, or set to cast/receive. This affects both performance and visual realism.- Additional settings like
Motion Vectors
(for motion blur),Lightmap Static
(for baked lighting), andRendering Layer Masks
(for selective lighting) give fine-grained control.
Without a
Mesh Filter
, theMesh Renderer
has nothing to draw; without aMesh Renderer
, theMesh Filter
has nothing to display. Together, they form the backbone of any visible 3D object in Unity.
CAD Models as Meshes
Engineering use cases often require importing components that are created using CAD software such as SolidWorks or Fusion 360. These CAD models can be exported and imported into Unity as meshes for visualization. For example, many components of XFactory (e.g., UR10e robot, quadruped, CNC parts, drone) have been created in and imported from CAD software. To import a custom mesh into Unity:
- Export the CAD Model from a 3D Software:
.FBX
: Best for animation and material support..OBJ
: Simple and widely supported..GLTF
or.GLB
: Lightweight and modern format with good PBR support.
- Import the Model to Unity:
- Place the file into the Unity
Assets > Models
folder (or a similar path, to keep things organized). - Select the imported mesh in the Unity Editor and adjust import settings in the
Inspector
. - Adjust the
Scale Factor
value to match real-world units (e.g., set to0.01
if importing from millimeters). - Set
Mesh Compression
toOff
,Low
,Medium
, orHigh
to reduce file size at the cost of precision; use carefully for detailed models. - Check
Optimize Mesh
this to improve rendering performance by reordering vertices and indices for the GPU. - For
Normals
setting, choose betweenImport
,Calculate
, orNone
to control how lighting and shading are applied. - Set
Tangents
toImport
,Calculate
, orNone
based on whether normal mapping is required for your materials. - Be sure to click
Apply
at the bottom of theInspector
after making changes.
- Place the file into the Unity
Tools like Blender can be used as an intermediary to convert native CAD files into Unity-friendly formats.
Optimizing Meshes
Efficient rendering is crucial for a real-time application like XFactory, especially when dealing with high-resolution engineering models. In VR/AR, it is important to keep the poly count low due to limited GPU resources and strict performance budgets for triangles and polygons. Below are some optimization tips:
- Reduce Polygon Count: Use decimation or retopology tools in modeling software to lower complexity while preserving shape.
- Use LOD (Level of Detail): Configure Unity to display simpler mesh versions when objects are farther from the camera.
- Enable Mesh Compression: Helps reduce file size and memory usage.
- Optimize UV Mapping: Ensures that textures are used efficiently without distortion or overlap.
Tools like PiXYZ (for decimating and optimizing CAD and engineering models) are ideal for preparing assets for real-time use. Free alternatives include Blender for manual mesh cleanup. In XFactory, for example, the quadruped robot at the exhibit station may have a high-poly original model. By simplifying its mesh for distant views, XFactory runs smoother on lower-end hardware.
Materials
Materials are essential to Unity’s rendering process, defining how objects appear in a scene. They encapsulate properties such as color, texture, reflectivity, and transparency, and serve as the link between 3D meshes and shaders. Materials simulate the physical appearance of surfaces, helping create visual realism and functional clarity in engineering simulations. In XFactory, the welding station uses high-reflectivity metallic materials for robot arms and the car body, the logistics station includes wood-textured pallets, matte plastic drones, and shiny scanning devices, and the exhibit station includes transparent screen overlays and glass surfaces that use materials with custom transparency settings.
Review this Unity documentation to learn more about Materials in Unity, such as creating and assigning materials or accessing material properties in a script (which will be covered later).
Default Material
When you create a new 3D object in Unity, it is automatically assigned a default material. It is named Lit
(in the Universal Render Pipeline
), is non-editable via the editor (only accessible through scripts), and uses a generic shader that adapts to the active rendering pipeline. This neutral material is useful for placeholders during prototyping. To examine and modify the default material:
- Create a 3D Object from the menu:
+
>3D Object > Cube
orSphere
. -
Select the object and inspect the default material and shader in the
Inspector
window. -
Try dragging a different material from the
Project > XFactory > Materials
folder onto the object in theScene
view to override it.
To locate materials, open the
Project
window and use the search filter:t:Material
. Use scopes likeIn Assets
(searches your project files) orIn Packages
(includes imported assets and plugins).
Adding a New Material
Creating custom materials is essential for adding realism and differentiation between components in engineering scenes. To create a new material:
- Create a
Material
:- Navigate to
Assets > XFactory > Materials
in theProject
window. - Right-click and select
Create > Material
. - Name it descriptively (e.g.,
Box_Metal
).
- Navigate to
- Configure Material Properties in the
Inspector
:Shader
: UseUniversal Render Pipeline/Lit
for realistic PBR-based shading (orUnlit
for stylized/transparent effects).Base Map
: Set the color or assign a texture image (e.g.,.png
,.jpg
, or.tga
).Metallic Map
&Smoothness
: Use sliders or assign a texture to control surface reflectivity and glossiness.Normal Map
: Add a normal map texture to simulate detailed surface bumps and grooves without additional geometry.Height Map
: Optional—used for parallax or displacement mapping, it creates the illusion (or actual modification) of surface depth by shifting pixels or vertices based on height values.Occlusion Map
: Optional—enhances depth in crevices using ambient occlusion data.Emission
: Use this if the material should glow or emit light.
- Apply the Material to a GameObject:
- Drag & Drop: From
Project
window to object inScene
window. - Inspector Assignment: Drop the material into the
Mesh Renderer > Materials
field. - Object Selector: Click the
⊙
icon next to the material field and pick from the list. - Copy from Another Object: Drag a material thumbnail from one object’s
Inspector
to another.
- Drag & Drop: From
Materials in Unity are shared assets. This means any changes made to a material will affect all objects using it. To create variations without affecting the original, duplicate the material by clicking on it and pressing
Ctrl+D
(Cmd+D
on Mac).
Fixing Magenta Materials
Bright magenta materials are Unity’s way of indicating a problem—most commonly a shader compatibility error. This frequently occurs when you import new assets that use shaders incompatible with your project’s current render pipeline (e.g., URP vs. Built-in). The material is likely referencing a shader that no longer exists or isn’t supported, resulting in Unity rendering it as magenta to signal the issue. You can fix magenta materials in two different ways:
- Manual Fix:
- Select the affected GameObject.
- In
Inspector
, change the shader toUniversal Render Pipeline > Lit
.
- Automatic Fix via
Render Pipeline Converter
:- Open
Window > Rendering > Render Pipeline Converter
. - Select
Built-in to URP
. - Enable the
Material Upgrade
option. - Click
Initialize And Convert
.
- Open
This ensures all your imported 3D assets are visually correct under the URP.
Textures
Textures are 2D images applied to 3D models to simulate surface detail and realism. While materials define how an object interacts with light, textures provide the visual content that brings surfaces to life—like rust, grain, wear, paint, or logos. Textures are typically assigned through the Base Map
of a material, but may also be used in other properties like Normal Map
, Metallic Map
, and Occlusion Map
.
In XFactory, textures help communicate function and realism—for instance, wood grain textures on logistics pallets, scratch textures on welding station robots, and touchscreen interface graphics on tech station panels.
Types of Textures
Different types of texture maps simulate specific surface properties:
Base Map
(Albedo): The primary color and surface detail. Think of it as “paint on a surface.” Used for most visual appearances.Normal Map
: Simulates bumps and fine surface details like scratches, tread patterns, weld seams, or brushed metal.Height Map
: Adds the illusion (or actual modification) of depth by offsetting surface details. Used for parallax effects or displacement, making surfaces appear raised or recessed.Metallic Map
: Dictates which parts of the object act metallic vs. non-metallic. Black = non-metal, white = metal.Smoothness/Roughness Map
: Controls gloss and light scattering.-
Occlusion Map
: Enhances shadows in crevices and seams for depth perception.
The robotic arms in the welding station use three key textures to achieve a realistic appearance: the
Base Map
provides the core color and surface details, theMetallic Map
controls how reflective and metallic different parts of the surface appear, and theNormal Map
simulates fine bumps and panel seams without adding extra geometry.
Applying Textures to Materials
Textures are usually applied through the Material Inspector
. To apply a texture:
- Select a material (e.g.,
prop_ind_robot_arm
). - In the
Inspector
, find theBase Map
slot (or other maps as necessary). - Click the small circle
⊙
next to the texture field. - Choose a texture from the list or drag a texture file into the field.
-
Preview changes in the
Scene
orGame
view.
Using Imported Textures
Unity supports various image formats such as .PNG
, .JPG
, and .TGA
. To import textures:
- Import the Texture:
- Drag the image file into the
Assets > XFactory > Textures
folder (or any organized subfolder in your project). - Select the texture and adjust
Import Settings
in theInspector
. - Set
Texture Type
toDefault
for most uses. - Enable
sRGB (Color Texture)
forBase Map
; disable for masks or data maps. - Set
Alpha Source
appropriately for transparency needs. - Set
Wrap Mode
toRepeat
(for tiling surfaces like floors) orClamp
(for UI graphics). - Set
Filter Mode
toBilinear
orTrilinear
for smooth transitions.
- Drag the image file into the
- Assign the Texture to a Material:
- In the
Inspector
, drag the texture into the appropriate field. - Use the texture as a
Base Map
for the object’s surface color and pattern. - Assign it to the
Normal Map
slot (if a normal texture is available) to simulate fine surface detail without extra geometry. - Use grayscale textures as a
Metallic Map
,Roughness Map
, orOcclusion Map
to control reflectivity, surface roughness, and ambient shading. For instance, a cardboard texture applied to a cube’s material (Base Map
) to resemble a cardboard box should useRepeat
wrap mode andBilinear
orTrilinear
filtering to ensure the texture tiles seamlessly across all sides and maintains visual sharpness at various viewing angles and distances.
- In the
Higher-resolution textures provide more detail but can affect performance. For XR applications in engineering simulations, balance is key. For example, 2K (2048x2048) is suitable for mid-size objects (e.g., drones, carts), 4K (4096x4096) is best for close-up views (e.g., HMI screens), and 512 or 1K can be used for background or repeated elements (e.g., storage boxes).
Shaders
A shader is a specialized script that determines how a material is visually rendered in Unity. It defines how light interacts with a surface and how colors, textures, and reflections are computed on screen. Shaders play a vital role in simulating realistic environments, functional interfaces, and performance-optimized visualizations. Shaders are closely tied to Unity’s Render Pipeline, which governs the rendering process. Each material in Unity references a shader, and this shader dictates how the associated mesh will appear when viewed in the scene. At a high level, shaders operate at two stages:
-
Vertex Shading: Modifies the position and appearance of each vertex in a 3D model. This is essential for creating effects like wave motion, bending, or object deformation.
-
Geometry Shading: Optionally processes entire primitives (points, lines, triangles) after vertex shading, allowing for tasks like tessellation, shadow volume extrusion, or adding/removing geometry dynamically.
-
Rasterization: Converts primitives (triangles, lines) into fragments (potential pixels). This stage handles visibility, perspective correction, and interpolates values (e.g., texture coordinates, normals) across surfaces.
-
Fragment Shading (Pixel Shading): Calculates the final color and lighting of each fragment (potential pixel). This determines how the object looks under different lighting models, textures, and material properties.
-
Per-Fragment Operations: Final tests and operations before writing to the framebuffer, including depth testing, stencil testing, blending, and discarding fragments based on transparency or other conditions.
In Unity, “shading” means evaluating how geometry or pixels should look on screen, whether that involves positions, colors, lighting, or other attributes. Shaders are like a group of painters working on a mural: the vertex shaders are the sketch artists who lay out and adjust the framework of shapes, deciding where edges and corners go, while the fragment (pixel) shaders are the painters who fill in every patch with color, shading, and texture, making the final image come to life under light. Review this e-book to learn how to create shaders and visual effects in Unity.
Shaders in URP
Unity’s Universal Render Pipeline (URP) includes a set of optimized shaders that support both high-quality visuals and real-time performance—ideal for XR environments deployed across various devices.
-
Lit Shader
: A shader that responds to dynamic lighting, reflections, and shadows. Used for realistic materials like metal, plastic, and glass. Like real paint on a sculpture, it changes appearance under different lights. -
Unlit Shader
: Ignores all lighting. Best for overlays, UI elements, and low-performance scenarios. Like a poster or sticker, it always looks the same regardless of light. -
Baked Lit Shader
: Optimized for precomputed lighting setups (lightmaps), reducing real-time computation. Like a photograph with lighting already captured, it won’t change if you move a lamp around. -
Terrain Lit Shader
: Special shader for large terrains with PBR support and vegetation blending. Like durable outdoor paint designed for soil, grass, and landscapes under sunlight. -
Particles
(Lit
,Simple Lit
,Unlit
): Used for visual effects like sparks from a welding robot, smoke from a 3D printer, or exhaust from a mobile platform. Like special-effects paint or glow powder—sometimes reacting to light, sometimes kept simple for performance.* -
Sprite-Lit-Default
: For 2D UI or elements (like AR overlays) that react to lighting. Like a paper cutout under a lamp, still catching shadows and light. -
Sprite-Unlit-Default
: For icons, markers, or HUDs that remain unaffected by lighting. Like a glowing sticker, always clear and bright no matter the lighting.* -
Simple Lit
/Complex Lit
: Lightweight alternatives for platforms where full PBR isn’t necessary. Like using cheap vs. premium paint — one runs faster, the other gives richer detail.
Use
URP/Lit Shader
for reflective surfaces like polished metal surfaces,Unlit Shader
for fixed-color UI overlays on control screens, andParticles > Lit
for real-time welding sparks in the welding station.
Exploring URP Shaders
You can experiment with URP shaders directly in the Unity Editor:
- Select a 3D GameObject in the Scene (e.g., a robotic arm).
- In the
Inspector
, find theMaterial
component. - Locate the
Shader
field at the top of the material settings. - Open the dropdown and navigate to
Universal Render Pipeline
. - Choose from
Lit
,Unlit
,Particles > Lit
, etc. -
Observe how different shaders affect lighting, transparency, and reflectivity.
Try switching the
Lit Shader
of the car body in the welding station to anUnlit Shader
. You will see the difference in how light interacts with reflections.
Lighting
Lighting is a foundational component of rendering in Unity. It defines how objects appear in a scene, influences performance, and contributes significantly to realism and usability in XR environments. Lighting plays a critical role in both the visual quality and performance of a Unity scene. Thoughtful lighting design enhances clarity, immersion, and usability in interactive environments.
- Clarifying Spatial Relationships and Depth: Well-placed lighting helps users distinguish between foreground, background, and overlapping objects, making navigation more intuitive.
- Highlighting Active vs. Inactive Elements: Dynamic lighting can be used to indicate the status of scene elements—e.g., a CNC machine might glow or cast a light when powered on, drawing the user’s attention.
- Simulating Time-of-Day Conditions: Lighting setups can mimic morning, midday, or evening environments, contributing to narrative context or mood.
- Improving Realism: Realistic lighting and shadowing reinforce depth perception and material authenticity, especially when simulating industrial or physical environments.
- Managing Performance: Lighting affects rendering complexity. Real-time shadows, global illumination, and baked lighting all carry different performance costs—crucial considerations for VR or AR applications where frame rate is critical.
Use baked lighting where possible for static objects to reduce runtime overhead, and consider light probes for dynamic objects to maintain visual consistency.
Light Types
Unity supports several core light types, each serving a unique role in shaping how environments are lit—especially important in engineering visualizations like factory simulations.
-
Directional Light simulates sunlight or other distant light sources by casting parallel rays across the entire scene, regardless of distance or size. This is ideal for global lighting, such as simulating daylight over the exterior of a factory or providing uniform light coverage across large workspaces.
-
Point Light emits light in all directions from a single point in space, similar to a bare bulb. It is commonly used for overhead lights, indicator LEDs on mobile robots, or illuminating small localized areas.
-
Spot Light projects a focused cone of light in a specific direction, allowing for targeted illumination. This type is effective for robotic weld torches, inspection lamps, or simulating focused task lighting in work cells.
-
Area Light emits light from a rectangular surface, producing soft, realistic illumination—though it’s only available for baked lighting (not real-time). Use area lights to simulate architectural wall panels or soft, diffuse lighting in fixed installations.
-
Ambient Light provides low-level, omnidirectional illumination across the entire scene, without casting shadows. This helps avoid harsh contrast and can simulate subtle background lighting in large factory halls or uniformly lit interiors.
Choosing the right light type is essential for balancing visual realism with performance, especially in XR environments.
Lighting Behavior
Light in Unity simulates how it interacts with real-world surfaces:
- Reflection: Smooth surfaces (e.g., polished robot arms) reflect light directionally.
- Refraction: Transparent materials like safety glass bend light, altering object appearance.
- Shadows: Realistic occlusion improves depth perception and immersion. Main shadow types include
Hard
(crisp, defined edges),Soft
(smoother, more natural shadows),Realtime
(for dynamic objects and lighting), andBaked
(for static scenes with better performance). - Emission: Surfaces can appear to emit light—used for status indicators or screens.
Shadows enhance realism but impact performance. Use shadow-casting lights sparingly, and tune
Shadow Distance
andResolution
underProject Settings > Quality
.
Lighting Source
-
Direct Lighting: Direct light comes straight from a light source to a surface, producing strong highlights, defined shadows, and clear visibility of object contours. It is often used for attention and focus (e.g., a welding torch, floodlight on a machine, or a spotlight over an HMI).
-
Indirect Lighting: Indirect lighting results from light bouncing between surfaces, adding soft shadows, ambient fill light, and subtle color bleeding from surrounding materials (e.g., a red machine reflecting onto the floor). This contributes heavily to immersion and realism, especially in interior factory scenes with metallic or glossy materials.
-
Global Illumination (GI): GI simulates how light scatters and bounces across an entire scene, enhancing depth and realism.
- Baked GI: Precomputed lighting stored in lightmaps or light probes, optimized for static environments.
- Realtime GI: Updates dynamically as lighting changes, but is deprecated in URP due to high performance cost.
- Progressive Lightmapper: Unity’s tool for baking GI, offering GPU/CPU-based solutions for faster iteration.
Unity supports Realtime Lighting (best for dynamic objects but costly in XR), Baked Lighting (ideal for static elements with no runtime cost), and Mixed Lighting (a balance that bakes static geometry while allowing dynamic objects to cast realtime shadows). In URP (used in XFactory), only baked GI is supported, which is well-suited for static interior environments like factory floors, walls, and machinery. Review this Unity documentation to learn more about light modes in Unity.
Configuring Lights (Direct)
- Add a Light:
- In the
Hierarchy
, click+ > Light > [Type]
(e.g.,Directional
,Point
,Spot
). - The light appears in the
Scene
view.
- In the
- Configure the
Light
:Type
: Choose the light type (Directional
,Point
,Spot
,Area
) based on the lighting scenario or physical source you are simulating.Mode
: Select how the light is calculated (Realtime
,Mixed
, orBaked
).Light Appearance
: Adjust settings like cookie textures (to simulate patterned light), flare, or bounce intensity to influence the visual feel of the light.Color
: Choose a color that suits the lighting context—use accurate, neutral tones for realism or warmer/cooler hues for mood and contrast.Intensity
: Controls how bright the light appears; tune it to match real-world brightness or to ensure visibility in dark areas.Range
: Sets the distance the light reaches (relevant forPoint
andSpot
lights); larger values illuminate more area but may affect performance.Culling Mask
: Defines which layers the light affects—useful for optimizing performance or isolating lighting to specific objects (e.g., only affecting machinery, not background props).Shadow Type
: Choose whether the light casts hard shadows, soft shadows, or no shadows, depending on the desired realism and performance budget.
Use
Spot
lights to mimic focused, high-intensity work lights on the XR headset stand.
Ambient Lighting (Indirect)
Ambient lighting provides overall scene brightness and ensures no object is left in complete darkness. To configure:
- Open
Window > Rendering > Lighting
. - Go to the
Environment
tab. - Set
Ambient Source
:Skybox
: Uses the sky environment (most realistic).Gradient
: Blend from sky to ground (efficient).Color
: Uniform ambient light.
- Tweak the
Intensity Multiplier
for brightness. -
Click
Generate Lighting
to apply.
Use ambient lighting to softly illuminate the logistics station’s walls without placing individual light sources.
Light Probes (Indirect)
While baked lighting works well for static objects, dynamic objects—such as moving robots or mobile equipment—need additional data to appear realistically lit in baked environments. Light Probes address this by storing information about indirect lighting (light bounced from nearby surfaces) and applying it to dynamic GameObjects. To use light probes:
- In the
Hierarchy
, click+ > Light > Light Probe Group
. - Position probes around areas where dynamic objects operate.
-
Bake lighting again (
Lighting > Generate Lighting
).
Use Light Probes along forklift routes or drone pathways in XFactory to ensure lighting stays consistent as they move through the environment.
Reflection Probes (Indirect)
In addition to light, dynamic and reflective objects also require believable environmental reflections. Reflection Probes solve this by capturing the surrounding scene and projecting that reflection onto materials—especially important for metallic or glossy surfaces. To use reflection probes:
- In the
Hierarchy
, click+ > Light > Reflection Probe
. - Position the probe in the environment (e.g., above a workstation).
- Adjust the bounding box to define the area of influence.
- Choose update mode:
Baked
,Realtime
, orCustom
. -
Bake lighting or update probes at runtime as needed.
In XFactory, place a
Reflection Probe
in the logistics station to reflect glow effects on surrounding shelves and objects like the drone.
Rendering Pipeline
The rendering pipeline in Unity is the ordered process through which scene data is transformed into the final image displayed on screen or in an XR headset. It begins by determining which objects need to be drawn, processes those objects with materials, shaders, and lighting to generate pixel data, and finally enhances the image with post-processing effects. This pipeline ensures efficient performance while maintaining high visual fidelity.
-
Culling: Culling is the first stage, where Unity decides which objects are unnecessary to render. Objects outside the camera’s view, obscured behind others (occlusion culling), or facing away can be skipped. By removing irrelevant geometry early, Unity reduces rendering workload and improves frame rates.
-
Rendering: Rendering is the core stage where visible objects are drawn. Unity processes scene geometry, applies materials and shaders, evaluates lighting and shadows, and issues draw calls—a command sent from the GPU telling it to “draw something.” This produces the base image containing all visible elements. The rendering stage varies in detail depending on the pipeline used (Built-in, URP, or HDRP).
-
Post-Processing: Post-processing is the final stage, where full-screen effects are applied to polish the image. Common effects include anti-aliasing to smooth jagged edges, bloom for glowing lights, depth of field for cinematic focus, motion blur, and color grading. These effects elevate realism and provide artistic control before the image is sent to the display or XR device.
Imagine making a movie: Culling is like the director telling unnecessary extras to step off set so the crew only films what the audience will see; Rendering is the actual shoot, where actors perform under lights with costumes and props to capture the raw footage; and Post-Processing is the film editor polishing that footage with filters, color grading, and special effects before releasing the final cut to the audience.
Types of Rendering Pipeline
Unity provides several Rendering Pipelines. These are pre-defined sets of tools and processes that determine how rendering is performed. Each pipeline offers different trade-offs in performance, flexibility, and visual fidelity. For engineering simulations like XFactory, choosing the right pipeline is critical based on target devices and the required level of realism. Unity’s modern pipelines are part of the Scriptable Render Pipeline (SRP) system, which allows developers to configure and extend how rendering works. To select a pipeline in Unity, go to Edit > Project Settings > Graphics
and assign your preferred pipeline asset as the Default Render Pipeline
.
-
Built-in Render Pipeline: Unity’s legacy default rendering system. Offers general flexibility and a broad ecosystem of shaders and assets, but lacks optimization for XR and modern rendering techniques. Suitable for rapid prototyping or projects where backward compatibility is important.
-
Universal Render Pipeline (URP): Optimized for cross-platform performance. Ideal for VR headsets and mobile AR, such as visualizing the tech station or logistics operations on a tablet. URP balances fidelity and efficiency, making it the most common choice for XR projects.
-
High Definition Render Pipeline (HDRP): Delivers cutting-edge visual fidelity with advanced lighting and materials. Recommended for high-end PCs or immersive simulations in a CAVE environment — perfect for detailed walkthroughs of welding robots or dynamic reflections on machine surfaces. HDRP is not suitable for low-power XR devices but excels in tethered, high-performance setups.
URP is the go-to rendering solution for XR in Unity, offering optimized performance and cross-platform compatibility for both AR and VR experiences. Review this Unity documentation to learn more about URP. This Unity documentation provides detailed guidelines about choosing the right Unity render pipeline for your project. Note that switching between pipelines is not seamless. Materials, shaders, and post-processing settings often require conversion or re-creation when moving from Built-in to URP or HDRP.
Rendering Paths
Rendering paths determine how lighting and materials are calculated for objects in the scene. The choice directly impacts performance, visual fidelity, and hardware compatibility.
-
Forward Rendering: In forward rendering, each object is rendered individually in a single pass that processes its geometry, material, and shading. While straightforward, additional passes are required for lighting, shadows, and other effects, meaning that each extra light increases rendering cost. This method is efficient for scenes with a small number of lights and simple materials, making it well-suited to lower-end hardware and mobile devices, but it becomes less optimal in complex, light-heavy environments. Forward+ improves this by using tiled/clustered lighting so many lights can be handled more efficiently while retaining MSAA support.
-
Deferred Rendering: Deferred rendering begins with a geometry pass (G-buffer) that stores per-pixel data such as positions, normals, and material properties, instead of shading objects directly. Lighting and shading are then applied in later passes using this data, which makes handling many dynamic lights much more efficient. This approach minimizes overdraw and scales better with complex lighting, but it has drawbacks such as limited transparency support, higher memory usage due to the G-buffer, and weaker MSAA performance. Deferred+ extends this by also using tiled/clustered lighting, making it more scalable with very large numbers of lights on modern GPUs.
Imagine you’re rendering a bunch of cubes in Unity: Forward Rendering → Each cube is drawn one at a time, and for every light that hits it, Unity runs another pass to calculate shading. A few lights? Smooth. Ten lights? Each cube is redrawn many times, which gets expensive. Deferred Rendering → All the cubes are drawn once into the G-buffer, storing each cube’s position, normal, and material info. Then Unity applies all the lights afterward in a single lighting pass—much faster when lots of lights are shining on the cubes.
Adjusting Rendering Paths
In Unity using URP, rendering methods like Forward
and Deferred
are configured through URP assets, as follows.
- Go to
Edit > Project Settings > Graphics
. -
Under
Default Render Pipeline
, assign your URP Pipeline Asset (e.g.,PC_PipelineAsset
). This tells Unity to use URP as your rendering system. - In the
Project
window, locate and select your URP Renderer Asset (e.g.,PC_PipelineAsset_ForwardRenderer
). -
In the
Inspector
underRendering
, set theRendering Path
toForward
orDeferred
(if supported by the target platform and enabled features).
When you switch rendering paths in Unity, what you’ll notice mainly comes down to how lights and edges behave. Forward is simplest—great if you only have a few lights, but it slows down fast as lights increase. Forward+ looks the same but runs better with many lights, especially in VR/XR where smooth edges matter. Deferred often looks richer in light-heavy scenes since dozens of lights can affect objects consistently, but transparent effects may look different. Deferred+ pushes this further, letting you pack in even more lights efficiently on modern GPUs.
Key Takeaways
Rendering in Unity is the process of converting 3D assets—meshes, materials, textures, lighting, and camera views—into the 2D images we see on screen, and it is central to achieving both visual fidelity and performance in XR projects like XFactory. Mastery of core elements such as meshes (geometry), materials (surface properties), shaders (light–surface interaction), textures (detail), and lighting (illumination and mood) ensures realistic and efficient scene presentation. Unity’s rendering pipelines—Built-in, URP, and HDRP—offer different balances of speed and quality, with URP being ideal for cross-platform XR. Choosing appropriate rendering methods (Forward, Deferred, Hybrid) and optimizing assets through mesh simplification, proper material setup, and efficient lighting can dramatically improve performance without sacrificing realism. By integrating baked lighting for static elements, real-time effects for dynamic ones, and leveraging tools like light probes and reflection probes, you can create immersive, responsive XR environments that run smoothly across a range of devices.