Virtual Production Pipeline: Overview of The Full Process
If you're exploring a career in film or digital arts, you've probably come across the term virtual production. But what does it actually involve, and how does it fit into the filmmaking process? In this article, I’ll walk you through the full virtual production pipeline, from the earliest development stages to final delivery, so you understand what each step involves and where real-time artists fit in.
Table of contents:
- What is virtual production?
- Virtual production vs. traditional filmmaking pipeline
- How does the virtual production pipeline look

What is virtual production?
Virtual production is the process of joining the physical world of filmmaking with the digital world in real time, using technology such as game engines. On a practical level, this means that actors, props, and physical set pieces can interact with digitally created environments, characters, and effects during the actual shoot, not just in post-production.
It's not a single technique. Virtual production is an umbrella term for a set of workflows and tools that are used across all stages of filmmaking:
- Pre-production.
- Principal photography.
- Post-production.
These include previs, pitchvis, techvis, stuntvis, virtual scouting, in-camera VFX, and more. Each of these will be covered in detail later in this article.
The roots of virtual production are in broadcast. For years, live sports shows and election coverage used real-time graphics to display constantly updating data on air. As real-time rendering engines, particularly Unreal Engine, became more powerful, the same underlying technology started producing high-fidelity, photorealistic visuals fast enough to be useful on a film set. That shift brought virtual production into film, television, and other industries, where it now plays a role at every stage of production.

Virtual production vs. traditional filmmaking pipeline
Traditional filmmaking follows a linear process: pre-production (planning and concepting) → production (filming) → and post-production (editing, color grading, and visual effects). Each stage is largely separate, and most creative and technical decisions about how the final film will look are resolved at the end. If something needs to change late in the process, it's costly and time-consuming.
Virtual production works differently because it's non-linear. The VFX process and asset creation begin in pre-production, continue through the shoot, and carry into post. Which means directors and cinematographers can see a close representation of the final result much earlier. This is highly beneficial as it gives a chance to make changes before they become expensive, and avoid unnecessary reshoots.
Here's how that structural difference plays out in practice:
Visual effects timing
In traditional production, VFX work begins after the shoot is completed, while in virtual production, it starts during pre-production and runs in parallel with principal photography. This earlier involvement produces better integration between CG and live-action elements and reduces the volume of work left for post.
Set and location requirements
Traditional productions require either physical sets or on-location shooting, both of which carry high costs and logistical challenges. Virtual production uses digitally created environments projected onto LED volume screens surrounding the set. So virtual production requires fewer costs, makes reshoots simpler, and helps maintain visual continuity across shooting days.
Lighting workflow
On a traditional set, lighting requires extensive physical rigs and setup time. On an LED volume stage, the screens themselves emit the light, producing more accurate reflections and physical props. This creates a more realistic integration between the live-action and digital elements, while reducing on-set lighting overhead.
Asset reusability
In traditional production, physical props and sets are built for specific shots and then retired. With virtual production, the digital assets you have created can be reused across multiple shots and stages, which becomes a big time and money saver.

How does the virtual production pipeline look
The virtual production pipeline is not a strict sequence where one stage finishes before the next begins. The stages overlap, feed into each other, and involve close collaboration between departments. Below is a walkthrough of each stage, from the earliest development work through to final delivery.
Pre-production: pitchvis and development
The pipeline starts before a project is even approved. 3D artists create digital scenes that visualize the story and tone of a project, a process called pitchvis. Think of it as a rough, animated preview of the film, built to show investors and studios what the production will look and feel like.
Pitchvis has been used to greenlight major Hollywood productions, including Godzilla (2014), Men in Black 3, and World War Z. The ability to show a vivid, moving representation of the creative vision, rather than static concept art or written descriptions, makes it a much more effective tool for securing funding and buy-in.
Virtual art department (VAD)
Once a project moves forward, the virtual art department begins building the assets that will be used throughout production. This includes everything from rough 3D prototypes to camera-ready props, characters, and environments.
One of the VAD's key responsibilities is deciding which assets will be digital (displayed on the LED volume), which will be physical props on set, and which will be handled in post-production by a VFX studio. Getting this right early saves significant time and money later. The VAD works closely with previs artists and VFX teams to make sure the production's priorities are met before principal photography begins.
Visualization: previs, techvis, and stuntvis
Visualization covers several distinct workflows, each serving a different purpose.
Previs (previsualization) uses CGI to rough out sequences and shots before filming begins. It helps define how the script will be visually presented, and allows directors and cinematographers to test camera angles, lighting, and staging without committing crew and equipment. Previs also forms the basis for VFX bidding, where studios compete for contracts to complete specific shots.
Techvis focuses on the technical side. It works out precise practical requirements before the shoot, including camera placements, crane positions, and motion-control rig setups. Unlike previs, techvis uses basic assets because the goal is practical planning, not creative decision-making. It answers questions like:
- Will the crane fit on this set?
- Will this camera move actually work on the day?
Stuntvis (also called action design) combines elements of both previs and techvis, specifically for stunt and action sequences. It uses real-world physics simulation available in real-time engines like Unreal Engine to choreograph shots with precision, test stunts before live performers attempt them, and maintain safety standards on set.
Virtual scouting and VR location scouting
Virtual scouting allows key creatives, including the director, cinematographer, and production designer, to review a digitally built set in VR before anything is physically constructed. They can move through the space at human scale, assess proportions, adjust set dressing, and plan shot choreography. This helps avoid costly surprises during the build or on shooting days.
VR location scouting takes the same approach but applies it to real locations. A team member visits the location and captures it using photogrammetry tools, converting it into a 3D model. The rest of the team can then review it remotely in Unreal Engine, cutting down on travel costs and speeding up the decision-making process when comparing multiple locations.
Principal photography: on-set virtual production and in-camera VFX
At this stage digital and physical worlds come together on set. Virtual environments are projected onto large LED volume screens that surround the actors and physical set pieces. Because the screens emit light, they produce realistic reflections and ambient lighting on everything in front of them, reducing the need for separate lighting rigs.
Camera tracking technology adjusts the virtual scene as the camera moves, creating parallax. So the digital background shifts in sync with the camera, making the two worlds appear to occupy the same space.
VFX artists can update the virtual environment live during the shoot, adjusting lighting, colors, or elements as needed. Directors and cinematographers can also use live compositing (Simulcam), which overlays CG elements onto the live footage in real time on a monitor. This gives them an accurate on-set reference for what the final shot will look like, without waiting for post-production.
Post-production: postvis and final VFX
Even with the amount of VFX work completed during earlier stages, post-production is still required. Additional 3D modeling, compositing, and FX work is completed here, particularly for elements that couldn't be handled on set, such as digital creatures or complex environmental effects.
Postvis is used as a bridge during this stage. It combines the live-action footage with rough VFX as editorial placeholders, giving the editor a working cut that reflects the intended final look while the actual VFX are still being completed. It also communicates the filmmakers' intent to VFX teams, and can be used in early test screenings to give audiences a representative version of the film before everything is finalized.
Final delivery
Once all VFX is completed and approved, the film is rendered and delivered. In virtual production, this stage is less of a dramatic endpoint than it is in traditional filmmaking, because much of the final look has already been established and refined across the earlier stages. By this point, there are far fewer surprises.
Final word
The virtual production pipeline is not just a new set of tools. It's a fundamentally different way of making films, one where creative decisions are made earlier, visual effects are integrated from the start, and the gap between concept and final image is much smaller than in traditional production.
For 3D artists working on real-time projects, it means being involved at every stage of a project. This includes creating scenes before a film is approved, as well as supporting on-set production and post-production work. There are lots of different roles available, and more and more people are needed for these jobs.
If you want to work in this field, practical experience with tools like Unreal Engine is where to start. At CADA, we have prepared a new course for Virtual Production with Unreal Engine. It is a 12-week online program that covers the pipeline we have talked about in this article. It's taught live by Mahmoud Alkawadri, an Epic Games educational advisor with over 20 years of industry experience working with clients including Netflix and Epic Games.
