As we near the end of 2016, current generation Virtual Reality remains a promising new medium that is far from reaching its potential. Whether it succeeds or fails in fulfilling its promise, many people in the visual effects community are racing to cater to the demands of 360-video virtual reality production.

For me personally, virtual reality is much more than a business opportunity. It excites the explorer in me by providing new uncharted territories and many unsolved challenges. So when Outpost|VFX was approached by YouTube to do VFX for several stereo-360 videos, I was immediately intrigued and eager to rise to the challenge.

Short turn-around time and a tight budget made it even more challenging, but didn’t deter me from signing up for the task. Though we ultimately succeeded and delivered all the visual effects to the client’s satisfaction, the road was full of obstacles that we didn’t anticipate. Over time these obstacles will inevitably be removed, but for now I hope this post will help newcomers by sharing some of the lessons learned during the process.

Before we begin, a brief note on the terms “360 video” and “virtual reality”:

As demonstrated here, 360-videos use the viewer’s head orientation only to display whichever part of the surrounding the viewer is directed at. The “surrounding” is essentially a flat image warped into a sphere. This allows you to look in any direction while being locked into a fixed point in space.

Virtual Reality experiences use both the viewer’s head orientation and position to place the viewer in a video-game-like virtual world. This allows more freedom of movement than 360 video, but content is currently limited to real-time CGI due to limitations of video-capture technology.

This post will discuss visual effects for 360-videos only. However, much like the rest of the entertainment industry, I’ll use the term “VR” as well when referring to 360-videos.



In many ways 360 video is nothing more than a wide format. A 360-degree image is essentially two 180-degree fish-eye lens images stitched together. So it’s easy to assume that the main challenge for VFX in this medium is matching lens distortions and figure out how to review and give notes on effects in 360-degrees.

In reality, the differences between 360-degree and traditional video are so vast that they affect every step of the process in more ways than one. Entering this new realm often requires leaving behind our most trusted tools and fail-safe devices. Our ability to rely on past experience is greatly reduced and even the simplest VFX task can suddenly turn into a nightmare. It might sound like an exaggeration. Is it really that different? -well obviously it depends on the specific needs of your project, but here are a few things to keep in mind:


If you’ve been in the game long enough you know that VFX supervisors rely on a variety of “tricks” that save time and money. Many of them don’t apply to 360 videos:

Editing is usually VFX’s best friend. As this GIF demonstrates, efficient use of cut-aways can hide certain actions that are hard to create or imagine (like a monster grabbing the kid by the head and lifting him off the ground) while adding more value to the scene. Cuts in VR can cause disorientation if they’re not carefully pre-planned, which means they can’t be relied on as an escape pod late in the game. Instead, VR 360 videos tend to incorporate long uninterrupted takes that call for long uninterrupted VFX. Keep that in mind when calculating render-times.


 Framing as used in

Framing is another trusted support-beam for VFX artists. Traditionally the director decides how a scene will be framed, and can change framing to leave out certain challenging effects, or at least trim parts that are unnecessarily complicated. In the scene above Tarantino famously framed out the cutting of an ear, forcing the audience to imagine it instead – and demonstrated how powerful these decisions can be. By now we are so used to designing VFX for a 16:9 frame that we automatically label them as easy or complicated assuming they’ll be framed a certain way. There is no framing in VR. Well technically there is, but it’s driven by the orientation of the viewer who is free to choose whether to look away or stare directly at an effect. As result, any VFX is potentially fully exposed from birth to death – consider that when calculating simulation times for your next fluid-dynamic shot in VR.


Camera movements are often perceived as something VFX supervisors prefer to avoid, as they raise the need for camera-tracking and other processes. In some situations, though, moving the camera (whether physically or in post-production) can help sell an effect. A good example is camera shakes following an explosion: they make the explosion feel bigger by suggesting there’s a shock-wave, and at the same time reduce the visibility of the VFX and allow certain demanding simulations be avoided.

In 360 video camera movements can cause motion sickness-like symptoms also known as “VR-sickness” and therefore aren’t used very often. Even when they are used, camera movements in VR can direct the viewer’s attention away from an effect, but not always take it off-screen completely.


Optical Effects such as lens flares, depth of field, motion blur, chromatic aberration, film grain and others are commonly added on top of visual effects to make them blend better and feel photorealistic while masking areas of low detail. These optical effects are so entrenched in our VFX workflow that it’s easy to forget how “naked” or “empty” our effects feel without them.

The absence of a single lens, frame and variable focal-length makes 360 videos extremely crisp and sharp. Lens flares are possible but behave strangely if they aren’t dynamically changing based on the viewing angle, which requires a layer of interactivity that’s not standardized yet. Depth of field and motion blur are largely avoided because they impose optical limitations and disrupt the immersiveness. Image degradation effects like film grain and chromatic aberrations are rarely used as they create a “dirty glass” effect, which is more distracting in VR than in traditional film.


Eye Tracking is more of an optical analysis that drives aesthetic decisions than a technical tool. I’m talking about being able to anticipate what the viewers are likely to look at during a scene, and focusing more efforts in those places. This ability is greatly reduced in VR because viewers have a much wider area they can explore visually, and are more prone to missing visual cues designed to grab their attention. This means you are forced to treat every part of the effect as a potential point of focused attention.







While this list may seem intimidating, many of these restrictions are easy to predict and address when breaking down the script of a VR film. By being involved in pre-production stages of the project I was supervising, I was able to identify problematic scenes and offer suggestions that simplified and reduced the workload of certain effects.

For instance, the script had a character carry a magical amulet that was supposed to glow with light. Making a practical glow wasn’t possible so it became a VFX task. Had this been a traditional film I’d consider this a simple task that requires brief design work, some tracking and maybe a bit of rotoscoping. I would expect the amulet to make brief appearances in one or two close-up shots, a medium-shot and potentially a couple of long-shots. I’d be able to suggest framing and blocking adjustments for each shot individually to simplify tasks and save costs.

But this being a 360-video, I suspected the scene might be filmed in a continuous take, capturing the entire spherical environment – meaning the action wouldn’t be broken down into sizeable chunks, and we wouldn’t have editing to rely on as a safeguard. This effect would potentially be visible for the entire duration of the scene.
Anticipating this in advance gave us a great advantage. Given our time and budget constraints I recommended limiting the visibility of the glowing amulet either by making it glow intermittently or covering it while not in use. The director decided to have it stashed in a pocket for the majority of the scene – limiting the effect to when it was truly needed.

This saved us a great deal of work that wasn’t crucial for the story. It also served as an important thought-exercise that made the entire crew more mindful of similar pitfalls and on the look-out for similar opportunities, which brings up another challenge of working in this medium:


Creating compelling Virtual Reality experiences has been attempted several times in the last few decades, but it wasn’t until now that the entertainment industry started taking it more seriously. Even established filmmakers are making their first footprints in virtual reality, and there aren’t many noteable VR films for inspiration. This makes every VR creator somewhat of an experimental filmmaker, working with a high risk of failure or undesired outcomes.
Whether done at your fault or against your advice, fixing VR-related issues could become your responsibility, adding new challenges to your work.

Here are examples for mishaps that may originate from lack of experience in VR storytelling:

Failing to direct viewers’ focus. 360 Video VR lets viewers look around freely. This freedom introduces the risk of missing out on key information. Guiding viewers’ focus through a 360-degree panoramic view is much harder than through a traditional screen. With tools like framing and editing either gone or severely limited, directors that never faced this challenge might fail to recognize the importance of preparation and pre-visualization. Proper workflow demands thorough planning of actors’ positions and movements, careful timing, visual composition that dissuades viewers from looking away, and various other approaches. Giving in to the temptation of positioning parallel action around the 360 camera can cause confusion and frustration for viewers as they can only look in one direction at a time.
Because live-previewing is not yet widely available, such mistakes might remain unnoticed until after the shoot is over when the director gets to review footage in a VR headset. By then it might be too late to re-shoot the footage, turning the problem over to post-production. Certain things can be fixed in post-production, by manipulating the timing of certain events, adding CG elements to grab viewers’ attention and aim them in the right direction, etc. But such fixes shouldn’t to be taken lightly as they are equally affected by the heightened complexity the VR medium introduces. Finally, in the case of an unfixable mistake, you could end up creating an effect knowing that a large chunk of viewers might miss it by looking elsewhere.

Remaining too stationary. I mentioned earlier that VR tends to incorporate long uninterrupted shots, and that camera movements tend to be avoided. This is mainly because early attempts to move cameras in VR caused nausea, and jump-cuts caused disorientation.
With the advancement of VR headsets, growing understanding of the causes of VR-sickness, and understanding of orientation cues, these side-effects can now be avoided without much compromise. This requires a bit of testing and prototyping, but the benefits are great compared to the limitations of a stationary camera and inability to cut. Needless to say, reintroducing cuts can help shrink the length of VFX shots, and moving the camera can further optimize viewing angles and save costs.

Inefficient visual vocabulary. Even experienced directors could find themselves out of their element when working in VR for the first time, and struggle to visualize the project they’re creating. Pre visualization tools are extremely helpful. If they’re not used for any reason, the director’s ability to envision their creation, let alone communicate it outwards, may be hindered.
This could not only disrupt your ability to plan and execute effects efficiently, but force you to backtrack and pre-visualize using final assets, while constrained to footage that’s far from optimal.

Relying on outdated concepts of VFX workflows. Generally speaking, directors who are familiar with VFX workflows are great to work with. In some cases they do your job for you by anticipating and dodging certain VFX traps in the early planning stages of certain scenes. Entering the VR realm alongside such a director can be a great experience of joint discovery. But over-confidence could lead a director to prep and shoot without a VFX supervisor, assuming the same rules that apply to traditional video apply to VR as well. Even if they follow every rule in the rulebook, without having researched and tested various tools and calculated workloads and bottlenecks, they are shooting in the dark. While it’s always best to err on the side of caution when accepting VFX tasks you didn’t on-set supervise, it’s especially true in VR.

Being overly ambitious. For all the reasons mentioned, and especially the ones that I’ll be getting to shortly, creating a VR experience is already an ambitious undertaking that can be surprisingly challenging and mentally taxing. But that’s not going to stop dreamers from dreaming, and you may end up working for a director who believes anything is possible. Therefore it’s important to communicate from the very beginning the compounded complexities involved in creating VFX in VR, and to prepare your collaborators to unexpected limitations that are going to be encountered.


Everyone is fairly new to VR and many of the restrictions and pitfalls are being discovered for the first time. The good news is that lessons are being learned and documented and a growing number of filmmakers are gaining experience. Old perceptions will inevitably be replaced by new ones, and our jobs as filmmakers and VFX creators will become easier as a result.

Beyond conceptual challenges, VR reintroduces a healthy amount of technical challenges as well. Even a fairly simple effect like adding glowing light-rays to an amulet can be quite time-consuming when you’re working in stereoscopic 6K resolution.

Part 2 of this post lists many of the technical challenges facing VFX creators working on VR productions.

Proceed to part two ->

1 reply

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.