In this post I’ll explain my process for making immersive media using only free/libre open-source software (FLOSS). I will explain one method that uses Blender to make a simple monoscopic 180 degree planetarium projection, and then discuss options for making 360 degree stereoscopic renders one might use in virtual reality headset.
Immersive media is a general way of referring to everything from virtual reality, planetariums, and other non-conventional display technology such as malleable LED display installations in airports, theater installations, shopping centers, etc. Blender is a useful tool when creating immersive media. It is a robust and powerful 3D animation and rendering suite that for many years has also had the ability to render panoramic views for non-flat surfaces in a fairly simple way. It’s also completely free and open source. This explanation will assume you have some experience using Blender. If not, I recommend you find their YouTube channel and familiarize yourself with its basics.

Blender’s panoramic modes currently only work when using the Cycles Render Engine, so there are a few things to keep in mind that may be confusing. Blender’s 3D viewport has four modes: Wireframe, Solid, Material Preview, and Rendered. The panoramic effect can only be seen in the Rendered mode of the 3D viewport. Perhaps more importantly, the ability to select things and their normal boundaries and control gizmos no longer function as normal when you are in panoramic mode. Lastly, what you see on the 2D computer monitor when you work in panoramic mode is not what you will see when that image wraps around a dome-like surface, so you must imagine how elements on the screen will look when stretch and warped by the surface on which they will be either projected or displayed. (I have a workaround for this problem which you will see toward the end).
First, make sure you are using Blender’s Cycles Render engine by selecting it from the Render settings panel. Also, if you have a discrete graphics processor unit (GPU) on your computer, you can greatly speed up rendering by switching your device from CPU to GPU as seen below. That said, rendering at high resolution requires a GPU with sufficient memory.

Keep your 3D viewport in either Solid Mode or Material Preview mode for the time being. Set up your desired scene in Blender. I’ll make my usual vaporwave landscape as it is easy to render, and I am a nerd.
Then I set up my shading, lights, camera, and animation as I usually would. This is what the animated scene looks like in a conventional 2D 1080p render. You can download the blend here.
Now, let’s make it immersive. Make sure your 3D viewport is in Rendered Mode and you are using Cycles as your render engine. Select the camera and go to Camera settings in the Properties Panel. In the type option, switch it to Panorama.

The default panorama type is Equirectangular, which is good for full 360 VR rendering. Your 3D viewport should now look like something this:

For now, we are going to focus on making a rendering for a planetarium. So, we are going to change the panorama type to Fisheye Equidistant. This will make your viewport look something like this:

There’s two immediate problems we need to resolve. First is that most planetariums need media of at least 4096 x 4096 pixels. Currently, our render is set to 1920 x 1080 as that is Blender’s default resolution. So, let’s fix it. Go to the Output settings in the Properties panel and change the resolution X and Y to 4096 and 4096.

Your 3D viewport should now be a circle and look like this:

Now here is where almost half my students make mistakes despite repeated warnings. The image above is projected on the ceiling of the planetarium. As a result, the ‘sun’ in the center, is the exact center of the ceiling of the planetarium. Most seating in planetariums orients viewers to look toward the front of the dome, not the top. So viewers would automatically crane their necks upward with this animation since the eyes tend to move into the direction of movement. The viewers would also get the feeling they were moving ‘up’ and not forward. That may or may not be what you are trying to do. But if we want them to face forward, we need to re-orient the camera to face forward in the planetarium.


So, we need need to angle the camera upward to bring the focus of the composition down to the front of our audience. This seems like a simple change, but it is counterintuitive to many. The new view seems as though it is now showing not as much of our landscape as before, but to the viewers in the planetarium, they are travelling forward on a road to synthwave Valhalla.
In edit mode, select the bottom half of the sphere and delete that part of the mesh.

Then go into the UV Editing layout by selecting the UV Editing tab at the top of the screen. Select all the mesh of the half sphere and then change the view in the 3D Viewport to Bottom by selecting View > Viewpoint > Bottom.

When you switch to this view, the 3D Viewport automatically switches to orthographic (no perspective) view. We actually need perspective, so turn it back on by hitting 5 on the number pad or clicking the little grid under the camera icon in the top right of the viewport . It should look like this after you click it
. Click on UV at the top of the viewport and select “Project from View (Bounds)” On the left, you should see the UVs become the same shape as the sphere.

Let’s take our rendered animation and project it onto our planetarium. Switch to the Shading layout. Make sure you are in Object mode. Right click on the dome and select Shade Smooth. Create a new material for our dome. Delete the BSDF node. The dome should turn black. Add an image texture node and connect it to the Surface input of the Material Output node.

On the Image Texture node, click Open and search for your rendered fisheye video or image sequence. If it’s an image sequence, select the first frame, then hit the A key to select all the otherframes (Blender loads frame sequences in the order you select them). Click the Auto Refresh option in the Image Texture node after the sequence or video loads. If you loaded a video file, sometimes you have to manually type in the number of frames or select the frame refresh button in the Node settings panel in the side panel.

Switch back to the Layout tab and switch your viewport to either Material Preview or Rendered. You can keep the render engine as Eevee for this bit. In the 3D Viewport, you should see the Fisheye rendering now being projected on both the inside and outside of the dome. This is a fairly accurate simulation of what it will look like in the planetarium. If your computer is fast enough, you can even tap the play button and your animation will play in the dome.

Lastly, you can put your Blender camera in the dome as if it were a viewer in the planetarium and make a render from that point of view.

Another way to get a preview of your panoramic render is to load it onto a VR headset. There are many VR video players on the MetaQuest store, but the DeoVR app is both free and plays VR video at a decent quality. Through the PC Link you can load mp4 h264 files directly into the headset and view panoramic 180, 360, as well as stereoscopic VR renders.
One thing I particularly like about the DeoVR app is that is allows you to load single frame renders into the headset – even stereoscopic renders.
The last way I’ll discuss to view your panoramic renders is simply to upload it to YouTube or Vimeo which fully support equirectangular renders. Before you upload a 360 render, you’ll need to inject a bit of code into the video file. Adobe makes this a simple click in the Media Encoder, but this is a free and open source demo and as such, it is not so easy. So, Google makes a little tool that injects the VR metadata into the video file so that YouTube and Vimeo treat the video as VR. That tool is here: https://github.com/google/spatial-media/releases/tag/v2.1. After you inject the metadata into the file, you can use VLC to see the effect before you upload it to YouTube.

Further Study: Stereoscopy
Note that the injector tool above also supports stereoscopic renders. Blender is fully capable of stereoscopy. For this demo I tried rendering a full top-bottom 4k equirectangular animation and Blender was fully capable of doing so, but I discovered that my VR headset was not powerful enough to play that much data (and there may be video resolution limits with h264 compression). For stereoscopy rendering in immersive situations, all the above applies, but you have to choose from a couple different types of stereo file storage. In the Output section of the Properties panel you find a section for stereoscopy. There you find multiple options for viewing stereoscopic renders for 3D displays, anaglyph glasses, and through a VR headset. The Top-Bottom stereo mode appears to be the one of choice at least for now.


That’s all for now. As our tech evolves and gets faster/higher resolution (and our AI overlords let us continue to make work), keep an eye out for more stereoscopic and, sooner or later, HDR immersive media.