In this post I’ll explain my process for making immersive media using only free/libre open-source software (FLOSS). I will explain one method that uses Blender to make a simple monoscopic 180 degree planetarium projection, and then discuss options for making 360 degree stereoscopic renders one might use in virtual reality headset.

Renderings made in Blender by my students and myself in the Whitworth Ferguson Planetarium on the campus of Buffalo State University. May 2025.

Immersive media is a general way of referring to everything from virtual reality, planetariums, and other non-conventional display technology such as malleable LED display installations in airports, theater installations, shopping centers, etc. Blender is a useful tool when creating immersive media. It is a robust and powerful 3D animation and rendering suite that for many years has also had the ability to render panoramic views for non-flat surfaces in a fairly simple way. It’s also completely free and open source. This explanation will assume you have some experience using Blender. If not, I recommend you find their YouTube channel and familiarize yourself with its basics.

The Blender 4.4. interface.

Blender’s panoramic modes currently only work when using the Cycles Render Engine, so there are a few things to keep in mind that may be confusing. Blender’s 3D viewport has four modes: Wireframe, Solid, Material Preview, and Rendered. The panoramic effect can only be seen in the Rendered mode of the 3D viewport. Perhaps more importantly, the ability to select things and their normal boundaries and control gizmos no longer function as normal when you are in panoramic mode. Lastly, what you see on the 2D computer monitor when you work in panoramic mode is not what you will see when that image wraps around a dome-like surface, so you must imagine how elements on the screen will look when stretch and warped by the surface on which they will be either projected or displayed. (I have a workaround for this problem which you will see toward the end).

First, make sure you are using Blender’s Cycles Render engine by selecting it from the Render settings panel. Also, if you have a discrete graphics processor unit (GPU) on your computer, you can greatly speed up rendering by switching your device from CPU to GPU as seen below. That said, rendering at high resolution requires a GPU with sufficient memory.

Blender’s render settings panel.

Keep your 3D viewport in either Solid Mode or Material Preview mode for the time being. Set up your desired scene in Blender. I’ll make my usual vaporwave landscape as it is easy to render, and I am a nerd.

Then I set up my shading, lights, camera, and animation as I usually would. This is what the animated scene looks like in a conventional 2D 1080p render. You can download the blend here.

A conventional perspective 2D 1080p render.

Now, let’s make it immersive. Make sure your 3D viewport is in Rendered Mode and you are using Cycles as your render engine. Select the camera and go to Camera settings in the Properties Panel. In the type option, switch it to Panorama.

Cameras Settings in the Properties Panel (have the camera selected to see this)

The default panorama type is Equirectangular, which is good for full 360 VR rendering. Your 3D viewport should now look like something this:

Panoramic view using the equirectangular option.

For now, we are going to focus on making a rendering for a planetarium. So, we are going to change the panorama type to Fisheye Equidistant. This will make your viewport look something like this:

Panoramic view using the Equidistnace option.

There’s two immediate problems we need to resolve. First is that most planetariums need media of at least 4096 x 4096 pixels. Currently, our render is set to 1920 x 1080 as that is Blender’s default resolution. So, let’s fix it. Go to the Output settings in the Properties panel and change the resolution X and Y to 4096 and 4096.

Change the image resolution to 4096 x 4096 for planetariums.

Your 3D viewport should now be a circle and look like this:

Panoramic view with the Equidistant option and a resolution set to 4096 x 4096.

Now here is where almost half my students make mistakes despite repeated warnings. The image above is projected on the ceiling of the planetarium. As a result, the ‘sun’ in the center, is the exact center of the ceiling of the planetarium. Most seating in planetariums orients viewers to look toward the front of the dome, not the top. So viewers would automatically crane their necks upward with this animation since the eyes tend to move into the direction of movement. The viewers would also get the feeling they were moving ‘up’ and not forward. That may or may not be what you are trying to do. But if we want them to face forward, we need to re-orient the camera to face forward in the planetarium.

The seating in the planetarium at Buffalo State University.
Orientation of projected image in the planetarium.

So, we need need to angle the camera upward to bring the focus of the composition down to the front of our audience. This seems like a simple change, but it is counterintuitive to many. The new view seems as though it is now showing not as much of our landscape as before, but to the viewers in the planetarium, they are travelling forward on a road to synthwave Valhalla.

Rendered Fisheye equidistant animation with the camera oriented to the front.

I have found that students have a hard time working with this distorted view, so a simple solution I advise is to make a mini-planetarium in Blender. To do this, we make a UV Sphere, cut it in half, and then map our rendered video into it. Here’s the steps.

A UV Sphere

In edit mode, select the bottom half of the sphere and delete that part of the mesh.

Selecting and deleting the lower half of the UV Sphere.

Then go into the UV Editing layout by selecting the UV Editing tab at the top of the screen. Select all the mesh of the half sphere and then change the view in the 3D Viewport to Bottom by selecting View > Viewpoint > Bottom.

The UV Editing layout. UVs on the left and the half sphere on the right.

When you switch to this view, the 3D Viewport automatically switches to orthographic (no perspective) view. We actually need perspective, so turn it back on by hitting 5 on the number pad or clicking the little grid under the camera icon in the top right of the viewport . It should look like this after you click it . Click on UV at the top of the viewport and select “Project from View (Bounds)” On the left, you should see the UVs become the same shape as the sphere.

UVs unwrapped for our mini-planetarium.

Let’s take our rendered animation and project it onto our planetarium. Switch to the Shading layout. Make sure you are in Object mode. Right click on the dome and select Shade Smooth. Create a new material for our dome. Delete the BSDF node. The dome should turn black. Add an image texture node and connect it to the Surface input of the Material Output node.

The dome with a new material and an image texture node connected directly to the surface input.

On the Image Texture node, click Open and search for your rendered fisheye video or image sequence. If it’s an image sequence, select the first frame, then hit the A key to select all the otherframes (Blender loads frame sequences in the order you select them). Click the Auto Refresh option in the Image Texture node after the sequence or video loads. If you loaded a video file, sometimes you have to manually type in the number of frames or select the frame refresh button in the Node settings panel in the side panel.

The Image texture node with an image sequence loaded.

Switch back to the Layout tab and switch your viewport to either Material Preview or Rendered. You can keep the render engine as Eevee for this bit. In the 3D Viewport, you should see the Fisheye rendering now being projected on both the inside and outside of the dome. This is a fairly accurate simulation of what it will look like in the planetarium. If your computer is fast enough, you can even tap the play button and your animation will play in the dome.

The Fisheye render projected an a texture on the dome.

Lastly, you can put your Blender camera in the dome as if it were a viewer in the planetarium and make a render from that point of view.

The camera in the dome to make visualization of the fisheye projected in a planetarium.

Another way to get a preview of your panoramic render is to load it onto a VR headset. There are many VR video players on the MetaQuest store, but the DeoVR app is both free and plays VR video at a decent quality. Through the PC Link you can load mp4 h264 files directly into the headset and view panoramic 180, 360, as well as stereoscopic VR renders.

The above animation as viewed in a VR headset.

One thing I particularly like about the DeoVR app is that is allows you to load single frame renders into the headset – even stereoscopic renders.

A single frame equirectangular render as viewed in a VR headset.

The last way I’ll discuss to view your panoramic renders is simply to upload it to YouTube or Vimeo which fully support equirectangular renders. Before you upload a 360 render, you’ll need to inject a bit of code into the video file. Adobe makes this a simple click in the Media Encoder, but this is a free and open source demo and as such, it is not so easy. So, Google makes a little tool that injects the VR metadata into the video file so that YouTube and Vimeo treat the video as VR. That tool is here: https://github.com/google/spatial-media/releases/tag/v2.1. After you inject the metadata into the file, you can use VLC to see the effect before you upload it to YouTube.

The spatial media injector interface.
360 equirectangular YouTube video – view in the YouTube app when on mobile for 360 options.

Further Study: Stereoscopy

Note that the injector tool above also supports stereoscopic renders. Blender is fully capable of stereoscopy. For this demo I tried rendering a full top-bottom 4k equirectangular animation and Blender was fully capable of doing so, but I discovered that my VR headset was not powerful enough to play that much data (and there may be video resolution limits with h264 compression). For stereoscopy rendering in immersive situations, all the above applies, but you have to choose from a couple different types of stereo file storage. In the Output section of the Properties panel you find a section for stereoscopy. There you find multiple options for viewing stereoscopic renders for 3D displays, anaglyph glasses, and through a VR headset. The Top-Bottom stereo mode appears to be the one of choice at least for now.

The stereoscopy options in Blender.
A single rendered frame with the left and right view on the top and bottom. The original resolution was 4096 x 8192.

That’s all for now. As our tech evolves and gets faster/higher resolution (and our AI overlords let us continue to make work), keep an eye out for more stereoscopic and, sooner or later, HDR immersive media.