Aside from the sample images provided, you can create your own. It is just like rendering a 2D image in OctaneRender as a 360 x 180 panorama. When you are happy with how your panorama looks, then turn on stereo cube map rendering in the camera node and make sure post processing is off. Finally, select the Gear VR stereo cube map resolution … and render!
Here are 10 useful guidelines for rendering scenes for VR headsets:
Make sure the scene units are in meters. Remember, IPD (Inter-Pupillary Distance) is set in real-world metric units, and scale matters a lot. This is also going to be critical for light-field VR renders in the future.
Make sure that objects in the view are at least 10x the stereo offset distance. If your default IPD is 65-125 mm (e.g. 125 mm if you want to double the stereo strength), then the nearest object should be 70 cm to 1.5 meters from the camera.
Keep the camera upright and the horizon as a straight line in front of the viewer, especially for environment renders (interiors or exteriors). That is why we added the ‘keep upright’ option in the camera node editor so that looking at panoramas through the HMD is as comfortable as possible.
Workflow suggestions: Set up your scene using a preview render target, with a normal spherical pano (Normal Panoramic Camera) camera at low res (for example, 1024×512) (Image resolution). Preview your scene with a cube map projection (Cube map projection) with anaglyph stereo rendering or side-by-side stereo rendering (Side-by-side stereo test) to test stereo offset easily (we may support 3D displays if enough users have this). When you are satisfied, you can then create a final quality rendertarget for the 18K cube map render (Gear VR stereo cube map resolution) that shares the camera position and orientation of the preview one. Make sure that your scene covers all directions because there may be something captivating in the center of the view yet you’ll generally want to look around. Note that the VR feeling, is more realistic if you have something underneath or behind you and you’re not just floating in space. If your scene is supposed to be viewed from a regular viewpoint, it is advisable to place the camera somewhere between 1.4m to 1.7m above the ground.
Because these renders are so large, it really helps to use region rendering on noisy areas that show up early in the render. We will probably add a stereo region render tool to make sure that we apply the region render to each eye identically. Right now, this is a manual process, and it is very important not to have one eye have more noise than the other or you get bad stereo speckling.
Lighting is important. Part of the VR feeling comes from the fact that the lighting is very realistic. You are also inclined to look at the image and be immersed in it for a longer time than usual. Make sure that lighting is as realistic as possible — for example, try the Pathtracing or PMC kernels. Also use hotpixel removal to get rid of the very bright fireflies, as fireflies in stereo look really bad, as they typically only show up in one eye and not the other. We used .75 on the Keloid example to remove all fireflies with a 1000 spp render.
Play with IPD scale value if you have a macro object that you want to give a bird’s eye view of. The space station sample has an IPD of 4 meters to give the effect that you are looking at a miniature. But it also pops out all the contours vividly and is a worthwhile way to show off details of a free-floating model suspended in a space or air.
Experiment often with subtle tone mapping, lower contrast imager settings, and test multiple tone mapping exports in the VR viewer app as you make WIP tests. You may find harsh tone mapping and contrast that make a 2D image look great, don’t work at all with VR. In VR you have bright high-contrast OLED pixels right in front of your eyes and no ambient light that frames the render as you do with an image or video.