Friday, August 21, 2009

Fisheye lens distortion using MOSAIC and Aqsis

Greetings again Blender heads and RenderMan junkies :)

Well I've been hard at work with MOSAIC and recently needed a good test case for the volume shader I'm currently working on. It just so happened at that moment someone over at BlenderArtists inquired about lens distortion so I decided to do a test project and developer blog demonstrating lens distortion but also testing several area of MOSAIC. I've been asked to copy that blog to share with all the good people here also....


From the original developer blog here http://sourceforge.net/apps/wordpress/ribmosaic/

I've recently had someone at BlenderArtists ask whether MOSAIC could do lens distortions. Well since that's not a standard feature of Blender I have no built-in solution for this, however knowing this could be done I figured I'd tell him that RenderMan could do it easy and MOSAIC can set it up. As I was writing the reply I realized that I hadn't actually tried this myself and I'm currently needing an good project to test the new volume shader so... thus was born this fisheye project :) This post is not intended as a tutorial but just an overview of how I achieved this effect, I'll also include the blend if anyone wants to play with it.

Well the first thing I did of course is look around for examples of techniques, and as it so often turns out there's several different ways to do this. If looking for just mild image distortion/displacement then the simplest solution is to just use an image shader to process the image (the same as the post process filter in Blender's compositor). If looking for something more extreme such as the 360 degree fisheye effect then more radical steps are required. One approach is to use a warped plane in front of the camera and use raytracing with the surface normal refraction to fake the lens effect. Another technique, which I prefer, is to render the camera as a cube map in separate passes and then combine the maps into an empty beauty pass with an image shader. This approach can see distortions all the way around the camera, has fine tuned control over image quality and camera rotations and lens distortion and can even be animated with the same maps as long as camera translation doesn't change!

I found several example shaders but was surprised to find one written by my friend Chris Foster in the Aqsis example folder! I only made a few small changes to the shader for creative control:

  • I added a rather wide filter to the lens mask to blur its edge

  • I added the ability to flatten the distortion so the lensing effect can be pulled in and out of

  • I added the ability to rotate the forward vector in the cube lookup on x,y and z axises


The idea behind this technique is simple: use a cube map pass from the camera's position to generate cube faces, then in the beauty pass lookup into the cube faces with the image shader to project the warped perspective on the frame. The first step is to build a standard scene, I decided to use checker displacements on a ground plane and columns around the camera to emphasize the effect. I also decided to test faked soft shadows in a larger more complex space and also to include my partially rewritten volume shader to produce more daigonal light streaks for effect (the finished shader will be included in next CVS update). Next I added the fisheye image shader and created a shader fragment.

First off there's several tricks I had to play on the beauty pass:

  • Enable an empty scene layer. This is so nothing is rendererd except the image shader otherwise you'll waste time rendering object not seen.

  • Create and select a RIBset on the camera with the fisheye shader enabled. This is so the beauty uses the fisheye shader but we can force the default RIBset with no fisheye effect for other passes.


Next I created a User Autopass to use as the cube renders with a few filter options applied:

  • Blank the "Layers Scene:" filter. This is so we can specify what layers to use in this pass, otherwise it will use the beauty's empty layer.

  • Set "RIBset Bypass:" to DEFAULT. This makes sure the camera used from the beauty pass is not using the fisheye lens.


An autopass is necessary instead of just using a global env pass because better filtering can be achieved in Aqsis using textures then an env map. In this custom pass I made several custom scene RIBset's named as numbers from 0000 to 0005. Then I set this pass in the Project tab to use "RIBSETPASSES" so MOSAIC will export the numbered RIBset's as separate passes and ignore the DEFAULT RIBset. This is so I can use the same scene setup for each cube perspective. Then for each RIBset I used the "Show Autopass Settings" to setup the following:

  • For each RIBset I use one of each of the cube Camera: Perspectives (Object: nx, Object: px, Object: ny, Object: py, Object: nz, Object: pz). I use the Object instead of World perspectives so the cube faces are relative to the cameras orientation.

  • Setup each "file" display to point to ./Maps/ and use the name of each cube face, as "ny.tif", "py.tif" ect.

  • Setup the "Texture" dialog to convert each file display tif into mipmapped tex, as "ny.tif" - "ny.tex"ect. This is not strictly necessary but produces much better results with the image shader ;)


So at this point what's happening is the user autopass is exporting each RIBset using the active camera from the beauty scene with one of each cube face directions and optimizing them into one of 6 images. Now all that has to do done is pull up the fisheye shader in the shader editor and put in each of the images from the cube pass and adjust "thetaMax" to the distortion angle and render the beauty pass.

However since the cube faces can be reused in a simple rotational animation with minimal render time I decided to take things further and do a 20 second animation. Also since the cube faces are static I decided to try really high quality occlusion maps, shadow map with faked soft shadows, DOF and volume atmosphere shading. This is because the addition time needed to calculate these passes on the first frame are more then made up for by the really fast render times of the animation in the imager pass (it only has to grab the cube faces and calculate lookup direction and lensing). I also thought it would be interesting to synchronize settings across the pipeline from Blender to MOSAIC by adding animated composting effects and by using the same camera controls to drive multiple shader parameters in MOSAIC. In particular I'm animating the "lens" control on the camera and hooking that to the thetaMax shader control but also grabbing the same lens data and modifying it in a python token to control the lens distortion parameter and finally using the frame count in another python token to feed y axis rotation in the cube lookup. As a finishing touch I've animated a spectral lens distortion effect in Blender's compositor, this could fairly easily be done in the image shader but this gave me a chance to try sychronizing animation in Blender's compositor with RenderMan :) Anyway Here's the video, project file and a few frames of the animation from my gallery...

Here's the youTube video...


Here's a direct download of the mp4...
http://www.dreamscapearts.com/Public/fisheye.mp4

Here's a frame at 100 degree lens at 0 degrees rotation...


Here's a frame at 200 degree lens at 90 degrees rotation...


Here's a frame at 360 degree lens at 180 degrees rotation...


Here's a frame at 360 degree lens at 180 degrees rotation...


And if anybody want's to play with the blend here it is too.
NOTE: I embedded a modified version of MOSAIC that includes the Object:py-nz camera perspecitves that is not in CVS yet so you'll need to run MOSAIC from the text editor!!
http://www.dreamscapearts.com/Public/fisheye.blend

That's it, thanks for reading :)
Eric Back (WHiTeRaBBiT)