Since I have several test renders I thought it would be a good idea to blog another progress report :)
I've been hard at work building the mapping utilities and related improvements over the last few weeks. At this moment I have the zdepth and deep shadow mapping utilities done and have the occlusion rigging utility almost done. Basically all you have to do to use them is just select the light or object and click on the utility desired, set a few settings in a dialog and the scenes and passes are automatically made. Since the passes are built as Blender scenes, theres full manual control of every pass if you need it. I've also been experimenting with the zdepth SSS shader and intend on integrating it into the default shaders. I'm pretty sure I'll be able to have full built-in support for both raytraced and mapped techniques such as:
- occlusion and IBL occlusion (using a dupli hemisphere rig)
- soft shadow lights (including sun ortho lights, and point light cube maps)
- support for both standard and deep shadows depending on renderer
- fresnel reflection and refraction (using both cube and mirror maps)
- caustics (both real and depth mapped)
- Sub surface scattering (currently using zdepth but looking into raytraced)
- support for all Blender's mapping channels
Ok... enough with the report, here's some simple test renders:
This is a test of the very simple SSS shader I'm messing with, In particular I was trying to see how well it will work for lots of detailed grass or leaves as well as heavy displacements (rendered in Pixie):
(I would click on this one to look at it closer, it looks a lot better up close!)
Here's a very poor quality Youtube video showing a simple 30 frame animation with very heavy displacements using mapped ambient occlusion and one shadow mapped spot. When dealing with extreme displacements shadow mapping really shines over raytracing as far as speed goes. This was rendered in 38 mins as apposed to 6 hours using raytracing!
The quality is so bad you can't see the smooth occlusion but on my next release I plan on showing a high quality video with all mapping techniques in one animation ;)
I've also been playing around with Lucille some more :) Heres a quick render showing the previous scene using image based lighting instead of just occlusion:
You can download this here: http://www.dreamscapearts.com/Public/lucille_ibl_killeroo.blend
This is another test to see how much data Blender, MOSAIC and Lucille can handle. This image uses high poly displaced geometry and a lot of polygon tube grass. To be more precise then this scene has 1,987,512 faces! It was exported from MOSAIC in 20 seconds and rendered in Lucille in 8 hours:
I'm not offering this one for download because its 320 megs :(
Thats it for me, hopefully the mapping utilities will be done in a week or two.
Thanks for reading,
>WHiTeRaBBiT<
Hi white rabbit,
ReplyDeleteis blender2aqsis still working?
I looked in the aqsis Third Party Integrations and it's empty like the developer had a massive falling out or something :D Plus that sourceforge site is 'server not found' LOL!
I'll try it again tommorow, I really want to try it, as I'm pretty sick of yafray and yafaray with no sss. LOL I think I've tried 6 different free renderers this weeek LOL!
LOL now I'm going to try mosaic,
ReplyDeleteI thought you were just some guy that tested a lot of renderers, butluckily I deduced that mosaic was involved with all the renders. The dinosaur gallery with all the rendering times looks great also!
Hi, very interesting project. I'm doing initial tests on Powua (google for it) with your scenes. If you like we can collaborate and provide cpu power for your tests.
ReplyDeleteRegards,
Marco Ghirlanda
marco ,dot, ghirlanda ,at, powua.com
Anonymous :
ReplyDeleteSign your comments with your name, or the same ID you use on Blenderartists, Blender.org and so on. In future unidentified comments will be deleted.
Greetings all!
ReplyDeleteAnonymous:
MOSAIC is just a Blender exporter I'm writing to tranlate Blender scene's into RenderMan scenes (the technology developed by Pixar). Since RenderMan is an industry standard that means there are potentially alot of renderers that can use it ;)
escherians:
Right now I'm not doing any tests that my small renderfarm can't handle (haven't even been using that much). In the near future though, when I'm testing much larger and complex animations, I may take you up on that! I think it would be good to highlight that by using an industry standard like RenderMan, it opens alot of resources both commercial and opensource for the graphics community (maybe even a bridge from one to the other).
What renderers are you planning on supporting? I've been mostly using Pixie because of its speed but I really like the community and potential of Aqsis. I've also done alot of testing with demo commercial renderers but can't afford the licensing for them (especially not at $1500x10 on my farm). One irony is I haven't done a single test with PrMan because I don't have access to it :-/
Anyway thanks for the offer ;)
Cheers
When you like we can start testing, we'll wait a feedback from you for this. Overall we are really interested in testing extensively blender and drqueue (which are already set up in the Powua Desktop) so if any of you guys have an animation that would like to speed up, here we are.
ReplyDeleteAs for the render we support we do like to use free ones (yafray, aqsis, pixie) but excluding yafray we didn't have any much requests for those. For larger installations we do support mental ray.
Anonymous :
ReplyDeleteI have already requested that comments be signed. Anonymous comments will not be published unless you include your name.
Furthermore, it is a good idea to check your spelling and grammar before posting.
Admin
I am sorry for me poor
ReplyDeleteenglishings ROFL!
I didn't notice your first warning!
This is truly indeed the
awesome program!
I like how a lot of the
settings are determined by
blenders own natural interface.
I was wondering how are the displacement subdivision
levels set?
Like say you wish to render
the 5th subdivision of a model,
I've looked and can't see it
in the UI and there's no displacement modifier
attached to the dino model
in blender, but it still
renders out sweet!
and that mud displacement is crazy!
Also I've been trying to get
sss working, mainly I'm just fumbling through clicking
buttons such as a fool would,
but say for aqsis is sss
supported?
I click the sss button and
nothing happens and I was
thinking maybe I need to get
the cvs of the entire app and
not just the mosaic.py.
Then again maybe I have to do
a seperate render for the
sss layer?
I like the idea of not having
to edit xml. for measly sss :D
I've only been able to get
that yellow, ugly candle
wax-ish sss in yaf,
and was disheartened when I couldn't figure out how to
apply a texture to it,
let alone a displacement
or normal map.
LOL I just started last night
and now I'm the master of rendermen!
truly yours,
Salvatore Larouche
http://www.filenanny.com/files/44f7b9c9f14e0/killeroo2.jpg
salvatore:
ReplyDeleteI'm very glad your liking RenderMan its a very complex technology but once you start to figure it out its just amazing :)
Concerning SDS, RenderMan doesn't really see or think about geometry as polygons. Instead it uses micropolys which basically means the scene is always split at pixel perfect levels. If you want to adjust the quality versus speed of the SDS and displacements you can adjust the objects "Shading Rate" in MOSIAC's "Geometry Setup" tab.
The displacements on the ground and Killeroo are done by applying a texture to a "Disp" channel in Blender's material "Map To" tab. You can then use Blender's "Disp" slider to adjust the amount of displacement ;)
I'm currently experimenting with several SSS techniques that I will add to MOSAIC's default shader in the coming weeks. The technique I used in the orange cube render uses shadow maps to simulate SSS (same technique used on Golem in Lord of the Rings). Its very fast and should work in any RenderMan renderer (including Aqsis). I'll probably also add a raytracing version if I can find a technique that will work on all the raytrace renderers :)
MOSAIC is still in beta and the current wiki documents are WAY behind the CVS right now. The mapping utilities I'm working on right now are the last heavy coding left, so when I'm done with the next release I'll concentrate on updating the wiki, examples and download package :D
Excellente!
ReplyDeleteSo the scale of the dis map
would also work I suppose if it judges resolution per pixel...
I have also been looking at fake
sss nodes for which I am trying to implement into yafray, but for animation I think the coding
to automate and filter would be
far above me.
Thank you for your patience!
Nice work WHiTeRaBBiT!
ReplyDelete1M tri scene with 8 hours in lucille case...
Hmm... I think its very slow. I have to optimize raytracing core for lucille!
Well actually 1.9M but whose counting ;)
ReplyDeleteI didn't think it was that bad, and I'm also not familiar with Lucille's setup so I'm sure the scene and setup could be optimized :)
Currently I am in the process of setting up a UNIX 3D system that I am also using for sound... but anyways it is a 64bit AMD with plenty of RAM, so on the software end getting it setup to be an effective Blender to Renderman workstation. I do plan on working in this on my personal time as much as I can, testing everything I can with MOSAIC.
ReplyDeleteOh and escherians... you should add Pixie to the list as well.
Hi white rabbit,
ReplyDeleteWhat version of Pixie you used in that scene with grass? 2.2.4 can render that? You can put the scene some where for download? Okan don't provide rib scenes as examples for learning and demonstration :( .
I used BMRT but was very slow so, I want to try Pixie (last try for renderman compliant renderers before returning to yafray and V-Ray).
Thank you!