Wednesday, December 08, 2010

RIBMosaic : Now part of Aqsis



HUH!?!?!

I am sure some of you are thinking this. Yes, it is true. RIBMosaic is now a part of the Aqsis project and the current official "home" of the new add on.

The back story goes like this. Eric Back emailed a certain select few last month (around the time of the last post here) and told us that he would no longer be developing the recoded Blender plugin. He was "orphaning" the code as it was on Sourceforge and anyone could come and pick it up. Knowing that RIBMosaic was important for Aqsis it was decided that the developers would adopt the code as it last was and continue development with the intention to bring Aqsis closer to Blender.

From the developer mailing list :

Eric has given his agreement that this should be the 'home' of RIB Mosaic now, and understands that we will focus on the integration of Aqsis specifically into Blender, while endeavouring to ensure that nothing we do will intentionally preclude support for other compliant renderers. We as a team will probably not be able to focus any effort on supporting other compliant renderers, beyond possibly testing regularly to ensure that existing functionality still works. Of course, we will assist and encourage anyone who wants to work on support for other renderers should they wish to do so within its new project space.

Cheers

Paul


In a way this would be a "RIBMosaic for Aqsis", while other developers could make a RIBMosaic for say 3Delight, or AIR or even PRMan.

So now RIBMosaic is now a part of Aqsis and will be packaged with the renderer from now on. A lot still has to be completed and there has to be some serious testing done in order to accomplish this in a timely manner. In order for Project Widow to continue, the tools need to be upgraded as well, it was bound to happen and now is the time.

http://sourceforge.net/apps/mediawiki/ribmosaic

Thank you Eric for bringing this idea to full steam, without your efforts and help it's hard to imagine being this far by now.

Sunday, November 07, 2010

Pixar and Microsoft - Cloud rendering with Azure





 http://blog.seattlepi.com/microsoft/archives/226427.asp

This is a little late news overshadowed by the Blender Conference this year but this is a bit of interesting news. Pixar and Micorsoft have paired up, with Renderman being shown as a proof of concept for Microsofts cloud computing service "Azure". This is the first time an RiSpec renderer has been used in such an environment, which is actually pretty tough to do since there are usually quite a bit of assets and files required to render any given frame as opposed to say Blender or Maya files which are usually a single file with all information needed to render.

For instance BURP and Renderfarm.fi are possible because Blender animation projects can be packed into a single file, these files can be distributed across the internet to multiple computer slaves, rendered and the images sent back to a single folder of multiple frames.

Renderman it is a bit different, shaders, image maps, RIB archives, even header files and any other on the fly processed imagery, brick maps or point clouds can be scattered across multiple file paths, even with relative file paths things can be lost if you are not carefull. The amount of exported files is quite large, often reqching into the thousands and the more data per frame that number just increases, shadow map passes alone can reach into the tens of thousands depending on the amount of frames and lights. This makes it very difficult to use distributed rendering across the internet using Renderman.

Another factor that makes distributed rendering unfavorable compared to a renderfarm is CPU architecture. The differences in the various types of processors will alter the output of procedural texturing because of the way the shader is compiled, the subtle changes in fractal pattern generation or noise, turbulence, anything that generates a procedural pattern. In other words an image rendered with Aqsis on an AMD64 Dual Core CPU will be slightly different than a render of the same file with Aqsis on a SPARC or MIPS processor. The difference will not really be noticable on still frames, even side by side the image can appear to be the same, however when the frames are going at 30 fps, these differences will be seen and the patterns will appear to flicker over time. This is why renderfarms are usually composed of identical hardware and at the very least the same type of CPU, this eliminates the worry of those artifacts.

However it is not impossible and if anyone can prove it, it's Pixar and they did.

Pixar took to the stage at the Professional Developers Conference 2010 to demonstrate this now potentially powerful ability to reduce the overhead cost of hardware and energy supply and instead using remote rendering. Cloud computing is considered the future of computing, whether it happens or not is anyones guess but in many instances where number crunching at a massive scale is needed, cloud computing could be one of the best options available to smaller studios that can't afford to spend the very large amount of money to build an effective renderfarm. Spending a fraction of that using cloud computing services is favorable in that sense, so it is a very welcome sight to see that it is possible with Renderman.




As you can see here, the payoff is incredible.




So this can open up new doors for some on what can be done with Renderman. In the past, I myself have argued that cloud computing is not really practical when using something like Aqsis to render, however with Blender, Maya 3DSMax and so on, it actually is favorable if the ability is there. I admit that I did not research fully into the subject, now I have and my opinion has changed.

Of course this is Pixar's work so there a large degree of engineering involved, this is not just a bunch of college kids doing this for a project or hopes of making it big. To do that yourself would require a bit of work, programming and patience, however this news of Renderman : Azure is really a nice breath of fresh air, gives us hope and inspiration to replicate it ourselves. While it may actually require investing money into the service, be it Azure, Google or Amazon, in the end it may end up be more cost effective than building your own renderfarm. On demand rendering services are on the rise.

It is funny though, since Pixar was owned by Steve Jobs, that Microsoft was used to demonstrate this. Also proving to the visual effects world that Microsoft is not as useless in the number crunching arena as it once was.

Monday, November 01, 2010

Aqsis new core prototype, interactive viewer!

So here it is - words cannot describe what you are about to see, you have to watch this for yourselves.



From the Aqsis development blog :

"This blog has been pretty quiet for a while, but aqsis development has been coming along behind the scenes. During the aqsis-1.6 development last year I focussed a lot on making aqsis faster. After working on this for a while it became obvious that some major changes were needed for the code to be really fast. In particular, the aqsis sampler code is geared toward dealing with single micropolygons at a time, but it seems better for the unit of allocation and sampling to be the micropolygon grid as a whole. This was just one of several far-reaching code changes and cleanups which seemed like a good idea, so we decided that the time was right for a rewrite of the renderer core. Broadly speaking, the goals are the following:

* Speed. Simple operations should be fast, while complex operations should be possible. The presence of advanced features shouldn't cause undue slowdowns when they are disabled.
* Quality. Speed is good, but not at the cost of quality. Any speed/quality trade offs should be under user control, and default settings should avoid damaging quality in typical use cases.
* Simplicity. This is about the code - the old code has a lot of accumulated wisdom, but in many places it's complex and hard to follow. Hopefully hindsight will lead us toward a simpler implementation.

Fast forward to the better part of a year later - development has been steady and we've finally got something we think is worth showing. With Leon heading off to the Blender conference, I thought an interactive demo might even be doable and as a result I'm proud to present the following screencast.



There's several important features that I've yet to implement, including such basic things as transparency, but as the TODO file in the git repository indicates, I'm getting there. The next feature on the list is to fix depth of field and motion blur sampling which were temporarily disabled when implementing bucket rendering.

Edit: I realized I should have acknowledged Sascha Fricke for his blender-2.5 to RenderMan exporter script which was used by Paul Gregory in exporting the last example from blender. Thanks guys!"
Posted by Chris Foster

"Just to clarify, this is not a demonstration of an interactive viewer for RIB editing. This is the newly under development main core. So, what you’re seeing there is the actual renderer, rendering microplygons (Reyes) at 40 fps. We’re just displaying it in an interactive framebuffer, rather than bucket at a time, to show how fast it really is. It’s not using GPU, purely CPU, exactly what you’ll get when you render with Aqsis.
I should also point out that it’s not complete yet, this is the first demonstrable stage of the core re-engineer, there’s more still to go in before it’s even up to current Aqsis feature levels, but rest assured, when it’s finished, this is going to be fast."

~ Paul Gregory

Wednesday, October 27, 2010

BlenderCon2010 "We have such sights to show you..."

All rights are reserved by Clive Barker and/or Bent Dress Productions.

The Aqsis team has been very hard at work giving the renderer a reboot of sorts, with the building of the new core and all, of which has not really been seen in the public eye.......until now. Well, almost.
Leon Atkinson (Renderguy) will be at BlenderCon this year and will be showing off some of the latest exciting developments. Yes, showing it off on screen for everyone to witness because there is no way to fully explain the details in words, the Aqsis team reports a demo for the conference is under development, it is being prepared specifically for BlenderCon to announce the new plans for Aqsis and to show how beneficial they could be to Blender users.

The very core of Aqsis is being re-engineered with a focus on speed, it is at the prototype stage now, but is functional enough to form the basis of the BCon demo. There also will be interface changes, the migration away from FLTK to QT4 for instance, which is actually pretty neat since a lot of the pipeline tools are already in QT4 as well, or in the process of switching. Other changes like multithreading for instance are a very recent addition to the new core.

They are preparing a more detailed press release for after the conference, so keep your eyes open for that. If anyone happens to be at BlenderCon and able to record a video if you could let us know so we can post it here.

This also coincides with a point release, Aqsis 1.6.1 which will mainly be a bug fix release

Some of the rumors going around the underworld is that Larry Gritz is trying to get Ptex implemented into OIIO (Open Image Input Output), which Chris Foster has expressed great interest in using for Aqsis, thus Aqsis would get Ptex for free. That won't be until later though, possibly in version 1.8 or the fabled Aqsis 2.0.

Piece by piece the developers are building up towards a very powerful rendering application.

On the other end of the conference spectrum is the paper "Blender to Renderman : Building a Pipeline for Production Use" written by myself. I had originally been planned for a speaker spot, however due to complications I had to back out and asked Ton if submitting a paper would be ok since the topic was pretty much the same (possibly even worded better on paper than with spoken word haha) so...... my first publication of sorts and it is appearing on this years BlenderCon page.

Strangely enough is that this year's Halloween is also during BlenderCon, so one can only imagine what will go on during the weekend, wouldn't it be cool to see the BlenderCon attendees dressed up as zombies and walk the streets of Amsterdam?