Sunday, November 07, 2010
Pixar and Microsoft - Cloud rendering with Azure
http://blog.seattlepi.com/microsoft/archives/226427.asp
This is a little late news overshadowed by the Blender Conference this year but this is a bit of interesting news. Pixar and Micorsoft have paired up, with Renderman being shown as a proof of concept for Microsofts cloud computing service "Azure". This is the first time an RiSpec renderer has been used in such an environment, which is actually pretty tough to do since there are usually quite a bit of assets and files required to render any given frame as opposed to say Blender or Maya files which are usually a single file with all information needed to render.
For instance BURP and Renderfarm.fi are possible because Blender animation projects can be packed into a single file, these files can be distributed across the internet to multiple computer slaves, rendered and the images sent back to a single folder of multiple frames.
Renderman it is a bit different, shaders, image maps, RIB archives, even header files and any other on the fly processed imagery, brick maps or point clouds can be scattered across multiple file paths, even with relative file paths things can be lost if you are not carefull. The amount of exported files is quite large, often reqching into the thousands and the more data per frame that number just increases, shadow map passes alone can reach into the tens of thousands depending on the amount of frames and lights. This makes it very difficult to use distributed rendering across the internet using Renderman.
Another factor that makes distributed rendering unfavorable compared to a renderfarm is CPU architecture. The differences in the various types of processors will alter the output of procedural texturing because of the way the shader is compiled, the subtle changes in fractal pattern generation or noise, turbulence, anything that generates a procedural pattern. In other words an image rendered with Aqsis on an AMD64 Dual Core CPU will be slightly different than a render of the same file with Aqsis on a SPARC or MIPS processor. The difference will not really be noticable on still frames, even side by side the image can appear to be the same, however when the frames are going at 30 fps, these differences will be seen and the patterns will appear to flicker over time. This is why renderfarms are usually composed of identical hardware and at the very least the same type of CPU, this eliminates the worry of those artifacts.
However it is not impossible and if anyone can prove it, it's Pixar and they did.
Pixar took to the stage at the Professional Developers Conference 2010 to demonstrate this now potentially powerful ability to reduce the overhead cost of hardware and energy supply and instead using remote rendering. Cloud computing is considered the future of computing, whether it happens or not is anyones guess but in many instances where number crunching at a massive scale is needed, cloud computing could be one of the best options available to smaller studios that can't afford to spend the very large amount of money to build an effective renderfarm. Spending a fraction of that using cloud computing services is favorable in that sense, so it is a very welcome sight to see that it is possible with Renderman.
As you can see here, the payoff is incredible.
So this can open up new doors for some on what can be done with Renderman. In the past, I myself have argued that cloud computing is not really practical when using something like Aqsis to render, however with Blender, Maya 3DSMax and so on, it actually is favorable if the ability is there. I admit that I did not research fully into the subject, now I have and my opinion has changed.
Of course this is Pixar's work so there a large degree of engineering involved, this is not just a bunch of college kids doing this for a project or hopes of making it big. To do that yourself would require a bit of work, programming and patience, however this news of Renderman : Azure is really a nice breath of fresh air, gives us hope and inspiration to replicate it ourselves. While it may actually require investing money into the service, be it Azure, Google or Amazon, in the end it may end up be more cost effective than building your own renderfarm. On demand rendering services are on the rise.
It is funny though, since Pixar was owned by Steve Jobs, that Microsoft was used to demonstrate this. Also proving to the visual effects world that Microsoft is not as useless in the number crunching arena as it once was.
Monday, November 01, 2010
Aqsis new core prototype, interactive viewer!
So here it is - words cannot describe what you are about to see, you have to watch this for yourselves.
From the Aqsis development blog :
"This blog has been pretty quiet for a while, but aqsis development has been coming along behind the scenes. During the aqsis-1.6 development last year I focussed a lot on making aqsis faster. After working on this for a while it became obvious that some major changes were needed for the code to be really fast. In particular, the aqsis sampler code is geared toward dealing with single micropolygons at a time, but it seems better for the unit of allocation and sampling to be the micropolygon grid as a whole. This was just one of several far-reaching code changes and cleanups which seemed like a good idea, so we decided that the time was right for a rewrite of the renderer core. Broadly speaking, the goals are the following:
* Speed. Simple operations should be fast, while complex operations should be possible. The presence of advanced features shouldn't cause undue slowdowns when they are disabled.
* Quality. Speed is good, but not at the cost of quality. Any speed/quality trade offs should be under user control, and default settings should avoid damaging quality in typical use cases.
* Simplicity. This is about the code - the old code has a lot of accumulated wisdom, but in many places it's complex and hard to follow. Hopefully hindsight will lead us toward a simpler implementation.
Fast forward to the better part of a year later - development has been steady and we've finally got something we think is worth showing. With Leon heading off to the Blender conference, I thought an interactive demo might even be doable and as a result I'm proud to present the following screencast.
There's several important features that I've yet to implement, including such basic things as transparency, but as the TODO file in the git repository indicates, I'm getting there. The next feature on the list is to fix depth of field and motion blur sampling which were temporarily disabled when implementing bucket rendering.
Edit: I realized I should have acknowledged Sascha Fricke for his blender-2.5 to RenderMan exporter script which was used by Paul Gregory in exporting the last example from blender. Thanks guys!"
Posted by Chris Foster
From the Aqsis development blog :
"This blog has been pretty quiet for a while, but aqsis development has been coming along behind the scenes. During the aqsis-1.6 development last year I focussed a lot on making aqsis faster. After working on this for a while it became obvious that some major changes were needed for the code to be really fast. In particular, the aqsis sampler code is geared toward dealing with single micropolygons at a time, but it seems better for the unit of allocation and sampling to be the micropolygon grid as a whole. This was just one of several far-reaching code changes and cleanups which seemed like a good idea, so we decided that the time was right for a rewrite of the renderer core. Broadly speaking, the goals are the following:
* Speed. Simple operations should be fast, while complex operations should be possible. The presence of advanced features shouldn't cause undue slowdowns when they are disabled.
* Quality. Speed is good, but not at the cost of quality. Any speed/quality trade offs should be under user control, and default settings should avoid damaging quality in typical use cases.
* Simplicity. This is about the code - the old code has a lot of accumulated wisdom, but in many places it's complex and hard to follow. Hopefully hindsight will lead us toward a simpler implementation.
Fast forward to the better part of a year later - development has been steady and we've finally got something we think is worth showing. With Leon heading off to the Blender conference, I thought an interactive demo might even be doable and as a result I'm proud to present the following screencast.
There's several important features that I've yet to implement, including such basic things as transparency, but as the TODO file in the git repository indicates, I'm getting there. The next feature on the list is to fix depth of field and motion blur sampling which were temporarily disabled when implementing bucket rendering.
Edit: I realized I should have acknowledged Sascha Fricke for his blender-2.5 to RenderMan exporter script which was used by Paul Gregory in exporting the last example from blender. Thanks guys!"
Posted by Chris Foster
"Just to clarify, this is not a demonstration of an interactive viewer for RIB editing. This is the newly under development main core. So, what you’re seeing there is the actual renderer, rendering microplygons (Reyes) at 40 fps. We’re just displaying it in an interactive framebuffer, rather than bucket at a time, to show how fast it really is. It’s not using GPU, purely CPU, exactly what you’ll get when you render with Aqsis.
I should also point out that it’s not complete yet, this is the first demonstrable stage of the core re-engineer, there’s more still to go in before it’s even up to current Aqsis feature levels, but rest assured, when it’s finished, this is going to be fast."
~ Paul Gregory
I should also point out that it’s not complete yet, this is the first demonstrable stage of the core re-engineer, there’s more still to go in before it’s even up to current Aqsis feature levels, but rest assured, when it’s finished, this is going to be fast."
~ Paul Gregory
Subscribe to:
Posts (Atom)