Thursday, January 28, 2010

A random collection of thoughts



This post is not really about anything in particular, really it has been a month since the last post so I want to just put down some thoughts about some of the stuff that has been sitting in my head over the past year.

Project Widow has become QUITE the task, I really do not think any of us realized just how difficult it was going to become. Considering that this is a small team it is not surprising that the initial 3 month deadline was completely blown. What started as a test became a short film, it will even have its own soundtrack provided by a fairly well known artist (that is MY surprise!!), which that too is considered CC license but anyways back to the point. When the idea started to form it was due to postings on websites that asked if this technology was able to do what we claimed, then why can't we re-render something like Big Buck Bunny? Well one reason was because of the technical implications, BBB was designed for Blender - not something like Aqsis. Another reason was simply why redo something when we can create something new? I took the idea that Pixar perfected, using short films to showcase advances in their software or more accurately - create short films to take advantage of the advances in the software that they developed. Well this is sort of one of those reasons. Quite a bit of programming has been done because of this project, new ideas and methods are also being developed as well. We are doing something never done before using the tools we use and in some cases - develop. Of course we have had more than our fair share of issues as well, in many cases some of the original ideas have either been scraped or re-developed in hopes that this or that would work. In all this is a huge learning experience for everyone and not to mention great demo reel material for all those involved, at the very least something to be proud of! Hopefully before the end of the year this will be completed, we are chipping away at this even if at times we want to throw in the towel (lol) but this is something that needs to be finished.

Something else that has happened over the year or so is that there is a great deal of interest for this pipeline, not only from the average user but also from studios or at the very least certain people within these places. I really cant say who, simply because I have not been authorized to release that kind of information but to the best of my knowlage at least 3 studios are testing these tools in some form and one of them is fairly well known in the CG industry. We have even gotten interest from a member of the Blender Foundation (other than the link on the website that is). Tens of thousands of hits to this very site over the past couple years and some of them from the big studios that we all know. I find this totally mind blowing, regardless of how little I personally have done to develop actual code, this site itself has really grabbed the attention of these people and it all started because I wanted to create a small community where we could bridge that gap between Blender and Renderman somehow.

Here is the best part! We have only just begun! Really this is the technolgical TEST that in a sense has proven that while the intagration of the tools is still not completely %100 compatible and some work arounds are required to achive certain results - all of it was mainly to prove that this WORKS! Project Widow already proved this long before this current stage even, Eric made a quick 10 second small rendered video of a test shot, so the proof of making an animated video using these tools has already been done time and time again. Project Widow is no longer a test, it is a short animation. When Blender 2.50 comes around and the next build of Mosaic is stable enough, who knows what is going to happen but I can say for certain that it will only improve over time.

I promise the next posting (coming early next month) will be a bit more than personal ramblings...

Thursday, December 24, 2009

Happy Holidays!




Happy Holidays to all!

We want to wish all the best of times and safe travels! This post is a long time coming, considering that there has never been a holiday related post here before but this year is different. Why is it so different than any other year? Well Blender to Renderman has become a valid project this year, at least we like to think so. This has stood the test of time and criticism, has pushed development in other areas, not to mention more people are becoming aware and using these tools for their own needs. Considering that Aqsis is one of 3 external rendering examples featured on the Blender website, I think the efforts done by everyone past and present have started to come to light. While this website is only one effort to bring a community together, the idea and practice of exporting Blender scenes to Renderman RIB files has been around for at least 7 years. It has only been in the past 2 years things have really picked up and only this past year that our efforts are taken seriously and not "Yet Another Project".

Not to say that the previous year was bad, not at all, in fact that year was a step towards the right direction. However even today this is still in it's infancy and we all have a ways to go before we see more start ups using these tools to develop their own animations. As mentioned in the previous post, some already have, we applaud them and encourage them, cheering them on because the more people that use these tools that are not directly involved in the development of Mosaic, Blender, Aqsis or Pixie then that means that we as a community have at least helped steer others in that direction. The tools we use and developed had also been used for research, in the form of RenderAnts, a GPU based Renderman visualization tool.

2009 was also the starting point in which "Project Widow" began, something that we wish would have been completed by now but due to the complexity and just the fact that this has not been done before using these tools, production is still in progress. This project helped in the development of Aqsis, with the multilayer OpenEXR display driver, as well as RIB Mosaic, with the addition of these EXR driver presets. While the production is slow, things are starting to pick up and we are chugging away, hoping that by next years SIGGRAPH we will have something to present. Regardless when it gets done, it will get done, some way, some how.

With Blender 2.50 on the horizon things are taking a turn. If anyone has downloaded the Alpha release you will notice that the interface has gone through a massive transformation. While an interface is only the nice presentation of the underlying code, this interface has truly changed from the Blender we all have been used to over the past 10 years. What is more important though is the Render API, the one thing that many have been waiting for years to come to light. At this point in time it is not known exactly when this will be added but development is supposedly in the works according to one of the Blender devs. This is one of the most important steps towards a more solid link between Blender and Renderman, in fact it is better for ANY render engine period! Be it Luxrender, MentalRay, VRay... whatever the case may be this will offer that ability to choose which render engine one wants to use, with better access to the data that was previously blocked by the Python API.

RIB Mosaic will also go under a massive change, mainly because in order to continue development Eric is forced to, since Blender is using Python 3.x and Mosaic was developed for the 2.5x Python, thus a substantial break of code. Also according to Eric in this recent post there will be a change in how Mosaic will not only function but how we will interface to Renderman. The current Mosaic uses a single pane system in the Python script window, the shader system alone is a massive series of RSL code that tries to replicate the Blender materials, the options and settings are there to try to include each feature and option for every single well known RiSpec render engine available. This has made development hard for Eric because it seemed to go beyond what was expected of the project. According to his latest post, RIB Mosaic will be more or less a framework for others to build upon, be Aqsis, Pixie, 3Delight or PRMan. This way he can concentrate primarily with Mosaic as it is rather than fixing the minute details for each renderer. "Pipelines" are described as these functions for each renderer, such as a pipeline for Aqsis shadow maps will be different than a pipeline for point clouds for Pixie. This approach will also allow others to develop pipelines for each rendering engine and then share them. The idea is to change the approach on how Blender and Renderman are to be used, rather than replicating Blender's material and lighting system, it will encourage users to take advantage of the true power of Renderman from the start. This idea is very similar to how Maya and Houdini approach the Renderman interface, the 3D app is just the geometry and animation engine while Renderman is the primary rendering system and thus anything built in the app is designed to be rendered as such. Not only will this encourage new users to really learn Renderman, it will also help more experienced users unlock the power of Renderman as it has been done for many years by the likes of Pixar and Industrial Light and Magic (for instance).

So changes all around are happening this year and even more so next year. Each year that I have been involved to some degree or another I get more excited about it, not because I am involved per se, it is more because this has been a desire of mine to see happen at all. The first taste of Renderman I had was in 2002 when I had been using Lightwave and downloaded 3Delight and Aqsis to be used with this export script Light-R. Since then I have not looked back and I can tell you hands down that anything done now using these tools has surpassed the Lightwave work I had played with years ago. That is why I am excited about this, the efforts done by many people have not only made the ability possible but work very well together and has reached all over the globe. We are not this massive corporation with millions in marketing funds, we are not even an animation studio.... most of us are just your average geek programmer or artist that have this desire to continue on.

So in the 2010 I believe that our efforts will become noticed more, not only by the Blender community but by the rest of the CG community, as well as the industry as a whole.

The image at the top was modeled by Jeremy Birn (yes of Pixar) and was the Lighting Challenge around this time last year. I never got to complete my render in time, until now. While it is not perfect, forgive me it was done last minute, it at least it is something festive! Not to mention the massive file size and poly count! The tree took up the majority of the render time and rightly so, each one of those needles are polygons! Post processing was done in GIMP.

See you next year!!

Thursday, December 17, 2009

KICHAVADI




Well it looks like our influence and efforts are starting to pay off! Recently a small team of artists over in India are working on developing an animated series and are using Blender and Aqsis, among other open source tools, to do it.

What a good start as well! They have been posting regular updates on their blog, mainly a lot of test renders but they do look very nice! In the past few days alone there have been many posts just on tests of shadows and DOF, so they seem to be getting the hang of Blender and Renderman quite well!

Hope to see more from these guys and best of luck gentlemen!

Tuesday, October 27, 2009

Update!! - RenderAnts - Interactive REYES Rendering on GPU

This has been floating around the net recently, something that is actually quite impressive for what it does as well as what it could be. Interactive REYES rendering on a GPU, which really in a sense has been something a lot of people and studios have been looking for.




CG Society Link

BlenderArtist Link

Authors page:
http://www.kunzhou.net/

Paper:
http://www.kunzhou.net/2009/renderants.pdf

Video:
http://www.kunzhou.net/2009/renderants.wmv

Updated News!

According to a source recently in the comments of this post the source code(?) for this is located here : http://research.microsoft.com/en-us/downloads/283bb827-8669-4a9f-9b8c-e5777f48f77b/default.aspx

There have been other similar types, such as Pixar's LPics which was featured after Cars was released, as well as Lightspeed which ILM had developed during the Transformer production. However the difference was these used GL shader equivalents of RSL shaders, so they both did not really use a Renderman based rendering. Both were very impressive though.

Gelato was also something that had been designed for such a purpose but was discontinued after a few years, certian tools did have the ability to convert basic RSL shaders into it's own shader language so in a sense it was sort of a start of what could be. Larry Gritz, the same person who had developed the first non Pixar REYES renderer, BMRT, had developed Gelato. So maybe that was another reason for Gelato being non REYES based considering the legal issues between Gritz and Pixar in previous years.

RenderAnts is a GPU based REYES rendering system, using RIB and RSL code to render the resulting image from the GPU, rather than the traditional CPU software we currently use. The ability to get fast rendering feedback is always a great thing, the only current way to do this is to render smaller sized images along with turning down the detail features of REYES, or do something like Crop rendering which will only render a certain region. This does in fact make an image render faster but if you are concerned about details, or lighting changes, having to render out a new image just to see if something works or not is quite a painfully slow task. This is why RenderAnts is a huge deal. It is not because of the fact that Elephants Dream was used to showcase the speed difference of normal CPU based rendering versus GPU, though it was pretty cool to see that. Elephants Dream was used mainly because it is Open Content, these were fully animated scenes that can be used for any reason within legal bounds.

What makes it so interesting for us, the Blender to Renderman users and developers, is that Mosaic was used to export these scenes out. This is why open source Blender to Renderman is important, it can be used for research, not only production. It is far easier and cheaper to use Blender, Mosaic and Aqsis or Pixie to showcase some new 3D research where you have access to the source code and can make your research possible, than it is with closed source commercial software. At best you can make a plugin for Maya if you were to make something like say a new form of fluid simulation that used a custom RSL volume shader. You would also only be able to do this on one system, while with open source you can have several copies spread out over a network, even at home.

This is the first time Mosaic has been officially used and cited in a published research paper.

If you watch the video make sure to notice that this NOT real time, it is fast but it does not have the ability to render at even 1 fps. At best it does take a few seconds, the few that do look fast are more like camera placement changes or lighting changes. Anything really drastic does seem to take a bit longer to render. However considering the same frame using the same data would take a considerable amount of time for PRMan to render does say quite a bit. What this also means is that this is not to replace current software for final frame rendering, at least not for a while. The best use for such a system is for previewing during production, the little changes that artists and TD's make for instance. Something so tedious like shader development would cut such time in half, making 30 renders of minute changes in the shader is a very time consuming task. It is not hard to imagine that this will be used by the big boys very soon, it is also only a matter of time before a commercial adaptation of this is released in the next few years.

We just have a nice warm feeling knowing that our work here has helped in this, we were used first. THAT is something.