Showing posts with label News. Show all posts
Showing posts with label News. Show all posts

Tuesday, August 07, 2012

Pixar releases OpenSubdiv

In a surprise move recently, Pixar has released open source code called Open Subdiv.

Directly from the website :

OpenSubdiv is a set of open source libraries that implement high performance subdivision surface (subdiv) evaluation on massively parallel CPU and GPU architectures. This codepath is optimized for drawing deforming subdivs with static topology at interactive framerates. The resulting limit surface matches Pixar’s Renderman to numerical precision.

OpenSubdiv is covered by the Microsoft Public License, and is free to use for commercial or non-commercial use. This is the same code that Pixar uses internally for animated film production. Our intent is to encourage high performance accurate subdiv drawing by giving away the “good stuff”.

The source code for OpenSubdiv is located on github and is entering open beta for SIGGRAPH 2012. Feel free to use it and let us know what you think through the github site.

https://github.com/PixarAnimationStudios/OpenSubdiv

Platforms supported: Windows, Linux, limited OSX.




While it is highly unlikely that this will end up in Blender at all it is a very interesting project, coming from a highly unlikely place to ever release anything to the public. Though Pixar has released papers and some shader source code to the public, nothing was expected to ever come in the form of a software library, so this is a welcome surprise to say the least.

Monday, June 18, 2012

Pixar announces Renderman Studio 4.0

From the Pixar website....

EMERYVILLE, CA – (June 14th, 2012) Pixar Animation Studios today announced that effective immediately, RenderMan for Maya is to be combined with RenderMan Studio as a single premium software solution at the new price of $1,300 including a fully functional embedded renderer. This major product consolidation sets the stage for the impending release of RenderMan Studio 4.0, which will provide Maya artists and Technical Directors with the latest tools to setup scene data, lighting, and shader assets for film-quality final rendering.

RenderMan Studio 4.0 also introduces the latest rendering technology developed for the forthcoming RenderMan Pro Server 17.0, and showcases significant advancements in ray tracing for multi-bounce global illumination and ray-traced subsurface scattering, including a system of physically plausible shaders directly integrated into Maya and the Slim shader editor. With these new features, artists can maximize today's high performance multi-core architectures to create photorealistic images with minimal setup within the user interface of Maya. The process of shading and lighting setup has also been dramatically accelerated with new lighting tools, including the robust re-rendering technology used in "Cars 2" and "Brave" as well as progressive ray-traced re-rendering for rapid look development. With additional new capabilities such as Dynamic Shader Binding, expanded RIB archiving, and a new library of RenderMan materials for Maya, RenderMan Studio 4.0 is the result of the feedback from numerous VFX productions allowing Maya artists to easily create photorealistic images at the highest levels of cinematic quality in a comprehensive solution that can be configured to accommodate any VFX pipeline.

Upgrade Availability

RenderMan Studio 4.0 is compatible with Maya 2013 and earlier versions on Microsoft Windows, Linux, and Mac OS X. Upgrade pricing from RenderMan Studio 3.0 is available and existing RenderMan for Maya customers can upgrade to RenderMan Studio 4.0 for the same price as previous RenderMan for Maya upgrades. Student pricing is also available. Further details can be found on the new RenderMan website at https://renderman.pixar.com/. For direct assistance concerning sales, maintenance, operating system compatibility, evaluation licenses, or any other questions about Pixar’s RenderMan, please contact rendermansales@pixar.com.

About Pixar Animation Studios

Pixar Animation Studios, a wholly-owned subsidiary of The Walt Disney Company, is an Academy Award®-winning film studio with world-renowned technical, creative and production capabilities in the art of computer animation. Creator of some of the most successful and beloved animated films of all time, including "Toy Story," "Monsters, Inc.," "Finding Nemo," "The Incredibles," "Ratatouille," "WALL•E," "Up" and "Toy Story 3" the Northern California studio has won 29 Academy Awards and its 12 films have grossed more than $7.2 billion at the worldwide box office to date. Pixar's next adventure, "Brave" takes aim at theaters on June 22, 2012.

Wednesday, March 14, 2012

Aqsis 1.8 released

Aqsis 1.8 has been released as of Feb 29, 2012, a long awaited release build which brings some of the most exciting new features in this renderer in some time. Of these new features, the Point Based Global Illumination functions are the most obvious and anticipated. This release was not without it's problems however as the Windows and MacOS binaries were broken due to the move from FLTK to QT4, work is underway to fix this so expect a 1.8.1 patch release soon. The QT4 switch changed the GUI appearance only slightly, though in some tests Piqsl seems to respond much slower than it's FLTK predecessor, however the point cloud viewer program makes up for that. The PartIO library is a great addition as well, which will allow particle data to be used from various software such as Maya and Houdini. Despite the binary problems, this is one of the most exciting releases from the Aqsis team in years.




LONDON, UK - February 29, 2012 - Aqsis Team, the developers of professional open source rendering software, announced today the immediate release of Aqsis Renderer 1.8.0; its leading cross-platform 3D rendering solution adhering to the RenderMan standard.
This is the accumulative effort of many developers and community members around the world, resulting in an even more competitive solution.

Global illumination and software integration have been the primary focus for this release, with improvements including:
  • Point-based global illumination, providing bake3d()indirectdiffuse()occlusion() and texture3d() shadeop support.
  • Partio library integration, providing Houdini, Maya and PRMan compatible pointcloud support.
  • New pointcloud viewer application (ptview).
  • Qt library integration, providing native 64-bit support on all recommended platforms.
  • BSD licensing for all new code.

In addition, key feature enhancements have been made with improvements including:
  • Memory optimisations.
  • PNG read/write support.
  • Updated SLO interface, matching other renderer APIs.
  • Improved RIB parser, including precise syntax error reporting.
  • Reinstate command line support for frame selection using -frames and -framelist.
  • MinGW support.

Further information regarding the changes in this release can be found within the release notes distributed with the software.
Aqsis Renderer 1.8.0 is freely available to download from the Aqsis website, with installers for Windows, Linux and OS X:
www.aqsis.org

Sunday, January 29, 2012

Pixar and Greenbutton reveal RenderMan On Demand

In 2010 Pixar had demonstrated RenderMan running on MicroSoft's Azure platform, as reported here http://www.blendertorenderman.org/2010/11/pixar-and-microsoft-cloud-rendering.html, however back then it was only a demo. As of Jan 19, this has become a real working service thanks to GreenButton, Pixar's RenderMan On Demand is now live and ready for all your rendering glory, for a price of course. To be fair .70 cents a core hour is really cheap, of course this is based off of third party information, mileage may vary.

So it looks like cloud services are becoming more and more common for the 3D industry, while there has been renderfarms that existed before in the sense of a traditional farm, the difference is that GreenButton is a cloud based service. The advantage of using cloud service rather than an in house farm is that there is no initial huge investment in hardware, you only pay for the use of other's hardware. The obvious reason for an in house renderfarm is that it is tailored to the studio, you have complete priority over jobs and it looks pretty impressive to the outside world. Smaller studios lack huge pockets though, so cloud rendering is far more valuable and attractive than investing that same amount of money on a few servers.


There is another method for us to render out frames without tying up our computers for hours or days on end; distributed computing. Distributed computing is also a way for Blender artists to make use of a renderfarm without having to spend a serious amount of money, in fact with Renderfarm.fi this is possible for free. Much of the well known distributed computing projects like SETI@Home are based off the BOINC platform, this is a distributed server-client system that has connected millions of computers worldwide all for the name of science. Why not take advantage of the same system for rendering and that is exactly what Renderfarm.fi does, it enables Blender users access to a large number of rendering nodes for free while also providing your computer as a rendering node for someone else's project.

This is a service that is based off of BURP, the technological framework for using BOINC as a distributed renderfarm, written by Janus Bager Kristensen. BURP started several years ago and works closely with Renderfarm.fi yet the two are completely different entities.

The question I have is, why not start something like this for Aqsis, or Pixie? Renderfarm.fi has done a very good job of marketing themselves and in all reality they do not even handle the actual rendering, that is taken care of by us BOINC users. In theory this kind of service could be started for Aqsis as well. Can Aqsis and Pixie be added to Renderfarm.fi, or even have a new website devoted to this? Can open source Renderman be turned into a cloud rendering technology? I believe it can, however I am not the most talented programmer in the world, so personally I would be a horrible choice for a developer. I have been looking at the code, not to mention that there has been some talk over forums with the BURP and Renderfarm.fi guys about supporting other external rendering software, it looks very possible to get Aqsis at the very least. The wall is of course the development of supporting this, as Blender changes these guys have to make changes in their own software, keep up BURP and Renderfarm.fi support and then fix things when it breaks, so this does cut into time and energy into other render engines. Not to mention the server itself needs to be pretty beefy, funding for the static IP and cost of hosting this, unless someone out there is willing to donate this. Would there even be enough interest to work on such a project? This obviously needs to be a project outside of BURP and Renderfarm.fi but in communication with so that if this works and tests well, then maybe it can be added to the Renderfarm.fi service.

The reason for this post is primarily because I used to be one of the biggest nay sayers against community based distributed rendering, claiming that too many technical factors outweigh the benefits but in the past year I have come to realize that maybe 5 or 10 years ago this was true, now it appears that this no longer is the case. When I first heard of BURP many years ago I thought that it was a neat project but would probably not work out in the end and look at how wrong I was about that, not only has this evolved into one of the only community based distributed render farms on the planet but has allowed every single Blender artist access to it, for free. That is an amazing feat and probably one of the greatest additions to the Blender community period, hence the reason this website has their logo graphic on the sidebar, these guys are awesome!

Monday, January 02, 2012

Cinepaint Developments

One of the most overlooked software packages in the Blender to Renderman arsenal has been given a new breath of air by the developers, Cinepaint 1.0 was released during the end of November (sorry guys). What makes Cinepaint so powerful and unique compared to it's parent GIMP is that it was designed for film work from the start, it is meant to handle 32-bit HDR images that are impossible to open in GIMP, which is why studios continue to use Photoshop among other things. After what seemed to be a very long stagnant period of lack of updates and uncertain future, Cinepaint has exploded into the scene again, even getting 3D World attention with an article, not bad for an open source software that was a fork of GIMP. At this time there is only a Linux source of Cinepaint and it has it's bugs, in fact version 1.1 is being delayed due to a nasty memory leak.

Aside from the issues here are some of the things that Cinepaint users might expect to see in the future as described by Robin Rowe

---

CinePaint Multi-bit Image Engine

CinePaint will continue to have a multi-bit engine. Some programs support deep paint by setting bittishness at compile time. You can have an 8-bit or a 16-bit core in ImageMagick, for example, but not both. It depends upon how ImageMagick was compiled. CinePaint has a true multi-bit engine where bittishness is chosen at runtime depending on the needs of an image. When you open a JPEG, CinePaint will allocate 8-bit channels for it because that’s what the image holds. When you open a 16-bit TIFF, it will allocate 16-bit channels.

At present, CinePaint supports unsigned 8-bit, binary fixed point 16-bit, half float 16-bit and float 32-bit. That’s a lot of flexibility in channel allocation, but not quite as much as we’d like. OpenEXR files may contain unsigned 32-bit channels and TIFF may have unsigned 64-bit channels. It would be nice to be able to open those in CinePaint without a loss of fidelity.

Comic Book, Anime and Fashion Illustration, Heads at Any Angle

Adding to ideas expressed in an earlier story, it would be nice to have an interface that supports pulling images from a library of art created by the artist, such as having a character’s head drawn at many different angles. This would enable an artist to quickly drag in previously drawn elements to quickly build an illustration. It would be nice to be able to mirror clone when half of a face is drawn in order to quickly draw the other half.

Architecture

There’s a lot that could be done to make architectural drawing easier. One would be a cross-hatch brush that would draw parrallel lines at a fixed separation locked to the background position so that drawing later with the same brush will line up perfectly. The angle of the cross-hatch would be a setting on the brush. Another important brush for architecture would be a perspective brush that draws straight lines according to the horizon specified by the artist for the picture. At zero degrees it would become a railings brush and have no crossing lines. There could also be brushes that draw boxes with proper geometry per the background horizon. There could be a clone brush that copies with perspective, whether it’s drawing bricks or a window that needs to be repeated across a building.

Roto and Tracking

Adobe After Effects has an excellent auto-roto feature that will separate foreground objects from their background. Surprisingly, Photoshop does not. It would be nice if CinePaint had auto-roto. It’s not surprising that Photoshop doesn’t have tracking and image stabilization like After Effects. It would be nice if CinePaint did.

Colorization

There’s a plug-in for painting colors from a similar photo onto a B&W image. Would be nice to have that type of thing as a standard clone brush feature.

Slides and Sequences

CinePaint’s flipbook can be used as a PowerPoint-style slide presenter. That could be further developed. While you can load a sequence of numbered images in CinePaint, there’s no way to save that sequence as a sequence. It would be nice if it would.

Brushes

In addition to the architecture brushes already described, it would be nice to have brushes that draw borders and to paint Apple-style liquid buttons. It would be nice to have a bucket brush that bucket fills but will not seek out through gaps smaller than the size of the brush. It would be nice to have a “sloppy” setting on the brush the can exaggerate or reduce the jitter with which a line is hand drawn, something like in Smart Sketch. There could be brushes that draw flowers or any randomly repeating “image tube” as in PaintShop Pro. As the brush paints it lays down the next image (or flower) in the tube. Another nice brush would be a human pores brush that adds pores to portraits that have been magnified.

Magnifier

It would be nice to have a magnifying glass like Apple Aperture.

Meta-data

It would be nice to support such things as keeping accounting data for the time spent working on an image.

Vector Graphics and Type

Artists seem to agree that moving between Adobe Illustrator and Photoshop is inconvenient, that we’d rather do to one app to paint. However, mixing vector graphics and rasters gets messy. A solution is to put vector objects, including type, in a separate layer.

Adjustment Layers and Nodes

It would be nice to have adjustment layers, that is, layers that dynamically enhance the image below instead of changing the layer with a filter. Layers and nodes can be thought of like waves and particles in physics. They’re two ways of looking at the same thing and yet seem quite different. It would be nice if CinePaint displayed nodes.

Scripting

CinePaint has supported many scripting languages, but it hasn’t been a satisfactory user experience. Preferably, CinePaint would record macros/scripts implicitly and at all times like Apple Shake. That would be a better solution than the Adobe Photoshop macro recorder that requires the user to decide when to record a macro first. Taking scripts a step further, it would be nice to have a text-based way to create image files. For example, to be able to quickly snap together a color bars image from a text description of the sizes and colors of the bars.

Sound

CinePaint has a flipbook movie player, but no sound. It would be nice to have JACK support so external JACK-compatible sound tools like Traverso would play in sync.

High Fidelity Color

CinePaint has a RGBA color space by default. Work’s been done to support advanced color management and other color spaces such as CMYK in CinePaint. It requires a domain expert in the printing industry to really get this right. Enhancements provided by a German open source developer have been difficult for CinePaint users to comprehend. To advance in this area we first need a color expert who speaks English, someone who can explain to me what we really need in the color interface and who can test that we got it implemented right.

---

Sorry for the lack of updates in recent months, my personal life has been quite filled with obligations.

Tuesday, May 10, 2011

Pixar announces ProServer 16, Weta adds Deep Image Compositing to OpenEXR

Pixar released a press item on Monday May 9th, 2011 that Pro Server 16.0 is now available. What makes this version so important is because of the competition Renderman has faced in recent years, primarily the Arnold renderer and the fact that ray tracing in films is actually quite affordable now, in fact in some cases needed in order to really achieve the photo real look. Ray tracing in film is not new, in fact Pixar and ILM were the first houses that were able to do so, however these shots were few and far between. One of the first uses of ray tracing by Pixar was during a couple of shots in "A Bug's Life", then full use of ray tracing and global illumination in "Cars".

The problem is that while it is possible, ray tracing and global illumination is horribly expensive, long render times, huge load on the CPU and memory, requires baking the illumination similar to traditional shadow maps and in general slow to work with. Pixar's flagship product and grandfather of the REYES family tree has been considered slow compared to other renderer's like Arnold, VRay and MentalRay when it comes to ray tracing. This coincidently helped these competing products establishing a foothold on the VFX and animation industry as viable production renderers because of that very fact.

From the press release :

"This latest release features a large number of innovative advancements in RenderMan's ray tracing technology, including a new ray tracing hider, a radiosity cache, and physically plausible shading. These milestones allow RenderMan to take full advantage of the ever-increasing processing power of multi-core architectures, while also delivering the tools to implement these new features with efficiency and elegance. Moreover, RenderMan's new progressive ray tracing provides interactive re-rendering for production shading and lighting."

While nobody can prove otherwise, it remains a bit suspicious that Pixar added a new interactive re-rendering method, possibly similar to the Aqsis interactive re-rendering that was announced last year, possibly it is just great timing. It is still cool none the less. Aqsis still holds the title of being the first REYES renderer that can re-render at near real time on the CPU.

"RenderMan Pro Server 16.0 fulfills every need on our feature request list. It lets us focus on the customizations we really care about for ray tracing physically-correct lighting in the cleanest way possible," said Dan Evans, Head of Shaders at Framestore, London. "The speed of the new radiosity cache makes ray-traced global illumination practical in a single render pass, and we can now refine our test frames live using the new progressive ray tracing. Multi-bounce glossy specular, importance sampling, area shadows, and direct lighting are now a breeze thanks to the renderer supporting them all directly. Better still, regular ray tracing is staggeringly faster at 8X on some of our more complex stereo renders.”

Of course the products are commercial with ProServer 16.0 at $2,000 a seat, so this is beyond the price range of most of the population. A full price list can be found here if you want to check them out. This is one reason for this website's existence, much of the population that would be interested in Renderman simply cannot afford the software that these houses use, at best we get to use free limited use versions, so the importance of open source Renderman is very valid. While Aqsis still is not up to the level of features that PRMan does, it is far beyond just a hobbyist tool and as seen over the past couple of years, capable of producing film quality imagery.

Pixar always seems to kick it up a notch when it matters.

Weta also made recent news when it announced that it will be adding deep image compositing to OpenEXR, while at FMX 2011. More can be explained here at Colin Doncaster's website. Colin was the original developer of the Liquid plugin, a Maya to Renderman exporter that first saw use at Weta (see a pattern??). The Lord of The Ring's trilogy's visual effects were primarily accomplished via this tool for instance.

Animal Logic first produced a paper that described what deep image compositing is and what it is for, it was supposed to be at SIGGRAPH 2010 but was rejected. Why, nobody knows, SIGGRAPH paper judges are not known for their generosity. Since then several applications have had this added, such as Side FX's Houdini. Now OpenEXR will get another very useful addition that will allow other applications to take advantage of deep image compositing, so one could expect to see this in Blender as well as Aqsis in time, when this happens is uncertain and certainly not planned officially anywhere.

"The concept of a deep image isn’t brand new; ultimately it’s just the technique of encoding more than just the RGBA value in a pixel. Many applications and systems already store multiple channels of data to enhance the compositing workflow as well as re-using calculations already performed by the rendering engine. Side FX software’s Houdini is an example of one of the more recent applications to utilize this workflow via it’s custom camera image format.", from Colin's site.

Exciting times in all areas of visual effects and animation!

Thursday, May 05, 2011

Disney releases expression editor SeExpr as open source

The studios just keep surprising us with more and more goodies. Just a day after LAIKA released SLIM templates, Disney releases an expression editor called SeExpr, a tool that can have uses in many areas of production.

"Arithmetic expressions appear in almost every animation system ever created. Being able to embed an expression language in a piece of custom software allows an amazing degree of artistic freedom. At Disney artists have enjoyed using expressions because they allow just enough flexibility without being overwhelming to non-programmer users. Developers have enjoyed them too for quick prototyping and deployment of fixes to production needs. SeExpr started as a language for our procedural geometry instancing tool, XGen. Work was done to generalize it into something that could be used in other contexts. Later it was integrated into paint3d, our texture painting facility, which opened the door to procedural synthesis. More recently, we have integrated it as a way of defining procedural controls to physical dynamical simulations and render time particle instancing."



"Expressions can be seen as a way of allowing customization of inner loops. This is contrast to scripting which is mostly aimed at glueing large parts of code base together. So in this sense, C++ forms the center of your application, python could be used to put pieces of it together, and SeExpr is used to customize tight inner loop."





As stated on the website the in house GUI at Disney is not being released, maybe it involves code that needs to be evaluated to be released under an open source license, it could be proprietary. Or they could just enjoy watching us squirm with anticipation, it is doubtful that it will ever be seen outside Disney though, either way Disney is understanding the power and reasoning for releasing production code to the public, adoption into other packages only helps spread it, just look at how fast Ptex was added to software, not to mention the other projects like PartIO, Munki and Resprado have quickly begun to collect followers and watchers. Who knows how many studios have adopted these projects into their own pipelines? While some of these projects may not be directly related to 3D production, their use in a pipeline can be invaluable.








Tuesday, May 03, 2011

LAIKA releases SLIM templates to the public

LAIKA, the Portland, Oregon based animation studio, has released production SLIM templates on the Pixar website in an effort to help boost the Renderman Studio community. For those of you who do not know what SLIM is for, it is an application that is developed by Pixar to develop shaders, similar to how RIBMosaic makes "shader fragments" for Blender based off of RSL shaders made in a shader editor such as Shrimp, SLer and Shaderman. However SLIM is far more advanced and has been a staple in any pipeline that uses Maya and Pixar's Renderman.

SLIM templates are readily available on the internet, mainly released for free by Renderman users, however visual effects and animation studios keep these hidden away, just like any other asset in a production, so for LAIKA to release them to the public is a big deal.

You can grab these templates and example files at the Pixar Support Forum (free registration is required)

http://renderman.pixar.com/confluence/display/~laika/Home

LAIKA, Inc. is an animation company specializing in feature films, commercials, music videos, broadcast series, interactive content, broadcast graphics and short films. Owned by Nike co-founder and Chairman Philip H. Knight, the company is located in Portland, Oregon.

LAIKA has a 30-year animation history presenting the artistry of award-winning filmmakers, designers and animators. In addition to numerous international honors, the company has won two Academy Awards, 11 Emmy Awards, 11 Clio Awards, three London International Advertising and Design Awards, five Mobius Advertising Awards and two Cannes Lion International Advertising Festival awards.

Sunday, May 01, 2011

New Blender to Renderman Wiki and other site news

Just a short update on the site. The new wiki is in place, though much work has to be done in order for it to be viewed as somewhat complete.
The previous wiki was hosted on Wikidot, which provided a decent working environment, it was just not visually pleasing as the new one, which is hosted by Google's Sites now. Why Wikidot was chosen first before the Google Sites is still a mystery, this is changed now.



Other sites related to Blender have also undergone some visual changes, such as Graphicall.org, a site started by Daniel Salazar, a friend of Blender to Renderman community actually.



BlenderStorm also has undergone the same visual design treatment, as well as being an OpenID provider.

BlendSwap also has undergone some updates!



The Blender community is getting connected! Now the next step is for this site is to follow suit, see what happens.

Sunday, December 26, 2010

Merry Christmas! A gift from us... to you.



Well it is that time of the year and this year I decided to release the Widow Pipeline as a present to the Blender and Renderman communities. Yes it is true you can get these all on the sites themselves, however some tools are not so well known. So now everyone can use the same tools we are using for the short film "The 10:15" (aka Project Widow).

Widow Pipeline I for Linux tar archive

(of course this was not without it's own technical problems.... torrents did not work, now just a single tar archive.... sigh)

This release is built for Linux, since much of the team uses that operating system, also some of the tools are built for Linux and would require some work to get them to build on Windows or Mac OS. Shrimp for instance is modified specifically for this short, WidowShade was a heavily modified version of ShaderLink, something that took close to several months to complete before Shrimp was brought into the pipeline.

This release is similar in fashion to the BRAT release in 2009, however the difference in the release is that where BRAT was meant for a more general installation on multiple operating systems, with a wide variety of tools and example files, this release is more specific and designed to fulfill a specific role : to be the basis of an open source production pipeline. This release is also using older tools rather than bleeding edge, simply because of stability and the fact that at the time of this writing there is still much to develop to get a stable pipeline using Blender 2.5x and RIBMosaic.This is the result of over a year of work developing a stable pipeline so that “Project Widow” could be completed and the hope is that people will be able to use this as a starting point for their own projects, or using this to learn Renderman with Blender.

Artist Tools

Aqsis
Blender 2.49b (both 32 and 64 bit)
RIBMosaic
DJV
Ramen
Cinepaint
Shrimp (Widow build)
WidowShade

Blender Scripts and files

Conspot
RIB_Lib Database (blend file with entire RSL LIB for linking)

Shaders

Entire shader collection used for “Project Widow” (as of Sep 2010)
Surface, Displacement, Imager, Volume, Light and Diagnostic shaders


Pipeline Tools

Shotgun API 3
WVTU (Widow Version Thumbnail Uploader)
WidVerTU 0.3 (dev version of GUI WVTU)
Postmosaic
BlenderAid
DrQueue

Libraries

RSL Lib 0.1
OpenColorIO 0.6.1 (Sony Imageworks color management library)



There is also something to find in this torrent. It's your present. It is a pretty complete collection of pre-production images, as well as R+D renders and test beauty shots. There are also some screw up images, some test blend files, as well as older images from the past to show just how much has changed in the past 5 years.

There is also the BlenderCon 2010 technical paper that I wrote, considering that this very pipeline is the one described in the paper.

Now to get back to the StarWars marathon on SpikeTV.

Merry Christmas!

Wednesday, December 08, 2010

RIBMosaic : Now part of Aqsis



HUH!?!?!

I am sure some of you are thinking this. Yes, it is true. RIBMosaic is now a part of the Aqsis project and the current official "home" of the new add on.

The back story goes like this. Eric Back emailed a certain select few last month (around the time of the last post here) and told us that he would no longer be developing the recoded Blender plugin. He was "orphaning" the code as it was on Sourceforge and anyone could come and pick it up. Knowing that RIBMosaic was important for Aqsis it was decided that the developers would adopt the code as it last was and continue development with the intention to bring Aqsis closer to Blender.

From the developer mailing list :

Eric has given his agreement that this should be the 'home' of RIB Mosaic now, and understands that we will focus on the integration of Aqsis specifically into Blender, while endeavouring to ensure that nothing we do will intentionally preclude support for other compliant renderers. We as a team will probably not be able to focus any effort on supporting other compliant renderers, beyond possibly testing regularly to ensure that existing functionality still works. Of course, we will assist and encourage anyone who wants to work on support for other renderers should they wish to do so within its new project space.

Cheers

Paul


In a way this would be a "RIBMosaic for Aqsis", while other developers could make a RIBMosaic for say 3Delight, or AIR or even PRMan.

So now RIBMosaic is now a part of Aqsis and will be packaged with the renderer from now on. A lot still has to be completed and there has to be some serious testing done in order to accomplish this in a timely manner. In order for Project Widow to continue, the tools need to be upgraded as well, it was bound to happen and now is the time.

http://sourceforge.net/apps/mediawiki/ribmosaic

Thank you Eric for bringing this idea to full steam, without your efforts and help it's hard to imagine being this far by now.

Monday, November 01, 2010

Aqsis new core prototype, interactive viewer!

So here it is - words cannot describe what you are about to see, you have to watch this for yourselves.



From the Aqsis development blog :

"This blog has been pretty quiet for a while, but aqsis development has been coming along behind the scenes. During the aqsis-1.6 development last year I focussed a lot on making aqsis faster. After working on this for a while it became obvious that some major changes were needed for the code to be really fast. In particular, the aqsis sampler code is geared toward dealing with single micropolygons at a time, but it seems better for the unit of allocation and sampling to be the micropolygon grid as a whole. This was just one of several far-reaching code changes and cleanups which seemed like a good idea, so we decided that the time was right for a rewrite of the renderer core. Broadly speaking, the goals are the following:

* Speed. Simple operations should be fast, while complex operations should be possible. The presence of advanced features shouldn't cause undue slowdowns when they are disabled.
* Quality. Speed is good, but not at the cost of quality. Any speed/quality trade offs should be under user control, and default settings should avoid damaging quality in typical use cases.
* Simplicity. This is about the code - the old code has a lot of accumulated wisdom, but in many places it's complex and hard to follow. Hopefully hindsight will lead us toward a simpler implementation.

Fast forward to the better part of a year later - development has been steady and we've finally got something we think is worth showing. With Leon heading off to the Blender conference, I thought an interactive demo might even be doable and as a result I'm proud to present the following screencast.



There's several important features that I've yet to implement, including such basic things as transparency, but as the TODO file in the git repository indicates, I'm getting there. The next feature on the list is to fix depth of field and motion blur sampling which were temporarily disabled when implementing bucket rendering.

Edit: I realized I should have acknowledged Sascha Fricke for his blender-2.5 to RenderMan exporter script which was used by Paul Gregory in exporting the last example from blender. Thanks guys!"
Posted by Chris Foster

"Just to clarify, this is not a demonstration of an interactive viewer for RIB editing. This is the newly under development main core. So, what you’re seeing there is the actual renderer, rendering microplygons (Reyes) at 40 fps. We’re just displaying it in an interactive framebuffer, rather than bucket at a time, to show how fast it really is. It’s not using GPU, purely CPU, exactly what you’ll get when you render with Aqsis.
I should also point out that it’s not complete yet, this is the first demonstrable stage of the core re-engineer, there’s more still to go in before it’s even up to current Aqsis feature levels, but rest assured, when it’s finished, this is going to be fast."

~ Paul Gregory

Wednesday, October 27, 2010

BlenderCon2010 "We have such sights to show you..."

All rights are reserved by Clive Barker and/or Bent Dress Productions.

The Aqsis team has been very hard at work giving the renderer a reboot of sorts, with the building of the new core and all, of which has not really been seen in the public eye.......until now. Well, almost.
Leon Atkinson (Renderguy) will be at BlenderCon this year and will be showing off some of the latest exciting developments. Yes, showing it off on screen for everyone to witness because there is no way to fully explain the details in words, the Aqsis team reports a demo for the conference is under development, it is being prepared specifically for BlenderCon to announce the new plans for Aqsis and to show how beneficial they could be to Blender users.

The very core of Aqsis is being re-engineered with a focus on speed, it is at the prototype stage now, but is functional enough to form the basis of the BCon demo. There also will be interface changes, the migration away from FLTK to QT4 for instance, which is actually pretty neat since a lot of the pipeline tools are already in QT4 as well, or in the process of switching. Other changes like multithreading for instance are a very recent addition to the new core.

They are preparing a more detailed press release for after the conference, so keep your eyes open for that. If anyone happens to be at BlenderCon and able to record a video if you could let us know so we can post it here.

This also coincides with a point release, Aqsis 1.6.1 which will mainly be a bug fix release

Some of the rumors going around the underworld is that Larry Gritz is trying to get Ptex implemented into OIIO (Open Image Input Output), which Chris Foster has expressed great interest in using for Aqsis, thus Aqsis would get Ptex for free. That won't be until later though, possibly in version 1.8 or the fabled Aqsis 2.0.

Piece by piece the developers are building up towards a very powerful rendering application.

On the other end of the conference spectrum is the paper "Blender to Renderman : Building a Pipeline for Production Use" written by myself. I had originally been planned for a speaker spot, however due to complications I had to back out and asked Ton if submitting a paper would be ok since the topic was pretty much the same (possibly even worded better on paper than with spoken word haha) so...... my first publication of sorts and it is appearing on this years BlenderCon page.

Strangely enough is that this year's Halloween is also during BlenderCon, so one can only imagine what will go on during the weekend, wouldn't it be cool to see the BlenderCon attendees dressed up as zombies and walk the streets of Amsterdam?


Thursday, September 30, 2010

Sintel now available



 Congrats to the Blender Foundation! I have been awaiting this film for quite some time, even had the chance to chat every now and then with some of the guys. During the course of their production those of us working on "Project Widow" took note of some of the methods the BF had done, in particular Blender-Aid, so in a way Sintel has been a good source of reference and information, even in such cases as the modeling sprint, used some of the models for testing with Renderman. In such case aside from the texture format change was pretty solid export, even from 2.49. Nathan in particular has been a real joy to chat with, spending time in the Aqsis chat room and in one such case showed Paul Gregory and I a video preview of his basic exporter using a rigged Sintel model. Of course there has been a lot of behind the scenes talks between developers about the RenderAPI, something that we have been encouraging for some time now.

I guess it is kind of strange that a few of us have had a small connection to this film, strange but cool at the same time.

Grats guys! I enjoyed it greatly!

http://www.blendernation.com/2010/09/30/sintel-now-available-for-download/

Thursday, August 05, 2010

Ptex support added to Blender!

This is a brand new commit to the Blender 2.5x code actually, something that has not been formally announced nor has it been fully worked out in functionality and there is probably bugs and needed optimizations ....
This is ptex, of course. The implementation isn't complete, but here's what working for now:

  • Per-face ptex resolution. Each face gets a U resolution and a V resolution based on its area (relative to other faces) and how stretched it is (i.e. a thin tall face should have a lower U resolution and a higher V resolution.)
  • Automatic generation of ptexes. This step is somewhat analogous to unwrapping your mesh, except instead of choosing UV coordinates, it's setting the default ptex resolution for each face. There's a UI control for texel density.
  • "Vertex" painting. That's a bit of a misnomer now, of course, but you can paint more or less normally. (Naturally I've broken some vpaint features like Blur in the process, but it'll all be restored.)

Note: ptex is designed mainly to work on quads. Triangles and other faces are split up into quads in the same manor as Catmull-Clark. I've coded it so that both quads and tris work (although there are some mapping issues with vpaint still), however quads are the "fast case"; for this reason I've applied one level of subsurf to Suzanne in this example.

A partial TODO list:

  • Add UI for setting individual faces' resolutions
  • Integrate the open source ptex library for loading and saving ptex files
  • Add upsampling/downsampling so that changes aren't lost when changing ptex resolution
  • Change default ptex to a flat color. The random noise is just for testing, of course :)
Very very cool to hear! Congrats to the Blender devs for adding this, now all that is needed is more render engines to support Ptex as well, I would imagine within the year this will be a sweeping motion to impelment this across the spectrum of graphics programming.

Wednesday, July 28, 2010

ILM and Sony Imageworks release Alembic OSS

"Alembic is an open computer graphics interchange framework. Alembic distills complex, animated, scenes into non-procedural, application-independent, baked geometric results. This distillation of scenes into baked geometry is exactly analogous to the distillation of lighting and rendering scenes into rendered image data.

Alembic is focused on efficiently storing the computed results of complex procedural geometric constructions. It is very specifically NOT concerned with storing the complex dependency graph of procedural tools used to create the computed results. For example, Alembic will efficiently store the animated vertex positions and animated transforms that result from an arbitrarily complex animation and simulation process,
 one which could involve enveloping, corrective shapes, volume-preserving simulations, cloth and flesh simulations, and so on. Alembic will not attempt to store a representation of the network of computations (rigs, basically) which were required to produce the final, animated vertex positions and animated transforms. "
 



http://code.google.com/p/alembic/

- quoted from the Google code project page

ILM and Imageworks truly are our best friends in the proffesional visual effects industry. From what is posted this is intended to bake this data into a format that can be read later on down the pipeline, such as cloth simulation into a single baked file rather than thousands of small files of cloth sim data, such as the case with Blender. Or bake an animated character that will be later used for cloth simulations. Then using the same format can take these scenes and bake it to be used later for lighting and rendering. There are limits though, it cannot store network representations, such as bone rigs and it is not meant to replace native scene files for applications. It is another bake format that is intended to smooth out pipeline issues, like file formats between applications, which has been an issue in the 3D industry for as long as it has been around. Trying to get Maya files into Blender is a pain and usually involves exporting an object into the .OBJ format, which cannot store animation data, so having the ability to bake animated scenes that can be read in another application is a HUGE leap! Imagine the ability to use Maya scenes in Blender, then rendering in 3Delight, or for instance, modeling something in Blender, animating it in Maya, use RealFlow to make a fluid simulation, then render it in Pixie.

While it is too early to say where this will go, who will adapt it and how long it will take, I can say that the idea is great and hope to see this evolve. Remember, in less than 5 years OpenEXR went from a small user base to be included by default on most Linux distros and 3D applications, both open source and commercial as well as proprietary in house tools.

Sunday, July 25, 2010

Aqsis Licensing Changes to BSD

Aqsis Licensing Changes

As of 25th July 2010, the Aqsis project intends to move towards re-licensing under the BSD license (http://www.opensource.org/licenses/bsd-license.php). All new code contributed by any authorised developer must, by agreement of the developer, be licensed under the BSD license. Existing code will remain under the current GPL/LGPL combination in the short term, while efforts are made to obtain agreement to re-license under the BSD license. Where agreement is not forthcoming, the code in question will be removed from the project, and be replaced with new implementation, written in isolation and licensed under the BSD license. The intention is to ultimately provide the whole of Aqsis under the BSD license.

As reproduced from the mailing list.

Tuesday, July 20, 2010

Blender 2.5x RenderMan Export News

It has been some time since there was anything written, sorry for the lack of activity, my personal life has taken some unexpected twists and turns, for the better of course.

Now for the news!!

Blender 2.5x now has the ability to export to RIB! Actually Blender has been able to do this for some time, it just has not been mature enough to really say anything about it. However this is not Eric Beck's RIBMosaic, this seems to be a simpler script similar in tune to the old Blenderman script, just with a lot more functionality. I have not had a chance to test run anything on Blender 2.5x, simply because the Project Widow pipeline is for 2.49 so in order for us to continue to work on this we need to keep the old production stable version. Thus the only information I have is already seen on the Blenderartists site here : http://blenderartists.org/forum/showthread.php?t=187969&highlight=renderman


"Frigge" as he has been known as in the forums, made this as an exercise into deeper technical stuff, as mentioned in the forums, so this script is not to be considered something for large productions. It does seem to be the perfect tool for someone who does not know much about Renderman but is curious to know how it works. That's how I started, my first taste of Renderman came from a Lightwave plugin and 3Delight back in 2003. So if you are not a shader and rendering wizard you might find this useful, at the very least informative and IT WORKS.

Back in Feb I had been talking a lot with Nathan of the Durian team and at one point in time he had made a basic export script for Renderman as well. It is kind of funny that he has done this considering that about 7 years ago he ranted about how much he hated Renderman shaders, I did some digging and had found this post form that time : http://www.blender.org/forum/viewtopic.php?t=1497&start=30, I just hope he does not kill me when he reads this, heh. I did have the source for it at one point but it got lost so I am unable to provide anything in terms of how it functions or what it looks like, though at the time he did do a simple 60 frame animation of the main character model using this script to export and me when he Aqsis to render it out. Again that too is lost in the abyss of the hard drive recycle bin (I was being stupid and did not realize that files and folders I deleted months ago were there). Nathan then became very busy with Sintel shortly afterwards so I left him alone to complete the film. He did say that once Sintel is over he wants to become more involved in the Blender to Renderman project so we are excited to have his presence here.

Matt Ebb also has been experimenting with a Blender 2.5 Renderman export script.


This script is also a sort of personal pet project for him and no code seems to be available at the moment, however from the screenshot he posted it looks good enough for people to play around with and learn more of what Renderman is and how it works rather than a full on production capable plugin like RIBMosaic, which of course can be daunting to grasp at first for someone who has never used a RiSpec renderer.


http://mke3.net/weblog/3delight-renderman-in-blender/

Finally Eric has informed me that the GUI portion of the new RIBMosaic script is done, cannot wait to see some screenshots!!

The point of this is that Renderman usage with Blender is gathering steam, more people are interested in it and there is some good effort from all over the Blender community, from the smaller basic scripts to the complex. This is a good thing, even in this little niche of the Blender community we have options and that is the whole goal of this : OPTIONS. There was a recent thread on Blenderartists, one that asks "Should the Blender Internal Engine be retired?" and in my honest opinion it should NOT. Why? Well not only has Ton put a LOT of time and work into that renderer but why should it be retired at all? Every 3D animation package has some sort of rendering engine and while each may not be able to x or y rendering capability, you are still able to render out an animation sequence. Yes Maya's internal engine cannot hold a candle to MentalRay or PRMan but it is still functional and with enough effort can produce some good results. We started this site for a single purpose, to have the OPTION to use an RiSpec rendering engine, not to replace the internal engine. We know for a fact that not everyone is interested in rendering, they are called character animators, or modelers. While they may prefer this renderer over another for whatever reason, they have the desire to create the model or animate it. Why should they have to be forced to learn a new rendering system when they are comfortable with the one supplied? Of course not everyone is like this, I just used that as a single example however the point is to not take out the internal renderer of ANY application that is intended to be used for animation. That would be software suicide.

Thursday, April 01, 2010

April Fools! Pixar aquires the source to Aqsis and Pixie in stunning lawsuit


Update : Yes it was a joke ;)

It seems that history repeats itself, as we all remember the sad demise of BMRT and Entropy due to the lawsuit against Larry Gritz by Pixar over patent infringment, or something to that effect. Well recently it has come to our attention that certain projects are being hit with the same lawsuits over graphics technology patents and now Aqsis and Pixe are now no more in the open source domain.
What this means is that the developers have to license the RiSpec from Pixar despite the fact that the code was written from scratch, the patent technology described in papers are so close to the Pixar source code that lawyers are afraid of investment loss. "Aqsis and Pixe, while being open source has this code that infringes on the technolgy Pixar invented, clearly the need to take effort to protect the name and reputation, as well as financial investments, is required" says a source that wishes not to be named for legal reasons.
This seems to be a common trend in the software industry where one technology company sues another over patent infringement, it seems that the open source world is not immune of this legal battle either. This comes as a devestating blow to the community since this means that development on these projects comes to a halt and will require the developers to license the patent, sell the software in order to recoup the license cost and make profit to pay for the subsequent years thereafter.
Already the developers of 3Delight, AIR and RenderDotC have adjusted the pricing to cover the fees with the same legal action, though it is far easier for them since they are commercial products already with established footprints in the industry. The open source community of Renderman users and artists were just starting to establish the valid reason for such tools and on the brink of the dawn the rug was pulled from under us and now we either use the old versions that will remain as is or pay for the next gen versions of our beloved rendering apps.
It is a sad time in our chapter as a whole and we wish the developers of Aqsis and Pixie well as they adjust to the dealings of commercial development. We are only waiting to see if our site gets hit with the same lawsuit over the name RenderMan itself, something also spoken of around the net here and there, so time will only tell if this site exists in the future at all.

Thursday, December 24, 2009

Happy Holidays!




Happy Holidays to all!

We want to wish all the best of times and safe travels! This post is a long time coming, considering that there has never been a holiday related post here before but this year is different. Why is it so different than any other year? Well Blender to Renderman has become a valid project this year, at least we like to think so. This has stood the test of time and criticism, has pushed development in other areas, not to mention more people are becoming aware and using these tools for their own needs. Considering that Aqsis is one of 3 external rendering examples featured on the Blender website, I think the efforts done by everyone past and present have started to come to light. While this website is only one effort to bring a community together, the idea and practice of exporting Blender scenes to Renderman RIB files has been around for at least 7 years. It has only been in the past 2 years things have really picked up and only this past year that our efforts are taken seriously and not "Yet Another Project".

Not to say that the previous year was bad, not at all, in fact that year was a step towards the right direction. However even today this is still in it's infancy and we all have a ways to go before we see more start ups using these tools to develop their own animations. As mentioned in the previous post, some already have, we applaud them and encourage them, cheering them on because the more people that use these tools that are not directly involved in the development of Mosaic, Blender, Aqsis or Pixie then that means that we as a community have at least helped steer others in that direction. The tools we use and developed had also been used for research, in the form of RenderAnts, a GPU based Renderman visualization tool.

2009 was also the starting point in which "Project Widow" began, something that we wish would have been completed by now but due to the complexity and just the fact that this has not been done before using these tools, production is still in progress. This project helped in the development of Aqsis, with the multilayer OpenEXR display driver, as well as RIB Mosaic, with the addition of these EXR driver presets. While the production is slow, things are starting to pick up and we are chugging away, hoping that by next years SIGGRAPH we will have something to present. Regardless when it gets done, it will get done, some way, some how.

With Blender 2.50 on the horizon things are taking a turn. If anyone has downloaded the Alpha release you will notice that the interface has gone through a massive transformation. While an interface is only the nice presentation of the underlying code, this interface has truly changed from the Blender we all have been used to over the past 10 years. What is more important though is the Render API, the one thing that many have been waiting for years to come to light. At this point in time it is not known exactly when this will be added but development is supposedly in the works according to one of the Blender devs. This is one of the most important steps towards a more solid link between Blender and Renderman, in fact it is better for ANY render engine period! Be it Luxrender, MentalRay, VRay... whatever the case may be this will offer that ability to choose which render engine one wants to use, with better access to the data that was previously blocked by the Python API.

RIB Mosaic will also go under a massive change, mainly because in order to continue development Eric is forced to, since Blender is using Python 3.x and Mosaic was developed for the 2.5x Python, thus a substantial break of code. Also according to Eric in this recent post there will be a change in how Mosaic will not only function but how we will interface to Renderman. The current Mosaic uses a single pane system in the Python script window, the shader system alone is a massive series of RSL code that tries to replicate the Blender materials, the options and settings are there to try to include each feature and option for every single well known RiSpec render engine available. This has made development hard for Eric because it seemed to go beyond what was expected of the project. According to his latest post, RIB Mosaic will be more or less a framework for others to build upon, be Aqsis, Pixie, 3Delight or PRMan. This way he can concentrate primarily with Mosaic as it is rather than fixing the minute details for each renderer. "Pipelines" are described as these functions for each renderer, such as a pipeline for Aqsis shadow maps will be different than a pipeline for point clouds for Pixie. This approach will also allow others to develop pipelines for each rendering engine and then share them. The idea is to change the approach on how Blender and Renderman are to be used, rather than replicating Blender's material and lighting system, it will encourage users to take advantage of the true power of Renderman from the start. This idea is very similar to how Maya and Houdini approach the Renderman interface, the 3D app is just the geometry and animation engine while Renderman is the primary rendering system and thus anything built in the app is designed to be rendered as such. Not only will this encourage new users to really learn Renderman, it will also help more experienced users unlock the power of Renderman as it has been done for many years by the likes of Pixar and Industrial Light and Magic (for instance).

So changes all around are happening this year and even more so next year. Each year that I have been involved to some degree or another I get more excited about it, not because I am involved per se, it is more because this has been a desire of mine to see happen at all. The first taste of Renderman I had was in 2002 when I had been using Lightwave and downloaded 3Delight and Aqsis to be used with this export script Light-R. Since then I have not looked back and I can tell you hands down that anything done now using these tools has surpassed the Lightwave work I had played with years ago. That is why I am excited about this, the efforts done by many people have not only made the ability possible but work very well together and has reached all over the globe. We are not this massive corporation with millions in marketing funds, we are not even an animation studio.... most of us are just your average geek programmer or artist that have this desire to continue on.

So in the 2010 I believe that our efforts will become noticed more, not only by the Blender community but by the rest of the CG community, as well as the industry as a whole.

The image at the top was modeled by Jeremy Birn (yes of Pixar) and was the Lighting Challenge around this time last year. I never got to complete my render in time, until now. While it is not perfect, forgive me it was done last minute, it at least it is something festive! Not to mention the massive file size and poly count! The tree took up the majority of the render time and rightly so, each one of those needles are polygons! Post processing was done in GIMP.

See you next year!!