Hello all,
I’ve uploaded binaries for the Mitsuba 0.4.4 release. This is mainly a bugfix release to address issues concerning the previous version. There is, however, one new feature:
Improved Python bindings for rendering animations
It’s a fairly common operation to render a turntable animation of an object to understand its shape a little better. So far, doing this in Mitsuba involved many separate invocations of the renderer (one for each frame). Not only is this a bit tedious, but it also wastes a considerable amount of CPU time by loading and preprocessing the same scene over and over again. Python to the rescue!
In Mitsuba 0.4.4, the Python bindings make this kind of thing straightforward: simply load the scene and render out frames in a for loop. The following piece of code does this, together with motion blur. The work can be spread over the local cores or those on networked machines. Some setup code is omitted for brevity (see the Python chapter in the documentation for all details).
# Render a turntable with 360 / 2 = 180 frames stepSize = 2 for i in range(0,360 / stepSize): # Compute the rotation at the beginning and the end of the frame rotationCur = Transform.rotate(Vector(0, 0, 1), i*stepSize); rotationNext = Transform.rotate(Vector(0, 0, 1), (i+1)*stepSize); # Compute matching camera-to-world transformations trafoCur = Transform.lookAt(rotationCur * Point(0,-6,10), Point(0), rotationCur * Vector(0, 1, 0)) trafoNext = Transform.lookAt(rotationNext * Point(0,-6,10), Point(0), rotationNext * Vector(0, 1, 0)) # Create an interpolating animated transformation atrafo = AnimatedTransform() atrafo.appendTransform(0, trafoCur) atrafo.appendTransform(1, trafoNext) atrafo.sortAndSimplify() sensor.setWorldTransform(atrafo) # Assign to the sensor # Submit the frame to the scheduler and wait for it to finish scene.setDestinationFile('frame_%03i.png' % i) job = RenderJob('job_%i' % i, scene, queue) job.start() queue.waitLeft(0) queue.join()
This is basically a 1:1 mapping of the C++ API. At this point, a good amount of the interfaces have been exposed, making it fun to prototype stuff while subjected to the amazing weightlessness of Python. Here, you can see an example of a video created this way (a turntable of the material test ball with a bumpy metal BSDF):
Other changes
-
Photon mapper: In previous releases, the standard photon mapper could miss certain specular paths compared to the path tracer. They are now correctly accounted for.
-
thindielectric: The thindielectric plugin computed incorrect transmittance values in certain situations; this is now fixed.
-
Robustness: Improved numerical robustness when dealing with specular+diffuse materials, such as “plastic”.
-
twosided: Fixed cases where the twosided plugin did not make a material two-sided as expected.
-
Instancing: The shading computed shading frame was incorrect for non-rigid transformations.
-
Cube shape: This recently added shape is now centered at the origin by default, to be consistent with the way that other shapes in Mitsuba work. This will require an extra translation in scenes which are already using the cube shape.
-
TLS cleanup logic: on some platforms, the mtssrv binary crashed with an exception after finishing a rendering job, due to some issues with cleaning up thread-local storage.
-
Other minor fixes and improvements, which are listed in the HG history
Thanks again Wenzel!
Hi, great little renderer with very big potencial! Can it been ported to Lightwave 3d?
sure, it can! Will you do it? 😉
I would like a Lightwave exporter aswell but i prefer Blender,
FOSS is the priority, when we have full blender integration then
maybe we can add support for proprietary software.
I’m getting compilation errors with the dipole plugin when I build from tip. I’ll post the error msg later.
Hello, i would just to say that you are really impressive. Very good job for this render engine. You think that is possible to integrate into blender a little bit like cycles with some interaction ? Thank you for this very good renderer.
Greg.
Is the exporter’s project for Softimage stopped?
My adaptation of LuXSI is functional, although it needs to clean enough the code.
In these moments, I have no time to continue in this project but .. does anybody cheer up to work a little?
https://bitbucket.org/povman/mitxsi/wiki/Home
No, it’s not stopped. It has just taken me ALOT longer to implement ICE RenderTree attributes than I expected.
I’m currently working on several features:
* ICE PointCloud to Mitsuba Heterogeneous Medium
* ICE RenderTree Attributes
* Animation Export with MotionBlur support
Initial Support for ICE RenderTree Attributes.
http://www.si-community.com/community/viewtopic.php?p=30849#p30849
Took me forever, but I finally got multi-material working in Softimage.
http://www.si-community.com/community/viewtopic.php?p=32685#p32685
VertexColors
http://www.si-community.com/community/viewtopic.php?p=32799#p32799
and TextureCoordinates working now.
http://www.si-community.com/community/viewtopic.php?p=32800#p32800
Wenzel, could OSL (Open Shading Language) be used in Mitsuba?
In principle, yes. Obviously that would be a very significant engineering effort.
How’s your thesis coming along?
wenzel, blender camera shift settings not working. Is there any way we can get the same viewpoint in mitsuba as blender?
Thanks .
you plan to implement material nodes? Thank you!!.
Can you compile and run the cmake files for mitsuba in debug mode in Visual studio?
It always crashes for me Windows causes a C0000005 ACCESS_VIOLATION … in boost_filesystem
When I build it in debug
or is it only designed for Release (which works fine)
Hello
I’m trying to install Mitsuba in Arch…
Every time i want to install it, it says that it need qt.
But qt is installed on my system!
What should i do?
Here’s a possible feature target for my favorite open source renderer: GPU Subdivisions with PTEX support as a bonus.
http://graphics.pixar.com/opensubdiv/release_info.html
LGPL is still on roadmap ? or GPL will remain, I’m very interested with LGPL license.
BTW, you are doing a fantastic work !
yes, it’s still planned.
Ahh awesome, this script is perfect for our http://arqspin.com importer! Thanks Wenzel.
I wish to use Mitsuba in a project that involve radiometry (i would like to use irradiancemeter) in a oven and get realistic measurement of the temperature of food. Is that a good idea to use Mitsuba in that purpose ? thank for your work !
Hi Guillaume,
I think that you should be able to use Mitsuba for that purpose, assuming that you’re only modeling the radiative transfer part of heat transfer.
Cheers,
Wenzel
Hi Wenzel,
Yes, i read it (i don’t know so much about heat transfer before, so i read a lot during the last month !). At the very end, I would like to have a blender plugin for modeling oven, food, sensor placement and inject it into Mitsuba to get information. Maybe i will share my project if it’s possible with my University.
I have another question : i don’t know if i’m doing well to represent a material at a certain temperature : i’m using bsdf conductor for light interactions + blackbody for the temperature ?
Guillaume
regarding the specifics of heat transfer, you’ll have to ask somebody who knows this stuff better than me