some tips, not written by me.
mentalray 3.6+ comes with importons and irradiance particles.
Here some tests and elucidations, eventually a geoshader to enable the new features.
Importons are an implementation to compute importance-driven sampling maps.
They can be used for importance-driven photon maps as a merging function for photons.
Ie, photons close to an importon will be saved to the map, those that are not, will be just discarded. This will lead to very light photon maps, but also very precise, because we just discard photons which have not a great impact on the final image, thus we have more photon density where we just need it (ok, photons are not really 'just' discarded.. when a photon is discarded its power is redistributed over nearby photons).
To engage the Importons with photons, you have to set to zero the 'merge' parameter in the global.illumination part and to something non-zero the 'merge' parameter in the Importon rollout. This will tell mentalray to use Importons to merge photons, instead of a fixed merging distance (if non-zero the merge param in the gi rollout).
Importons are shot in the importons phase, that happens before photon shooting. Importons are then discarded once the photon map is saved. In the verbosity you'll find how many photons are merged and how many are saved to the map. Importons shooting depend on the image size if you use the 'density' parameter, or can be shot arbitrary with an 'emitted' parameter.
When using Importons together with photon maps, the 'traverse' parameter should be checked. In this mode Importons will not be blocked by object and will continue evaluating importance for any further intersection.
The pratical way to use Importons is to shot a lot of photons. The problem with photons is not photon shooting but the photon map: access, balancing, storing and network sharing.
Generally, on 32bit, when something more than 15milions of photons are stored in a map, we'll start having some problem. First the map could be around 200/300mb and more. Then mentalray will just crash before attemping to save the map, just by trying to optimize it before goto save. Now, why we're going to shot 15mil of photons. :)
There're a couple of techniques, generally, to deal with photons.
Have a low detail solution just to bring the overall light conditions and then get detail with a final gather pass.
Have an high detail photon map to deal more accurately with light details and then have contact shadows and such with a final gather pass.
Now, we have a third solution.
We can use photons together with Importons to have both an accurate light bouncing while achiving very minimal details.
An example.
Here I shot 5milions photons in a cornell box.
Photon shooting is taking no more than 30 secs.
I'm working on two dual-quads in a distributed env.
Once the map is saved, it's size is around 102mb. That's just enough for my dbr rendering, as I have to wait the map broadcasting to the slave, and as the scene is very light, I will end up waiting for the slave to get its photon map when the master has already finished, ie. wasting of render time.
Also the results, are those classic results we achive with photon maps. Ie. very poor details. Just look at the cube with its bottom part in dark, how is un-natural the shadowing there.
http://img377.imageshack.us/img377/485/impphotonysd6.png
Now, we can try with Importons. Full density. And a merging distance of 0.05 (cm, while photon radius is 1cm) and traverse enabled.
http://img379.imageshack.us/img379/6163/impmergingverbosityzd1.png
The photon map is now 3861kb.
From the verbosity we can see how only around 200.000 photons are necesarry to cover the 'important' zones of our image. As the distribution of photon is importance based, we shot a lot of photons covering all the zones with a sufficent density so that when merged via importons we'll remain with enough photons to get out small details. Just take a look at the cube shadow, as was before without importons, and how it's now.
http://img74.imageshack.us/img74/6382/impphotimplc2.png
(edited: as importons are used to merge photons and then discarded, only the photon map will be available. that means, one can use maya photon map visualizer to see photon distribution (with and without importons) directly in the viewport).:)
===============================================================================================
Then, we can use Importons without photons and with the help of Irradiance Particles.
Beside the fact you need a minimun amount of importons to get out a smooth image, the most important parameter here (before, with photons, was the merging, here it's discarded) is Trace Depth. We need to let bounce a lot our importons in the scene to have an importon map suitable for a complete walkthrough (for example). Start with density of 0.2 and go up. For Depth start with something 2.
Irradiance Particles.
Let me attach a mentalimages description of this 'novel' technique:
' A short description of the technique may be given as follows: before rendering,
importons are shot to the scene from the camera. Their hit positions with
information on the amount of direct (and possibly indirect) illumination coming at
their position (hence the name "irradiance particles") are combined into a map.
Optionally, one or more passes of indirect illumination can be computed. The
nature of the algorithm is that the computation is importance-driven. During
rendering, Irradiance Particles are used to estimate the irradiance for every
shading point; if only direct illumination is collected for irradiance particles, then
this is equivalent to one bounce of indirect lighting. Irradiance can also be
interpolated from precomputed values at particles' positions. '
Parameters are close to those of finalgather.
There's an amount of ray shot over the particle sample.
There's a way to have the calculation interpolating over particle position (also only for secondary rays) and a mode where we're acting like in a brute force approach, where no interpolation is used.
Passes means how many indirect bounces we'll consider when calculating the irradiance, ie, something like fg diffuse bounces.
Eventually, if you have an environment, like a mr_sky, irradiance particles implementation will support a different set of sampling parameters to deal just with that.
Here some images to demostrate the new feature.
Only on the first image a full importon+irradianceparticles pre pass was shot. for all the others I just freezed the map and simply gone for rendering. The pre-pass took around 30mins. All the subsequent frames, around 3mins (1k).
http://img396.imageshack.us/img396/2710/shot001rg2.png
http://img120.imageshack.us/img120/6391/shot060hn3.png
http://img72.imageshack.us/img72/8105/shot0109ut1.png
(importons density 0.5, depth4; irradianceparticles, 680rays, 2 passes, 32interp, 480envrays)
Here, also an animation to see how Irradiance Particles are flicker free.
The solution is a medium detail solution. There're still some blotches on the parts in shadow. But this bloches are not flickering as they are not computed for every frame. (Banding comes with the web compression).
http://rapidshare.com/files/107992908/output3.mov.html
===============================================================================================
Another feature that comes with mr3.6+ is an advanced framebuffer memory management. Indeed, the cached mode. When in cache mode you can render any image size (even on 32bit). In fact, if enabled, only a small fraction of the resulting image (or user framebuffers) is present in memory: newly rendered tiles and tiles recently accessed.
This mode should be used only for batch rendering (it will crash maya if used for the render view.. also it does not make too much sense rendering big size images in a viewer). I just rendered a 20k image, in floating point, on a 32bit system. Slower than the others methods, it should be used only if mr is not able to create a framebuffer of a huge size (generally more than 4k images on 32bit).
RC 0.2 info : option: fb mem management cached
RC 0.2 info : option: image type interpolate
RC 0.2 info : 0 rgba_fp yes
RC 0.2 info : camera: focal length 1.37795
RC 0.2 info : camera: aperture 1.41732
RC 0.2 info : camera: aspect 0.8
RC 0.2 info : camera: resolution 16000 20000
===============================================================================================
A couple of things to use Importons and Irradiance Particles in Maya2008SP1.
You should avoid any maya shader:
'Suppress all Maya Shaders' should be checked. 'Export with Shading Engine' instead shouldn't be checked, in the shading engine node (tests were made with mia_material).
The same goes for lights. 'Suppress all Maya Shaders' need to be checked, and a custom light shader should be supplied (mr sky portals work good).
Finally, in the Rendering Settings, goto Translation->Customization and de-check the 'Export State Shader'.
Eventually read the description file that comes with the geoshader for a more detailed description of paramters and such.
64bit version available (remove the .x64 postfix).
Have fun,
Max
You get the contac shadows with the skyportals, wich are area lights passing the light and color info of the env to the interior. And if you mix that with the AO color bleed you have some kind of extra shadow..color bleed with contact shadow, that will lead to the same result of importons and IP.
And faster
I think you're right for the interpolated IP.
I wanna say something more about this, i think the brute force IP it's a great improvement (it's much much faster then FG brute force or path tracing, even the puppet integration) but i'm also a bit worried about the future direction of this kind of features.
Ok, now we have a good start point (importons), they are great for unbiased renders (or something like this) why not to think about a progressive tracing (look at the lightcache and ppt in vray... and the next year they have a new interactive/progressive engine based on that)?
What i'm trying to say is: we can't have a feedback with mr develepers, we can't see where is the direction of the new features implemented, and surely this is not good, because many times they dont take the right direction for us, IMHO.
mental image developers, you must listen your user's needs if you want to progress!! please!!
Now in the right hand we have Zap here, and we can talk about shaders, and i'm really sure he take in consideration all the feedback we do, and we can see the benefits about this, we have great shader integrated in mray now, this is the real mray flagship right now.
In the other hand we have the core development (and i can criticize many choices here, but i wanna take the constructive way), why they can't have a forum (c'mon we are in 2008!!), even the mray website it's very depressive (mray 3.3 information? oh my...)
and then i can see the consequences of this... many features needed, some features developed, but no one it's really convincing...
Ok, think abount the interpolated IP, i'm really hoping they can make it better in the next release, but hey, looking at the past problems (i was talking about the 3.5FG interp problem from the first day i worked with it, and nothing changed in... 3 years??) i'm a bit worried...
Right now, for what i can understand (i'm not a technical guy) there is something wrong on the IP mechanism itself, i mean on the interpolated part of the IP.
The good start point of photons (and lightcache, all of the good algorithms for secondary rays) is: if you need an interpolated solution, you can have a smooth and fast calculation for the secondary rays of the diffuse light, and add details with a good algorithm for the primary rays (FG in mr, Irradiance cache in vray, etc)
Now with IP: you shoot importons for the primary and secondary rays, but you can't control separately the primary and secondary rays quality!
IMHO This is the big problem right now with IP interpolated, if you want a good GI you have to shoot many importons, many importons = many rays too shoot, many rays to shoot + many importons = big rendertime. And now you can have an almost perfect IP interp. render but rendertimes are very close to the uninterpolated solution!
I'll do some comparison in the next weeks so you can see it much better.
So, what's the solution?
IMHO it's to separate primary and secondary rays quality options! And there are two ways you can choose: there is the photon+FG way (it's the mray way) and there is the vray (and turtle and fR and kray...) way, separate primary and secondary rays on the core!
I was trying to request this feature at least for 3year by now... i can't understand why the never heard it! Turtle developers are much more receptives! So, maybe mray core has some technical limitation and you can't do that? Let us know, but search some other solution!
I dont whant to trasform this thread in another whimpering thread, Max, hope you can understand what i'm trying to say, i'm not here to criticise mr developers, but i wanna be constructive, i wanna do feedbacks, i wanna do feature request etc
but if you dont like this kind of speech i can erease my words and open another thread for that...
thanx
mat
Irradiance Particles 3.7
This algorithm is a novel approach to compute global illumination based on importance sampling, which tends to converge much faster to a desirable quality than the existing solutions like global illumination photon tracing combined with final gathering.
Before rendering starts, importons are shot from the camera into the scene and collected as a new kind of particle, called an irradiance particle. They carry information about the amount of direct illumination coming in at their position (hence the name "irradiance") and, optionally, the amount of indirect irradiance incident at their position (if indirect passes are enabled). During rendering, the stored particles are used to estimate the irradiance at a shading point: if just direct illumination was collected for irradiance particles, this is equivalent to one bounce of indirect lighting.
The irradiance can also be interpolated from precomputed values at the particle positions.
The irradiance particle algorithm simulates some but not all of the indirect lighting interactions of the traditional global illumination algorithms in mental ray. For this reason, if irradiance particles are enabled then mental ray will turn off the global illumination photon tracing automatically if it was activated. This is a common situation when external applications are asked to generate mental ray scenes with photon shaders attached, which are needed for importons. Caustics can be used together with irradiance particles because they are used to capture indirect lighting effects that irradiance particles cannot simulate. If both final gathering and irradiance particles are enabled then final gathering is preferred and irradiance particles will be switched off automatically.
Irradiance particles support a special IBL-style functionality which can be enabled by setting the number of indirect passes to -1. In this case only the environment map lighting but not diffuse bounces are taken into account. If interpolation is disabled then only environment presampling map is build and no further precomputation steps are required. If interpolation is enabled then particles are emitted in the precomputation pass in a usual way, but used as interpolation points only.
The irradiance particles feature is controlled by scene options and command line arguments of a standalone mental ray.