|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Ray Bellis wrote:
>> Very nice images! Path tracing really gives very realistic
>> result with very little tweaking. Nice!
>
> What's a good place to discuss algorithms?
>
> I've written a simple stochastic tracer in Java and I'm struggling with
> doing diffuse inter-reflection, and getting enough rays to hit my ambient
> "light sources".
I think Off-topic is the best place. If you mean this news server. (You
can answer to this post and the follow-up will be posted in the correct
group.)
I have also done my own renderer:
http://www.saunalahti.fi/~sevesalm/ssRay.html
Still a lot of work to be done...
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> I think Off-topic is the best place. If you mean this news server.
> (You can answer to this post and the follow-up will be posted in the
> correct group.)
>
> I have also done my own renderer:
>
> http://www.saunalahti.fi/~sevesalm/ssRay.html
>
> Still a lot of work to be done...
Yes - yours was the inspiration for my work.
So far mine only does spheres and planes, though.
However I've abstracted out a lot of stuff so that new shading algorithms
can be implemented as a standalone Java class.
Hence I have standard "Refract" and "Reflect" shaders, but two versions of
"Diffuse" - one that does standard light source tracing and shadow
detection, and a Monte-Carlo one which fires off rays in random directions,
hopefully hitting other objects. These shaders are hence a little bit like
Renderman shaders.
It's also fully multithreaded :). Bizarrely the JRE on MacOSX seems
*really* fast compared to other O/S. On my MacBook Pro with a Core2Duo I
get better performance than on my 3.0 GHz Core2Quad XP machine, and only
slightly worse performance than on the 16 core Opteron server at work.
For anti-aliasing I do multiple passes, just throwing rays in at completely
random coordinates.
cheers,
Ray
Post a reply to this message
|
|
| |
| |
|
|
From: Nicolas Alvarez
Subject: Re: stochastic (monte-carlo) tracing
Date: 11 Sep 2008 22:09:55
Message: <48c9cf73@news.povray.org>
|
|
|
| |
| |
|
|
Severi Salminen wrote:
> http://www.saunalahti.fi/~sevesalm/ssRay.html
Still waiting for you to release the code :) Or even a binary! I want to
play with it!
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Ray Bellis wrote:
> Yes - yours was the inspiration for my work.
We both have to thank Fidos for path tracing inspiration! I immediately
got interested when he posted the images on this news server.
> So far mine only does spheres and planes, though.
I also have only a few objects. So much to do even with those only.
> However I've abstracted out a lot of stuff so that new shading algorithms
> can be implemented as a standalone Java class.
Sounds a good approach. I started this project cause I wanted to learn
some C++ (and OO programming). So you can imagine I have re-structured
my code quite a bit.
> It's also fully multithreaded :). Bizarrely the JRE on MacOSX seems
> *really* fast compared to other O/S. On my MacBook Pro with a Core2Duo I
> get better performance than on my 3.0 GHz Core2Quad XP machine, and only
> slightly worse performance than on the 16 core Opteron server at work.
That kinda sucks if the differences are big. I don't know Java but it
sounds off if multithread performance on XP is significantly slower.
> For anti-aliasing I do multiple passes, just throwing rays in at completely
> random coordinates.
Yeah. That works well. You can improve it (and many other random things)
by not choosing totally random samples but using some pseudo-random
sampling that gives more evenly spread samples.
I'm now implementing direct lighting calculations. The results are
impressive. So instead of just letting rays bounce randomly and hoping
that they hit a light source, I now evaluate direct lighting component
at each bounce. The result is that the noise reduces A LOT faster. If
the light sources are small, the difference gets bigger. If I used point
light sources, my original program would never hit them.
Still some nasty bugs I have to figure out but this one is a keeper. The
downside is that I have no idea if I can implement this feature for any
kind (shape) of light source. Now it works for spheres as it is easy to
calculate the solid angle. We'll see...
Severi
Post a reply to this message
|
|
| |
| |
|
|
From: Severi Salminen
Subject: Re: stochastic (monte-carlo) tracing
Date: 12 Sep 2008 11:37:27
Message: <48ca8cb7@news.povray.org>
|
|
|
| |
| |
|
|
Nicolas Alvarez wrote:
> Still waiting for you to release the code :) Or even a binary! I want to
> play with it!
Never...NEVER!! It's my precioussss..... :)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Severi Salminen wrote:
> Ray Bellis wrote:
>>> Very nice images! Path tracing really gives very realistic
>>> result with very little tweaking. Nice!
>> What's a good place to discuss algorithms?
>>
>> I've written a simple stochastic tracer in Java and I'm struggling with
>> doing diffuse inter-reflection, and getting enough rays to hit my ambient
>> "light sources".
>
> I think Off-topic is the best place. If you mean this news server. (You
> can answer to this post and the follow-up will be posted in the correct
> group.)
>
> I have also done my own renderer:
>
> http://www.saunalahti.fi/~sevesalm/ssRay.html
>
> Still a lot of work to be done...
Now if you want to get really realistic:
* Instead of shooting RGB rays, shoot photons of a particular wave
length, selected randomly from the spectrum of the light source at the
emission point;
* at each surface of intersection, calculate the chances of refraction,
reflection, absorption, and possible emission, and either shoot the
photon off at another angle, or count it as absorbed and start a new photon;
* Remember that the angle of refraction is dependent on wavelength.
Voila: Rainbows are built in.
* Do constructive and destructive interference so that you can produce
realistic iridescence.
* A photon isn't drawn on the screen unless it hits the virtual film.
Then you calculate its RGB value, and add that to the pixel at that
point in the screen.
Regards,
John
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> Sounds a good approach. I started this project cause I wanted to learn
> some C++ (and OO programming). So you can imagine I have re-structured
> my code quite a bit.
I've done a few refactorings on mine too.
The biggest one was then I realised that the bulk of the work in the tracer
used a function with the same parameters as the shaders, so that itself
became a shader. :)
>> For anti-aliasing I do multiple passes, just throwing rays in at
>> completely random coordinates.
>
> Yeah. That works well. You can improve it (and many other random
> things) by not choosing totally random samples but using some
> pseudo-random sampling that gives more evenly spread samples.
Mine does seem to work pretty well as is.
> I'm now implementing direct lighting calculations. The results are
> impressive. So instead of just letting rays bounce randomly and hoping
> that they hit a light source, I now evaluate direct lighting component
> at each bounce. The result is that the noise reduces A LOT faster. If
> the light sources are small, the difference gets bigger. If I used
> point light sources, my original program would never hit them.
I've had this problem - can you please explain this direct lighting
algorithm in more detail?
> Still some nasty bugs I have to figure out but this one is a keeper.
> The downside is that I have no idea if I can implement this feature
> for any kind (shape) of light source. Now it works for spheres as it
> is easy to calculate the solid angle. We'll see...
This sounds promising.
BTW, I implemented simple "rough" reflections earlier today. That algorithm
was trivial:
d = original_reflection_direction
s = surface_smoothness
if (s > 0) {
pt = random_vector();
d += s * pt;
normalize(d);
}
cast_new_ray(d);
I haven't checked that it's mathematically accurate (it's possible that the
distribution of the pertubed ray around the original direction isn't
uniform) but it looks nice enough.
Ray
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> BTW, I implemented simple "rough" reflections earlier today. That
> algorithm was trivial:
>
> d = original_reflection_direction
> s = surface_smoothness
> if (s > 0) {
> pt = random_vector();
> d += s * pt;
> normalize(d);
> }
> cast_new_ray(d);
>
> I haven't checked that it's mathematically accurate (it's possible
> that the distribution of the pertubed ray around the original
> direction isn't uniform) but it looks nice enough.
p.s. I've uploaded a sample image to:
http://ray.bellis.googlepages.com/mc-balls-01.jpg
A lot of the noise on that image is from the JPG compression. This is from
1.75 hours of rendering on a MacBookPro (Core2Duo)
Ray
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> * Instead of shooting RGB rays, shoot photons of a particular wave
> length, selected randomly from the spectrum of the light source at the
> emission point;
Yes. This should be implemented at some point to enable accurate dispersion.
> * at each surface of intersection, calculate the chances of refraction,
> reflection, absorption, and possible emission, and either shoot the
> photon off at another angle, or count it as absorbed and start a new photon;
Done!
> * Do constructive and destructive interference so that you can produce
> realistic iridescence.
Too far a way now...
> * A photon isn't drawn on the screen unless it hits the virtual film.
> Then you calculate its RGB value, and add that to the pixel at that
> point in the screen.
I start the rays from film plane so all rays always hit the film. I'm not sure
if anyone does it the other way and how it works in practice.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> I've had this problem - can you please explain this direct lighting
> algorithm in more detail?
Ok, I'll try. Let's assume we have found the first intersection point along the
ray. Instead of choosing a random new direction (if it is a diffuse surface)
you can do this:
1. Pick a random point on a light source.
2. Check if it is visible or not.
3. If it is, add the radiance it contributes. You must take the cosine term into
account as well as the ratio of solid angle of the source and hemisphere (the
last one thus handles the distance of the light source).
4. After this you can continue tracing the path normally. But if the ray next
hits the same light source, ignore it as otherwise you have calculated it
twice.
Oh, and there are 2 ways to handle multiple light sources:
A. Sample only 1 source from N light sources and multiply the result by N. This
is unbiased and gives the correct result with many passes.
B. Sample all light sources and add the results together.
Example. We have 3 lights that give these values: 0.4, 0.6, 1.0. Let's do 100
passes.
B gives 100*(0.1+0.5+1.0) = 200.
A might give 40*3*0.4+30*3*0.6+30*3*1.0 = 192.
Increasing passes decreases the error. Nice technique that might give good
results if you have many light sources.
That is the basics. The devil lies in the details and I still have some nasty
bug(s) somewhere.
Severi
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |