|
|
Hellow,
I've recently begun using povray and although I'm finding solutions for a lot of
problems at the documentation and the newsgroups, there is something I'm finding
difficult to understand: If I use radiosity and as light source a sphere of 1 mm
of diameter, at a distance of 10 mm I obtain the brightness predicted by physics
(or that I think): 1% of the original , but if I use a pointlight with
fade_distance=1 and fade_power=2 deactivating radiosity, the brightness doubles,
while it should be the same.
I'm using Imagej to measure the brightness in a 16 bits gray scaled image
without gamma correction.
The code:
#version 3.7;
global_settings {ambient_light 0
//radiosity {count 100000 max_sample -1 gray_threshold 0.0 brightness 1
normal
on media on}
assumed_gamma 1.0
}
camera {orthographic location <5,0,0> look_at <10,0,0> angle 90}
//Source (2 options)
//sphere {<0,0,0> 1 texture {pigment {color rgb<1,1,1>} finish {emission 1
ambient 0 reflection 0 diffuse 0} } }
light_source { <0,0,0> color rgb <1,1,1> fade_distance 1 fade_power 2}
//Surface
plane {<1,0,0> 10
texture {pigment {color rgb<1,1,1>} finish {emission 0 ambient 0 reflection
0 diffuse 1 } }
}
Thanks
Post a reply to this message
|
|