|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
I want to simulate a 3D laser scanner using a POV-ray scene. To to this,
I declare the scene as a union and use trace() from the scanner's
position to generate points on the scene's surfaces.
Now I also want to evaluate the pigment of each point, but I can't find
a good way to do it. trace() itself doesn't return any colour information.
I tried to declare the scene as an array of objects instead of a union,
and then call trace() and eval_pigment() on each object separately.
However, then the trace function "sees through" objects since they are
tested separately. For example, I get points both on a foreground object
and the background behind it, which should be occluded.
Is there another way?
Post a reply to this message
|
|
| |
| |
|
|
From: Le Forgeron
Subject: Re: How to trace() and evaluate colour of CSG object?
Date: 2 Nov 2010 05:15:27
Message: <4ccfd6af$1@news.povray.org>
|
|
|
| |
| |
|
|
Le 02/11/2010 09:05, none a écrit :
> I want to simulate a 3D laser scanner using a POV-ray scene. To to this,
> I declare the scene as a union and use trace() from the scanner's
> position to generate points on the scene's surfaces.
>
> Now I also want to evaluate the pigment of each point, but I can't find
> a good way to do it. trace() itself doesn't return any colour information.
>
> I tried to declare the scene as an array of objects instead of a union,
> and then call trace() and eval_pigment() on each object separately.
> However, then the trace function "sees through" objects since they are
> tested separately. For example, I get points both on a foreground object
> and the background behind it, which should be occluded.
>
> Is there another way?
What about simply render your scene with the camera at the scanner spot
(and using the right camera type according to your scanner).
But notice that the "colour" that a laser-scanner would get would in
fact be the level of reflected laser-beam in the ray direction.
A laser-beam being a single ray of the spectrum, at best it would be a
grey level (of the colour of the laser). In fact, any real material is
going to defeat the modeling unless you change its natural pigment(s)'s
colour(s) to the value relevant for the laser light's colour. A Red ball
in a blue laser is just black. (unless a small part of blue (the one
which match the wavelength of the laser) is also reflected by the Red of
the ball)
You might want to reduce the quality (+Q) parameter of povray to ignore
lights & shadows (and might be other things) (unless the only light spot
is also at the scanner location, assuming a fixed-point beam moved with
a moving mirror/prism to perform the scanning)
--
A good Manager will take you
through the forest, no mater what.
A Leader will take time to climb on a
Tree and say 'This is the wrong forest'.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
2010-11-02 10:15, Le_Forgeron skrev:
> Le 02/11/2010 09:05, none a écrit :
>> I tried to declare the scene as an array of objects instead of a union,
>> and then call trace() and eval_pigment() on each object separately.
>> However, then the trace function "sees through" objects since they are
>> tested separately. For example, I get points both on a foreground object
>> and the background behind it, which should be occluded.
>>
>> Is there another way?
>
> What about simply render your scene with the camera at the scanner spot
> (and using the right camera type according to your scanner).
I suppose I could do it that way, although it seems to require a bit of
work. If I correctly understand your idea, it would be like this.
I would first run my POV macro to generate a text file with all the
"traced" points. Then I would render the the scene (with +Q0) to get the
pigment at each pixel. I would first have to change the camera
parameters so that each pixel in the image corresponds to one of the
rays that I trace to get the 3D position of a point (not sure how to do
that, but it's probably possible). And then I would make some other
program that would match the traced points with the colour pixels.
> But notice that the "colour" that a laser-scanner would get would in
> fact be the level of reflected laser-beam in the ray direction.
Yes, that's true. But in this case I actually do want the pigment of the
object that is hit by the ray, and not a simulation of the remission of
the laser beam, so you can actually disregard the application I
described. Sorry for the confusion.
Post a reply to this message
|
|
| |
| |
|
|
From: Le Forgeron
Subject: Re: How to trace() and evaluate colour of CSG object?
Date: 2 Nov 2010 06:49:40
Message: <4ccfecc4$1@news.povray.org>
|
|
|
| |
| |
|
|
>
>> But notice that the "colour" that a laser-scanner would get would in
>> fact be the level of reflected laser-beam in the ray direction.
>
> Yes, that's true. But in this case I actually do want the pigment of the
> object that is hit by the ray, and not a simulation of the remission of
> the laser beam, so you can actually disregard the application I
> described. Sorry for the confusion.
Do you want to know the pigment of the object as seen from the camera of
the scanner, or really the pigment of the object (independently of the
lightning condition ?)
The first one is an actual render, and in fact you just want the colour
of the ray as influenced by the pigment (and you want to stay away of
+Q0 in such case).
The second one is a kind of ambient 1/reflection 0/diffusion 0 for every
object (and get away with any other finish), so +Q0 should do it.
--
A good Manager will take you
through the forest, no mater what.
A Leader will take time to climb on a
Tree and say 'This is the wrong forest'.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
2010-11-02 11:49, Le_Forgeron skrev:
>>> But notice that the "colour" that a laser-scanner would get would in
>>> fact be the level of reflected laser-beam in the ray direction.
>>
>> Yes, that's true. But in this case I actually do want the pigment of the
>> object that is hit by the ray, and not a simulation of the remission of
>> the laser beam, so you can actually disregard the application I
>> described. Sorry for the confusion.
>
> Do you want to know the pigment of the object as seen from the camera of
> the scanner, or really the pigment of the object (independently of the
> lightning condition ?)
I want to know the pigment of the object, independent of any lighting.
Actually, what I really want is to know which object the ray hit. I just
imagine that using different pigments for different objects and looking
at the pigment at the end of each ray would be an easy way to accomplish
that.
I just thought of another workaround to accomplish what I want. Given an
array of objects, I can trace() a ray to each of them and only consider
the closest hit. That would take care of occlusions.
Thanks a lot for your effort to help!
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Le 2010-11-02 07:43, none a écrit :
> 2010-11-02 11:49, Le_Forgeron skrev:
>>>> But notice that the "colour" that a laser-scanner would get would in
>>>> fact be the level of reflected laser-beam in the ray direction.
>>>
>>> Yes, that's true. But in this case I actually do want the pigment of the
>>> object that is hit by the ray, and not a simulation of the remission of
>>> the laser beam, so you can actually disregard the application I
>>> described. Sorry for the confusion.
>>
>> Do you want to know the pigment of the object as seen from the camera of
>> the scanner, or really the pigment of the object (independently of the
>> lightning condition ?)
>
> I want to know the pigment of the object, independent of any lighting.
>
> Actually, what I really want is to know which object the ray hit. I just
> imagine that using different pigments for different objects and looking
> at the pigment at the end of each ray would be an easy way to accomplish
> that.
>
> I just thought of another workaround to accomplish what I want. Given an
> array of objects, I can trace() a ray to each of them and only consider
> the closest hit. That would take care of occlusions.
For that, you can use v_length() and retain the one with the smallest
value. Start with a large test value, like 1e9.
>
> Thanks a lot for your effort to help!
Do trace against the union. Then, do an eval_pigment() against the
pigment of the object, including any transform applied after the texture.
That way, you find the first intersection point, then you evaluate the
texture at that point, ignoring any lighting and finish.
You may still need to test against the individual objects if the various
components have different textures, and test for the one that returns
the same coordinates and normal as the trace against the whole union.
Alain
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|