POV-Ray : Newsgroups : povray.newusers : povray output as a film/CCD Server Time
23 Nov 2024 17:17:50 EST (-0500)
  povray output as a film/CCD (Message 1 to 10 of 10)  
From: spitz
Subject: povray output as a film/CCD
Date: 3 Jul 2012 15:15:01
Message: <web.4ff343d84f92bc799b0928190@news.povray.org>
I'm just wondering if there has been any updates to POV-ray since this 2004
thread in regards to acting as a film/CCD screen...?

http://news.povray.org/povray.newusers/thread/%3C41ab944a@news.povray.org%3E/

Instead of using povray's predefined camera's I would like to use my own stack
of lenses to play with focus and depth of field. Any examples would be extremely
helpful.


Post a reply to this message

From: clipka
Subject: Re: povray output as a film/CCD
Date: 3 Jul 2012 17:58:51
Message: <4ff36b1b$1@news.povray.org>
Am 03.07.2012 21:11, schrieb spitz:
> I'm just wondering if there has been any updates to POV-ray since this 2004
> thread in regards to acting as a film/CCD screen...?
>
> http://news.povray.org/povray.newusers/thread/%3C41ab944a@news.povray.org%3E/

Have a look at the new mesh camera feature; it might be of use in such a 
context.


Post a reply to this message

From: spitz
Subject: Re: povray output as a film/CCD
Date: 5 Jul 2012 10:30:00
Message: <web.4ff5a44cfaf6826b9b0928190@news.povray.org>
Are there any examples using this feature? Maybe a lens projecting the image of
an illuminated 3D scene on the mesh(film/CCD) showing depth of field.


clipka <ano### [at] anonymousorg> wrote:
> Am 03.07.2012 21:11, schrieb spitz:
> > I'm just wondering if there has been any updates to POV-ray since this 2004
> > thread in regards to acting as a film/CCD screen...?
> >
> > http://news.povray.org/povray.newusers/thread/%3C41ab944a@news.povray.org%3E/
>
> Have a look at the new mesh camera feature; it might be of use in such a
> context.


Post a reply to this message

From: clipka
Subject: Re: povray output as a film/CCD
Date: 6 Jul 2012 05:06:58
Message: <4ff6aab2$1@news.povray.org>
Am 05.07.2012 16:28, schrieb spitz:
 > [mesh camera]
> Are there any examples using this feature? Maybe a lens projecting the image of
> an illuminated 3D scene on the mesh(film/CCD) showing depth of field.

I have to hand that question over to the people who have more experience 
with the feature; I only know it does exist, with a gut feeling that it 
might be relevant for your task.


Post a reply to this message

From: Jaime Vives Piqueres
Subject: Re: povray output as a film/CCD
Date: 6 Jul 2012 07:32:43
Message: <4ff6ccdb$1@news.povray.org>
On 06/07/12 11:06, clipka wrote:
> Am 05.07.2012 16:28, schrieb spitz:
>> [mesh camera] Are there any examples using this feature? Maybe a
>> lens projecting the image of an illuminated 3D scene on the
>> mesh(film/CCD) showing depth of field.
>
> I have to hand that question over to the people who have more
> experience with the feature; I only know it does exist, with a gut
> feeling that it might be relevant for your task.

   As one of the few who played with it, I'm not sure it could serve
for his purpose, that is, to play with depth of field and focus. At
least in a realistic way... In any case, here are the results of my
tests with this feature:

http://www.ignorancia.org/en/index.php?page=mesh-camera

   As you see at the end, I'm guessing that some sort of "custom focal
blur" could be achieved, but I didn't figure a way to do it.

   I'm not at all an expert on optics (in fact I know next to nothing
about the topic), but I think that just modeling the lenses as meshes
and using these as mesh-cameras will not do the trick... standard focal
blur doesn't work with mesh cameras (and anyhow it's not related to real
world f-stops).

   Again just guessing, but I have the feeling that what he needs is
something like the camera implementation on the Lux renderer.

--
Jaime


Post a reply to this message

From: spitz
Subject: Re: povray output as a film/CCD
Date: 6 Jul 2012 15:15:01
Message: <web.4ff73843faf6826b9b0928190@news.povray.org>
There was a nice thread on LuxRender about this

http://www.luxrender.net/forum/viewtopic.php?f=14&t=4766

In this case the CCD sensor or image plane simply is a plane with translucent
material with transmission set to 1 and reflection set to 0. An orthographic
camera was used to sample this image plane.

Would this be possible in POV-Ray...?

Jaime Vives Piqueres <jai### [at] ignoranciaorg> wrote:
> On 06/07/12 11:06, clipka wrote:
> > Am 05.07.2012 16:28, schrieb spitz:
> >> [mesh camera] Are there any examples using this feature? Maybe a
> >> lens projecting the image of an illuminated 3D scene on the
> >> mesh(film/CCD) showing depth of field.
> >
> > I have to hand that question over to the people who have more
> > experience with the feature; I only know it does exist, with a gut
> > feeling that it might be relevant for your task.
>
>    As one of the few who played with it, I'm not sure it could serve
> for his purpose, that is, to play with depth of field and focus. At
> least in a realistic way... In any case, here are the results of my
> tests with this feature:
>
> http://www.ignorancia.org/en/index.php?page=mesh-camera
>
>    As you see at the end, I'm guessing that some sort of "custom focal
> blur" could be achieved, but I didn't figure a way to do it.
>
>    I'm not at all an expert on optics (in fact I know next to nothing
> about the topic), but I think that just modeling the lenses as meshes
> and using these as mesh-cameras will not do the trick... standard focal
> blur doesn't work with mesh cameras (and anyhow it's not related to real
> world f-stops).
>
>    Again just guessing, but I have the feeling that what he needs is
> something like the camera implementation on the Lux renderer.
>
> --
> Jaime


Post a reply to this message

From: clipka
Subject: Re: povray output as a film/CCD
Date: 6 Jul 2012 19:39:26
Message: <4ff7772e@news.povray.org>
Am 06.07.2012 21:11, schrieb spitz:
> There was a nice thread on LuxRender about this
>
> http://www.luxrender.net/forum/viewtopic.php?f=14&t=4766
>
> In this case the CCD sensor or image plane simply is a plane with translucent
> material with transmission set to 1 and reflection set to 0. An orthographic
> camera was used to sample this image plane.
>
> Would this be possible in POV-Ray...?

In this quality? Pretty likely so.

The key issue here is how to model the sensor; I can think of two 
approaches:


(A) Use a non-solid object (e.g. a disc, or a union of two triangles); 
give it a perfectly white pigment, no highlights, no reflection, and 
"diffuse 0.0, 1.0" (sic!); this syntax introduced in POV-Ray 3.7 
specifies that 0% of incident illumination should be scattered /back/ 
diffusely (first parameter), while 100% should be scattered /through/ 
the surface instead (second parameter); this makes sure your sensor does 
not scatter light back into the scene.

To actually make use of this approach, you'll need to use radiosity, and 
you need to use special settings that make sure that you get a lot of 
radiosity samples; "maximum_reuse" is of particular importance in this 
context and should be very low. Radiosity sample density will be a 
problem though (as it limits the resolution of your sensor), and you 
might actually end up with a high memory footprint.


(B) Actually, the "sensor" used in the Luxrender example does nothing 
but apply pretty random perturbations to the rays going through it. It 
so happens that POV-Ray has a dedicated feature to perturb camera rays, 
aptly referred to as "Camera Ray Perturbation" in the docs, which works 
by adding a "normal" statement to the camera statement. If the specified 
perturbation is fine enough and has a suitable distribution, and you 
make generous use of anti-aliasing, it'll make the /perfect/ 
sensor-imitation screen for your camera: It'll cause no backscatter at 
all, you don't have to worry about positioning it relative to the 
camera, you don't need to use radiosity if you don't like it, and so on.


That said, I'm not yet sure what formula is used for the perturbation 
effect (whether it is equivalent to looking at a correspondingly 
perturbed mirror, or whether the ray directions are modified directly as 
if they were the normals), nor what kind of angular dependency is 
realistic for a CCD's response to incoming light (lambertian law might 
be a first approximation, but I guess it's not that simple), let alone 
what pattern would most faithfully model the scattering effect. But for 
starters I'd try with "normal { bumps 0.5 }" or something along these 
lines. There's always the possibility of changing the pattern later to 
make it more realistic once you're convinced that it's worth the effort 
and have figured out what actually /is/ realistic anyway.


As another alternative, you could try with very out-of-focus focal blur, 
setting the focal point maybe halfway between the camera and the lens, 
and "aperture" to the difference between the camera diagonal size and 
the lens' diameter. I don't know whether this is a suitable model for 
the anular dependency of a CCD's response though, and you can't just 
tweak it very much as you can do with the camera perturbation approach.


Oh, and then I just read about the mesh camera: You /can/ use it for 
your purposes, using distribution type 0. Use multiple meshes, each 
defining the same points from which to shoot a ray, but use different 
randomized surface normals for each mesh to jitter the directions in 
which rays are shot (you can use the very same normal for all triangles 
in a mesh, or randomize them even within one mesh). This gives you more 
direct control over the sample directions as compared to the perturbed 
camera approach. See the docs or the wiki for more details on mesh cameras.


Post a reply to this message

From: spitz
Subject: Re: povray output as a film/CCD
Date: 13 Jul 2012 18:15:01
Message: <web.50009d33faf6826b9b0928190@news.povray.org>
I'll look more into the mesh camera. My goal is to simulate the Lytro camera
which consists of a lens and a micro lens array in front of the CCD. Once I get
the simulated CCD image I plan to use matlab for performing digital refocusing.
Do you think this is feasible using POV-Ray...?


clipka <ano### [at] anonymousorg> wrote:
> Am 06.07.2012 21:11, schrieb spitz:
> > There was a nice thread on LuxRender about this
> >
> > http://www.luxrender.net/forum/viewtopic.php?f=14&t=4766
> >
> > In this case the CCD sensor or image plane simply is a plane with translucent
> > material with transmission set to 1 and reflection set to 0. An orthographic
> > camera was used to sample this image plane.
> >
> > Would this be possible in POV-Ray...?
>
> In this quality? Pretty likely so.
>
> The key issue here is how to model the sensor; I can think of two
> approaches:
>
>
> (A) Use a non-solid object (e.g. a disc, or a union of two triangles);
> give it a perfectly white pigment, no highlights, no reflection, and
> "diffuse 0.0, 1.0" (sic!); this syntax introduced in POV-Ray 3.7
> specifies that 0% of incident illumination should be scattered /back/
> diffusely (first parameter), while 100% should be scattered /through/
> the surface instead (second parameter); this makes sure your sensor does
> not scatter light back into the scene.
>
> To actually make use of this approach, you'll need to use radiosity, and
> you need to use special settings that make sure that you get a lot of
> radiosity samples; "maximum_reuse" is of particular importance in this
> context and should be very low. Radiosity sample density will be a
> problem though (as it limits the resolution of your sensor), and you
> might actually end up with a high memory footprint.
>
>
> (B) Actually, the "sensor" used in the Luxrender example does nothing
> but apply pretty random perturbations to the rays going through it. It
> so happens that POV-Ray has a dedicated feature to perturb camera rays,
> aptly referred to as "Camera Ray Perturbation" in the docs, which works
> by adding a "normal" statement to the camera statement. If the specified
> perturbation is fine enough and has a suitable distribution, and you
> make generous use of anti-aliasing, it'll make the /perfect/
> sensor-imitation screen for your camera: It'll cause no backscatter at
> all, you don't have to worry about positioning it relative to the
> camera, you don't need to use radiosity if you don't like it, and so on.
>
>
> That said, I'm not yet sure what formula is used for the perturbation
> effect (whether it is equivalent to looking at a correspondingly
> perturbed mirror, or whether the ray directions are modified directly as
> if they were the normals), nor what kind of angular dependency is
> realistic for a CCD's response to incoming light (lambertian law might
> be a first approximation, but I guess it's not that simple), let alone
> what pattern would most faithfully model the scattering effect. But for
> starters I'd try with "normal { bumps 0.5 }" or something along these
> lines. There's always the possibility of changing the pattern later to
> make it more realistic once you're convinced that it's worth the effort
> and have figured out what actually /is/ realistic anyway.
>
>
> As another alternative, you could try with very out-of-focus focal blur,
> setting the focal point maybe halfway between the camera and the lens,
> and "aperture" to the difference between the camera diagonal size and
> the lens' diameter. I don't know whether this is a suitable model for
> the anular dependency of a CCD's response though, and you can't just
> tweak it very much as you can do with the camera perturbation approach.
>
>
> Oh, and then I just read about the mesh camera: You /can/ use it for
> your purposes, using distribution type 0. Use multiple meshes, each
> defining the same points from which to shoot a ray, but use different
> randomized surface normals for each mesh to jitter the directions in
> which rays are shot (you can use the very same normal for all triangles
> in a mesh, or randomize them even within one mesh). This gives you more
> direct control over the sample directions as compared to the perturbed
> camera approach. See the docs or the wiki for more details on mesh cameras.


Post a reply to this message

From: clipka
Subject: Re: povray output as a film/CCD
Date: 13 Jul 2012 18:30:10
Message: <5000a172$1@news.povray.org>
Am 14.07.2012 00:12, schrieb spitz:
> I'll look more into the mesh camera. My goal is to simulate the Lytro camera
> which consists of a lens and a micro lens array in front of the CCD. Once I get
> the simulated CCD image I plan to use matlab for performing digital refocusing.
> Do you think this is feasible using POV-Ray...?

Quite possible.


Post a reply to this message

From: Jaime Vives Piqueres
Subject: Re: Plenoptic camera (was: povray output as a film/CCD)
Date: 28 Dec 2012 11:30:23
Message: <50ddc91f$1@news.povray.org>

> I'll look more into the mesh camera. My goal is to simulate the Lytro camera
> which consists of a lens and a micro lens array in front of the CCD. Once I get
> the simulated CCD image I plan to use matlab for performing digital refocusing.
> Do you think this is feasible using POV-Ray...?

   Well, at the time I didn't know what the $%&$& you were talking 
about... and surely I was busy enough to get interested on "yet another 
POV-Ray experiment" that sounded like too difficult. :)

   But now, I recently received a personal request to help with this 
very same subject, and the fact is that this time I got interested... 
I'm not sure why. So, I did a bit of my usual chaotic research, and 
finally figured out something that looks like a solution, using the mesh 
camera of course.

   See the result on p.b.images, which I think simulates the raw output 
of a plenoptic camera (like the Lytro), or so I think... because I can't 
say for sure: the info on plenoptic cameras is either very dense or too 
superficial, and seems there is no available software to try myself that 
"refocusing" thing on my images.

   In case anyone is interested, here are the sources:

http://www.ignorancia.org/en/uploads/experiments/plenoptic/plenoptic-camera-2.zip

   I would appreciate if someone with real knowledge on the matter could 
tell me if I got it right, or at least not totally wrong...

   Regards,

--
Jaime


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.