|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Hi, I was wondering if it is possible to speed up the raytraced focal
blur by using similar methods to what was used with media.
--
Samuel Benge
E-Mail: STB### [at] aolcom
Visit my isosurface tutorial at http://members.aol.com/stbenge
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
SamuelT <STB### [at] aolcom> wrote:
: Hi, I was wondering if it is possible to speed up the raytraced focal
: blur by using similar methods to what was used with media.
It is done already. That's why you get that grainy focal blur.
The only way I know to get non-grainy focal blur is to set variance to 0.
--
main(i,_){for(_?--i,main(i+2,"FhhQHFIJD|FQTITFN]zRFHhhTBFHhhTBFysdB"[i]
):5;i&&_>1;printf("%s",_-70?_&1?"[]":" ":(_=0,"\n")),_/=2);} /*- Warp -*/
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Sorry, I didn't clarify my question. I meant is there a way to implement the
MegaPOV media types of sampling to focal blur. E.G. method 1, method 2
Warp wrote:
> SamuelT <STB### [at] aolcom> wrote:
> : Hi, I was wondering if it is possible to speed up the raytraced focal
> : blur by using similar methods to what was used with media.
>
> It is done already. That's why you get that grainy focal blur.
> The only way I know to get non-grainy focal blur is to set variance to 0.
>
> --
> main(i,_){for(_?--i,main(i+2,"FhhQHFIJD|FQTITFN]zRFHhhTBFHhhTBFysdB"[i]
> ):5;i&&_>1;printf("%s",_-70?_&1?"[]":" ":(_=0,"\n")),_/=2);} /*- Warp -*/
--
Samuel Benge
E-Mail: STB### [at] aolcom
Visit my isosurface tutorial at http://members.aol.com/stbenge
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
SamuelT <STB### [at] aolcom> wrote:
: Sorry, I didn't clarify my question. I meant is there a way to implement the
: MegaPOV media types of sampling to focal blur. E.G. method 1, method 2
Media and focal blur are different things.
Perhaps comparing to antialiasing (which has 2 methods) could be better.
Still, I think the focal blur uses some kind of antialiasing method already.
--
main(i,_){for(_?--i,main(i+2,"FhhQHFIJD|FQTITFN]zRFHhhTBFHhhTBFysdB"[i]
):5;i&&_>1;printf("%s",_-70?_&1?"[]":" ":(_=0,"\n")),_/=2);} /*- Warp -*/
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
SamuelT wrote:
>
> Hi, I was wondering if it is possible to speed up the raytraced focal
> blur by using similar methods to what was used with media.
>
I believe something similar could be applied to focal blur. In theory.
The mechanism whould be more akin to adaptive antialiasing - like AA, focal blur
samples are distributed over an area, whereas media samples are distributed
along a single line.
Here are some thoughts (a.k.a. insomniac rambling):
The adaptive mechanism needs some initial set of samples, which it can then
start to compare and subdivide. Intervals for method 3 media, the pixel corners
for adaptive AA. If focal blur were to use adaptive sampling, the blur_samples
keyword should control this minimum value.
A suitable method is needed for distributing the samples, selecting the ones to
compare, and It should not be too complex and slow, but it should give a
reasonably even distribution of samples.
Seemingly most straightforward is the system used in adaptive AA: take 4
samples, compare, take subsamples between them if necessary, repeat recursively.
One thing to consider is the greater dispersion of focal blur samples - only 2x2
initial samples might be inadequate. The blur_samples value could control
initial grid size, much like antialias_depth: blur_samples 5 would give a 5x5
initial grid. The process of sampling this grid is the equivalent of adaptive AA
sampling a 4x4 pixel image.
This method is not ideally suited for focal blur. Mapping distortions cause
results to be somewhat biased. This shouldn't have a severe effect though,
unless perhaps with extreme blurring. Some random jitter might help, too.
Another possibility would be to distribute the samples over a spherical surface.
This should eliminate any bias, since sample rays would have constant spacing.
But uniform coverage of a sphere is a bit tricky, as is sorting the samples into
sets for comparing and subdivision. Not sure if results would be worth the
effort.
Rejoice! You have wasted minutes of your life on this nonsense.
I will not be held answerable.
--
Margus Ramst
Personal e-mail: mar### [at] peakeduee
TAG (Team Assistance Group) e-mail: mar### [at] tagpovrayorg
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|