POV-Ray : Newsgroups : povray.off-topic : WebGL Server Time
28 Jul 2024 02:33:26 EDT (-0400)
  WebGL (Message 11 to 20 of 28)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 8 Messages >>>
From: scott
Subject: Re: WebGL
Date: 10 Jun 2016 03:28:32
Message: <575a6c20$1@news.povray.org>
>> I'm sure you've all seen a white sphere with some coloured lights. :-P
>
> OK, here ya go:

Welcome to the dark side...   :-)


Post a reply to this message

From: scott
Subject: Re: WebGL
Date: 10 Jun 2016 03:33:33
Message: <575a6d4d$1@news.povray.org>
> That sounds quite complex. (In particular, it seems to require me to
> actually design a real lens assembly, and implement real refraction.)

Yes I realised that later too, Google didn't seem to throw up any real 
data on proper lens design (for pretty obvious reasons), only very 
simple examples. Still it might be interesting, and you never know if 
you made it slick enough one of the big lens manufacturers may show some 
interest in you or your code.

> I was thinking more along the lines of a camera entity that fires rays
> in a pattern that matches a theoretical ideal lens. But I need to figure
> out the equations for that first...

Look through the POV source? :-)

> Yeah, not just a random number per pixel, but *multiple* random numbers!
> Multiple, statistically-independent numbers... This is not trivial.

Exactly, poor RNGs in these sorts of things can cause some bizarre 
artefacts.


Post a reply to this message

From: scott
Subject: Re: WebGL
Date: 10 Jun 2016 03:50:04
Message: <575a712c$1@news.povray.org>
>> Obviously it's not, because there are plenty of shadertoy examples
>> showing it:
>>
>> https://www.shadertoy.com/view/4ssGWX
>
> Interesting. On my browser, that just crashes.

Odd. But then you did say half the examples on shadertoy didn't work on 
whatever browser you had. Tried Chrome/IE?

> The difficulty is figuring out what surface was hit just from the
> intersection coordinates.

You could have an intersection struct with surface normal and material 
ID in it as well.

> And perhaps a perfect reflection isn't so
> hard, but I wanted to have the colour change slightly on each
> reflection... but I can't figure out how to stack up the colour changes
> until you get to the end of the reflection chain, and then unstack them
> again.

You need to keep track of a "cumulative coefficient of reflection" value 
as you go (make it a vec3 if you want coloured reflections):

vec3 CCOR = vec3(1,1,1);
vec3 colour = vec3(0,0,0);
for(int i=0;i<MAX_TRACE_DEPTH;i++)
{
   isect = Raytrace();
   colour += CCOR * isect.diffuse_colour;
   CCOR *= isect.reflection_colour;
}


Post a reply to this message

From: Orchid Win7 v1
Subject: Re: WebGL
Date: 10 Jun 2016 14:03:43
Message: <575b00ff$1@news.povray.org>
>> Interesting. On my browser, that just crashes.
>
> Odd. But then you did say half the examples on shadertoy didn't work on
> whatever browser you had. Tried Chrome/IE?

Yeah, it seems the smaller examples run just fine, but bigger ones freak 
it out. In particular, browsing the shaders seems to try to run too many 
shaders at once and break Opera. (And once it's broken, only restarting 
the browser will fix it. It just doesn't *try* any more.)

>> The difficulty is figuring out what surface was hit just from the
>> intersection coordinates.
>
> You could have an intersection struct with surface normal and material
> ID in it as well.

Material ID is something I hadn't thought of. (I'm trying to do a 
checkerboard for the ground, which makes calculating the colour... 
interesting.)

> You need to keep track of a "cumulative coefficient of reflection" value
> as you go (make it a vec3 if you want coloured reflections):
>
> vec3 CCOR = vec3(1,1,1);
> vec3 colour = vec3(0,0,0);
> for(int i=0;i<MAX_TRACE_DEPTH;i++)
> {
> isect = Raytrace();
> colour += CCOR * isect.diffuse_colour;
> CCOR *= isect.reflection_colour;
> }

I'm thinking about the associative/distributive property of the reals, 
though... If one object adds 4% blue and then reflects 95%, and the next 
object adds %6 yellow and reflects 50%, and the final object is green, 
you need

   ((green * 50%) + 6% yellow) * 95% + 4% blue

but the algorithm above gives

   ((4% blue * 95%) + 6% yellow) * 50% + green

which I don't think simplifies to the same result. You'd have to neither 
trace all the rays backwards (i.e., from scene to camera), which is 
laughably inefficient, or somehow store a history (array?) of all the 
intermediate steps so you can reverse them...

...then again, I think I'll just give up on reflection, and see if I can 
make the lens work with a checkerboard. :-P


Post a reply to this message

From: Orchid Win7 v1
Subject: Re: WebGL
Date: 10 Jun 2016 14:06:39
Message: <575b01af$1@news.povray.org>
On 10/06/2016 08:33 AM, scott wrote:
>> That sounds quite complex. (In particular, it seems to require me to
>> actually design a real lens assembly, and implement real refraction.)
>
> Yes I realised that later too, Google didn't seem to throw up any real
> data on proper lens design (for pretty obvious reasons), only very
> simple examples. Still it might be interesting, and you never know if
> you made it slick enough one of the big lens manufacturers may show some
> interest in you or your code.

Pfft. I doubt anything I'll ever do will be "slick". But maybe I can 
make something interesting.

(Would be nice if ShaderToy would let you add sliders to control your 
shader!)

>> I was thinking more along the lines of a camera entity that fires rays
>> in a pattern that matches a theoretical ideal lens. But I need to figure
>> out the equations for that first...
>
> Look through the POV source? :-)

Any idea where in the 25,000,000 LoC I should start looking?

(You never know, maybe there's a file named camera.c or something...)

>> Yeah, not just a random number per pixel, but *multiple* random numbers!
>> Multiple, statistically-independent numbers... This is not trivial.
>
> Exactly, poor RNGs in these sorts of things can cause some bizarre
> artefacts.

There's a couple of StackOverflow questions about this. It seems the 
best results are obtained by using an integer hash function... yet 
ShaderToy seems to not support bitwise integer operations, so that's 
kind of not an option.

That said, there's a widely used sin-based method, which gives awful 
results. But if you iterate it a few times... it's not *too* bad.

Still waiting on how you average consecutive frames.


Post a reply to this message

From: Orchid Win7 v1
Subject: Re: WebGL
Date: 10 Jun 2016 14:10:01
Message: <575b0279$1@news.povray.org>
On 10/06/2016 07:06 PM, Orchid Win7 v1 wrote:
> (You never know, maybe there's a file named camera.c or something...)

There is, in fact, a camera.cpp file.



OMG, look at this:

/// POV-Ray is based on the popular DKB raytracer version 2.12.
/// DKBTrace was originally written by David K. Buck.
/// DKBTrace Ver 2.0-2.12 were written by David K. Buck & Aaron A. Collins.

I am *almost certain* that the man I bought my flat from was called 
Aaron Collins... o_O


Post a reply to this message

From: Orchid Win7 v1
Subject: Re: WebGL
Date: 12 Jun 2016 08:02:44
Message: <575d4f64$1@news.povray.org>
On 10/06/2016 08:24 AM, Orchid Win7 v1 wrote:

> Still trying to work out how you "average over several frames".
> ShaderToy is great fun, but so *utterly* undocumented... it's quite hard
> to figure out how to do anything.

OK, I think I've cracked it:

If you click the "new tab" button, it gives you another shader, that 
renders to Buffer A. If you then set iChannel0 to be Buffer A, you can 
do a texture2D(iChannel0, coords) to read the previous frame's pixel value.

Now you just need to set the main image shader to be a texture2D() 
lookup on Buffer A, and you're golden.

Not what you'd call "obvious"...


Post a reply to this message

From: Orchid Win7 v1
Subject: Re: WebGL
Date: 12 Jun 2016 09:30:09
Message: <575d63e1$1@news.povray.org>
On 10/06/2016 07:10 PM, Orchid Win7 v1 wrote:
> On 10/06/2016 07:06 PM, Orchid Win7 v1 wrote:
>> (You never know, maybe there's a file named camera.c or something...)
>
> There is, in fact, a camera.cpp file.

Sadly, it contains the data structures for *describing* the camera, but 
nothing whatsoever to do with *casting rays*.

It seems *that* is in tracepixel.cpp. Looking into it further, it seems 
that TracePixel::CreateCameraRay() has a giant switch-block for every 
possible camera type, but they all end with JitterCameraRay(). This, 
seemingly, is where the focal blur stuff happens.

In full:

void TracePixel::JitterCameraRay(Ray& ray, DBL x, DBL y, size_t ray_number)
{
     DBL xjit, yjit, xlen, ylen, r;
     Vector3d temp_xperp, temp_yperp, deflection;

     r = camera.Aperture * 0.5;

     Jitter2d(x, y, xjit, yjit);
     xjit *= focalBlurData->Max_Jitter * 2.0;
     yjit *= focalBlurData->Max_Jitter * 2.0;

     xlen = r * (focalBlurData->Sample_Grid[ray_number].x() + xjit);
     ylen = r * (focalBlurData->Sample_Grid[ray_number].y() + yjit);

     // Deflect the position of the eye by the size of the aperture, and in
     // a direction perpendicular to the current direction of view.

     temp_xperp = focalBlurData->XPerp * xlen;
     temp_yperp = focalBlurData->YPerp * ylen;

     deflection = temp_xperp - temp_yperp;

     ray.Origin += deflection;

     // Deflect the direction of the ray in the opposite direction we 
deflected
     // the eye position.  This makes sure that we are looking at the 
same place
     // when the distance from the eye is equal to "Focal_Distance".

     ray.Direction *= focalBlurData->Focal_Distance;
     ray.Direction -= deflection;

     ray.Direction.normalize();
}

Good luck *ever* figuring out what the hell any of it means, of course...


Post a reply to this message

From: Orchid Win7 v1
Subject: Re: WebGL
Date: 12 Jun 2016 09:48:33
Message: <575d6831$1@news.povray.org>
On 12/06/2016 02:30 PM, Orchid Win7 v1 wrote:
> In full:
>
> void TracePixel::JitterCameraRay(Ray& ray, DBL x, DBL y, size_t ray_number)
> {
> DBL xjit, yjit, xlen, ylen, r;
> Vector3d temp_xperp, temp_yperp, deflection;
>
> r = camera.Aperture * 0.5;
>
> Jitter2d(x, y, xjit, yjit);
> xjit *= focalBlurData->Max_Jitter * 2.0;
> yjit *= focalBlurData->Max_Jitter * 2.0;
>
> xlen = r * (focalBlurData->Sample_Grid[ray_number].x() + xjit);
> ylen = r * (focalBlurData->Sample_Grid[ray_number].y() + yjit);
>
> // Deflect the position of the eye by the size of the aperture, and in
> // a direction perpendicular to the current direction of view.
>
> temp_xperp = focalBlurData->XPerp * xlen;
> temp_yperp = focalBlurData->YPerp * ylen;
>
> deflection = temp_xperp - temp_yperp;
>
> ray.Origin += deflection;
>
> // Deflect the direction of the ray in the opposite direction we deflected
> // the eye position. This makes sure that we are looking at the same place
> // when the distance from the eye is equal to "Focal_Distance".
>
> ray.Direction *= focalBlurData->Focal_Distance;
> ray.Direction -= deflection;
>
> ray.Direction.normalize();
> }
>
> Good luck *ever* figuring out what the hell any of it means, of course...

I'm not entirely sure why that method is so complicated, but it 
*appears* the key part of the algorithm is this:

ray.Origin += deflection;
ray.Direction *= focus_distance;
ray.Direction -= deflection;
ray.Direction.normalize();

The aperture determines the maximum size of deflection, and the focus 
distance is mentioned above. Plugging these two into my shader, I seem 
to be able to get it to produce blurry images focused at a specific 
distance.


Post a reply to this message

From: Orchid Win7 v1
Subject: Re: WebGL
Date: 12 Jun 2016 09:53:22
Message: <575d6952@news.povray.org>
On 12/06/2016 02:48 PM, Orchid Win7 v1 wrote:
> I'm not entirely sure why that method is so complicated, but it
> *appears* the key part of the algorithm is this:
>
> ray.Origin += deflection;
> ray.Direction *= focus_distance;
> ray.Direction -= deflection;
> ray.Direction.normalize();
>
> The aperture determines the maximum size of deflection, and the focus
> distance is mentioned above. Plugging these two into my shader, I seem
> to be able to get it to produce blurry images focused at a specific
> distance.

In case anybody cares:

float Rand(vec2 v)
{
     return fract(sin(dot(v.xy ,vec2(12.9898,78.233))) * 43758.5453);
}

float Rand(vec4 v)
{
     float a = Rand(v.xy);
     float b = Rand(v.zw);
     vec2 ab = vec2(a, b);
     float c = Rand(ab * v.xy);
     float d = Rand(ab * v.zw);
     vec2 cd = vec2(c, d);
     return Rand(ab * cd);
}



struct Ray
{
     vec3 S, D;
};

vec3 RayPoint(in Ray ray, in float t)
{
     return ray.S + ray.D*t;
}

Ray Camera(in vec2 uv)
{
     Ray ray;
     ray.S = vec3(0, 0, -5);
     ray.D = vec3(uv.x, uv.y, 1.0);

     const float Aperture = 0.2;
     const float FocusDistance = 18.0;

     float r1 = Rand(vec4(uv, 0, iGlobalTime));
     float r2 = Rand(vec4(uv, 1, iGlobalTime));
     float r3 = Rand(vec4(uv, 2, iGlobalTime));
     vec3 deflection = vec3(r1, r2, 0);
     deflection = Aperture*deflection;

     ray.S += deflection;
     ray.D *= FocusDistance;
     ray.D -= deflection;
     ray.D = normalize(ray.D);

     return ray;
}

struct Plane
{
     vec3 N;
     float D;
};

float IsectPlane(in Ray ray, in Plane plane)
{
     // NP = d
     // NP - d = 0
     // N(Dt + S) - d = 0
     // ND t + NS - d = 0
     // ND t = d - NS
     // t = (d - NS)/ND

     return (plane.D - dot(plane.N, ray.S)) / dot(plane.N, ray.D);
}

struct Sphere
{
     vec3 C;
     float R;
     float R2;
};

Sphere MakeSphere(vec3 center, float radius)
{
     return Sphere(center, radius, radius*radius);
}

float IsectSphere(in Ray ray, in Sphere sphere)
{
     // (P - C)^2 = r^2
     // (P - C)^2 - r^2 = 0
     // ((Dt + S) - C)^2 - r^2 = 0
     // (Dt + S - C)^2 - r^2 = 0
     // (Dt + V)^2 - r^2 = 0
     // D^2 t^2 + 2DVt + V^2 - r^2 = 0

     vec3 V = ray.S - sphere.C;
     float a = dot(ray.D, ray.D);
     float b = 2.0*dot(V, ray.D);
     float c = dot(V, V) - sphere.R2;

     float det = b*b - 4.0*a*c;
     if (det >= 0.0)
     {
         return (0.0 - b - sqrt(det))/(2.0*a);
     }
     else
     {
         return -1.0;
     }
}

float Illuminate(vec3 light, vec3 surface, vec3 normal)
{
     vec3 d = light - surface;
     float i = dot(normalize(d), normalize(normal));
     if (i < 0.0)
     {
         return 0.0;
     }
     return i;
}

vec2 MapScreen(vec2 xy)
{
     return (xy - iResolution.xy/2.0) / iResolution.y;
}

vec2 MapScreenExact(vec2 xy)
{
     return (xy - iResolution.xy/2.0) / iResolution.xy;
}

vec3 ColourGround(in vec3 surface, in vec3 normal)
{
     float u = floor(surface.x / 3.0);
     float v = floor(surface.z / 3.0);
     if (mod(u+v, 2.0) == 0.0)
     {
         return vec3(0.4, 0.4, 0.4);
     }
     else
     {
         return vec3(1.0, 1.0, 1.0);
     }
}

vec3 TraceRay(in Ray ray)
{
     Plane ground = Plane(vec3(0, 1, 0), -5.0);
     Sphere sphere1 = MakeSphere(vec3(MapScreenExact(iMouse.xy)*4.0, 0), 
1.0);

     float groundT  = IsectPlane(ray, ground);
     float sphere1T = IsectSphere(ray, sphere1);

     int object = 0;

     if (groundT < 0.0 && sphere1T < 0.0)
     {
         object = 0;
     }

     if (groundT > 0.0 && sphere1T < 0.0)
     {
         object = 1;
     }

     if (groundT < 0.0 && sphere1T > 0.0)
     {
         object = 2;
     }

     if (groundT > 0.0 && sphere1T > 0.0)
     {
         if (groundT < sphere1T)
         {
             object = 1;
         }
         else
         {
             object = 2;
         }
     }

     if (object == 0)
     {
         return vec3(0, 0, 0);
     }

     vec3 surface, normal, colour;

     if (object == 1)
     {
         surface = RayPoint(ray, groundT);
         normal = ground.N;
         colour = ColourGround(surface, normal);
     }

     if (object == 2)
     {
         surface = RayPoint(ray, sphere1T);
         normal = surface - sphere1.C;
         colour = vec3(1, 0, 0);
     }

     float b1 = Illuminate(vec3(0, +10, 0), surface, normal);
     return colour*vec3(b1, b1, b1);
}

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
     vec4 prev = texture2D(iChannel0, fragCoord.xy / iResolution.xy);

     Ray cr = Camera(MapScreen(fragCoord.xy));
     vec3 colour = TraceRay(cr);

     fragColor = vec4(colour/float(iFrame), 1) + prev*(1.0 - 
1.0/float(iFrame));
     fragColor = clamp(fragColor, vec4(0, 0, 0, 0), vec4(1, 1, 1, 1));
}



You'll need to configure this for frame averaging, by setting this up as 
the shader for Buffer A, and configuring iChannel0 = Buffer A. Then set 
the main shader to just render Buffer A to the screen.


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 8 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.