|
|
Apache wrote:
> And I'm calculating a 12000 samples set. This time I'll get rid of the
> 255x255x255 problem. Maybe I'll try integers: 65535x65535x65535.
You could use two shorts instead of three, and compute the third with z =
sqrt(1 - x^2 - y^2). Then the memory use would only go up by 33% per sample.
(Besides, it would probably be faster than vnormalizing the whole vector
every time.)
If you get this working, by the way, you could reuse the same table for
focal blur samples, since it is after all a uniform distribution on the
disc.
Anders
--
#macro E(D)(#if(D<2)D#else#declare I=I+1;mod(pow(.5mod(I 6))*asc(substr(
"X0(1X([\\&Q@TV'YDGU`3F(-V[6Y4aL4XFUTD#N#F8\\A+F1BFO4`#bJN61EM8PFSbFA?C"
I/6 1))2)<1#end)#end#macro R(D,I,T,X,Y)#if(E(D))R(D-1I,T,Y/2X)R(D-1I,T+Y
/2Y/2X)#else box{T T+X+Y pigment{rgb E(2)*9}}#end#end R(10,5z*3-1v*2u*2)
Post a reply to this message
|
|