|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
As you have probably heard before, I have a POV-Ray renderfarm. When I
rendered the cups scene from the "waste of time!" thread on p.b.i, I had
some tiles take 3 seconds (pure plain floor; most of the spent time was
decompressing the photon map, not POV-Ray itself), and others take 15
minutes. I don't like the wasted overhead with super-short tiles. But I
can't just increase the tile size, because then other tiles may be so
slow they may end up hanging around while lots of other computers are idle.
So... I need a way to split tiles to get somewhat evenly distributed
render time. I know it's impossible to get them all take exactly the
same time, but at least I'd like them all within the same order of
magnitude ;)
I made a script that uses a quad tree. It reads a histogram image
generated by POV-Ray, splits it in four, and sums the pixel color on
each of the four areas. Then it checks which has the highest sum, and
splits that one once more. Then again calculates sums on each area (7
areas at this level), gets highest, and splits. The stop condition could
be anything: the number of tiles, or the smallest tile size, or the
speed of the slowest tile.
And uhm... Seemed to work fine.
http://stuff.povaddict.com.ar/split.png
http://stuff.povaddict.com.ar/split2.png
Sort of.
I noticed doing this I definitely get rid of all tiles slower than a
certain value, but I don't really get evenly distributed times. In one
test, the "fastest" tile had a sum of 3210 (this is the sum of pixels in
the histogram image; in some unspecified unit) and the fastest had
61467. That's too varying. It could very well mean 2 minutes and 40
minutes respectively.
The problem seems to be when splitting a slow area that contains one
fast area and one very slow area (relatively, in pixels per second; of
course it's faster than the big tile it belongs to!).
Any ideas? I'll post some more images later.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
You already have the histogram rendered with some quick settings or a
previous full render and want to use it to divide the work ?
Deciding the perfect division is going to be a 'hard' problem. So you
need a 'near enough' heuristic.
Allowing any rectangular subdivision is also going to lead to difficulty
determining the best packing.
So how about this simple approach:
- Define minT and maxT as the minimum and maximum 'time' in those
arbitrary units that you want a block to take. Eg. minT = 5000, maxT =
10000. You could probably come up with a rule of thumb for these based
on the total time estimate for the entire image and the number of
available processors
- Consider scan lines and proceed left to right
- If T(line) < maxT
- If (T(line) + T(line+1)) < maxT Then accumalate line, line+1 (etc)
into a single block so long as it fits under maxT
- Else output line as a block
- Else split the line
- In splitting the line accumulate T(pixel) until it is > minT and <
maxT. Could even split it as close to proportionally as possible. For
example if T(line) = 6500 then you would want parts such that T(part1) ~
3250 and T(part2) ~ 3250 rather than T(part1) = 5000 and T(part2) = 1500.
You could get some blocks that take less than minT but probably not
many. You'll only get blocks that take more than maxT if a single pixel
is the cause and there is nothing to be done about that.
Also I'd suggest sorting the blocks in descending time and passing out
the heaviest ones first. That way your render won't wait for some large
block to finish after everything else is done.
You might also develop a measure Q of the efficiency of different
algorithms. If the Q of a simple algorithm is near enough to the
optimum and in the same ballpark as the Q of a fancy algorithm then it
is good enough.
It would be interesting to see those division images for different
approaches.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> You already have the histogram rendered with some quick settings or a
> previous full render and want to use it to divide the work ?
Correct.
> Deciding the perfect division is going to be a 'hard' problem. So you
> need a 'near enough' heuristic.
And correct. The histogram won't be anywhere near perfect anyway,
because I might disable effects or lower photon quality or lower
resolution etc. for generating the histogram. I just need it good enough
to differentiate a checkered plane and a glass with photons!
> Allowing any rectangular subdivision is also going to lead to difficulty
> determining the best packing.
>
> So how about this simple approach:
>
> - Define minT and maxT as the minimum and maximum 'time' in those
> arbitrary units that you want a block to take. Eg. minT = 5000, maxT =
> 10000.
POV-Ray histograms are normalized so that the brightest pixel is at the
highest possible value on that image depth. So I don't really know how
to map pixel values to an actual time unit, as the factor changes with
each histogram
The exception is CSV histograms, which I should start using instead of
images... Those have a fixed unit.
> You could probably come up with a rule of thumb for these based
> on the total time estimate for the entire image and the number of
> available processors
I don't really know how many processors will end up "participating". I
generate way more tiles than computers, then let each computer queue up
multiple tiles.
> - Consider scan lines and proceed left to right
> [...]
Okay, all this I will read and analyze tomorrow. It's already midnight
here :)
> It would be interesting to see those division images for different
> approaches.
From my current approach:
http://stuff.povaddict.com.ar/division/1/
Numbers are the sum of pixels inside that area. Red rectangles are the
"slowest" (highest number) of that image. It's that rectangle which gets
split in the next iteration. Green rectangles are the "fastest" (lowest
number). I added those colors mainly to notice how much the numbers vary.
There is also histgram.png, the histogram used for those images, and a
zip with all the images, for your downloading convenience.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Nicolas Alvarez wrote:
> Any ideas? I'll post some more images later.
Can you tile by raster lines instead of by 8x8 squares? That should
give you much greater consistency between sections.
Regards,
John
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> The problem seems to be when splitting a slow area that contains one fast
> area and one very slow area (relatively, in pixels per second; of course
> it's faster than the big tile it belongs to!).
>
> Any ideas? I'll post some more images later.
Why not split the image repeatedly (alternating horizontal and vertical
splits), but for every split choose the split position so that the sum of
pixels is the same on either side of the split. When the sum of pixels
reaches some minimum value, or the rectangle is some minimum size, don't
split anymore.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> Nicolas Alvarez wrote:
>> Any ideas? I'll post some more images later.
>
> Can you tile by raster lines instead of by 8x8 squares? That should
> give you much greater consistency between sections.
Each tile can have any size. Even 1x1 if I wanted to (although that
would waste time if antialiasing is involved; afaik POV-Ray would
actually render four pixels).
I'm not using "8x8" anywhere. It's just my current script is splitting
the image always in half, so yeah, borders are probably all multiples of
8 at that depth :)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> http://stuff.povaddict.com.ar/split.png
> http://stuff.povaddict.com.ar/split2.png
> From my current approach:
> http://stuff.povaddict.com.ar/division/1/
Images were down all night; links now back working.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On Thu, 27 Mar 2008 12:23:32 -0300, Nicolas Alvarez
<nic### [at] gmailisthebestcom> wrote:
>> http://stuff.povaddict.com.ar/split.png
>> http://stuff.povaddict.com.ar/split2.png
>
>> From my current approach:
>> http://stuff.povaddict.com.ar/division/1/
>
>Images were down all night; links now back working.
Sorry
Internet Explorer cannot display the webpage
--
Regards
Stephen
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |