POV-Ray : Newsgroups : povray.advanced-users : Problem rendering large image Server Time
23 Dec 2024 08:02:16 EST (-0500)
  Problem rendering large image (Message 1 to 10 of 10)  
From: Jos leys
Subject: Problem rendering large image
Date: 18 Oct 2012 07:15:01
Message: <web.507fe48eb1b5e2779a9ee9f50@news.povray.org>
When attempting to render a 6000*8000 pixel image (V 3.7.0 RC6 msvc10.win64), I
get the following message :

Intermediate image storage backing file write failed at creation.
Failed to start render: Cannot access data in file.

Smaller size images render without any problem.
The quickres.ini entry is :

[6000x8000, AA 0.3]
Width=6000
Height=8000
Antialias=On
Antialias_Threshold=0.3

Anything I can do about this?


Post a reply to this message

From: Warp
Subject: Re: Problem rendering large image
Date: 18 Oct 2012 13:40:36
Message: <50803f14@news.povray.org>
Jos leys <jos### [at] pandorabe> wrote:
> When attempting to render a 6000*8000 pixel image (V 3.7.0 RC6 msvc10.win64), I
> get the following message :

> Intermediate image storage backing file write failed at creation.
> Failed to start render: Cannot access data in file.

I'm guessing that the file is hitting the magic 2-gigabyte line, but
I'm not sure why that should matter, especially since you are using
the 64-bit version of POV-Ray. (Which filesystem are you using? The
only one that has a 2-gigabyte limit is FAT32, but it would be really
strange if your system were using that. Could still be a possibility.)

Someone who has more detailed knowledge about the internal working of
the intermediate file should be able to give a better answer.

-- 
                                                          - Warp


Post a reply to this message

From: Jos leys
Subject: Re: Problem rendering large image
Date: 18 Oct 2012 14:00:01
Message: <web.5080426d8203a6b9a9ee9f50@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:

> I'm guessing that the file is hitting the magic 2-gigabyte line, .....

It's strange because I did an 8000*8000 pixel image before , but this was with
an earlier version of 3.7, so I wonder if something changed in this area during
the last release...


Post a reply to this message

From: Jos leys
Subject: Re: Problem rendering large image
Date: 18 Oct 2012 14:05:01
Message: <web.508044468203a6b9a9ee9f50@news.povray.org>
"Jos leys" <jos### [at] pandorabe> wrote:

If some good soul with the same Povray version installed on a 64bit Windows
system could try and render a large (say 8000*8000) image, then it would be
clearer if the culprit is my computer or the program...


Post a reply to this message

From: Christian Froeschlin
Subject: Re: Problem rendering large image
Date: 18 Oct 2012 14:33:17
Message: <50804b6d$1@news.povray.org>
Jos leys wrote:

> If some good soul with the same Povray version installed on a 64bit Windows
> system could try and render a large (say 8000*8000) image, then it would be
> clearer if the culprit is my computer or the program...

tried this with a simple sphere and checkered plane, memory use went
up to about 1 GB during the first 20 seconds (which were also reported
as render time). Reported render state was about 20%, memory use then 
slowly trickled down over the next minutes as the render completed.
I assume the last part is actually some transfer time, data shoveling
and image encoding.

I would expect the extra memory requirement for high resolution
renders to be mostly scene-independent. But I could be wrong.


Post a reply to this message

From: Le Forgeron
Subject: Re: Problem rendering large image
Date: 19 Oct 2012 05:30:11
Message: <50811da3$1@news.povray.org>
Le 18/10/2012 19:40, Warp nous fit lire :
> Jos leys <jos### [at] pandorabe> wrote:
>> When attempting to render a 6000*8000 pixel image (V 3.7.0 RC6 msvc10.win64), I
>> get the following message :
> 
>> Intermediate image storage backing file write failed at creation.
>> Failed to start render: Cannot access data in file.
> 
> I'm guessing that the file is hitting the magic 2-gigabyte line, but
> I'm not sure why that should matter, especially since you are using
> the 64-bit version of POV-Ray. (Which filesystem are you using? The
> only one that has a 2-gigabyte limit is FAT32, but it would be really
> strange if your system were using that. Could still be a possibility.)
> 
> Someone who has more detailed knowledge about the internal working of
> the intermediate file should be able to give a better answer.
> 

The 2 Gigabyte line should be met with
Intermediate image storage backing file write/seek failed at creation.
Normally...

(that's the one that's doing the lseek64/seek_set call to reach the tail
of the file at creation: combined with the next write, it will allocate
the full file on the disk at creation time, you do not want the write to
fail later due to a disk full after 72 hours of rendering ;-) (unless
the OS handle it with an hollow file, which windows does not, IIRC) )

The message "Intermediate image storage backing file write failed at
creation." is due when the actual write of 3 integers (well, it's  wider
than "int", currently it's a "long") fails (or rather: it should have
returned the actual size written, and it does not match the size.

To sum up: (oversimplifying the actual code)


long long last_position;
//value is about height*width * 5 * sizeof(COLC)
//  it might get a bit more for rounding of blocking factor
// it's about 2 400 000 000 in this case
lseek64(the_file_descriptor, last_position, SEEK_SET)

long info[3];
info[0] = something;
// the data structure size for a single pixel in the cache
info[1] = more; // the width
info[2] = yet; // the height
if (write(the_file_descriptor ,&info[0],3*sizeof(long)) !=
3*sizeof(long)) { Your Failure }

Any clue why it might fails on Windows ?
Does the lseek be deferred until actual usage ? (which write does)
A packing issue with the array of long ? (but even if the written data
are bogus, we asked for X bytes to be written, we should get X bytes
written, no more, no less. (at worst, X could be shorter than the
coverage of info[], or do you see a way to access bytes out of info[]
scope ?)
Or the call to write was interrupted ? (Windows actually has to allocate
the whole file now on disk: if the allocation mechanism suck, it can
take a lot of time: it has to write allocated and write all the needed
sectors, that's a write of 2.4G file here... windows is not know to
handle such request (of nearly empty file) as lightly as linux
(according the actual filesystem, linux might just somehow fake it with
an hollow file, whereas Windows will actually make the plain file right
now: with a write to disk at 50MB/s (my reference for hard disk speed
I/O), that's about 45 seconds of optimal I/O... may be more, as there is
also the underlying filesystem structures to update)

Is there on Windows a limit to the filesize that a user can create ?
(man of write talk about RLIMIT_FSIZE resource)


Post a reply to this message

From: Jos leys
Subject: Re: Problem rendering large image
Date: 19 Oct 2012 13:00:01
Message: <web.508185eb8203a6b9a9ee9f50@news.povray.org>
Well, after this problem, I turned on my PC this morning and attempted again to
render the 8000*8000 file.
Off it went, without any complaints this time!

No error messages, just smooth rendering. I don't understand at all. Was it
because I had some other program open yesterday?

Anyway, thanks to all who tried to help..


Post a reply to this message

From: Ive
Subject: Re: Problem rendering large image
Date: 19 Oct 2012 13:20:32
Message: <50818be0$1@news.povray.org>
Am 19.10.2012 11:30, schrieb Le_Forgeron:

> Any clue why it might fails on Windows ?
> Does the lseek be deferred until actual usage ? (which write does)
> A packing issue with the array of long ? (but even if the written data
> are bogus, we asked for X bytes to be written, we should get X bytes
> written, no more, no less. (at worst, X could be shorter than the
> coverage of info[], or do you see a way to access bytes out of info[]
> scope ?)
> Or the call to write was interrupted ? (Windows actually has to allocate
> the whole file now on disk: if the allocation mechanism suck, it can
> take a lot of time: it has to write allocated and write all the needed
> sectors, that's a write of 2.4G file here... windows is not know to
> handle such request (of nearly empty file) as lightly as linux
> (according the actual filesystem, linux might just somehow fake it with
> an hollow file, whereas Windows will actually make the plain file right
> now: with a write to disk at 50MB/s (my reference for hard disk speed
> I/O), that's about 45 seconds of optimal I/O... may be more, as there is
> also the underlying filesystem structures to update)
>
> Is there on Windows a limit to the filesize that a user can create ?
> (man of write talk about RLIMIT_FSIZE resource)
>

FWIW I've just created a 16000x9800 pixel file (an OpenEXR, but PNG 
works the same) with 3.7RC6 msvc10.win64 and Windows 7 without any problems.
And the delay between parsing (not much to parse) and the render start 
was about 3 seconds.

-Ive


Post a reply to this message

From: Le Forgeron
Subject: Re: Problem rendering large image
Date: 19 Oct 2012 18:52:45
Message: <5081d9bd$1@news.povray.org>
Le 19/10/2012 19:18, Ive nous fit lire :
> 
> FWIW I've just created a 16000x9800 pixel file (an OpenEXR, but PNG
> works the same) with 3.7RC6 msvc10.win64 and Windows 7 without any
> problems.
> And the delay between parsing (not much to parse) and the render start
> was about 3 seconds.

traditional hard disk or SSD ?
Should I revise my reference speed ?
or maybe they now make a better OS with more file caching or handling of
hollow file.


Post a reply to this message

From: clipka
Subject: Re: Problem rendering large image
Date: 24 Oct 2012 05:12:38
Message: <5087b106$1@news.povray.org>
Am 18.10.2012 20:33, schrieb Christian Froeschlin:

> tried this with a simple sphere and checkered plane, memory use went
> up to about 1 GB during the first 20 seconds (which were also reported
> as render time). Reported render state was about 20%, memory use then
> slowly trickled down over the next minutes as the render completed.
> I assume the last part is actually some transfer time, data shoveling
> and image encoding.
>
> I would expect the extra memory requirement for high resolution
> renders to be mostly scene-independent. But I could be wrong.

As a matter of fact, on Windows it isn't: Due to some quirks of the 
communication between the render threads, simple scenes eat /more/ 
memory than complex ones (because the render threads generate image data 
way faster than it can be written to the interim file).


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.