POV-Ray : Newsgroups : povray.off-topic : Largest POV image? Server Time
9 Oct 2024 01:15:46 EDT (-0400)
  Largest POV image? (Message 7 to 16 of 56)  
<<< Previous 6 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Invisible
Subject: Re: Largest POV image?
Date: 22 Oct 2009 10:33:07
Message: <4ae06d23$1@news.povray.org>
>> Any theories on how many GB of RAM would be required to render that?
>> (Assuming you turn off the display preview - apparently some people
>> don't know you can do this...)
> 
> In theory, POV-Ray needs to store (width*2) pixels. libpng may need to store
> a bunch more rows at once while compressing.

libpng only needs the current and previous row to run the pixel filter. 
I have no idea what the DEFLATE compressor needs. (Other than a complete 
redesign...) All assuming you're not going to generate interlaced PNG, 
that is.

I note that to store (2^31 - 1) x (2^31 - 1) pixels, where each pixel 
requires exactly 4 bytes, requires about 18 exabytes of storage. (!!)

http://www.wolframalpha.com/input/?i=(2^31+-+1)^2+*+4+bytes

Note: 1 exabyte = 1,000 petabytes = 1,000,000 terabytes (or thereabouts).

Nobody has anything approaching that amount of disk space, never mind RAM.

Also, if POV-Ray achieves a million pixels per second (highly unlikely), 
that gives us...

http://www.wolframalpha.com/input/?i=(2^31+-+1)^2+%2F+1000000+*+1+second

...almost 150 millennia. So it seems pretty moot to me. :-P


Post a reply to this message

From: Nicolas Alvarez
Subject: Re: Largest POV image?
Date: 22 Oct 2009 11:06:03
Message: <4ae074db@news.povray.org>
Invisible wrote:
> libpng only needs the current and previous row to run the pixel filter.
> I have no idea what the DEFLATE compressor needs. (Other than a complete
> redesign...) All assuming you're not going to generate interlaced PNG,
> that is.

Well, with antialiasing off, it should be possible (in theory) to produce an
interlaced PNG with O(1) memory too. You'd have to modify POV-Ray to
compute the pixels in the same order they're stored in the PNG file.


Post a reply to this message

From: clipka
Subject: Re: Largest POV image?
Date: 22 Oct 2009 11:11:35
Message: <4ae07627$1@news.povray.org>
Warp schrieb:

>   As for POV-Ray itself, I bet the limit is 2147483648x2147483648 pixels
> (or maybe even 4294967296x4294967296 pixels).

On contemporary x64 machines, the limit will be much smaller - some 4e6 
x 4e6 pixels; more than that, and you'll crash through the 256 TeraByte 
address limit (contrary to legend, contemporary x64 processors only 
support 48-bit addresses, despite the architecture being designed for 
future extensibility to full-fledged 64-bit addresses).

Windows will not allow more than some 7e5 x 7e5 pixels (8 TB), while 
Linux will not go beyond some 3e6 x 3e6 pixels (128 TB) due to 
per-process address space limits.


Post a reply to this message

From: clipka
Subject: Re: Largest POV image?
Date: 22 Oct 2009 11:13:16
Message: <4ae0768c$1@news.povray.org>
Nicolas Alvarez schrieb:

> In theory, POV-Ray needs to store (width*2) pixels. libpng may need to store
> a bunch more rows at once while compressing.

Hum... yes, for POV-Ray 3.6 this is true. I was thinking of POV-Ray 3.7 
only.


Post a reply to this message

From: clipka
Subject: Re: Largest POV image?
Date: 22 Oct 2009 11:14:44
Message: <4ae076e4$1@news.povray.org>
Invisible schrieb:

> So it seems pretty moot to me. :-P

A bit, yes. :-)


Post a reply to this message

From: Warp
Subject: Re: Largest POV image?
Date: 22 Oct 2009 11:21:12
Message: <4ae07867@news.povray.org>
Invisible <voi### [at] devnull> wrote:
> > In theory, POV-Ray needs to store (width*2) pixels.

  Is that so even if you don't use antialiasing?

> > libpng may need to store
> > a bunch more rows at once while compressing.

> libpng only needs the current and previous row to run the pixel filter. 
> I have no idea what the DEFLATE compressor needs.

  Compression algorithms almost invariably use a buffer for the data, and
this buffer is often surprisingly small (like some tens of kilobytes).
(This means that most compression algorithms will *not* find and compress
repetitions which are gigabytes apart from each other. This is for the
simple reason that the compression routine has to use only a limited amount
of memory and be relatively fast.)

  Some modern algorithms might use larger buffers (like some megabytes),
but they nevertheless don't achieve enormously better compression ratios
(for some reason there seems to be some kind of cutoff point after which
enlarging the compression buffer does not improve the compression of average
data significantly, but slows down the compression more than it's worth).

  The problem with compressing exabytes of data is not the compression
algorithm itself, but the file format into which the compressed data is
stored. (Theoretically if the data is compressed as a stream then there
shouldn't be any theoretical limit to how much data you can compress.
However, if the file format has headers and other data expressing the
amount of compressed data, you may run into limitations.)

> I note that to store (2^31 - 1) x (2^31 - 1) pixels, where each pixel 
> requires exactly 4 bytes, requires about 18 exabytes of storage. (!!)

  Depends on what those pixels contain, and how smart your compression
algorithm is. If the image is completely filled with black pixels, using
a smart compression algorithm could compress it down to some tens of
bytes. (Actual compression algorithms, however, often fail to do this
because of the limited compression buffer. Some of the most naive
algorithms achieve surprisingly poor compression ratios when the input
is full of the same byte value. Other smarter algorithms are more capable
of taking advantage of this.)

  Of course if your image actually contains something meaningful, the
best you can hope to compress (losslessly) is something like to 1/10th
of the original size.

  I don't think many file systems even support files which are exabytes
large.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Largest POV image?
Date: 22 Oct 2009 11:24:51
Message: <4ae07943@news.povray.org>
clipka <ano### [at] anonymousorg> wrote:
> Warp schrieb:

> >   As for POV-Ray itself, I bet the limit is 2147483648x2147483648 pixels
> > (or maybe even 4294967296x4294967296 pixels).

> On contemporary x64 machines, the limit will be much smaller - some 4e6 
> x 4e6 pixels; more than that, and you'll crash through the 256 TeraByte 
> address limit (contrary to legend, contemporary x64 processors only 
> support 48-bit addresses, despite the architecture being designed for 
> future extensibility to full-fledged 64-bit addresses).

> Windows will not allow more than some 7e5 x 7e5 pixels (8 TB), while 
> Linux will not go beyond some 3e6 x 3e6 pixels (128 TB) due to 
> per-process address space limits.

  But those are not limitations of POV-Ray, they are limitations of the
hardware and/or operating system.

  If POV-Ray were to be compiled on a system without those limitations,
it could probably achieve images of those sizes (in theory, of course,
because in practice it would take millenia to render such an image, as
noted elsewhere).

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Largest POV image?
Date: 22 Oct 2009 11:27:54
Message: <4ae079fa@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
> clipka <ano### [at] anonymousorg> wrote:
> > Warp schrieb:

> > >   As for POV-Ray itself, I bet the limit is 2147483648x2147483648 pixels
> > > (or maybe even 4294967296x4294967296 pixels).

> > On contemporary x64 machines, the limit will be much smaller - some 4e6 
> > x 4e6 pixels; more than that, and you'll crash through the 256 TeraByte 
> > address limit (contrary to legend, contemporary x64 processors only 
> > support 48-bit addresses, despite the architecture being designed for 
> > future extensibility to full-fledged 64-bit addresses).

> > Windows will not allow more than some 7e5 x 7e5 pixels (8 TB), while 
> > Linux will not go beyond some 3e6 x 3e6 pixels (128 TB) due to 
> > per-process address space limits.

>   But those are not limitations of POV-Ray, they are limitations of the
> hardware and/or operating system.

>   If POV-Ray were to be compiled on a system without those limitations,
> it could probably achieve images of those sizes (in theory, of course,
> because in practice it would take millenia to render such an image, as
> noted elsewhere).

  Actually, thinking about it, you confused me.

  POV-Ray doesn't need to keep the entire image in memory in order to render
it (after all, POV-Ray was developed on systems with limited amount of memory,
yet was able to render images larger than any conceivable RAM size back then).

  Thus it's perfectly possible for POV-Ray to render a 2^31 x 2^31 pixels
large image even on a system which limits memory addresses to be smaller
than that.

  (Of course you won't be able to *save* that image anywhere because you
would encounter limitations in the file system. But that doesn't stop
POV-Ray from *rendering* the image.)

-- 
                                                          - Warp


Post a reply to this message

From: Invisible
Subject: Re: Largest POV image?
Date: 22 Oct 2009 11:31:38
Message: <4ae07ada$1@news.povray.org>
>> libpng only needs the current and previous row to run the pixel filter.
>> I have no idea what the DEFLATE compressor needs. (Other than a complete
>> redesign...) All assuming you're not going to generate interlaced PNG,
>> that is.
> 
> Well, with antialiasing off, it should be possible (in theory) to produce an
> interlaced PNG with O(1) memory too. You'd have to modify POV-Ray to
> compute the pixels in the same order they're stored in the PNG file.

If you generate a PNG image without any fancy filtering, then you don't 
need any extra memory except for the DEFLATE compressor. But usually PNG 
files use filtering to increase compression. (I don't know, but I'd 
*hope* libpng provides a way to control this...) If filtering, you may 
need the previous row of pixels [only].


Post a reply to this message

From: Warp
Subject: Re: Largest POV image?
Date: 22 Oct 2009 11:34:32
Message: <4ae07b88@news.povray.org>
Invisible <voi### [at] devnull> wrote:
> If you generate a PNG image without any fancy filtering, then you don't 
> need any extra memory except for the DEFLATE compressor. But usually PNG 
> files use filtering to increase compression. (I don't know, but I'd 
> *hope* libpng provides a way to control this...) If filtering, you may 
> need the previous row of pixels [only].

  I think PNG has options to filter with the previous row, the previous
column, or both. I assume that if you choose to filter with the previous
column only, you don't need to store the previous row of pixels in memory
at all. (I don't know if libpng optimizes in this way, but at least in
theory I think it could be possible to implement png compression like that.)

-- 
                                                          - Warp


Post a reply to this message

<<< Previous 6 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.