|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>> libpng only needs the current and previous row to run the pixel filter.
>> I have no idea what the DEFLATE compressor needs.
>
> Compression algorithms almost invariably use a buffer for the data, and
> this buffer is often surprisingly small (like some tens of kilobytes).
> (This means that most compression algorithms will *not* find and compress
> repetitions which are gigabytes apart from each other. This is for the
> simple reason that the compression routine has to use only a limited amount
> of memory and be relatively fast.)
>
> Some modern algorithms might use larger buffers (like some megabytes),
> but they nevertheless don't achieve enormously better compression ratios
> (for some reason there seems to be some kind of cutoff point after which
> enlarging the compression buffer does not improve the compression of average
> data significantly, but slows down the compression more than it's worth).
DEFLATE is ancient. I doubt it needs a lot of space. Most particularly,
the space required by DEFALTE is *unrelated* to the dimensions of the
PNG image. The PNG spec says the pixel data is optionally "filtered",
and then fed to DEFLATE as raw data, which DEFLATE then compresses in
the usual way.
> The problem with compressing exabytes of data is not the compression
> algorithm itself, but the file format into which the compressed data is
> stored.
>> I note that to store (2^31 - 1) x (2^31 - 1) pixels, where each pixel
>> requires exactly 4 bytes, requires about 18 exabytes of storage. (!!)
>
> Depends on what those pixels contain, and how smart your compression
> algorithm is.
I was thinking more of the amount of RAM required if you were to store
the entire thing uncompressed. I would expect any half-decent
compression algorithm to improve on this significantly.
> I don't think many file systems even support files which are exabytes
> large.
I agree.
(Maybe Google BigFile or whatever the hell it's called?)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> Thus it's perfectly possible for POV-Ray to render a 2^31 x 2^31 pixels
> large image even on a system which limits memory addresses to be smaller
> than that.
>
> (Of course you won't be able to *save* that image anywhere because you
> would encounter limitations in the file system. But that doesn't stop
> POV-Ray from *rendering* the image.)
No - geological time stops you from rendering it. ;-)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> I think PNG has options to filter with the previous row, the previous
> column, or both.
Yes.
Each row can be filtered in one of these ways (or not filtered at all).
The choice is independent per row.
> I assume that if you choose to filter with the previous
> column only, you don't need to store the previous row of pixels in memory
> at all.
Yes.
> (I don't know if libpng optimizes in this way, but at least in
> theory I think it could be possible to implement png compression like that.)
Indeed - I have no idea if libpng has an option to control this...
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> I don't think many file systems even support files which are exabytes
> large.
There are a number that support 2^64 clusters, I think. People are beginning
to realize that if you actually need more than 2^64 clusters, you're not
going to want to copy that all over to a new file system when you run out of
space. :-)
Both NTFS and VFS in theory support up to 8 exabytes on a partition.
--
Darren New, San Diego CA, USA (PST)
I ordered stamps from Zazzle that read "Place Stamp Here".
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Darren New <dne### [at] sanrrcom> wrote:
> Both NTFS and VFS in theory support up to 8 exabytes on a partition.
Actually wikipedia claims that NTFS supports partitions and files of
16 exabytes. That's a bit better than ReiserFS which supports "only"
8 terabytes.
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> Invisible <voi### [at] devnull> wrote:
>> If you generate a PNG image without any fancy filtering, then you don't
>> need any extra memory except for the DEFLATE compressor. But usually PNG
>> files use filtering to increase compression. (I don't know, but I'd
>> *hope* libpng provides a way to control this...) If filtering, you may
>> need the previous row of pixels [only].
>
> I think PNG has options to filter with the previous row, the previous
> column, or both. I assume that if you choose to filter with the previous
> column only, you don't need to store the previous row of pixels in memory
> at all. (I don't know if libpng optimizes in this way, but at least in
> theory I think it could be possible to implement png compression like
> that.)
You need to store the previous row in POV-Ray anyway, to know if you need to
antialias the current pixel :)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Thank you all for your posts. Quite an insight.
> (BTW, why did you post this in povray.off-topic?)
Because it was just an idle thought. I saw some pov-rendered posters. I then
pondered upon resolution and pixel-sizes for an A0-poster and wondered how
far you can realistically go. And how long it would take.
I realize of course that render-time is dependent on the scene to be
rendered. I just hoped to get to know if render-time behaves proportionally
to the numer of pixels to be rendered or if render time increases
exponentially with the number of pixels at really high resolutions.
Do you have any experience?
Furthermore I did not wish to bother anybody with this who might be occupied
with productive work. Whoever is browsing through povray.off-topic is
probably somewhat bored and wishes to pass time (probably until the current
render finishes) ;-)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Nicolas Alvarez <nic### [at] gmailcom> wrote:
> You need to store the previous row in POV-Ray anyway, to know if you need to
> antialias the current pixel :)
Not in 3.7, which renders one small square at a time.
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Invisible escreveu:
> TC wrote:
>
>> What is the largest image you have rendered and how long did it take?
>
> I've rendered stuff for Zazzle at silly resolutions. (It takes a *long*
> time with a 32-bit CPU.)
>
> Let me go check... Yeah, that was 8,000 x 6,000 pixels.
That`s truly insane! Some fractal, I guess?
how about render time? Days, weeks or months? :)
--
a game sig: http://tinyurl.com/d3rxz9
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Nicolas Alvarez wrote:
> You need to store the previous row in POV-Ray anyway, to know if you need to
> antialias the current pixel :)
I was under the impression that with AA enabled, *all* pixels are
supersampled. (But with adaptive supersampling, it shoots 4 rays before
deciding whether to supersample further...)
--
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |