POV-Ray : Newsgroups : povray.off-topic : Haskell raving : Re: Haskell raving Server Time
15 Nov 2024 01:16:48 EST (-0500)
  Re: Haskell raving  
From: Orchid XP v7
Date: 31 Oct 2007 16:15:52
Message: <4728f088$1@news.povray.org>
Warp wrote:

>   However, I'm wondering what happens with huge files (which won't fit
> in memory) and which you read thoroughly. It could be, for example, a
> video file (which could be, for example, 3 gigabytes big, while your
> computer has only eg. 1 gigabyte of RAM).
>   The usual way of doing this is to read a bunch of data from the file
> (for example some megabytes), process it, discard it, read a new bunch
> of data from the file, etc.
> 
>   What I'm wondering is what a language like haskell will do in this case,
> even though it has lazy evaluation. In fact, lazy evaluation is of no
> help here because you are, after all, reading the entire file. If you
> use the completely abstract way you could end up having the haskell
> program trying to read the entire file into memory, thus running out of it.

Not so.

>   You have to somehow be able to tell it that you are only going to need
> small bunches at a time, and after they have been processed, they can be
> discarded. I wonder if there's a really abstract way of saying this in
> haskell, or whether you need to go to ugly low-level specifics.

There is. It's called a garbage collector. ;-)

Simply let lazy evaluation read the file as required, and the GC can 
delete the data you've already processed transparently behind you.

In this way, linearly processing a file in constant RAM usage is pretty 
trivial in Haskell.

Now, *random* access... *that* presents a bit more of a problem. (For 
the love of God, don't accidentally hang on to pointers you don't need 
any more! You could end up loading the entire file into RAM - and, 
obviously, that would be "bad".)


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.