POV-Ray : Newsgroups : povray.off-topic : Found the dynamic optimization things... Server Time
30 Sep 2024 17:23:04 EDT (-0400)
  Found the dynamic optimization things... (Message 18 to 27 of 27)  
<<< Previous 10 Messages Goto Initial 10 Messages
From: Warp
Subject: Re: Found the dynamic optimization things...
Date: 23 Sep 2008 17:58:38
Message: <48d9668e@news.povray.org>
Orchid XP v8 <voi### [at] devnull> wrote:
> GHC [currently the best Haskell compiler] used to be able to compile 
> DLLs. At some point that code path bitrotted and stopped working; I'm 
> not sure what the current status is.

  I faintly remember asking about this subject in the other group once.
I think it was about how Haskell manages (or if it manages at all) to
create precompiled (possibly dynamically-loadable) libraries which
nevertheless work with any user-defined type and user-defined functions.

  In object-oriented programming this is typically achieved with dynamic
binding (ie. virtual functions): The precompiled library doesn't need
to know the user-defined type as long as it has been inherited from a
base class defined in the library. The virtual table attached to objects
of that class allows the precompiled library to call the proper user-defined
function. (In a way you could say that the user code is telling the library
which function to call for each of the virtual functions in the base class.)

  However, how does haskell manage to do this? Suppose you have something
like this precompiled into a dynamically loadable library:

foldl1 (+) someList

  Also assume the user has defined his own element type for the list and
the (+) function for that element type, and then calls the dynamically
loaded library by giving it a list with elements of that type. How does
the library know which (+) function to call? How does it know that a (+)
function exists for that element type in the first place?

-- 
                                                          - Warp


Post a reply to this message

From: pan
Subject: Re: Found the dynamic optimization things...
Date: 23 Sep 2008 18:01:07
Message: <48d96723@news.povray.org>
"Warp" <war### [at] tagpovrayorg> wrote in message 
news:48d92c0d@news.povray.org...
> Darren New <dne### [at] sanrrcom> wrote:
>> http://steve-yegge.blogspot.com/2008/05/dynamic-languages-strike-back.html
>
>  With all the talk about fancy new languages which are so 
> super-efficient
> and so super-fast and so super-secure and whatever, I not seen 
> too much
> discussion about one thing: What about dynamically loadable 
> libraries?

dlls make portable software hard.


Post a reply to this message

From: Darren New
Subject: Re: Found the dynamic optimization things...
Date: 23 Sep 2008 21:42:35
Message: <48d99b0b$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> Each process does have its own garbage collector, but that's good 
>> because each process can use a different garbage collector, depending on 
>> the types of garbage it collects.
> 
>   But doesn't that make each program selfish? In other words, it only takes
> care of itself not running out of memory but completely disregards any other
> program running in the system at the same time?

Yes? And how is that different from, say, Windows or Linux? Maybe I'm 
not understanding your point.

I don't think I've ever written a program for Windows or Linux and 
worried about how much memory other processes in the system might be 
using at the moment. I think I did exactly one program like that on the 
Amiga. Not that I intentionally used more memory than I needed, mind.

What I was describing in that quote is this: The real-time programs use 
a real-time GC, the interactive programs use an interactive GC. It 
really doesn't have too much to do with how *much* memory they use.

Obviously if you have your memory management hardware turned off, you're 
going to have a hard time doing demand paging of stuff off the disk, but 
systems worked with that restriction for years and years. And they have 
mechanisms in place to let you turn that on for particular programs that 
want it, IIRC.

-- 
Darren New / San Diego, CA, USA (PST)


Post a reply to this message

From: Darren New
Subject: Re: Found the dynamic optimization things...
Date: 23 Sep 2008 21:47:49
Message: <48d99c45$1@news.povray.org>
Warp wrote:
>   In a typical modern OS there may be hundreds of processes running even
> if the system has just booted up and the user has not started any program
> of his own. There are all kinds of drivers, services, task managers,
> window managers, firewalls... you name it. Every single one of them uses
> the same system libraries (eg. typically libc plus a few others in linux).

I understand that. How much of libc is actually used in common by a 
majority of those programs? What sorts of things do you think are in 
libc that's used by all the programs, other than (let's see) I/O and 
perhaps the floating point stuff (which probably isn't used by too many 
device drivers, task managers, or firewalls :-).

In any case, like I said, there are possibilities in that system that 
you could have libraries like that available at certain shared physical 
addresses if you wanted, or being set up as services in their own 
process, for large discrete-functionality packages. Indeed, this is just 
exactly how the kernel itself is set up.  They just don't do it for 
other packages, yet, as far as I know.

>   Just because the user has not started any program doesn't mean there
> isn't a big bunch of programs running.

Yes. And far more in Singularity, because all those things actually are 
separate processes.

-- 
Darren New / San Diego, CA, USA (PST)


Post a reply to this message

From: Invisible
Subject: Re: Found the dynamic optimization things...
Date: 24 Sep 2008 04:55:07
Message: <48da006b$1@news.povray.org>
Warp wrote:

>   I faintly remember asking about this subject in the other group once.
> I think it was about how Haskell manages (or if it manages at all) to
> create precompiled (possibly dynamically-loadable) libraries which
> nevertheless work with any user-defined type and user-defined functions.

For statically-linked libraries, GHC needs both the object code and an 
"interface file" which tells it everything it could need to know about 
the stuff exported from the module (e.g., how many bytes of space does 
this exported type take up?) For dynamically-linked libraries, it's more 
tricky.

>   In object-oriented programming this is typically achieved with dynamic
> binding (ie. virtual functions): The precompiled library doesn't need
> to know the user-defined type as long as it has been inherited from a
> base class defined in the library. The virtual table attached to objects
> of that class allows the precompiled library to call the proper user-defined
> function. (In a way you could say that the user code is telling the library
> which function to call for each of the virtual functions in the base class.)
> 
>   However, how does haskell manage to do this? Suppose you have something
> like this precompiled into a dynamically loadable library:
> 
> foldl1 (+) someList
> 
>   Also assume the user has defined his own element type for the list and
> the (+) function for that element type, and then calls the dynamically
> loaded library by giving it a list with elements of that type. How does
> the library know which (+) function to call? How does it know that a (+)
> function exists for that element type in the first place?

The foldl1 function expects a list and a function that can operate on 
the elements of that list. In this case, all that happens is that the 
*caller* passes foldl1 a pointer to the appropriate implementation of 
(+) for the data type in question. The foldl1 function itself knows 
nothing about this; it's up to the caller. Since the caller presumably 
"knows about" this user-defined data type, that's no problem.

A more interesting example might be if somebody does

   sum someList

Now the library function "sum" is being passed a list. The type checker 
will ensure that the list can be summed, but how does the precompiled 
"sum" function know what the hell function to sum it with?

The answer is that under the covers, the compiled "sum" function 
actually takes an extra parameter pointing to a virtual table - exactly 
like in an OOP language. The caller has to provide a pointer to the 
correct table for the type in question.

And what if the caller doesn't know the type either? Well then the 
caller also takes a pointer as an extra hidden argument. And so on and 
so forth until we reach a point in the code where the type *is* 
statically known.

To summarise: if a type is statically known, the correct function is 
looked up at compile time. If a type is unknown at compile time, a 
virtual table pointer is secretly passed in.

(Actually, "sum" is a tiny little function, so it's rather likely to be 
inlined. If the function it's immediately inlined into knows the type 
statically, all the vtable lookups get optimised out. Alternatively, 
"sum" is still optimised to perform only 1 vtable lookup, not 1 on each 
iteration.)


Post a reply to this message

From: Invisible
Subject: Re: Found the dynamic optimization things...
Date: 24 Sep 2008 05:57:16
Message: <48da0efc$1@news.povray.org>
Orchid XP v8 wrote:

> I'll see if I can figure out what the current status of this feature 
> is... I'm curios myself.

Building libraries as DLLs used to work, but is currently broken.

You can still build a whole Haskell program as a DLL rather than an EXE, 
but that's all. (And why the hell would you want to? Load up 5 Haskell 
DLLs and you have 5 copies of the RTS and several copies of the Haskell 
libraries statically linked in... urgh!)

It was working until roughly GHC 6.4 (~2005), and then it broke. 
Apparently it's due to be back (on all platforms) in the next version of 
GHC - which is now actually overdue for release. (There was supposed to 
be an RC out by now.) The various developer docs are unclear as to 
whether this feature really will be present or not.


Post a reply to this message

From: Warp
Subject: Re: Found the dynamic optimization things...
Date: 24 Sep 2008 09:22:33
Message: <48da3f19@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> >   But doesn't that make each program selfish? In other words, it only takes
> > care of itself not running out of memory but completely disregards any other
> > program running in the system at the same time?

> Yes? And how is that different from, say, Windows or Linux? Maybe I'm 
> not understanding your point.

  When a C/C++ program frees at least a certain amount of memory in Windows
and Linux, that memory is also freed from the system, and becomes available
to other programs.

  If a GC'd program never runs the GC, it will keep all that memory reserved
even though it doesn't use it. Moreover, a GC'd system often allocates a
lot more memory than it really needs (because "freed" memory cannot be
reused until the GC is run).

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Found the dynamic optimization things...
Date: 24 Sep 2008 09:28:43
Message: <48da408a@news.povray.org>
Invisible <voi### [at] devnull> wrote:
> >   However, how does haskell manage to do this? Suppose you have something
> > like this precompiled into a dynamically loadable library:
> > 
> > foldl1 (+) someList
> > 
> >   Also assume the user has defined his own element type for the list and
> > the (+) function for that element type, and then calls the dynamically
> > loaded library by giving it a list with elements of that type. How does
> > the library know which (+) function to call? How does it know that a (+)
> > function exists for that element type in the first place?

> The foldl1 function expects a list and a function that can operate on 
> the elements of that list. In this case, all that happens is that the 
> *caller* passes foldl1 a pointer to the appropriate implementation of 
> (+) for the data type in question. The foldl1 function itself knows 
> nothing about this; it's up to the caller. Since the caller presumably 
> "knows about" this user-defined data type, that's no problem.

  You didn't understand me. The line "foldl1 (+) someList" is *in* the
precompiled library. It was not written by the user.

  When you are compiling the library you don't know what the type of
elements are in the list nor what is the correct (+) function to call
for those types.

> To summarise: if a type is statically known, the correct function is 
> looked up at compile time. If a type is unknown at compile time, a 
> virtual table pointer is secretly passed in.

  Exactly where is this virtual table pointer stored?

-- 
                                                          - Warp


Post a reply to this message

From: Invisible
Subject: Re: Found the dynamic optimization things...
Date: 24 Sep 2008 09:44:49
Message: <48da4451$1@news.povray.org>
Warp wrote:

>   You didn't understand me. The line "foldl1 (+) someList" is *in* the
> precompiled library. It was not written by the user.

Ah, I see.

Well "foldl1 (+)" is the implementation of "sum", which is indeed in the 
Haskell standard library (and hence precompiled). So what you're asking 
is, "what happens if I call 'sum' on some custom datatype that I just 
wrote?"

>   Exactly where is this virtual table pointer stored?

Each time you create a datatype that supports (+), (-), etc., the 
compiler generates a table pointing to the appropriate implementations 
of these functions for that datatype.

Each time you compile a function that accepts an arbitrary numeric type, 
the compiler secretly adds an extra pointer argument to that function. 
When a caller calls that function, the compiler secretly adds a pointer 
to the correct table into the function call. (And as I noted, if the 
caller doesn't know the type, it must have received a pointer itself in 
the same way, so it just passes that on.)

If that makes sense?


Post a reply to this message

From: Darren New
Subject: Re: Found the dynamic optimization things...
Date: 24 Sep 2008 11:48:34
Message: <48da6152$1@news.povray.org>
Warp wrote:
>   When a C/C++ program frees at least a certain amount of memory in Windows
> and Linux, that memory is also freed from the system, and becomes available
> to other programs.

True. On the other hand, if you have a lot of small allocations and you 
can't condense them, you can wind up wasting a lot of memory because you 
have a few bytes allocated on each of dozens of pages. With a compacting 
GC, this situation doesn't occur.

>   If a GC'd program never runs the GC, it will keep all that memory reserved
> even though it doesn't use it. 

I imagine in this case you'd need to implement GC mechanisms that ran 
well. For example, you might force a GC after every N number of new 
pages are allocated.

Now you have me curious - I'll have to look at the code to see when the 
GC gets triggered. It'll be interesting to see how easy that is to find.

> Moreover, a GC'd system often allocates a
> lot more memory than it really needs (because "freed" memory cannot be
> reused until the GC is run).

On the other hand, a compacting GC doesn't have wasted space in the 
pages where the data is stored after a GC runs.

Certainly, a GCed program that doesn't run the GC often enough will 
waste memory, just like a Windows program that doesn't deallocate its 
memory resources when it's finished will use more memory than it needs 
to.  If your video game doesn't clean up the structures for dead aliens 
until you get to the end of the level, you're likely using up more 
memory than you need to.

Doctor, Doctor, it hurts when I do this.

-- 
Darren New / San Diego, CA, USA (PST)


Post a reply to this message

<<< Previous 10 Messages Goto Initial 10 Messages

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.