POV-Ray : Newsgroups : povray.off-topic : The ups and downs of computing Server Time
2 Nov 2024 03:15:44 EDT (-0400)
  The ups and downs of computing (Message 1 to 8 of 8)  
From: Invisible
Subject: The ups and downs of computing
Date: 3 Dec 2010 11:34:16
Message: <4cf91c08$1@news.povray.org>
Some while ago, they added a feature to the Haskell compiler where you 
can link Haskell libraries dynamically rather than statically.

However, this feature works only on Unix, because Haskell hates Windows.

*sigh*


But wait! What's this? The latest and greatest version of GHC just 
released now supports Windows DLLs too. Cool!

...well, partly cool anyway.

Good things:

- Compiling the Hello World program with static libraries gives me a 
518KB file, but with dynamic libraries that goes down to 13KB.

Bad things:

- The program doesn't actually /run/ any more. It can't find the 
necessary DLLs. It seems that /you/ have to manually copy them to 
somewhere sensible and/or alter the search path.

- Finding the DLL files is non-trivial. Rather than putting them all in 
one folder, the installer scatters them across the filesystem in a 
pseudo-random manner. (This is quite probably why the DLL loader can't 
find them either.)

- Hello World requires the DLLs for the Haskell run-time system, Haskell 
compiler primitives, foreign function interface, arbitrary-precision 
integers, and of course the Haskell "base" library. That's 5 DLLs you 
have to have in your search path.

- These five DLLs add up to 8MB. (!)

03/12/2010  03:55 PM            12,814 HelloWorld.exe
13/11/2010  12:16 AM         7,015,961 libHSbase-4.3.0.0-ghc7.0.1.dll
13/11/2010  12:18 AM            35,854 libHSffi-ghc7.0.1.dll
13/11/2010  12:16 AM           676,886 libHSghc-prim-0.2.0.0-ghc7.0.1.dll
13/11/2010  12:16 AM           394,520 libHSinteger-gmp-0.2.0.2-ghc7.0.1.dll
13/11/2010  12:18 AM           266,240 libHSrts-ghc7.0.1.dll
                6 File(s)      8,402,275 bytes

And you thought MSVBVM50.DLL was a pain! >_<

(I especially love the way that I'm not actually /using/ FFI or 
arbitrary-precision integers, and yet I must have the DLLs installed. 
But then, *I* am not using FFI, but the entire Haskell I/O system is 
based on it. Similarly, file sizes are returned as arbitrary-precision 
integers, so the DLL for that is required.)

In summary, I don't think I'll rush out and use this feature anytime soon.



Also entertaining is the way that several new features of the compiler 
aren't accurately reflected in all of the documentation. (E.g., some 
chapters claim that shared libraries don't work on Windows, while others 
assert that they do...)

PS. I just tried compiling with dynamic libraries using an older version 
of the compiler (i.e., one which supports it on Unix but not on 
Windows). Rather than saying "sorry, this is not supported on Windows", 
or even just "unrecognised flag name", instead it crashes with a 
file-not-found error from GCC. Graceful failure, anyone?


Post a reply to this message

From: Darren New
Subject: Re: The ups and downs of computing
Date: 3 Dec 2010 12:16:02
Message: <4cf925d2$1@news.povray.org>
Invisible wrote:
> - The program doesn't actually /run/ any more. It can't find the 
> necessary DLLs. It seems that /you/ have to manually copy them to 
> somewhere sensible and/or alter the search path.

DLLs are searched the same place that .exes are searched, except with the 
addition of the directory that the .exe is in, if there's also a magic file 
that says "search this directory too" to circumvent DLL hell. :-)

> - These five DLLs add up to 8MB. (!)

Well, sure. Because you're not just selecting at link time the routines you 
use.  If you have 30 Haskell programs installed, that's a net win.

> In summary, I don't think I'll rush out and use this feature anytime soon.

You should, so you can report on success and failure, and improve the 
product. You *are* one of only 38 beta testers, after all.

> Also entertaining is the way that several new features of the compiler 
> aren't accurately reflected in all of the documentation. (E.g., some 
> chapters claim that shared libraries don't work on Windows, while others 
> assert that they do...)

Welcome to open source, where by the time you're competent enough to fix the 
documentation, you no longer need the documentation!

-- 
Darren New, San Diego CA, USA (PST)
   Serving Suggestion:
     "Don't serve this any more. It's awful."


Post a reply to this message

From: Orchid XP v8
Subject: Re: The ups and downs of computing
Date: 3 Dec 2010 16:53:41
Message: <4cf966e5$1@news.povray.org>
>> - The program doesn't actually /run/ any more. It can't find the
>> necessary DLLs. It seems that /you/ have to manually copy them to
>> somewhere sensible and/or alter the search path.
>
> DLLs are searched the same place that .exes are searched, except with
> the addition of the directory that the .exe is in, if there's also a
> magic file that says "search this directory too" to circumvent DLL hell.
> :-)

I'm just loving the fact that installing the compiler and telling it to 
compile something is broken out-of-the-box. The fact that it does 
eventually works proves that somebody actually tested it, but one 
wonders what the hell kind of testing they did while managing to miss 
this one...

Also: What about DLLs registered in the... registry?

And didn't "DLL hell" get fixed, like, 10 years ago?

>> - These five DLLs add up to 8MB. (!)
>
> Well, sure. Because you're not just selecting at link time the routines
> you use. If you have 30 Haskell programs installed, that's a net win.

Apparently the compiler's dead code elimination works quite well. But 
anyway, I guess you'd have to have quite a lot of Haskell executables 
before the disk space and setup complexity of DLLs makes it even vaguely 
worth the effort.

>> In summary, I don't think I'll rush out and use this feature anytime
>> soon.
>
> You should, so you can report on success and failure, and improve the
> product. You *are* one of only 38 beta testers, after all.

I'm sure the number of people testing on Windows is nearer to 2. Hell, 
even Mac OS seems to get better support than Windows...

> Welcome to open source, where by the time you're competent enough to fix
> the documentation, you no longer need the documentation!

Yeah, tell me about it.

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Darren New
Subject: Re: The ups and downs of computing
Date: 3 Dec 2010 18:19:13
Message: <4cf97af1$1@news.povray.org>
Orchid XP v8 wrote:
> wonders what the hell kind of testing they did while managing to miss 
> this one...

Yeah, the same was true of the Eiffel compiler when I bought it. Eiffel, the 
language for reliable software!

Anyway, chances are the person who tested it had already installed it and 
wound up running it against a different file. It's the sort of thing that 
happens when you've got a small shop instead of someone whose job it is to 
spend extra time to make sure it works on a vanilla machine.

> Also: What about DLLs registered in the... registry?

That's not DLLs. That's COM classes stored in DLLs.

> Apparently the compiler's dead code elimination works quite well.

It's not hard if every function is in a separate .o blob.

-- 
Darren New, San Diego CA, USA (PST)
   Serving Suggestion:
     "Don't serve this any more. It's awful."


Post a reply to this message

From: scott
Subject: Re: The ups and downs of computing
Date: 6 Dec 2010 04:56:42
Message: <4cfcb35a@news.povray.org>
> Apparently the compiler's dead code elimination works quite well. But 
> anyway, I guess you'd have to have quite a lot of Haskell executables 
> before the disk space and setup complexity of DLLs makes it even vaguely 
> worth the effort.

It's a huge win if you're repeatedly transferring the same programs to the 
same machine (eg rolling out updates).  Also given that people downloading 
your Haskell programs might also be looking at other Haskell programs, it 
probably worth offering two versions to download (one that requires the 
DLLs, one that doesn't).


Post a reply to this message

From: Invisible
Subject: Re: The ups and downs of computing
Date: 10 Dec 2010 04:23:28
Message: <4d01f190@news.povray.org>
On 06/12/2010 09:56 AM, scott wrote:
>> Apparently the compiler's dead code elimination works quite well. But
>> anyway, I guess you'd have to have quite a lot of Haskell executables
>> before the disk space and setup complexity of DLLs makes it even
>> vaguely worth the effort.
>
> It's a huge win if you're repeatedly transferring the same programs to
> the same machine (eg rolling out updates). Also given that people
> downloading your Haskell programs might also be looking at other Haskell
> programs, it probably worth offering two versions to download (one that
> requires the DLLs, one that doesn't).

I haven't tested this (actually I /can't/ test this), but I have a 
sinking feeling that the DLLs have to be generated by the same compiler 
version as the executable using them.

That means that if you have two Haskell programs compiled with different 
versions of GHC, they /still/ can't actually share DLLs.

(OTOH, since GHC 7.0.1 is currently the /only/ version of GHC in 
existence that can create DLLs this way, currently it's a non-issue. 
This is also the reason why I can't test it...)

Of course, if you're distributing a bunch of related Haskell programs 
together, they will (presumably) all be compiled together, and they can 
share DLLs.

Still, the idea of making a shared library that needs to be recompiled 
for every patch-level release of the compiler is... uh... unappealing?


Post a reply to this message

From: Invisible
Subject: Re: The ups and downs of computing
Date: 10 Dec 2010 04:25:51
Message: <4d01f21f$1@news.povray.org>
>> wonders what the hell kind of testing they did while managing to miss
>> this one...
>
> Yeah, the same was true of the Eiffel compiler when I bought it. Eiffel,
> the language for reliable software!

Heh, yeah. (Although... I thought it was actually Eiffel, the language 
for *reusable* software?)

> Anyway, chances are the person who tested it had already installed it
> and wound up running it against a different file. It's the sort of thing
> that happens when you've got a small shop instead of someone whose job
> it is to spend extra time to make sure it works on a vanilla machine.

Sounds about right.

>> Also: What about DLLs registered in the... registry?
>
> That's not DLLs. That's COM classes stored in DLLs.

Oh, right.

>> Apparently the compiler's dead code elimination works quite well.
>
> It's not hard if every function is in a separate .o blob.

Yeah, there's an option to do that. It tends to produce thousands of 
tiny object files, which sometimes freaks the linker out. (Apparently.) 
It also takes /forever/ to compile and link.

Even with this option not selected though, the program still ends up 
rather small. Certainly much, much smaller than 8MB. And I'm not sure how...


Post a reply to this message

From: Darren New
Subject: Re: The ups and downs of computing
Date: 10 Dec 2010 11:04:59
Message: <4d024fab$1@news.povray.org>
Invisible wrote:
> Heh, yeah. (Although... I thought it was actually Eiffel, the language 
> for *reusable* software?)

Both. But mainly for reliable software.

>>> Apparently the compiler's dead code elimination works quite well.
>>
>> It's not hard if every function is in a separate .o blob.
> 
> Yeah, there's an option to do that. It tends to produce thousands of 
> tiny object files, which sometimes freaks the linker out. (Apparently.) 
> It also takes /forever/ to compile and link.

No. Then you concatenate them all into a .a file.  A .a file is simply 
essentially a zip file full of .o files.

With a .so, there's no way of knowing which functions the caller will call, 
because linking happens after you've already distributed the .so file.

-- 
Darren New, San Diego CA, USA (PST)
   Serving Suggestion:
     "Don't serve this any more. It's awful."


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.