|
|
Warp wrote:
> Manually closing the file handle when it's not needed anymore is not
> any better then manual memory management: There's always the danger of
> a leak. Functions may be exited at unexpected places, etc.
Of course, all these objections are fairly irrelevant when the OS is
actually designed to help the GC system.
For example, nobody complains "UNIX signals are bad, because if you
don't catch them, your process might have files open when it exits." No,
the OS closes the files for you. It doesn't delete temp files, and it
doesn't roll back half-completed transactional changes to the file
system (e.g., it won't let you rename files such that all three change
or none change, even if you dump core in the middle), but people don't
seem to complain about this aspect for some reason.
In a system that actually actively supports GC, a program trying to
access a file handle that someone else has open (and locked) will cause
a GC in that program to finalize those resources, freeing up the lock,
in exactly the same way that a modern desktop OS will page out inactive
portions of one process to free up memory for other processes.
And again, there are pretty trivial techniques to prevent this from
being a problem. At the top of your main loop, you do a quick GC to
scavenge anything you accidentally freed.
And of course anything that allocated stuff on the heap has this sort of
problem anyway, unless you wind up essentially writing your own GC anyway.
> So, basically, garbage collection takes care of memory management,
> often at the cost of reducing the possibility of the user implementing
> automatic management of resources other than memory. These resources may
> be system resources, or resources in the program itself. GC seems to
> completely disregard them.
Only naive GC. Of course, sophisticated GC needs support throughout the
system. Some systems have this.
Other systems don't have this sort of problem at all, in that files are
memory resources just like "in-memory" structures, so going out of scope
deletes the file just like you'd expect.
> C# may be slightly better in this regard, though.
Syntactically, it's a bit easier, but it's essentially the same
mechanism under the covers. It's just easier to invoke it.
> Another problem with GC, which admittedly is often not very relevant,
> but in some cases may be, is that GC is very detrimental with regard
> to swapping.
I don't think I've demand-paged in ages. Certainly not any process I've
let run to completion.
I've noticed that every program I use that actually uses too much data
to conveniently fit in memory (e.g., photoshop, mysql, various other
database-like engines) all manage their own memory-to-disk paging.
I bet you could get a speed boost by leaving out the demand-paging bits
on modern chips and just go back to swapping and overlays in the rare
cases where people care, and let them implement it themselves. You'd
still probably want virtual addressing, so I don't know how much you'd
actually save if you did this.
> If the system was running low on memory to begin with, this can be
> quite detrimental. The system may start swapping like mad, all for
> absolutely no benefit.
I wouldn't say "no benefit". If the system is smart, it'll do a GC
*before* it swaps out the task. That's the sort of thing I'm talking
about: it'll save swapping time and swap space.
Erlang has a primitive called "hibernate", which does a GC on the
process then swaps it out until it gets the next message. Normally you'd
only do this if you expect to be idle for a while (say, the backup
process waiting for the next scheduled backup to roll around).
> So the question is: If you are making such an application, can you
> tell C# to *not* to use garbage collection? Is GC optional?
Not really.
--
Darren New / San Diego, CA, USA (PST)
"That's pretty. Where's that?"
"It's the Age of Channelwood."
"We should go there on vacation some time."
Post a reply to this message
|
|