|
![](/i/fill.gif) |
On 3/16/2012 23:29, Warp wrote:
> Darren New<dne### [at] san rr com> wrote:
>> On 3/16/2012 12:38, Warp wrote:
>>> I think you are talking about system resources. A program may use other
>>> resources than simply system resources.
>
>> Like what?
>
> "When the last reference to this sprite dies, remove it from the screen."
I will point you to Smalltalk, which had no trouble doing things like this.
> "When the last reference to this timer dies, remove it from the runloop."
I will point you at Smalltalk, which had no trouble doing things like this. :-)
> And so on. (Actual real-life examples.)
Again, you seem to be assuming that the OS isn't garbage collected. If you
just GCed the timer, why would it still be firing events? If you just GCed
the sprite, why would it be still drawing on the screen? What's going to
refresh it?
Now, granted, if you have hardware sprites that don't actually stop drawing
when you write over them, you'd need something in the OS to handle that, but
that's again the OS's job to regulate shared hardware.
> A system cannot offer *everything* that a programmer might ever want.
> At some level a feature has to be implemented. If you want CoW, that has
> to be implemented somewhere. It cannot just magically work out of nowhere.
Sure. And my point is that if you're writing and OS designed for languages
that are GCed, then that sort of thing belongs in either the compiler or the
OS. Just like if you're writing a database designed for multiple
applications to access it at once, you don't just go "well, leave the
locking out and let the applications themselves worry about that, because
you have to implement it somewhere."
No, if you don't implement GC in the OS, you need finalizers to tell the OS
that you're done with a resource. If you *do* implement it in the OS, the OS
knows you're done with the resource because it knows there aren't any more
references to the resource.
That said, your idea of COW is an interesting case. I remember you talking
about it before. And I'll grant that it's not the sort of thing that's
trivial to do without keeping track of how many references there are to the
object. But I'd rather see this as something like a different type of class,
rather than taking advantage of a more global functionality designed to
bypass limitations in the OS. In other words, your COW doesn't really need
finalizers. It needs a way of knowing how many references there are to your
writable block, whether it's already shared. Clearly the modern OSes already
support copy-on-write semantics (leading to the OOM killer, for example), so
it's not really obvious that we're not solving this particular problem at
the wrong level of architecture.
Now, granted, I think having a mechanism whereby you can mark a particular
class as (say) having no circular references and needing reference-counted
GC or something, and maybe that would buy you something in some various
cases like your COW or in other circumstances like network sockets where
you're necessarily talking to something that can't be garbage collected and
you want it released as soon as possible. But mostly it's still the sort of
thing that should go in the compiler so everyone can use it.
--
Darren New, San Diego CA, USA (PST)
"Oh no! We're out of code juice!"
"Don't panic. There's beans and filters
in the cabinet."
Post a reply to this message
|
![](/i/fill.gif) |