|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Darren New <dne### [at] sanrrcom> wrote:
> On 3/16/2012 12:38, Warp wrote:
> > I think you are talking about system resources. A program may use other
> > resources than simply system resources.
> Like what?
"When the last reference to this sprite dies, remove it from the screen."
"When the last reference to this timer dies, remove it from the runloop."
And so on. (Actual real-life examples.)
> > The quintessential example would be if you wanted to implement a
> > copy-on-write mechanism.
> That hasn't anything to do with finalizers. In advanced systems, that sort
> of stuff isn't something you write in the application code, either, any more
> than worrying about taking things out of the B-tree is something a SQL
> programmer worries about when deleting a row.
Sure. If that's the principle, then every program you could ever want
to write is rather simple: Just something like "do_what_i_want();"
A system cannot offer *everything* that a programmer might ever want.
At some level a feature has to be implemented. If you want CoW, that has
to be implemented somewhere. It cannot just magically work out of nowhere.
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 3/16/2012 23:29, Warp wrote:
> Darren New<dne### [at] sanrrcom> wrote:
>> On 3/16/2012 12:38, Warp wrote:
>>> I think you are talking about system resources. A program may use other
>>> resources than simply system resources.
>
>> Like what?
>
> "When the last reference to this sprite dies, remove it from the screen."
I will point you to Smalltalk, which had no trouble doing things like this.
> "When the last reference to this timer dies, remove it from the runloop."
I will point you at Smalltalk, which had no trouble doing things like this. :-)
> And so on. (Actual real-life examples.)
Again, you seem to be assuming that the OS isn't garbage collected. If you
just GCed the timer, why would it still be firing events? If you just GCed
the sprite, why would it be still drawing on the screen? What's going to
refresh it?
Now, granted, if you have hardware sprites that don't actually stop drawing
when you write over them, you'd need something in the OS to handle that, but
that's again the OS's job to regulate shared hardware.
> A system cannot offer *everything* that a programmer might ever want.
> At some level a feature has to be implemented. If you want CoW, that has
> to be implemented somewhere. It cannot just magically work out of nowhere.
Sure. And my point is that if you're writing and OS designed for languages
that are GCed, then that sort of thing belongs in either the compiler or the
OS. Just like if you're writing a database designed for multiple
applications to access it at once, you don't just go "well, leave the
locking out and let the applications themselves worry about that, because
you have to implement it somewhere."
No, if you don't implement GC in the OS, you need finalizers to tell the OS
that you're done with a resource. If you *do* implement it in the OS, the OS
knows you're done with the resource because it knows there aren't any more
references to the resource.
That said, your idea of COW is an interesting case. I remember you talking
about it before. And I'll grant that it's not the sort of thing that's
trivial to do without keeping track of how many references there are to the
object. But I'd rather see this as something like a different type of class,
rather than taking advantage of a more global functionality designed to
bypass limitations in the OS. In other words, your COW doesn't really need
finalizers. It needs a way of knowing how many references there are to your
writable block, whether it's already shared. Clearly the modern OSes already
support copy-on-write semantics (leading to the OOM killer, for example), so
it's not really obvious that we're not solving this particular problem at
the wrong level of architecture.
Now, granted, I think having a mechanism whereby you can mark a particular
class as (say) having no circular references and needing reference-counted
GC or something, and maybe that would buy you something in some various
cases like your COW or in other circumstances like network sockets where
you're necessarily talking to something that can't be garbage collected and
you want it released as soon as possible. But mostly it's still the sort of
thing that should go in the compiler so everyone can use it.
--
Darren New, San Diego CA, USA (PST)
"Oh no! We're out of code juice!"
"Don't panic. There's beans and filters
in the cabinet."
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Darren New <dne### [at] sanrrcom> wrote:
> > "When the last reference to this sprite dies, remove it from the screen."
> I will point you to Smalltalk, which had no trouble doing things like this.
> > "When the last reference to this timer dies, remove it from the runloop."
> I will point you at Smalltalk, which had no trouble doing things like this. :-)
And exactly how does Smalltalk know what to do if you don't tell it?
You have to be able to tell it somehow. It cannot guess it by magic.
> > And so on. (Actual real-life examples.)
> Again, you seem to be assuming that the OS isn't garbage collected. If you
> just GCed the timer, why would it still be firing events?
Because you have told the runloop to fire events with that timer on
regular intervals. You have to tell the timer/runloop to stop doing that.
The runloop owns the timer, so there's at least one pointer pointing to
it until you explicitly tell the runtime to stop it.
Same goes for sprites in environments where you tell to the runtime
(which might be eg. a custom game engine) "this object is placed here":
The runtime owns the object and it will be there until you tell it to
drop it.
How do you tell it to drop them when the last reference to them dies?
With a destructor/finalizer.
> No, if you don't implement GC in the OS, you need finalizers to tell the OS
> that you're done with a resource. If you *do* implement it in the OS, the OS
> knows you're done with the resource because it knows there aren't any more
> references to the resource.
But not everything is a system resource.
> That said, your idea of COW is an interesting case. I remember you talking
> about it before. And I'll grant that it's not the sort of thing that's
> trivial to do without keeping track of how many references there are to the
> object. But I'd rather see this as something like a different type of class,
> rather than taking advantage of a more global functionality designed to
> bypass limitations in the OS. In other words, your COW doesn't really need
> finalizers. It needs a way of knowing how many references there are to your
> writable block, whether it's already shared.
I can't think of any other way of knowing if an object is being shared
than by either using deterministic scope-bound reference counting, or by
running a GC sweep, which would ostensibly be extremely heavy if done too
often.
Why can't RAII *and* automatic GC be supported in the same language?
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 3/17/2012 10:40, Warp wrote:
>>> "When the last reference to this timer dies, remove it from the runloop."
>> I will point you at Smalltalk, which had no trouble doing things like this. :-)
> And exactly how does Smalltalk know what to do if you don't tell it?
> You have to be able to tell it somehow. It cannot guess it by magic.
The timer doesn't fire if you've garbage-collected it.
> Because you have told the runloop to fire events with that timer on
> regular intervals. You have to tell the timer/runloop to stop doing that.
> The runloop owns the timer, so there's at least one pointer pointing to
> it until you explicitly tell the runtime to stop it.
Well, sure. So? I'm not following why this is a problem. If you want the
timer to stop, you stop it.
> Same goes for sprites in environments where you tell to the runtime
> (which might be eg. a custom game engine) "this object is placed here":
> The runtime owns the object and it will be there until you tell it to
> drop it.
>
> How do you tell it to drop them when the last reference to them dies?
> With a destructor/finalizer.
You just told me the last reference will not go away, because it's in the
runloop or the game engine, right? I'm not following.
It sounds like what you're saying is you want a type of object to keep
reference counts, so when you dispose of an object, it can have an action
other than finalizing other objects? I.e., you don't want the timer garbage
collected, but you want the timer to keep track of how many references to it
exist from objects other than the owner, so it can be stopped when the last
user gets collected?
In that case, you use weak references. The runloop would hold a weak
reference to the timer, and when the last user of that timer gets collected,
the timer gets collected out from under the runloop. Now, granted, you might
want the timer or sprite to disappear before the next GC, if that's what
you're talking about, but again that's not an appropriate task for a
finalizer that only runs during GC in the first place.
The other question is whether you want the timer to keep alive the objects
whose methods it invokes when the timer fires. In C++, you can delete the
invoked object out from under the timer, but you can't do that in a GCed
language. If you said "Run xyz.pdq() every ten seconds", then the xyz
instance is going to hang around so it can be run, so asking how to stop the
timer when nobody is run by it doesn't even make sense, from that point of
view.
Do you want the fact that the sprite is on the screen to keep the sprite
object alive? Or are you really saying "I want to use scope to keep track of
when to start and stop various processes"
> Why can't RAII *and* automatic GC be supported in the same language?
I think not so much RAII as reference counting. I fully support having weak
references as well as reference-counted objects. (I think actually that
Python supports both.) Reference-counted objects are high overhead compared
to GCed objects, tho, so unless there's really a reason you promptly need
them to free their resources, you probably want to avoid declaring your
class that way. And if you manage to get a circular loop of
reference-counted objects, your reference counting is going to be screwed up
anyway, so all the more reason to mark reference-counted classes as special
- you can have the compiler check that no reference-counted class can
transitively point to an instance of itself.
--
Darren New, San Diego CA, USA (PST)
"Oh no! We're out of code juice!"
"Don't panic. There's beans and filters
in the cabinet."
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Now there's interesting. I had a go with NetBeans on my laptop while I
was in Switzerland, and it didn't keep giving me random build failures
for no apparent reason. And while it was still slow, it wasn't
unacceptably unresponsive. All of which is interesting, because when I
originally tried it out, it was running on a more powerful PC.
(Admittedly in a VM, but it's using hardware virtualisation, and no
other applications seemed unduly slow.)
Also, it appears that NetBeans has wired-in support for Git, Mercurial
and Subversion. Obviously my source control system of choice is not
supported, largely because nobody has ever heard of it. I did try to use
Git though. I /presume/ it's recording my changes, because damned if I
can find any way of, you know, /looking at/ the change history. :-P Just
to be confusing, NetBeans keeps its own session history as well, so if
you just accidentally edited the wrong file or something, you can
quickly pull up the last few diffs and revert them.
Is there some kind of tool you can use to /actually see/ what's in a Git
repository? Because NetBeans isn't being very helpful here.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|