POV-Ray : Newsgroups : povray.off-topic : GOTO Server Time
4 Sep 2024 01:15:04 EDT (-0400)
  GOTO (Message 31 to 36 of 36)  
<<< Previous 10 Messages Goto Initial 10 Messages
From: Warp
Subject: Re: GOTO
Date: 17 Oct 2010 15:39:16
Message: <4cbb50e4@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> >   On a different note, one example where I think the RAII mechanism is
> > better than the Java-style GC mechanism is that RAII allows you to
> > implement copy-on-write semantics (ie. "lazy copying") for objects.

> I understand what you're saying, but this is a bad example for Java because 
> that's exactly how it would work in Java, because strings are immutable, so 
> your "+=" returns a brand new string. :-)  That's why Java has a 
> StringBuilder as well as a String.

  The problem is that it will make a copy of the string data every time
it's modified, which is inefficient.

  For example, assume that instead of appending something to the string
you want to modify one of the characters, such as:

    strings[250][0] = 'H';

  If the string data is not shared, this is very efficient to do because
it can be done in-place, without any kind of data copying. Imagine the
string containing 10 megabytes of data, and you wanting to modify just
the first character.

> >   This is possible transparently in C++ because of RAII: When objects are
> > created, copied, assigned and destroyed, you can specify what happens.

> I think it's more because you can overload the assignment operator, not the 
> RAII as such. Maybe you count that as part of RAII.

  Well, construction (including copy construction) and destruction are
integral parts of CoW, because without them you wouldn't be able to
perform reference counting (and this is exactly one of the aspects of
RAII: resource acquisition and release).

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: GOTO
Date: 17 Oct 2010 16:30:50
Message: <4cbb5cfa@news.povray.org>
Warp wrote:
>   The problem is that it will make a copy of the string data every time
> it's modified, which is inefficient.

Yes. I understand that. I'm just saying that when you do this discussion, 
don't discuss it using strings if Java is your target language. Java strings 
are immutable. Discuss it using arrays of bytes, or something like that.

>   If the string data is not shared, this is very efficient to do because
> it can be done in-place, without any kind of data copying. Imagine the
> string containing 10 megabytes of data, and you wanting to modify just
> the first character.

Yeah. You *can* do that sort of thing in Java. You just have to do it the 
same way you do it in C: manually and error-prone. :-)

>   Well, construction (including copy construction) and destruction are
> integral parts of CoW, because without them you wouldn't be able to
> perform reference counting (and this is exactly one of the aspects of
> RAII: resource acquisition and release).

OK. I just wasn't sure whether the assignment operator and such was 
considered part of the support for RAII, but in retrospect, I guess it 
wouldn't make sense for it not to be.

-- 
Darren New, San Diego CA, USA (PST)
   Serving Suggestion:
     "Don't serve this any more. It's awful."


Post a reply to this message

From: Warp
Subject: Re: GOTO
Date: 17 Oct 2010 17:44:00
Message: <4cbb6e20@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> I understand what you're saying, but this is a bad example for Java because 
> that's exactly how it would work in Java, because strings are immutable, so 
> your "+=" returns a brand new string. :-)  That's why Java has a 
> StringBuilder as well as a String.

  Maybe this is a much better example, and one which is actually closer to
an actual, realistic situation:

    // An array of 100 unit matrices of size 1000x1000:
    std::vector<Matrix> matrices(100, Matrix(1000, 1000));

  The 'matrices' vector will contain 100 unit matrices. Since these matrices
are really large (1 million elements each), but they are all identical,
there's a very clear memory saving advantage if they all share the same
data (as will be the case if the 'Matrix' class uses copy-on-write). This
is especially so if it's expected that not all of them will ever be modified.

  Of course if you modify one of them, eg:

    matrices[25] *= 2;

you want only the 26th matrix to be modified, rather than all of them.
(Hence it needs to perform a deep-copy before the modification, but only
if its data is being currently shared. If it's not being currently shared,
as may be the case if you modify it again after the above, deep-copying
should not be performed because it would be horribly inefficient.)

-- 
                                                          - Warp


Post a reply to this message

From: Invisible
Subject: Re: GOTO
Date: 18 Oct 2010 04:09:44
Message: <4cbc00c8$1@news.povray.org>
On 16/10/2010 04:03 AM, nemesis wrote:

> in other words:  "I am your father!" :D

NOOOOOOOOO!!! >_<


Post a reply to this message

From: Invisible
Subject: Re: GOTO
Date: 18 Oct 2010 04:23:15
Message: <4cbc03f3@news.povray.org>
>> I also note, with some interest, what looks suspiciously like Haskell
>> syntax, in a letter typed 42 years ago. Obviously it's not after
>> Haskell, but rather whatever mathematical formalism Haskell borrowed the
>> notation from. Still, interesting none the less...)
>
> What syntax specifically?

case [i] of (A1, A2 ... An)

It's almost valid Haskell as it stands. The next line down has

(B1 -> E1, B2 -> E2 ... Bn -> En)

which is also strikingly similar.

> I think (but I was not yet part of the scene then) that people were
> looking for good notation. They knew that programming and mathematics
> had a lot in common. Yet things that were obvious in von Neumann
> machines (assignments and control flow statements in particular) did not
> have a direct counterpart in maths.

I think it's more that mathematics is capable of expressing 
transformations far more complex than what the hardware can actually 
implement. Computer hardware is, more or less, restricted to mutating 
individual hardware registers one at a time. Which, if you think about 
it, is a very slow way to make large-scale changes. Mathematics has a 
lot of language for describing complex transformations, and is much less 
focused on how to (say) multiply two matrices one floating-point value 
at a time.

> A problem that is not entirely solved even today. Haskell and other
> languages try to deal with it by restriction to a a paradigm that is
> closer to math than the imperative way of thinking. The downside of that
> is that problems that are easier solved in another paradigm become more
> complicated. Perhaps there are also people working on extending maths to
> include time-dependent behaviour.

I think that "maths" has "included time-dependent behaviour" since at 
least when Sir Isaac Newton formulated his Laws of Motion. :-P

> Fact is that we as humans solve problems using a lot of techniques.
> Choosing whatever seems appropriate at the time. That is why I think a
> good programmer needs to be familiar with at least three or four
> different languages.

Well, I'd say I've tried more than most...


Post a reply to this message

From: andrel
Subject: Re: GOTO
Date: 19 Oct 2010 18:10:47
Message: <4CBE176A.3030002@gmail.com>
On 18-10-2010 10:23, Invisible wrote:

>> A problem that is not entirely solved even today. Haskell and other
>> languages try to deal with it by restriction to a a paradigm that is
>> closer to math than the imperative way of thinking. The downside of that
>> is that problems that are easier solved in another paradigm become more
>> complicated. Perhaps there are also people working on extending maths to
>> include time-dependent behaviour.
>
> I think that "maths" has "included time-dependent behaviour" since at
> least when Sir Isaac Newton formulated his Laws of Motion. :-P

No, but you knew that.


Post a reply to this message

<<< Previous 10 Messages Goto Initial 10 Messages

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.