POV-Ray : Newsgroups : povray.off-topic : This is the sort of brokenness... Server Time
6 Sep 2024 19:21:43 EDT (-0400)
  This is the sort of brokenness... (Message 81 to 90 of 164)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 13:44:49
Message: <49c28491$1@news.povray.org>
Warp wrote:
>   The question here is: Are compiler-enforced private member variables a
> good thing in programming or not?

My argument, again, very specifically, is that there's a bunch of 
possibilities, not two:

1) They're enforced 100%. Nobody outside the class is capable of writing to 
or reading from a private variable. This provides maximum isolation at the 
expense of not having available at runtime lots of stuff the compiler knows 
at compile time that you might want to know (e.g., reflection type stuff). 
The benefit is you can debug modules in isolation.

2) They're enforced 100%, but people outside the class can read but not 
write them. This is a little more fragile to change, but still capable of 
being debugged in isolation, because other parts of the code can't break 
your invariants. Eiffel does this, but it's the same syntax (x.y) to 
reference the y member of x as it is to invoke the nullary member function y 
on x, so you don't need to do more than recompile the client code. (I think 
you can also say a variable is completely private, IIRC.)

3) They're enforced, but there are explicit mechanisms to bypass them, such 
as reflection. This is my personal preference, because it's easy to find the 
places in the code that might be breaking your invariants or might rely on 
your internal implementation, while providing the power of metadata. This is 
the C# and Java model.

4) They're enforced by the compiler but not by the runtime. This removes 
*both* the ability to debug your module in isolation *and* the ability to do 
metaprogramming that eliminates boilerplate by using the information the 
compiler has already calculated.

5) They're not enforced by the compiler or the runtime, but there are 
conventions and/or standards that make it obvious to everyone when you're 
breaking the data encapsulation, and the runtime ensures that you can only 
do this "on purpose". That is, the unenforced conventions (or enforced but 
bypassable mechanisms) ensure that the only way of breaking encapsulation is 
on purpose. This is almost as good as #3, except it may be harder to track 
down who is violating your invariants.

6) There's no convention, and all you have is documentation saying which 
bits are supposed to be private and which aren't. Worst of all possible 
worlds. Not very modular at all.

>   I think that your problem is that you have some kind of holy war against
> C++, and every single discussion about programming is always somehow turned
> to bashing C++.

Nope. Someone brought up C++ by talking about the compiler enforcing 
"private:". I was just pointing out that the compiler only enforces it in 
some ways, and not in the ways that I thought was also important for modularity.

I.e., there's two good reasons for modularity: future-proofing your code, 
and bug-proofing your code. You only talked about the former.

>   I defended data hiding in general, as a programming paradigm (and I made
> that pretty clear). Your very first reply to me was a reference to (and
> attack against) C++.

Err, no it wasn't. I simply said "LISP can use macros to simulate OO just 
like C++ uses constructors and destructors to do resource management."

> You certainly didn't wait. 

Sure I did. Go back and look what I said about C++ again. Really.

> access rights in C++ and why everything is better in all other languages.

I haven't said better. I said different. I'm talking about the things above, 
which have little to do with C++ except it's the only unsafe OO language I 
know of.

>   I'm honestly getting tired of your C++ tirades. Every single subject
> related to programming must somehow include C++ bashing, regardless of
> the subject. That's getting tiresome.

I'm really not trying to bash C++ here. If pointing out that C++ allows you 
to corrupt member variables with a bug is "bashing", then I guess I'm bashing.

>> So you're saying having non-portable ways of bypassing the typing is better 
>> than having portable ways of bypassing the typing?
> 
>   How do you even manage to twist my words to almost the exact opposite of
> what I'm saying?
> 
>   Where exactly do you see the word "better", or any kind of synonym or
> reference to it? That's completely your twisted invention.

OK. I'm just trying to communicate here. That's why I'm asking the question. 
It seemed to me that you preferred a language with unsafe behavior and 
compiler-enforced checks to one with safe behavior and no compiler-enforced 
checks.  I was asking whether that was the case. The right answer would be 
"No, that's worse" or "No, that's different but just as bad because..." or 
something like that.

I've never heard you call C++ a kludge OO language. I assumed you were 
excluding C++ from that criticism when you said a language that allows 
access to private members is a kludge OO.

AFAIK, using lambdas to implement OO (which you can do in Python or LISP) is 
the only mechanism I've seen that doesn't expose private instance variables 
outside the class. Everything else provides either reflection or undefined 
(or even well-defined) behavior to so do.

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 14:02:52
Message: <49c288cc$1@news.povray.org>
Warp wrote:
>   I still can't comprehend what's so bad in the compiler doing the check
> that you don't access the private members.

If it's actually enforced (as in an error rather than a warning), it makes 
lots of things harder than they need to be, because the running code winds 
up with less information than the compiler had. You're throwing away half 
the work the compiler did once you generate the code, so you wind up with 
lots of boilerplate to enter into code things the compiler already knows.

If it's not enforced (i.e., it's a warning), then I don't see it as an 
improvement over a naming convention. People don't accidentally use x._y in 
their python code any more than they accidentally name something 
__builtin_printf in their C code. In CLOS, you actually have to go read the 
source code of the implementation of the class to see what member variables 
you can access, so it's not like you don't know you're violating encapsulation.

If it's *really* enforced (as in you actually can't get to private 
variables), it lets you do things like prove your program is correct and 
lets you do things like know you've got a bug in your module when an 
instance variable winds up with the wrong value.

If it's half-enforced, as in the compiler complains and won't compile the 
code, but there's ways to get around it anyway (on purpose or by mistake), 
then it's IMO the worst of all possible worlds. You'll spend hours or days 
trying to debug code that's already right because the client is convinced 
the other code they've written is bugfree and it's easier to blame you than 
to find the wild pointer in their own code. The whole idea of class 
invariants goes out the window.

Incidentally, I see little wrong with breaking encapsulation if you maintain 
the invariants. It makes it harder to upgrade in the future without changing 
the client, but that's the price you pay for it. If you can automate the 
access to where it doesn't hurt to change the client, it seems like a 
win-win to me.

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 14:03:10
Message: <49c288de$1@news.povray.org>
Warp wrote:
>   You could, of course, take that pointer and start trashing the memory
> it's pointing to, but that would be rather pointless. Certainly not useful.

Sure. And that's how C does its OO design pattern. :-)

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 14:26:37
Message: <49c28e5d$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> He's saying that putting "private:" before the variables in a class 
>> declaration is equivalent to naming private variables with an underscore. In 
>> the first case, the compiler warns you if you use a private variable by 
>> mistake.
> 
>   Could you give a concrete example of a compiler, of any OO language,
> which will only *warn* (rather than give an error message) if you try to
> access a member variable specifically marked as private from the outside?

Sure. C# or Java. You get this warning that says "this won't compile."  So 
you use the reflection libraries instead, and it compiles. The compiler 
hasn't stopped you from accessing the private variable. It just stopped you 
from trivially accidentally accessing the private variable. (Python doesn't 
even prevent you from trivially accessing the private variable, but in 
practice it isn't a problem.)

I know what you're asking. I'm asking you to look at it from a slightly 
different point of view. Take a more general approach to what it means to 
get "a warning" than the usual "the compiler issues warnings and errors, and 
nothing else is meaningful."

Another example is, as I've been saying, C++. The compiler warns (and won't 
compile) if you access the private variable by name, but not if you 
intentionally or accidentally access the private variable by address. Again, 
I understand what you're saying, and I'm asking you to also try to 
understand what I'm saying, instead of thinking I'm only saying it to annoy 
you. This is really how I think about programming - what do I know, and what 
might be broken. That's why I dislike unsafe languages with undefined behavior.

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 14:27:20
Message: <49c28e88$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> Warp wrote:
>>> Darren New <dne### [at] sanrrcom> wrote:
>>>>> (Well, at least not without some serious hacking.)
>>>> Running off the end of an array is "serious hacking"? I wouldn't think so.
>>>   Now I'm completely puzzled. We are talking about public interfaces and
>>> private members, and you start suddenly talking running off the end of an
>>> array? I can't see *any* connection whatsoever between these two things.
> 
>> I'm saying that C++ does not enforce that you don't change private instance 
>> variables from outside the class.
> 
>   And this has exactly what to do with whether (compiler-enforced) data
> hiding is a good thing or not?

Because there's two reasons for data hiding: (1) allow more flexibility in 
implementation, (2) allow easier debugging.

If you can violate my encapsulation, I don't see the data as especially 
encapsulated, is all.

>   Every single discussion about programming with you turns into C++ bashing,
> no matter what the subject.

I'm happy to bash any other unsafe OO language you might want to name. :-)

>   Could that perhaps change 'y'? Maybe. The standard allows the compiler
> to do whatever it wants. If the compiler wants to change 'y', it has the
> standard's blessing. If the compiler gives you an error message saying that
> you are a moron, that's ok as well.

Yep. Can you name a C++ implementation that doesn't change "y" in that 
sample? I'm 100% serious here, because I know of C interpreters, but I've 
never seen a C++ interpreter. Or is there a tool that would catch that at 
runtime?  It would make some sorts of debugging a lot easier. I don't think 
purify handles that sort of thing, does it? It seems like it would be 
difficult, because the size of pointers would have to change or the shape of 
the structure would have to be incompatible with C.

>>>>  The compiler doesn't enforce the 
>>>> encapsulation in C++ - it just gives warnings when you use a variable name 
>>>> that's private.
>>>   Warnings? Trying to access a private variable gives a compiler error.
> 
>> Only if you do it in the obvious way. Or are you saying there's a 
>> compile-time warning for calling memset() with too large a size?
> 
>   I'm just baffled about how you twist words. Exactly where am I saying
> anything like that? I don't even understand how you can come with such a
> conclusion. It doesn't make any sense.

I'm used to thinking about the semantics of a program. It's easy to 
miscommunicate when the semantics of a language include "there are no 
semantics for this valid piece of code that compiles cleanly."

It's untrue that "trying to access a private variable gives a compiler 
error." It's only true if you try to access the private variable in the 
obvious way. It's not true if you explicitly or accidentally circumvent the 
runtime system.

> "The compiler doesn't enforce encapsulation, it just gives warnings."
> "Actually the compiler gives an error if you try to break encapsulation."
> "Are you saying there's a compile-time warning for calling memset wrongly?"
> "???"
> 
>   I just can't follow your logic.

I said "the compiler warns you in some cases that you're accessing private 
variables of other classes, and it doesn't warn you in other cases that 
you're accessing private variables of other classes." You want to make that 
into a discussion about whether it's a "warning" or an "error".

The compiler only gives you error messages if you break encapsulation in 
*some* ways, not in other ways.  I.e., only when you break encapsulation on 
purpose does it give you a warning. When you break encapsulation by mistake, 
it doesn't. If you access a private variable by name, it won't compile. If 
you access a private variable by address, it compiles fine.

When you break encapsulation in CLOS on purpose, you don't get a warning, 
but you know you're doing it because you're groping around in the source 
code of the class you're violating. I don't see how making that uncompilable 
can help.

Explain that to me? Really, without treating it as an attack, explain to me 
how requiring a programmer to read the source code (not header) of the 
module to discover the names of private variables isn't sufficiently 
encapsulated, compared to something enforced at compile-time but not 
runtime? It's discussions like this that lead me to a better and deeper 
understanding of the craft. :-)

>> Sure you do. You have to put "private:" in front of the private bits and 
>> "public:" in front of the public bits.
> 
>   Exactly how is that "documentation"? That's source code.

And in CLOS, it's also source code. The only way to break encapsulation is 
go read the source code to see what the private variables are. It's not 
something you do accidentally, any more than you accidentally
#define private public.

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 14:36:35
Message: <49c290b3$1@news.povray.org>
Warp wrote:
> nemesis <nam### [at] gmailcom> wrote:
>>>   Ok, "written agreement" then, if you want to nitpick. Not much difference.
>>> Still not enforced nor sanity-checked by anyone or anything.
> 
>> Here is where Larry Wall's quote really applies. ;)
> 
>   Since when have code sanity checks become a bad thing?

When they become something to work around, rather than a help to the 
programmer. If you need to go read the source code of the module you're 
accessing to figure out how to get to the private variables, it's no longer 
a "code sanity check" IMO.

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

From: nemesis
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 15:00:54
Message: <49c29666@news.povray.org>
Darren New escreveu:
> Warp wrote:
>> nemesis <nam### [at] gmailcom> wrote:
>>>>   Ok, "written agreement" then, if you want to nitpick. Not much 
>>>> difference.
>>>> Still not enforced nor sanity-checked by anyone or anything.
>>
>>> Here is where Larry Wall's quote really applies. ;)
>>
>>   Since when have code sanity checks become a bad thing?
> 
> When they become something to work around, rather than a help to the 
> programmer. If you need to go read the source code of the module you're 
> accessing to figure out how to get to the private variables, it's no 
> longer a "code sanity check" IMO.

Err... I think here I'll have to agree with Warp.  Why would you try to 
bypass the "code sanity checks" and all the paranoid safety devices to 
access something you shouldn't be accessing in the first place?

This thread has long derailed into something quasi-metaphysical... :P

My final word here is:  to have modularity you don't need all the traps, 
contraption devices and alarmist chainballs with all the barroque syntax 
associated.  All you need is to agree to use only what a module provides 
you.  It's your choice to be dumb enough or a smartass h4x0r and rip the 
contract just to show you can.


Post a reply to this message

From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 15:26:04
Message: <49c29c4c$1@news.povray.org>
nemesis wrote:
> Darren New escreveu:
>> Warp wrote:
>>> nemesis <nam### [at] gmailcom> wrote:
>>>>>   Ok, "written agreement" then, if you want to nitpick. Not much 
>>>>> difference.
>>>>> Still not enforced nor sanity-checked by anyone or anything.
>>>
>>>> Here is where Larry Wall's quote really applies. ;)
>>>
>>>   Since when have code sanity checks become a bad thing?
>>
>> When they become something to work around, rather than a help to the 
>> programmer. If you need to go read the source code of the module 
>> you're accessing to figure out how to get to the private variables, 
>> it's no longer a "code sanity check" IMO.
> 
> Err... I think here I'll have to agree with Warp.  Why would you try to 
> bypass the "code sanity checks" and all the paranoid safety devices to 
> access something you shouldn't be accessing in the first place?



Because, every once in a while, the provider of the library wasn't omniscient.

It's pretty simple. It's a practical thing. Say you have (just for an 
example) a video codec. It does all kinds of nice things with timestamps and 
such, but one thing it doesn't provide is telling you the current frame 
number. However, that is stored in a private variable (which you can see 
from reading the .h file), and for some reason you need that information to 
implement your player in an environment the original author didn't intend 
his player to be used in.

You have three choices:
1) Violate encapsulation, and pay the price if you ever need to use a new 
version of the decoder that doesn't have the frame number in the same place.

2) Rewrite the entire video decoder from scratch.

3) Find the owner of the original code and get him to change it and agree to 
support it indefinitely.

It's a cost vs benefit thing.


You "shouldn't" be accessing the internal data structures NTFS stores on the 
disk. It makes it kind of hard to make an open source NTFS driver without 
doing so. If your systems *really* prevented that access, such drivers would 
be impossible to write. When MS changes the NTFS drivers, Linux maintainers 
have to go in, reverse engineer things again, and fix stuff up to stay 
compatible. That's a cost that wouldn't be borne if Microsoft's NTFS was 
open source and there was only one code base.

Sure, if you have the source code to the module, providing public access to 
the private variable is pretty easy. I'd not recommend bypassing the 
protection mechanisms in that case. I thought that would be obvious to 
people, tho, so I assumed everyone was on the same page with me there. 
(Which is not to say anyone disagrees or has disagreed with that here.)



> This thread has long derailed into something quasi-metaphysical... :P

I'm thinking it's something very practical. I'm not advocating regularly 
accessing the internals of other objects. I'm just saying that preventing it 
by convention or by hiding information from the humans isn't any worse IMO 
than implementing it in the compiler.

> My final word here is:  to have modularity you don't need all the traps, 
> contraption devices and alarmist chainballs with all the barroque syntax 
> associated.  All you need is to agree to use only what a module provides 
> you.  

And to be able to tell easily when you're violating it. If you have to read 
prose documentation to know whether some routine or variable is private or 
public, *then* you lack modularity.

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

From: nemesis
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 15:42:15
Message: <49c2a017$1@news.povray.org>
Darren New escreveu:
> nemesis wrote:
>> Err... I think here I'll have to agree with Warp.  Why would you try 
>> to bypass the "code sanity checks" and all the paranoid safety devices 
>> to access something you shouldn't be accessing in the first place?
> 
> Because, every once in a while, the provider of the library wasn't 
> omniscient.

You can also try another library. ;)

> And to be able to tell easily when you're violating it. If you have to 
> read prose documentation to know whether some routine or variable is 
> private or public, *then* you lack modularity.

Speaking purely from the point of view of Scheme and functional 
programming, there's no such a fuss:  all the module provides are public 
functions.  You just read their names and parameters and use them.

You can see from this thread that while it began with CLOS, it ended in 
C++ because of its tons of private, protected, friend and other 
schizophrenic access mechanisms leading to complicated interfaces and 
behavior for modules.


Post a reply to this message

From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 15:58:29
Message: <49c2a3e5$1@news.povray.org>
nemesis wrote:
>> Because, every once in a while, the provider of the library wasn't 
>> omniscient.
> 
> You can also try another library. ;)

Well, yeah.

> You can see from this thread that while it began with CLOS, it ended in 
> C++ because of its tons of private, protected, friend and other 
> schizophrenic access mechanisms leading to complicated interfaces and 
> behavior for modules.

Actually, I thought it ended in C++ because that's the only unsafe OOP 
language I know of. It's the only language I know where the compiler 
attempts to enforce modularity but in practice fails to do so, by declaring 
violation of modularity "undefined" or "compiler-specific" rather than 
catching all cases of broken modularity at compile time or runtime.

I was trying to figure out how well-documented modularity breaking is a 
"kludge" but poorly documented unsupported modularity breaking isn't.

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.