POV-Ray : Newsgroups : povray.off-topic : This is the sort of brokenness... : Re: This is the sort of brokenness... Server Time
6 Sep 2024 23:23:49 EDT (-0400)
  Re: This is the sort of brokenness...  
From: Darren New
Date: 19 Mar 2009 17:00:33
Message: <49c2b271$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>>>   Could you give a concrete example of a compiler, of any OO language,
>>> which will only *warn* (rather than give an error message) if you try to
>>> access a member variable specifically marked as private from the outside?
> 
>> Sure. C# or Java. You get this warning that says "this won't compile."
> 
>   Wait. The compiler gives a *warning* which says "this won't compile"?
> If it's just a warning, and not an error, it means that it will compile
> it anyways? Isn't that contradictory?

No. That's why I'm saying you have to look at it from a slightly different 
perspective. It's a compiler error to access the private variable the *easy* 
way, just like in C++. It's not a compiler error to access the private 
variable in the *difficult* way. Hence, you are *warned* when accessing the 
private variable that you have to go out of your way to do it a different 
way.  The modularity isn't enforced at compile time.

>> Another example is, as I've been saying, C++. The compiler warns (and won't 
>> compile) if you access the private variable by name
> 
>   I get no warning if I try to access a private member. I get an error.
> There's a big categorical difference.

To you. Not to me. There are compiler flags that change warnings to errors. 
Does that mean they're not warnings any more?

>> , but not if you 
>> intentionally or accidentally access the private variable by address.
> 
>   You can't access the private variable by address. Not according to the
> standard. If you can, then that can be considered a non-standard compiler
> extension.

// So....
class xyz {
  int i;
} pdq;
memset(&pdq,23,sizeof(int));
// That's illegal C++ code?? I thought C++ is compatible with C?
// Surely the compiler has enough info there to refuse to compile
// it if it's illegal? Doesn't a class with no virtual members have
// to have the same memory layout as in C? How do you pass a struct
// to read(2) or write(2) if you can't take its address?

>   And trashing memory, triggering UB, is not the same thing as "accessing
> a member variable".

I'm not saying I'm "trashing memory". I agree it's triggering UB, but that, 
IMO, means C++'s encapsulation is as much of a "kludge" as CLOS's is. I 
mean, if you're agreeing that C++'s OOP is a kludge also due to lack of 
encapsulation, then sure, every OOP I know of is a kludge under those 
definitions. They're just kludgey in different ways.

I'd rather have the kind where you have to intentionally violate 
encapsulation in a way that is portable and makes it easy to see who is 
doing it, than have the kind that a simple bug can violate encapsulation in 
a module utterly unrelated to where the bug is.

And it's not about C++ vs other things, it's about safe vs unsafe languages. 
The only reason I brought up C++ is it's the only language I know that 
*doesn't* give you a well-defined way to work around the modularity in times 
of need.

What OO language besides C++ do you think doesn't have a kludgey modularity 
system? Any?

> The effect may be the same as accessing in one compiler,
> but might not be in another. You are not *really* accessing it. You are
> just abusing compiler and system specific undefined behavior.

And using reflection isn't *really* accessing it? Or is it? (That's a 
question of your opinion, not a rebuttal.)

As I said, I see what you're saying. I'm trying to generalize a bit more. 
Unsafe languages don't provide encapsulation of modules, IMO. The best they 
do is to prevent you from accessing private variables in a straightforward 
easy to understand way, without preventing you from accessing them in a 
contorted and non-portable way.   And since you're accessing the internals 
of someone else's code anyway, you've already decided to take on the problem 
of changing your access routines when the module changes, so the fact that 
it isn't portable when you change compilers doesn't seem like that much of a 
drawback to me, if the need arises.

It seems to be perfectly valid to use memcpy to copy the contents of an 
Alpha class into an array of characters that's sizeof(Alpha) and then pull 
out from that the values of any variables I need without violating the 
standard. It's non-portable, because I need to change the access routines 
when I change compilers, but ... I know that.

I wouldn't do that sort of thing if I had a choice. Certainly I wouldn't 
publish source code that intentionally takes advantage of this sort of 
undefined behavior. :-)  But then, I wouldn't publish source code that 
depends on the internals of someone else's CLOS code either.



Maybe here's something we can both agree on: Violating modularity can be a 
kludge. That doesn't mean that mechanisms to allow that give you a kludgey 
language. I'll grant that a language that makes it hard to know you're 
violating encapsulation is a kludgey language in that respect. I'll grant 
that any program that accesses private C++ variables or private CLOS 
variables from outside their class is a kludge. I think we'll have to agree 
to disagree on whether the mechanisms that *allow* such access (such as 
reflection) are inherently kludges.


IMO, it also doesn't mean that every violation of modularity is a kludge. If 
your violation of modularity doesn't violate the class invariant and it 
automatically adjusts to new releases of the class you're violating without 
further programming, I don't consider that a kludge. (Otherwise, for 
example, debuggers would be "kludges" by definition, and I don't think 
that's a reasonable assessment.)

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.