POV-Ray : Newsgroups : povray.off-topic : This is the sort of brokenness... Server Time
9 Oct 2024 20:21:09 EDT (-0400)
  This is the sort of brokenness... (Message 95 to 104 of 164)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Warp
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 16:31:09
Message: <49c2ab8d@news.povray.org>
nemesis <nam### [at] gmailcom> wrote:
> You can see from this thread that while it began with CLOS, it ended in 
> C++ because of its tons of private, protected, friend and other 
> schizophrenic access mechanisms leading to complicated interfaces and 
> behavior for modules.

  No. You just listed the excuses, not the real reason. The real reason
is that some people here really like to bash C++ at every possible
opportunity they can get. It's like a hobby to them. Or an obsession.

  I certainly didn't bring up C++. It's only after the bashing started
that I got dragged into the discussion about it.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 16:38:03
Message: <49c2ad2b@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> >> I'm saying that C++ does not enforce that you don't change private instance 
> >> variables from outside the class.
> > 
> >   And this has exactly what to do with whether (compiler-enforced) data
> > hiding is a good thing or not?

> Because there's two reasons for data hiding: (1) allow more flexibility in 
> implementation, (2) allow easier debugging.

> If you can violate my encapsulation, I don't see the data as especially 
> encapsulated, is all.

  You didn't answer my question.

  "Is enforced data hiding a good thing or a bad thing?"
  "In C++ you can bypass the encapsulation, so it's a bad thing."

  That doesn't make any sense. Exactly how is C++ related to the subject?
You are badly derailing.

  But of course. As long as you get to bash C++...

> >   Every single discussion about programming with you turns into C++ bashing,
> > no matter what the subject.

> I'm happy to bash any other unsafe OO language you might want to name. :-)

  It's getting really tiresome and old.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 16:50:32
Message: <49c2b017@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> Actually, let me ask this. Since you seem to be saying that reflection means 
> that an OO system is a "kludge" because it breaks modularity

  In fact I said that trying to emulate OOP in a language which has no
direct support for OOP is a kludge. The only thing I said about reflection
is that IMO it breaks modularity and thus you lose many of the benefits of
modular design.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 16:54:00
Message: <49c2b0e7@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> Actually, I thought it ended in C++ because that's the only unsafe OOP 
> language I know of.

  I can't understand if that means "I haven't studied unsafe OOP languages
other than C++" or "to my knowledge there are no other unsafe OOP languages".

> It's the only language I know where the compiler 
> attempts to enforce modularity but in practice fails to do so, by declaring 
> violation of modularity "undefined" or "compiler-specific" rather than 
> catching all cases of broken modularity at compile time or runtime.

  Then you don't know much, do you.

> I was trying to figure out how well-documented modularity breaking is a 
> "kludge" but poorly documented unsupported modularity breaking isn't.

  The connection between my usage of the word "kludge" and your usage
of the words "unsafe" and "reflection" is purely your invention.

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 17:00:33
Message: <49c2b271$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>>>   Could you give a concrete example of a compiler, of any OO language,
>>> which will only *warn* (rather than give an error message) if you try to
>>> access a member variable specifically marked as private from the outside?
> 
>> Sure. C# or Java. You get this warning that says "this won't compile."
> 
>   Wait. The compiler gives a *warning* which says "this won't compile"?
> If it's just a warning, and not an error, it means that it will compile
> it anyways? Isn't that contradictory?

No. That's why I'm saying you have to look at it from a slightly different 
perspective. It's a compiler error to access the private variable the *easy* 
way, just like in C++. It's not a compiler error to access the private 
variable in the *difficult* way. Hence, you are *warned* when accessing the 
private variable that you have to go out of your way to do it a different 
way.  The modularity isn't enforced at compile time.

>> Another example is, as I've been saying, C++. The compiler warns (and won't 
>> compile) if you access the private variable by name
> 
>   I get no warning if I try to access a private member. I get an error.
> There's a big categorical difference.

To you. Not to me. There are compiler flags that change warnings to errors. 
Does that mean they're not warnings any more?

>> , but not if you 
>> intentionally or accidentally access the private variable by address.
> 
>   You can't access the private variable by address. Not according to the
> standard. If you can, then that can be considered a non-standard compiler
> extension.

// So....
class xyz {
  int i;
} pdq;
memset(&pdq,23,sizeof(int));
// That's illegal C++ code?? I thought C++ is compatible with C?
// Surely the compiler has enough info there to refuse to compile
// it if it's illegal? Doesn't a class with no virtual members have
// to have the same memory layout as in C? How do you pass a struct
// to read(2) or write(2) if you can't take its address?

>   And trashing memory, triggering UB, is not the same thing as "accessing
> a member variable".

I'm not saying I'm "trashing memory". I agree it's triggering UB, but that, 
IMO, means C++'s encapsulation is as much of a "kludge" as CLOS's is. I 
mean, if you're agreeing that C++'s OOP is a kludge also due to lack of 
encapsulation, then sure, every OOP I know of is a kludge under those 
definitions. They're just kludgey in different ways.

I'd rather have the kind where you have to intentionally violate 
encapsulation in a way that is portable and makes it easy to see who is 
doing it, than have the kind that a simple bug can violate encapsulation in 
a module utterly unrelated to where the bug is.

And it's not about C++ vs other things, it's about safe vs unsafe languages. 
The only reason I brought up C++ is it's the only language I know that 
*doesn't* give you a well-defined way to work around the modularity in times 
of need.

What OO language besides C++ do you think doesn't have a kludgey modularity 
system? Any?

> The effect may be the same as accessing in one compiler,
> but might not be in another. You are not *really* accessing it. You are
> just abusing compiler and system specific undefined behavior.

And using reflection isn't *really* accessing it? Or is it? (That's a 
question of your opinion, not a rebuttal.)

As I said, I see what you're saying. I'm trying to generalize a bit more. 
Unsafe languages don't provide encapsulation of modules, IMO. The best they 
do is to prevent you from accessing private variables in a straightforward 
easy to understand way, without preventing you from accessing them in a 
contorted and non-portable way.   And since you're accessing the internals 
of someone else's code anyway, you've already decided to take on the problem 
of changing your access routines when the module changes, so the fact that 
it isn't portable when you change compilers doesn't seem like that much of a 
drawback to me, if the need arises.

It seems to be perfectly valid to use memcpy to copy the contents of an 
Alpha class into an array of characters that's sizeof(Alpha) and then pull 
out from that the values of any variables I need without violating the 
standard. It's non-portable, because I need to change the access routines 
when I change compilers, but ... I know that.

I wouldn't do that sort of thing if I had a choice. Certainly I wouldn't 
publish source code that intentionally takes advantage of this sort of 
undefined behavior. :-)  But then, I wouldn't publish source code that 
depends on the internals of someone else's CLOS code either.



Maybe here's something we can both agree on: Violating modularity can be a 
kludge. That doesn't mean that mechanisms to allow that give you a kludgey 
language. I'll grant that a language that makes it hard to know you're 
violating encapsulation is a kludgey language in that respect. I'll grant 
that any program that accesses private C++ variables or private CLOS 
variables from outside their class is a kludge. I think we'll have to agree 
to disagree on whether the mechanisms that *allow* such access (such as 
reflection) are inherently kludges.


IMO, it also doesn't mean that every violation of modularity is a kludge. If 
your violation of modularity doesn't violate the class invariant and it 
automatically adjusts to new releases of the class you're violating without 
further programming, I don't consider that a kludge. (Otherwise, for 
example, debuggers would be "kludges" by definition, and I don't think 
that's a reasonable assessment.)

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 17:23:43
Message: <49c2b7df$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> Warp wrote:
>>>   I still can't comprehend what's so bad in the compiler doing the check
>>> that you don't access the private members.
> 
>> If it's actually enforced (as in an error rather than a warning), it makes 
>> lots of things harder than they need to be, because the running code winds 
>> up with less information than the compiler had. You're throwing away half 
>> the work the compiler did once you generate the code, so you wind up with 
>> lots of boilerplate to enter into code things the compiler already knows.
> 
>   That's the whole idea in modular thinking: You don't know what's inside
> the module, and you *don't care*. You don't need to care.

It also prevents code from *inside* the module from doing certain things in 
an easy way.

>   As soon as you start worrying what's inside the module, you are already
> breaking module boundaries and abstraction.

Not if you're doing it from inside the module.

>> If it's half-enforced, as in the compiler complains and won't compile the 
>> code, but there's ways to get around it anyway (on purpose or by mistake), 
>> then it's IMO the worst of all possible worlds. You'll spend hours or days 
>> trying to debug code that's already right because the client is convinced 
>> the other code they've written is bugfree and it's easier to blame you than 
>> to find the wild pointer in their own code. The whole idea of class 
>> invariants goes out the window.
> 
>   No you'll explain to me how a naming convention of public variables helps
> this problem.

The naming convention doesn't help that. The safety of the language helps 
that. In other words, what helps it is not the naming convention, but the 
fact that code that accesses private variables *must* use the naming 
convention to do so.

Then you can take the Python code you're running, and look for the string 
"._framenumber". If you don't find that, nobody outside your class is 
changing the framenumber, even by accident. (Simplified of course, but 
that's the idea.)

In the CLOS case, you can just look for the name of the private variable 
that you suspect is getting changed from outside. Or you can change the 
source code you have to consistently rename that private variable and run it 
again and get a useful indication of exactly where in the outside code it's 
violating your encapsulation.

In an unsafe language, I can have code that was compiled before I even wrote 
my class accidentally violate my class's encapsulation, and it'll be almost 
impossible to figure out what is going on if the program's big enough or if 
I don't have the source to that code.

I'm just less worried about intentionally violating modularity than 
accidentally doing so. When I do it on purpose, I know the cost/benefit 
trade-offs. Doing so may be a kludge, but there's reasons kludges get 
implemented in the commercial world.

(I've had server code written in unsafe languages where it would dump core 
on a segfault, and the providers wanted to blame it on the fact that the Xen 
kernel was on the machine. We weren't booting Xen, but they wouldn't even 
look at their code until we uninstalled that kernel. Just as an example of 
how people can avoid admitting their own bugs might be part of the problem.

Another time, the "calculation" part of the program did its work, clobbered 
memory, wrote on the screen "now generating report", tried to create the 
data file for the report generator, and crashed out in a way that locked up 
the machine. As the author of the report generator, I got to spend several 
*weeks* trying to figure out what it was before I discovered it was just the 
calc routines not knowing their own calling structure and hence returning to 
the wrong place on the stack at some point. This included printing out *all* 
of the code I'd written, about a 5" stack, and going thru line by line 
looking for bugs. (Found one, which wasn't the problem. :-) Another problem 
that wouldn't have happened in a safe language.)

If you're in a particularly hostile environment, the best way is to have 
well-defined class contracts in a safe environment that prevents the abusers 
from bypassing those contracts. If you're not in a hostile environment, you 
just get the author to agree to support revealing the private information 
you need.

>   IMO if anyone feels the need to break your interface and access private
> members directly, then your class design sucks and should be redesigned.

I agree.  That's why I say that CLOS allowing you to get to private members 
from outside the class *if* you read the source code to the class and know 
what they're called isn't, in practice, a problem.  The only time someone 
would do that is if for some reason they can't *change* the source (because, 
perhaps, they're expecting an update), but they can't get the author to make 
an update that provides the interface that someone needs. At which point the 
only choice is (1) toss the entire module (rewrite, buy different version, 
etc), or (2) take on the burden of staying compatible with updates (which 
may be infrequent enough that it's in practice not a problem). It's a 
cost/benefit sort of thing.

>   If your class design is good, nobody will need to access anything
> directly.

I also agree. Or, at least, nobody will access anything directly without the 
author's "permission". If I say "it's OK to serialize this object, send it 
over the wire, and reconstitute it in another address space", I'd still say 
that's a good class design. If I'm worried about it getting stored for 
longer than the lifetime of the declaration of the class (i.e., the "stored 
in database" example), I'll write code to deal with it, or to convert the 
old versions to new versions.

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

From: Warp
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 17:38:56
Message: <49c2bb70@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> >   That's the whole idea in modular thinking: You don't know what's inside
> > the module, and you *don't care*. You don't need to care.

> It also prevents code from *inside* the module from doing certain things in 
> an easy way.

  Everything in the class is public to the class itself. I don't understand
your point.

> >> If it's half-enforced, as in the compiler complains and won't compile the 
> >> code, but there's ways to get around it anyway (on purpose or by mistake), 
> >> then it's IMO the worst of all possible worlds. You'll spend hours or days 
> >> trying to debug code that's already right because the client is convinced 
> >> the other code they've written is bugfree and it's easier to blame you than 
> >> to find the wild pointer in their own code. The whole idea of class 
> >> invariants goes out the window.
> > 
> >   No you'll explain to me how a naming convention of public variables helps
> > this problem.

> The naming convention doesn't help that. The safety of the language helps 
> that. In other words, what helps it is not the naming convention, but the 
> fact that code that accesses private variables *must* use the naming 
> convention to do so.

  So your solution to the problem of being able to modify private members
by accident because of the language being unsafe is to make the language
safe... and to make everything public (with a naming convention of
variables).

  I don't see how the latter follows.

> I'm just less worried about intentionally violating modularity than 
> accidentally doing so.

  Problems with programming errors in unsafe languages has nothing to
do with the subject of modularity.

  Accessing something out of boundaries by accident is always a bug
regardless of what effects that has. This has nothing to do with whether
modularity is a good thing or not.

> (I've had server code written in unsafe languages where it would dump core 
> on a segfault, and the providers wanted to blame it on the fact that the Xen 
> kernel was on the machine. We weren't booting Xen, but they wouldn't even 
> look at their code until we uninstalled that kernel. Just as an example of 
> how people can avoid admitting their own bugs might be part of the problem.

  And this has what to do with the original subject?

  You always succeed in returning to your favorite subject: Bashing
"unsafe" languages. And in your world there's only one such language: C++.

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 17:56:43
Message: <49c2bf9b@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> 1) They're enforced 100%. Nobody outside the class is capable of writing to 
>> or reading from a private variable. This provides maximum isolation at the 
>> expense of not having available at runtime lots of stuff the compiler knows 
>> at compile time that you might want to know (e.g., reflection type stuff). 
> 
>   If you need to know the private members of the class from the outside,
> then that class has been badly designed.
> 
>   With a well-designed class you don't *need* to know, nor care.

Not all uses of reflection come from outside the class, and not all uses of 
reflection violate modularity. For example, the ability to instantiate an 
instance of a class whose name you have in a string is technically 
"reflection". That doesn't break modularity.

>> 2) They're enforced 100%, but people outside the class can read but not 
>> write them. This is a little more fragile to change
> 
>   It breaks modularity badly. The whole idea of data hiding is that the
> private part should be completely opaque to the outside. The second it
> isn't, outside code will start making assumptions, and you can't change
> the implementation of the class anymore.

Right. That's what I meant by "it's more fragile to change."

>   Some people argue that accessors to all private members, especially if
> they are automatically generated, is almost as bad as having all the
> members public.

It seems semantically identical to me. :-) Of course, if you can override 
the private accessors, you might be able to keep some sorts of compatibility.

> It exposes the internal implementation, and thus outside
> code will start making assumptions about it, making it harder to change
> the implementation later.
> 
>   Some accessors may be justifiable, but you should not go overboard.

Agreed. I'm not advocating that people access private members without 
knowing what they're doing and why. I'm not advocating they access private 
members when they can change the module to reveal what they need revealed.


>> 3) They're enforced, but there are explicit mechanisms to bypass them, such 
>> as reflection. This is my personal preference, because it's easy to find the 
>> places in the code that might be breaking your invariants or might rely on 
>> your internal implementation, while providing the power of metadata. This is 
>> the C# and Java model.
> 
>   If you don't have access to the code which is making assumptions about
> your class' internal structure, you can't change your class without breaking
> that existing code.

It depends what assumptions are being made by the code you don't have.  It's 
also the case that it's not uncommon for things like reflection to be used 
internally to the class or the module to replace what would otherwise be 
fragile boilerplate code.

If the code that violates modularity adjusts automatically when you change 
the class being violated, it's in practice capable of handling far more 
kinds of changes than one might expect.

>   Sure, this situation might not be extremely common, but it can happen.
> And it's something which might have been avoided with a good interface
> design and a completely opaque private implementation.

Agreed. The problem comes when you're stuck with a bad interface design and 
an opaque implementation. :-)  Surely, you don't imagine that reading the 
source code of a class implementation and then using internal functions to 
get the values of private variables is a normal way of programming LISP objects?

>> 4) They're enforced by the compiler but not by the runtime. This removes 
>> *both* the ability to debug your module in isolation *and* the ability to do 
>> metaprogramming that eliminates boilerplate by using the information the 
>> compiler has already calculated.
> 
>   Why would you want runtime checks if access rights could be fully checked
> at compile time?

I wouldn't. I don't know any languages that fully check access rights at 
compile time that don't provide well-defined standard mechanisms for 
bypassing them at need.

>> 5) They're not enforced by the compiler or the runtime, but there are 
>> conventions and/or standards that make it obvious to everyone when you're 
>> breaking the data encapsulation, and the runtime ensures that you can only 
>> do this "on purpose". That is, the unenforced conventions (or enforced but 
>> bypassable mechanisms) ensure that the only way of breaking encapsulation is 
>> on purpose. This is almost as good as #3, except it may be harder to track 
>> down who is violating your invariants.
> 
>   Again, if you must keep "reverse compatibility", for a lack of a better
> term (in other words, you as a library developer must maintain compatibility
> with existing code which uses your library, and you don't have access to
> this existing code nor can change it), it can be a problem if the existing
> code can make assumptions about your library and does so.

Agreed. But I don't know of any language that doesn't let you get into that 
situation. It's just a question of how easy or hard it is, and how obvious 
it is that it's happening.

>>>   I think that your problem is that you have some kind of holy war against
>>> C++, and every single discussion about programming is always somehow turned
>>> to bashing C++.
> 
>> Nope. Someone brought up C++ by talking about the compiler enforcing 
>> "private:". I was just pointing out that the compiler only enforces it in 
>> some ways, and not in the ways that I thought was also important for modularity.
> 
>   You immediately assumed that it must be talking about C++, of course,
> and immediately jumped at the opportunity of bashing it.

Well, the first thing I objected to is the idea that a language is kludgey 
if you can access private variables whose names only appear in the source 
code of the class implementation.

My first mention of C++ (in the same paragraph where I meantioned another 
half-dozen OO languages) was that it's trivial to bypass the modularity of 
it, meaning that you can read and write private members of a class by using 
well-defined if non-portable operations. I then asked you if you didn't 
think that meant modularity was lacking in C++.

Actually, you mentioned "private:" first, and I admit I assumed you were 
talking about C++, since the only other languages I know that use that 
syntax are ones you'd already said are kludgey.

I think we're just off on the wrong foot here, perhaps. You said "data 
hiding is an integral part of OO" and criticized CLOS for not having it. Yet 
the only way to unhide the private data in CLOS is to read the source code 
for the implementation of the class. Hence, it seems you were saying that 
any mechanism that allows the access of private members, even with complete 
access to the source, is a bad thing (assuming you think kludges are bad). 
Yet I can get access to the private members of *every* language that has the 
"private:" syntax. C++ is the only language I know that use that syntax that 
doesn't have a standard way of accessing private variables. All C++ has is 
non-portable (but standard) ways of accessing private variables. So I talked 
about C++.

>   In a discussion about the modular programming paradigm and data hiding you
> succeeded in creating an lengthy thread about how you can trash memory in C++.
> It's not exactly like you avoided the subject.

No, I didn't avoid it. There was an implication that C++ had better 
modularity and data hiding than languages with reflection implemented 
standardly. I'm pointing out that it doesn't. You can dump an instance into 
an array of characters and rebuild whatever you need, if you accept you'll 
have a non-portable knowledge of how the class is laid out.

>> OK. I'm just trying to communicate here. That's why I'm asking the question. 
>> It seemed to me that you preferred a language with unsafe behavior and 
>> compiler-enforced checks to one with safe behavior and no compiler-enforced 
>> checks.
> 
>   That "you preferred a language with unsafe behavior" is 100% your own
> invention. I have nowhere said that.

Are you missing the "it seemed to me" bit there? Do you not understand what 
that phrase means? It means "Hey, we seem to have a miscommunication going 
on. This is what I received. Is it what you meant?"  That doesn't mean I'm 
"twisting" your words. It means I'm asking you to clarify.

>   What I have said is that I prefer enforced modularity over non-enforced
> one (which IMO is not modularity at all). That "unsafe" bit is all your
> own twisting.

I prefer enforced modularity over non-enforced modularity myself. I know of 
virtually no language that supports enforced modularity. They all have 
specific trap doors just to get around the modularity restrictions. (Except 
perhaps Ada and Eiffel, neither of which is a really a language I use enough 
to be sure of.)

>> I've never heard you call C++ a kludge OO language. I assumed you were 
>> excluding C++ from that criticism when you said a language that allows 
>> access to private members is a kludge OO.
> 
>   At least you admit you are making assumptions.

Yeah. Sure. That's a basic part of communication. That's why my comments are 
full of things like "it seems to me..." and "are you saying that..." 
They're indicators that I'm making assumptions that need to be validated. It 
doesn't mean I'm saying you said that. It means it sounds to me like you're 
implying that, and I'm asking if I'm interpreting you correctly.

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 17:59:09
Message: <49c2c02d$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> Actually, let me ask this. Since you seem to be saying that reflection means 
>> that an OO system is a "kludge" because it breaks modularity
> 
>   In fact I said that trying to emulate OOP in a language which has no
> direct support for OOP is a kludge. The only thing I said about reflection
> is that IMO it breaks modularity and thus you lose many of the benefits of
> modular design.

Fair enough. I misremembered precisely what you said, and went with the 
logical implication that seemed to follow.

I'll still disagree that emulating OO with closures in LISP is a kludge, 
because other features of LISP provide direct support for changing the 
language to support other features directly. Once you hit that meta-level of 
support, it's a whole new ball of wax.

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

From: Darren New
Subject: Re: This is the sort of brokenness...
Date: 19 Mar 2009 18:31:54
Message: <49c2c7da$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> Actually, I thought it ended in C++ because that's the only unsafe OOP 
>> language I know of.
> 
>   I can't understand if that means "I haven't studied unsafe OOP languages
> other than C++" or "to my knowledge there are no other unsafe OOP languages".

There are few other unsafe OOP languages I know enough about to discuss. Ada 
and C++ are the only unsafe OOP languages I know, and I don't know Ada well 
enough to know what bad code is erroneous and what bad code is simply UB in 
Ada.

>> It's the only language I know where the compiler 
>> attempts to enforce modularity but in practice fails to do so, by declaring 
>> violation of modularity "undefined" or "compiler-specific" rather than 
>> catching all cases of broken modularity at compile time or runtime.
> 
>   Then you don't know much, do you.

Possibly. What OO languages do you know of where the compiler attempts catch 
all violations of modularity at compile time but in practice fails to do so?

>> I was trying to figure out how well-documented modularity breaking is a 
>> "kludge" but poorly documented unsupported modularity breaking isn't.
> 
>   The connection between my usage of the word "kludge" and your usage
> of the words "unsafe" and "reflection" is purely your invention.

OK. I was confused by the fact that your post had
1) A paragraph about how CLOS allows access to the private variables,
2) a comment about languages lacking data hiding aren't really OO,
3) a lack of specific support for hiding means the OO is kludgey.

I leapt to the conclusion that in 2 and 3 you were actually talking about 
the same thing as you were talking about in 1. That was a bad assumption, 
and started me arguing that CLOS has a reasonable degree of modularity built 
in.

If you weren't talking about CLOS in 2 and 3, I'm not really sure why you 
quoted (1) there.

If you *were* talking about CLOS, I'm arguing that there isn't to my 
knowledge any OO language that does a better job. And none that I know of 
that use "private:".

-- 
   Darren New, San Diego CA, USA (PST)
   My fortune cookie said, "You will soon be
   unable to read this, even at arm's length."


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.