POV-Ray : Newsgroups : povray.off-topic : Tell me it isn't so! Server Time
14 Nov 2024 20:29:21 EST (-0500)
  Tell me it isn't so! (Message 144 to 153 of 473)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: clipka
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 15:15:01
Message: <web.4a68b5a7ac52dfd4aca5323b0@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
> > "WTF - what does *this* crap have to do with *programming*?!"
>
>   It has to do with program design. The larger the program is, the more
> important it is for it to have been designed properly. A large program
> without proper design easily becomes unmaintainable and incomprehensible.

You didn't get my point.

*Now* you're talking about something OOP- and programming-related:
Modularization. But what on earth does your previous post's talk of "concepts",
pens, cars, dogs and cats have to do with this?

Well, with some experience with OOP concepts, *I* do know how to map them to the
constructs prevalent in this type of blurb, but it's still blurb and not the
real thing.


>   One of the most basic methods for keeping a large program manageable is
> to subdivide it into smaller parts, hierarchically. When you write something
> inside the large program, you shouldn't have to be keeping in mind the
> *entire* program in order to be able to write that small part. You should
> be able to keep in mind only the *relevant* parts of the rest of the program
> necessary to make that small part work.

BTW, note how the blurb may help to confuse: You're now talking about modular
hierarchies (going back to cats, they're comprised of a body, a head, for legs
and a tail); in your previous post, you were talking about conceptual
hierarchies (cats and dogs both being animals).

You *need* the modular hierarchies for large projects; but the blurb focuses on
the conceptual hierarchies.

>   The solution presented (although not originally invented) by object
> orientedness to this problem is the concept of modules: Modules can have
> both functionality and data, and they enclose both in a way that makes
> managing the data easier in a large project. Modules can also form a
> hierarchy (so modules can define sub-modules inside them, or own instances
> of other modules, etc).

Note that encapsulation in OO goes a step beyond what would typically be
considered modularization: A module is typically thought of as a collection of
code, typically coming with various data structures and a bit of module-global
data as an aside, and a project would typically have one instance of each
module. In OO, the focus is more on the data and the interface, with the code
taking on the role of an aside, and the whole project would be dealing with
multiple instances at once.


> > Despite all claims, I'm convinced this has virtually *nothing* to do with how
> > normal people *really* think (at least when faced with the task of explaing to
> > some dumb box how it should *do* something), and has even less to do with OO
> > *programming*.
>
>   Oh, it has a lot to do with how normal people think, and how other things
> (such as big companies) are organized.

Read again: "(at least when faced with the task of explaing to some dumb box how
it should *do* something)"

And notice that I'm talking about the blurb here, the cats and dogs and pens BS,
*not* OOP. I *do* agree that OO is indeed a very natural way of thinking when
analyzing or planning complex systems. It also very closely resembles the
design of e.g. complex machines, btw. But the typical OOP introduction blurb
tends to be very far from that.

And yes, in a sense this may further some "elite" thinking: With OOP being
introduced this way, students may get the impression that it is something
totally different from older programming paradigms, when in reality it is just
adding a few not really complicated concepts.


>   I'm sorry, but I think that the one writing bullshit is you.

Well, sorry if I got you a bit upset - I didn't do full justice to your post, as
it's still rather on the humane side of blurb; but it's still blurp-ish enough,
and I know you have a tough skin.


Post a reply to this message

From: David H  Burns
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 15:21:19
Message: <4a68b82f$1@news.povray.org>
Warp wrote:
> David H. Burns <dhb### [at] cherokeetelnet> wrote:
>>>   You clearly don't understand what "object-oriented programming" means.
>>> Then you come here to tell us that you don't want it, whatever it might be.
>>>
>> I have seen a number of object oriented programs.
> 
>   Ironically, by saying that you are only *confirming* what I just wrote.
> 

OK I have seen OOP programs but I don't know what object orient 
programming mean.
Object Oriented Programming does produce Object Oriented Programs, but 
looking
at them doesn't tell me what Object Oriented Programming means. I'll 
accept that. Now, I ask again
tell me what Object Oriented Programming means.

David


Post a reply to this message

From: Darren New
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 15:32:09
Message: <4a68bab9$1@news.povray.org>
Warp wrote:
>   There aren't many programming languages out there which would allow the
> same kind of "controlled compiler abuse" as C, which is precisely why it's
> so popular to write kernels and device drivers.

True, but there are a number that do. :-)  Most people don't need controlled 
abuse.

> but right now I can't say what other languages have been used to write
> kernels.

Well, Ada is high on the list, as is FORTH. Algol was used for the B-series 
from Burroughs. That's why I mentioned them.

>   Sometimes you just have to. No programming language can account for every
> single feature a kernel needs. Sometimes you just *must* bypass the standard
> language and poke the hardware directly.

Sure. But, for example, take a look at the list of things that Ada defines 
how to do that C doesn't that would be useful for kernels:

Interrupt hooking, interrupt prioritization, interrupt disabling while 
running a higher-priority interrupt. Packed structures where you can define 
what and where every bit goes. Switching stacks (as in, context switching). 
Atomically writing to memory-mapped hardware. Test-and-set. Dynamically 
loading code and then executing it. Mapping data structures to particular 
addresses. Well-defined language structures for inline assembler. Plus 
everything C++ does (including interfacing to other calling conventions) 
except maybe turing complete templates. :-)

Oh, and I think it handles segmented memory, but I might be mistaken there.

Sure, if you need to invoke some magic opcode to blink the front panel 
lights, it's going to be really hard to make that portable. But you can do a 
lot better than C does.  Of course, Ada has another 25 years of experience 
on C, so it's not really a fair comparison.

I really think C got off the ground just by being used for the first 
portable kernel. The same as FORTRAN still being tops in numeric analysis 
because it had the first portable numeric analysis libraries.

>   The only other alternative would be to write the kernel in assembly, which
> would be enormously less portable and less manageable than C.

As I said, lots of kernels are written in FORTH (for small machines) or Ada 
(for dangerous machines).

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

From: David H  Burns
Subject: Re: Tell me it isn't so!:Apparently it is!
Date: 23 Jul 2009 15:44:48
Message: <4a68bdb0$1@news.povray.org>
Warp wrote:
> David H. Burns <dhb### [at] cherokeetelnet> wrote:
>> Neeum Zawan wrote:
> 
>>>     Try lying in front of the house when the bulldozers come. And if you 
>>> see someone coming along hugging a towel, become friends with him. 
>>> Otherwise your life will be in danger.
>>>
>>>     And not because of the bulldozers.
>>>
>> I'm not at all clear what this means. Submit to the destructors (or to 
>> your fate)?
>> Do you mean OOP is a bulldozing machine predestined to level all. That 
>> I'm an
>> obstructor of the progress man and should be removed? That my life is
>> in danger from irate OOP programmers? What did you mean?
> 
>   Can I ask you how old are you?
> 
>   You don't understand the concept of smileys, you don't understand the
> concept of object-oriented programming, and you can't even get a hitchhiker's
> reference.
> 
True, I didn't recognize the reference to the Hitchhikers Guide and 
wouldn't have
for a million years without your hint.  I have read it, saw the PBS 
version and
had successfully forgotten them. And I didn't know how to make smileys, 
if that constitutes
not understanding the concept. And I don't know much about OOP 
programming, but
these exchanges have taught me a lot about OOP devotees.


I'm 63 over the hill and senile. Just an uncultured ignorant old retard 
who needs to get out
of the way and let you young folk take over.

Can I ask how old you are? :)

David


Post a reply to this message

From: clipka
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 15:45:01
Message: <web.4a68bd776bcf74aeaca5323b0@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> Sure. But, for example, take a look at the list of things that Ada defines
> how to do that C doesn't that would be useful for kernels:
>
> Interrupt hooking, interrupt prioritization, interrupt disabling while
> running a higher-priority interrupt. Packed structures where you can define
> what and where every bit goes. Switching stacks (as in, context switching).
> Atomically writing to memory-mapped hardware. Test-and-set. Dynamically
> loading code and then executing it. Mapping data structures to particular
> addresses. Well-defined language structures for inline assembler. Plus
> everything C++ does (including interfacing to other calling conventions)
> except maybe turing complete templates. :-)

An interesting thought here:

How high is the influence of the most favorite programming languages on the
further development hardware design?

> As I said, lots of kernels are written in FORTH (for small machines) or Ada
> (for dangerous machines).

Maybe the latter is somewhat comforting. Not that the pure *existence* of
dangerous machines would be.


Post a reply to this message

From: Mike Raiford
Subject: Re: Tell me it isn't so!:Apparently it is!
Date: 23 Jul 2009 15:47:34
Message: <4a68be56@news.povray.org>
scott wrote:

> 
> Even if POV4 does change to an OOP model that you don't like, a) that is 
> not going to be released for decades, and b) you are free to still use 
> the latest stable 3.x release.  I wouldn't worry about it.

Crap. I'm gonna be old by the time its released.


-- 
~Mike


Post a reply to this message

From: Warp
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 16:27:21
Message: <4a68c7a9@news.povray.org>
David H. Burns <dhb### [at] cherokeetelnet> wrote:
> Now, I ask again
> tell me what Object Oriented Programming means.

  I have already written extensively about it in several posts in this
thread.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 16:40:24
Message: <4a68cab7@news.povray.org>
clipka <nomail@nomail> wrote:
> *Now* you're talking about something OOP- and programming-related:
> Modularization. But what on earth does your previous post's talk of "concepts",
> pens, cars, dogs and cats have to do with this?

  In object-oriented programming a class is basically a user-defined type,
and a user-defined type is a concept. For example, "a string" is a concept,
and a string class is the implementation of that concept.

  A concept is basically something which behaves in a certain way. By writing
a class you are defining how it behaves.

  Inheritance is related to a hierarchy of concepts: More concrete concepts
inherit from more abstract concepts. For example, a file input stream may
inherit from a more generic input stream, both of which are concepts.

  The animal-dog example is not at all far-fetched. If you were to write
some kind of life simulation program with different plants and animals,
the classical inheritance hierarchy would, in fact, be basically perfect
for designing all the entities in that program.

> >   One of the most basic methods for keeping a large program manageable is
> > to subdivide it into smaller parts, hierarchically. When you write something
> > inside the large program, you shouldn't have to be keeping in mind the
> > *entire* program in order to be able to write that small part. You should
> > be able to keep in mind only the *relevant* parts of the rest of the program
> > necessary to make that small part work.

> BTW, note how the blurb may help to confuse: You're now talking about modular
> hierarchies (going back to cats, they're comprised of a body, a head, for legs
> and a tail); in your previous post, you were talking about conceptual
> hierarchies (cats and dogs both being animals).

  I don't see how those are mutually exclusive. A dog may be composed of
several parts, and a dog may be an animal, both at the same time. I really
don't see the problem.

  (In fact, inheritance and composition are two concepts of OOP.)

> You *need* the modular hierarchies for large projects; but the blurb focuses on
> the conceptual hierarchies.

  In many programs inheritance is best used to describe conceptual hierarchies.
It may not be about dogs and cats, but it may be eg. about file or socket
handles, strings, database entries, or basically anything. The mechanics
are still the same.

> >   The solution presented (although not originally invented) by object
> > orientedness to this problem is the concept of modules: Modules can have
> > both functionality and data, and they enclose both in a way that makes
> > managing the data easier in a large project. Modules can also form a
> > hierarchy (so modules can define sub-modules inside them, or own instances
> > of other modules, etc).

> Note that encapsulation in OO goes a step beyond what would typically be
> considered modularization: A module is typically thought of as a collection of
> code, typically coming with various data structures and a bit of module-global
> data as an aside, and a project would typically have one instance of each
> module. In OO, the focus is more on the data and the interface, with the code
> taking on the role of an aside, and the whole project would be dealing with
> multiple instances at once.

  Modular programming does know the concept of instantiating modules. For
example the modula programming language (which is not an OOP language) has
modules with public and private interfaces, and which can be instantiated
and referenced. (What makes it non-OOP is that it doesn't support inheritance
nor obviously dynamic binding.)

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 16:45:24
Message: <4a68cbe3@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> As I said, lots of kernels are written in FORTH (for small machines) or Ada 
> (for dangerous machines).

  I wonder why Ada never got much popularity, even though it was really
pushed at some point (especially by the US government, AFAIK). Was it
lack of compilers for common hardware or something?

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 17:49:47
Message: <4a68dafb$1@news.povray.org>
clipka wrote:
> How high is the influence of the most favorite programming languages on the
> further development hardware design?

Exactly. The Intel line of x86s was originally designed for Pascal. You can 
tell by looking at the four segments (which map exactly to Pascal data 
spaces) and the "ret n" instruction, even if there wasn't historical 
documentation backing it up.

> Maybe the latter is somewhat comforting. Not that the pure *existence* of
> dangerous machines would be.

I figure if you're writing any code that kills >100 people when it crashes, 
you probably shouldn't be writing it in a language that wasn't designed for 
that. :-)

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.