POV-Ray : Newsgroups : povray.off-topic : Tell me it isn't so! Server Time
9 Oct 2024 21:16:00 EDT (-0400)
  Tell me it isn't so! (Message 131 to 140 of 473)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Warp
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 13:53:19
Message: <4a68a38f@news.povray.org>
clipka <nomail@nomail> wrote:
> Ah, yes - here we enter the realm of blah that scared *me* away from OOP when I
> first came into contact with it:

> "WTF - what does *this* crap have to do with *programming*?!"

  It has to do with program design. The larger the program is, the more
important it is for it to have been designed properly. A large program
without proper design easily becomes unmaintainable and incomprehensible.

  One of the most basic methods for keeping a large program manageable is
to subdivide it into smaller parts, hierarchically. When you write something
inside the large program, you shouldn't have to be keeping in mind the
*entire* program in order to be able to write that small part. You should
be able to keep in mind only the *relevant* parts of the rest of the program
necessary to make that small part work.

  It's the same in almost everything that is big and complex, not just
programming. The CEO of a large company doesn't have to worry about what
every and each single one of the ten thousand employees is doing. That would
be impossible. He just can't be managing ten thousand things at once. It
becomes even more complicated when they have sub-contractors and other such
companies to deal with. Instead, there's a *hierarchy* of management in the
company: The CEO controls a dozen or so bosses, who control a dozen of
managers, who control individual employees, or however a big company is
subdivided.

  The functionality of a large program can be largely divided into a
hierarchy by distributing the code logically into functions. However,
functions are not enough when you need to handle enormous amounts of
different types of *data*. Data management has to also be divided into
logical parts, often hierarchically.

  The solution presented (although not originally invented) by object
orientedness to this problem is the concept of modules: Modules can have
both functionality and data, and they enclose both in a way that makes
managing the data easier in a large project. Modules can also form a
hierarchy (so modules can define sub-modules inside them, or own instances
of other modules, etc).

  Most concepts can be naturally expressed as modules. If you have, for
example, the concept of "string", you can write a module which represents
such a string, with all the necessary data and functionality related to
strings. When such a string module has been properly designed, it becomes
easier and more manageable to handle strings in the program. The rest of
the code doesn't have to worry about how strings are handled; they just
use the functionality provided by the module. This also makes it easier
to *change* the implementation of this string module without breaking
existing code.

> Despite all claims, I'm convinced this has virtually *nothing* to do with how
> normal people *really* think (at least when faced with the task of explaing to
> some dumb box how it should *do* something), and has even less to do with OO
> *programming*.

  Oh, it has a lot to do with how normal people think, and how other things
(such as big companies) are organized.

  When you go to a grocery store to buy food, you don't have to care how
the grocery store works internally. You just use its "interface" to buy the
food and that's it. The store takes care of its own inner functionality.
That's modularity and object-orientedness in action, in real-life.

> David, you hear me? *This* is *not* OOP. This is indeed BS originally from
> people leaning on the shallow theoretical side, trying to sell "OO"-labelled
> products (compilers, tools, training, consulting, whatever) to people who
> haven't experienced the benefits of OOP in practice yet.

  I'm sorry, but I think that the one writing bullshit is you.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 14:05:01
Message: <4a68a64d@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> Warp wrote:
> >   Because C allows many types of low-level optimization that are very
> > difficult, if not even outright impossible, in higher-level languages.

> >   For example, you know how a C struct will map onto memory locations,
> > on every possible platform which your kernel can be compiled for.

> Not in a portable way. You don't know what kinds of padding happen, or what 
> order the integers are in.

  Yes, you do. Maybe you didn't understand the "on every possible platform
which your kernel can be compiled for" part?

  Ok, maybe I expressed myself poorly and it should have written "on every
possible platform your kernel has been ported to".

> > You know
> > exactly what assigning one struct instance to another entails as machine
> > code. 

> That's true of lots of low-level languages too. FORTH and Ada both spring to 
> mind, for example. Lots of languages of the same ilk have instructions for 
> laying out structures at a per-byte level. (Erlang? Modula-whatever?)

  The difference is probably that neither FORTH nor Ada have the same amount
of libraries, platform support or optimizing compilers, nor are they nearly
as popular.

>  > If you need a pointer which should point to a raw memory address,
> > you can create one. 

> Not in C. Only in some language that looks like C but invoked undefined 
> behavior.

  Of course in C. And "undefined behavior" can also mean "works as desired
in this platform". When you know what the compiler is doing, and you are
writing platform-specific code, C allows you to do a whole lot of things
you can't do with other languages.

  Most DOS demos written in C used raw pointers (eg. to the VGA memory
buffer). They worked just fine on that platform.

> >   And of course C allows for programmatical optimizations which can often
> > be difficult in higher-level languages. Memory usage optimization would be
> > the main example. 

> I'm not sure what that means. If I have an array of 12-bit values in C, 
> that's a PITA to implement compared to a lot of other languages (including C++).

  It means that many if not most of the "high-level" languages pay zero
attention to memory usage. They freely and carelessly allocate memory like
it was candy because, you know, all computers have gigazillions bytes of
RAM and any program written in that language will be run alone, so it
doesn't have to worry about other programs which might also want some of
the memory.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 14:07:44
Message: <4a68a6f0@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> Warp wrote:
> > Darren New <dne### [at] sanrrcom> wrote:
> >> That's an IDE issue, not a language issue. See, for example, Eiffel's IDEs.
> > 
> >   IMO a language shouldn't rely on IDE or text editor features in order to
> > be readable.

> So you'd rather rely on the programmer doing a job the IDE ought to be 
> automating, just to insist it's there?

  Yes, because IDEs are necessarily quite platform-specific.

> We're discussing basically manually built .h files vs automatically built .h 
> files, and you're suggesting that manually built .h files are better because 
> the compiler can force someone to provide them, whereas with an IDE they 
> might not have a nice summary?  I don't think you really mean that.

  You said "That's an IDE issue, not a language issue." I read that to mean
that a language (like C# in this case) doesn't need to be designed to be
readable because an IDE can be used to make it readable.

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 14:10:56
Message: <4a68a7b0$1@news.povray.org>
Warp wrote:
>   When you go to a grocery store to buy food, you don't have to care how
> the grocery store works internally. You just use its "interface" to buy the
> food and that's it. The store takes care of its own inner functionality.

And remember that the store also has food ordering at the back end, 
accounting, paying employees, filing taxes, cleaning the floors, predicting 
how much food needs to be ordered, etc. *that* is where the modularity comes 
in, and the OO, even more than just "buying food". A program that only deals 
with one kind of interaction is like a database with only one primary key.

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

From: David H  Burns
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 14:11:50
Message: <4a68a7e6$1@news.povray.org>
Warp wrote:
> David H. Burns <dhb### [at] cherokeetelnet> wrote:
>> *I* don't think, naturally or otherwise, like any OOP programming I have 
>> seen!! You may be trying to
>> read your on programming philosophy onto the world, or we may be of 
>> different genera. :)
> 
>   You clearly don't understand what "object-oriented programming" means.
> Then you come here to tell us that you don't want it, whatever it might be.
> 
I have seen a number of object oriented programs. And I did not like 
what I say.
They are pretty nigh incomprehensible to me. I know they were produced 
(or claimed to
be produced) by object oriented programming. Now it's true that the 
process of programming
is different from the program produced, but the product that I see gives 
little indication that
  it was produced by thought processes like mine. I don't presume to say 
that you don't naturally
think like object oriented programming, but since I have more experience 
*my*mind than you,
I can assure you that I don't! I definitely would not like to see the 
Pov-Ray scripting language
turn into something like those examples of OOP I have seen.

BTW I forgot one other possibility: You might be a genus. I once worked 
with an old fellow who
was a near genus (at least) in programming. His mind seemed to work in a 
way in almost diametrically
opposed to mine. Any time we tried to work together, we started with an 
argument but ended
accomplishing our task. He' dead now; I wonder what he would have 
thought of OOP, it might
have been his native language. :)

David


Post a reply to this message

From: Warp
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 14:13:21
Message: <4a68a840@news.povray.org>
David H. Burns <dhb### [at] cherokeetelnet> wrote:
> >   You clearly don't understand what "object-oriented programming" means.
> > Then you come here to tell us that you don't want it, whatever it might be.
> > 
> I have seen a number of object oriented programs.

  Ironically, by saying that you are only *confirming* what I just wrote.

-- 
                                                          - Warp


Post a reply to this message

From: David H  Burns
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 14:15:10
Message: <4a68a8ae@news.povray.org>
Warp wrote:
> David H. Burns <dhb### [at] cherokeetelnet> wrote:
>> What I have seen of OOP
> 
>   Clearly you haven't. You don't even understand what "object-oriented
> programming" means.
> 
OK so I don't. Tell me. :)


Post a reply to this message

From: Darren New
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 14:16:16
Message: <4a68a8f0$1@news.povray.org>
Warp wrote:
>   Yes, you do. Maybe you didn't understand the "on every possible platform
> which your kernel can be compiled for" part?

I'm just saying that's not really well-defined in *C*. It's 
compiler-specific, not C as such.

>   The difference is probably that neither FORTH nor Ada have the same amount
> of libraries, platform support or optimizing compilers, nor are they nearly
> as popular.

True. And those things feed on each other.

>   Of course in C. And "undefined behavior" can also mean "works as desired
> in this platform".

That's what I'm distinguishing. That's why I say C, as such, isn't very good 
for that sort of thing. You can do it, but only because you look at what 
your particular compiler generated for some piece of otherwise undefined 
code, and say "yes, that's what I'd like."

> When you know what the compiler is doing, and you are
> writing platform-specific code, C allows you to do a whole lot of things
> you can't do with other languages.

I'd phrase that as "the compiler will generate code for programs without 
defined semantics that's often what you want."

>   Most DOS demos written in C used raw pointers (eg. to the VGA memory
> buffer). They worked just fine on that platform.

Maybe I'm more of a theoreticist, but "it works for me" isn't how I like to 
write kernels. ;-)

>   It means that many if not most of the "high-level" languages pay zero
> attention to memory usage.

I'm not saying every programming language is appropriate for writing 
kernels. I'm saying there are good high-level languages both more powerful 
and safer that are better than C, and I'm not sure why they're not more popular.

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

From: David H  Burns
Subject: Re: Tell me it isn't so!:Apparently it is!
Date: 23 Jul 2009 14:22:24
Message: <4a68aa60$1@news.povray.org>
scott wrote:
>> That's what it sounds like, the Pov-Ray world anyway. You've greatly 
>> increased my paranoia.
>> Any day now I expect the carpenter's union to redesign and rebuild my 
>> house not for
>> convenience or comfort but to teach me the current accepted practice 
>> in carpentry.
> 
> Even if POV4 does change to an OOP model that you don't like, a) that is 
> not going to be released for decades, and b) you are free to still use 
> the latest stable 3.x release.  I wouldn't worry about it.
> 
> 
If that be the case, shall we postpone this discussion for
twenty years or so? :-) Did that produce a wink.

David


Post a reply to this message

From: Warp
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 14:23:59
Message: <4a68aabf@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> >   Of course in C. And "undefined behavior" can also mean "works as desired
> > in this platform".

> That's what I'm distinguishing. That's why I say C, as such, isn't very good 
> for that sort of thing. You can do it, but only because you look at what 
> your particular compiler generated for some piece of otherwise undefined 
> code, and say "yes, that's what I'd like."

  There aren't many programming languages out there which would allow the
same kind of "controlled compiler abuse" as C, which is precisely why it's
so popular to write kernels and device drivers.

  Of course not all kernels in existence have been written in C (or asm),
but right now I can't say what other languages have been used to write
kernels.

> >   Most DOS demos written in C used raw pointers (eg. to the VGA memory
> > buffer). They worked just fine on that platform.

> Maybe I'm more of a theoreticist, but "it works for me" isn't how I like to 
> write kernels. ;-)

  Sometimes you just have to. No programming language can account for every
single feature a kernel needs. Sometimes you just *must* bypass the standard
language and poke the hardware directly.

  The only other alternative would be to write the kernel in assembly, which
would be enormously less portable and less manageable than C.

-- 
                                                          - Warp


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.