POV-Ray : Newsgroups : povray.off-topic : Tell me it isn't so! Server Time
9 Oct 2024 19:14:06 EDT (-0400)
  Tell me it isn't so! (Message 121 to 130 of 473)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Warp
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 12:46:08
Message: <4a6893d0@news.povray.org>
David H. Burns <dhb### [at] cherokeetelnet> wrote:
> *I* don't think, naturally or otherwise, like any OOP programming I have 
> seen!! You may be trying to
> read your on programming philosophy onto the world, or we may be of 
> different genera. :)

  You clearly don't understand what "object-oriented programming" means.
Then you come here to tell us that you don't want it, whatever it might be.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 12:46:56
Message: <4a689400@news.povray.org>
David H. Burns <dhb### [at] cherokeetelnet> wrote:
> What I have seen of OOP

  Clearly you haven't. You don't even understand what "object-oriented
programming" means.

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 12:55:32
Message: <4a689604$1@news.povray.org>
Warp wrote:
>   Object-oriented programming closely matches the thought process of people.

I think it's more like how people categorize real-world stuff. Since a lot 
of programming is essentially simulating the real world, OO makes sense.

There's a bunch of things that go on in the real world that don't make sense 
to do with OO: expert systems, board game AIs, financial analysis, 
accounting, etc.

There's a bunch of things that make sense with OO: Windowing system, 
interactions between businesses and customers, etc.

To say "people think in OO" is to miss that in many situations people don't 
think in OO.

>   This is the way people think naturally,

I would say this is *A* way people think naturally.  Something as simple as 
a decision table or decision tree is another way people think naturally.

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

From: Darren New
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 13:05:46
Message: <4a68986a$1@news.povray.org>
Invisible wrote:
> One assumes that if you don't need foo(), you wouldn't bother including 
> the header file. 

You may use printf() without using sprintf(), fprintf(), and vprintf(). But 
they're all in the same header file.

In any case, I'm telling you what's *correct*, regardless of whether you 
think it's good style. :-)

>> Sort of. That's called "linking." Then there's "loading", which is 
>> when you put it into memory and *again* adjust a bunch of addresses.
> 
> Right. So what you're saying is that there's actually a second linking 
> stage each time final executable is run?

No, there's loading. :-) If the software does both steps, then it's a 
"linking loader".

But yes, there can be multiple linking phases, especially with DLLs and such.

If you have code compiled to run at 0x100 and you copy it from disk to 
memory at 0x300, you need to add 0x200 to every address that points into the 
program. That's loading. Often not needed nowadays with CPUs that can handle 
code that uses only relative offsets.

> In summary... you can't call the OS from C. You can only write C wrapper 
> functions around the assembly code that calls the OS. (And the wrapper 
> then of course looks like a normal C function...)

Right. Generally speaking.

I've always wondered why people think C is good for writing kernel-level 
code, as the only facility in C that actually deals with the sorts of things 
you do in a kernel is "volatile".

>> It's an instruction that invokes an indirect branch.
> 
> I see. So that's the mechanism on the IA32 platform, is it? (I thought 
> only the BIOS uses this method...)

I believe so. I stopped paying attention around the 286 era.

> Interestingly enough, the Motorola 68000 does in fact have two modes of 
> operation: "user mode" and "supervisor mode". I have no idea what the 
> distinction is.

Some opcodes in user mode will cause a software interrupt instead of doing 
what they do in supervisor mode. For example, trying to turn off the 
interrupts will "trap" instead of turning off the interrupts. (Really? 
68000? Not 68020?)

> the MMU-enabled variant of the CPU could support memory protection if 
> you wanted.

Yes. There's actually two things you need for demand-paged VM. You need 
virtual addressing (which the 68010 supported, I think), and you need 
restartable instructions (which the 68020 supported). If you try to store 
four bytes, and the first two bytes get stored and the second two hit a page 
that has to be swapped in, you're kind of hosed if your processor doesn't 
deal with that properly. (Say, by checking before you store anything that 
all the pages are available, or "unwriting" the failed write, or something.)

> And finally, it's perfectly possible to make a multiuser OS without 
> memory protection. It just won't have any memory protection.

You can do the memory protection in software, tho. Not *too* uncommon.

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

From: Darren New
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 13:07:05
Message: <4a6898b9$1@news.povray.org>
Chambers wrote:
> In fact, one of my biggest complaints with C# is the lack of class 
> prototypes.  I miss having a public interface which fits entirely on one 
> (maybe two) screen(s). 

That's an IDE issue, not a language issue. See, for example, Eiffel's IDEs.

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

From: clipka
Subject: Re: Tell me it isn't so!
Date: 23 Jul 2009 13:25:00
Message: <web.4a689c94ac52dfd4aca5323b0@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
>   Object-oriented programming closely matches the thought process of people.
> OOP can be deconstructed into to most basic elements: Concepts and algorithms.
>
>   People think about things conceptually. For example, you can have one pen,
> one car, one dog, and so on.
>
>   Moreover, people use hierarchies of concepts. Some concepts are more
> abstract while other concepts are more concrete. For example, the concept
> of "animal" is more abstract than the concept of "dog" or "cat", which are
> more concrete. Moreover, there's a hierarchical relationship between these
> concepts: A dog is an animal, and a cat is an animal (but a dog is not a cat).

Ah, yes - here we enter the realm of blah that scared *me* away from OOP when I
first came into contact with it:

"WTF - what does *this* crap have to do with *programming*?!"

Honestly, even as a professional SW developer for over a decade now who
definitely prefers OO concepts, I think this is perfect BS.

Despite all claims, I'm convinced this has virtually *nothing* to do with how
normal people *really* think (at least when faced with the task of explaing to
some dumb box how it should *do* something), and has even less to do with OO
*programming*.


David, you hear me? *This* is *not* OOP. This is indeed BS originally from
people leaning on the shallow theoretical side, trying to sell "OO"-labelled
products (compilers, tools, training, consulting, whatever) to people who
haven't experienced the benefits of OOP in practice yet.


*True* OOP is primarily about encapsulation: You highly integrate data
structures and the algorithms operating on them into so-called "objects"; you
precisely define what operations there should be to manipulate the data (for
instance, on a data structure to be used as a stack, you'd primarily want a
"push" and a "pop" operation, plus maybe a few more), and you hide the data
structure from any other code (to the best extent possible with the chosen
language), to prevent the data from being manipulated in any different way.
This is a very powerful tool for keeping track of how and where the data
structures are actually manipulated, so you can more easily change the inner
workings of an object if needs be.

Second, OOP is about "polymorphism": You define operations that should be
possible on various different types of objects (for instance, you might define
a "compute bounding box" operation for all geometric primitives as well as CSG
aggregates) without necessarily doing the same way internally, so they can be
easily processed alongside each other by the same calling code for some purpose
(e.g. get the bounding boxes of all objects) despite any differences in how the
particular operation is actually performed for each type of object.

These, in my eyes, are the most important aspects of OOP. There are others, like
"inheritance" (you might implement part of a new object type by simply referring
to - aka "inheriting from" - a similar object's existing implementation where
applicable, allowing you to easily re-use the existing code) that are of
practical value, too, but I wouldn't rank them as important as encapsulation
and polymorphism.


Strangely enough, inheritence is what the typical blah seems to focus on; on the
other hand, maybe it just feels that way because it's the only thing *really*
unfamiliar to a programmer: Encapsulation, for instance, could be regarded as
modularization driven to the extreme; and as for polymorphism, most programmers
will at least know the underlying problem of how to manage a collection of
elements with different data types. But inheritance? There's really nothing
like this in the world of classical imperative programming, nor does there seem
to be any need for it. Indeed, it is a solution for a problem you won't even
encounter unless you have already entered the realm of OOP. So it's somewhat
ridiculous trying to *introduce* OOP with this.

(I must confess that without this discussion, I would probably try the same
approach to introduce OOP to someone - even though I should know better,
originally having been deterred by this myself; never gave much thought to it
though until now.)


Post a reply to this message

From: Warp
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 13:28:55
Message: <4a689dd7@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> I've always wondered why people think C is good for writing kernel-level 
> code, as the only facility in C that actually deals with the sorts of things 
> you do in a kernel is "volatile".

  Because C allows many types of low-level optimization that are very
difficult, if not even outright impossible, in higher-level languages.

  For example, you know how a C struct will map onto memory locations,
on every possible platform which your kernel can be compiled for. You know
exactly what assigning one struct instance to another entails as machine
code. If you need a pointer which should point to a raw memory address,
you can create one. You can be sure that something like garbage collection
will *not* suddenly and unexpectedly kick in during a highly-critical
operation. If needed, you can even write inline-asm for things which cannot
be done in C directly (eg. write or read from ports, issue some exotic CPU
commands such as disabling interrupts, etc).

  And of course C allows for programmatical optimizations which can often
be difficult in higher-level languages. Memory usage optimization would be
the main example. In something like a kernel you really don't want to waste
too much memory if you don't have to.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 13:29:51
Message: <4a689e0f@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> That's an IDE issue, not a language issue. See, for example, Eiffel's IDEs.

  IMO a language shouldn't rely on IDE or text editor features in order to
be readable.

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 13:51:06
Message: <4a68a30a$1@news.povray.org>
Warp wrote:
>   Because C allows many types of low-level optimization that are very
> difficult, if not even outright impossible, in higher-level languages.

>   For example, you know how a C struct will map onto memory locations,
> on every possible platform which your kernel can be compiled for.

Not in a portable way. You don't know what kinds of padding happen, or what 
order the integers are in.

> You know
> exactly what assigning one struct instance to another entails as machine
> code. 

That's true of lots of low-level languages too. FORTH and Ada both spring to 
mind, for example. Lots of languages of the same ilk have instructions for 
laying out structures at a per-byte level. (Erlang? Modula-whatever?)

 > If you need a pointer which should point to a raw memory address,
> you can create one. 

Not in C. Only in some language that looks like C but invoked undefined 
behavior.

> You can be sure that something like garbage collection
> will *not* suddenly and unexpectedly kick in during a highly-critical
> operation. 

True, but there are lots of languages I consider better than C (as in, more 
powerful) that don't have GC. Anything with managed memory is going to be a 
mess for writing kernel code in, I'll grant you. :-) At least for a general 
kernel where you can't control what else runs.

 > If needed, you can even write inline-asm for things which cannot
> be done in C directly (eg. write or read from ports, issue some exotic CPU
> commands such as disabling interrupts, etc).

But not in C. That's kind of my point. There's whole bunches of stuff that 
happen in kernels that C just doesn't define, just like this.

>   And of course C allows for programmatical optimizations which can often
> be difficult in higher-level languages. Memory usage optimization would be
> the main example. 

I'm not sure what that means. If I have an array of 12-bit values in C, 
that's a PITA to implement compared to a lot of other languages (including C++).

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

From: Darren New
Subject: Re: Tell me it isn't C
Date: 23 Jul 2009 13:52:35
Message: <4a68a363$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> That's an IDE issue, not a language issue. See, for example, Eiffel's IDEs.
> 
>   IMO a language shouldn't rely on IDE or text editor features in order to
> be readable.

So you'd rather rely on the programmer doing a job the IDE ought to be 
automating, just to insist it's there?

We're discussing basically manually built .h files vs automatically built .h 
files, and you're suggesting that manually built .h files are better because 
the compiler can force someone to provide them, whereas with an IDE they 
might not have a nice summary?  I don't think you really mean that.

-- 
   Darren New, San Diego CA, USA (PST)
   "We'd like you to back-port all the changes in 2.0
    back to version 1.0."
   "We've done that already. We call it 2.0."


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.