POV-Ray : Newsgroups : povray.off-topic : Teach yourself C++ in 21 days : Re: Days 1-5 Server Time
29 Jul 2024 10:23:28 EDT (-0400)
  Re: Days 1-5  
From: Invisible
Date: 17 Apr 2012 04:52:32
Message: <4f8d2f50$1@news.povray.org>
>> On one point it is clear: Should I learn C before learning C++? The
>> answer comes back as "no". Which seems quite correct to me.
>
> Well, there is nothing really to learn in C, once you have learned
> assembly programming of course.

Except that assembly generally follows a simple and consistent syntax, 
whereas C is written in C. :-P

>> On the subject of Java, it made me laugh.
>
>> "In any case, C++ and Java are so similar that learning one involves
>> learning 90% of the other."
>
> Right&  wrong. Class, reference, direct inheritance are similar. complex
> inheritance, exception handling, pointer are not.
>
> Traditional Java C++ war... not interesting.

War nothing. The two languages bear a superficial resemblance to each 
other, but beyond that they're /radically different/. To the degree that 
learning any OO language helps you learn any other OO language, C++ and 
Java are similar. Beyond that, they're two totally different languages. 
Learning Java teaches you maybe 10% of C++, not 90%. And vice versa.

>> Then we are told that "more than any other programming language, C++
>> demands that you /design/ a program before you build it".
>
> Well, they should look at some Ada or smalltalk... but it's right. Think
> before you start coding.

Try Haskell. A typical coding session involves coming up with a 
breakthrough mathematical insight, followed by some ingenious 
programming. Certainly you can't just hack on it until it works.

> I.e: the author has no clue for you.

Yeah, that's pretty much the conclusion I reached too.

It says in the preface that the author is the president of some company 
that supplies C++ training. It doesn't say anywhere that he's actually a 
C++ programmer. :-P

>> Anyway, apparently main() must always return int. It is against the ANSI
>> standard for it to return void.
>
> The problem is to explain Unix: the return of main() is used as the
> return of the called program... and some utility like "make" use that
> result to check success or failure.

Not just make; a whole host of programs make use of a program's return 
code. (And not just on Unix either; Windows does the same.) But this is 
apparently an "obscure feature" which you don't need to know about. :-P

>> Then there's some mumbo-jumbo about how you can use cout "to do
>> addition", and how when you do count<<  8+5, then "8+5 is passed to
>> cout, but 13 is what is actually printed". Nice, clear explanation,
>> that. :-P
>
> Oh great, it's not cout that does the addition any way! So you now have
> a false explanation.

With this level of conceptual muddling about something of fundamental 
importance, you can see why my opinion of the book is so low. :-P

>> "comments should not explain /what/ is happening, but
>> /why/ it is happening".
>
> At least a sensible sentence.

Yeah. The purpose of comments is to explain stuff which /isn't/ obvious 
from reading the source code itself. (I'm also of the opinion that less 
is more. I dislike code that's so littered with comments that you can't 
see the code properly. Then again, maybe that's because I don't have a 
syntax highlighter...)

>> In an example demonstrating all this, cin is casually dropped in but not
>> mentioned anywhere. Presumably we're supposed to /guess/ this detail.
>
> Guess what: it's an input stream (whereas cout is output stream)

Oh, *I* know that. But no thanks to the book that's supposed to be 
teaching me. :-P

> The definition of type are inherited from C (and they f****d it large).
> There was no bool in C.
> A char is enough bits to store a native glyph. and you can assume that
> the number of bit is at least 8.
> It can be 9. It can be 16. It can even be 32. (the 9 are met only on
> architecture with 9 bits per bytes in the hardware bus)

That's pretty messed up.

> One warning: sizeof() will return the size in char, not bytes.

Oh, that's fun. Let us hope that char *is* one byte, otherwise... damn!

I guess this is why autoconf exists. :-P

> Second warning: the signedness of char is local to the compiler.

Really? The book seems to say that all integer types default to signed, 
unless you request unsigned. (AFAIK, there isn't a way to request signed.)

> In 1999, there was only issue with 3 integers type. It was 32 bits time.

Well, the book still talks about 16-bit code.

> On nowadays 64 bits, there is also a new one "long long int" !

long long int? Woah, that's special...

> As inherited from C, it's a total garbage, you can only assume:
>
> char<= short<= int<= long<= long long
> Notice that it is possible that "char = long"

So much for C and C++ being for system-level programming. :-P

>> But hey, at least double and float work in a sane way, right?
>
> Do not rely on that. I'm not sure C mandate that the double&  float
> should obey the IEEE-754 rules, or even their size, nor their
> representation and mantissa/offset bitcount.

Well, no, it'll be whatever the underlying platform supports. Which on 
any sane x86 system is going to be IEEE-754. If I was trying to program 
a microcontroller, I might be worried. On a desktop PC, I'm not too 
concerned.

> Usually, float is a 32 bits, double is 64 bits. But it is known that
> some compiler would perform the computation on 80 bits registers...

Is there a way to explicitly request 80 bits?

> Yep, variables are allocated, but whatever was in the memory at that
> time is their value. You'd better set one.

Right. So if I request a 2GB array, it won't actually sit there trying 
to zero all 2GB of RAM, right before my for-loop rolls over it to 
initialise it /again/.

>> Apparently assigning a double to an int is only a /warning/, not a
>> compile-time error. The book helpfully fails to specify how the
>> conversion is actually done.
>
> Issue: it might be a local architecture/compiler choice.

Oh God.

>>    while (x[i] = y[i--]) ;
>>
>> is perfectly valid. You sick, sick people.
>
> It came from C. Put the blame on C.

C, the PDP assembly language that thinks it's a high-level language. :-P

(I forgot who wrote that...)

>> In most programming languages, performing division promotes integers to
>> reals. But not in C++, apparently. (Again, it is unspecified exactly how
>> integer division works. Whether the author actually doesn't /know/ how
>> it works or merely thought it unimportant is unclear.)
>
> Operation on integers are perform with integers.
> Integer division is performed as usual: 14/5 is 2.

So, it always rounds downwards?

>> Next we learn that false=0, and true /= 0. (Christ, I'm /never/ going to
>> remember that!) Apparently there's a new bool type, but details are thin
>> as to what the exact significance of this is.
>
> Once again, C did not had bool, C++ has.

I notice that "true" and "false" seem to be valid names now, which is 
useful...

> And now for a mad trick: if you follow an expression with a , (comma),
> its value will be replaced by the expression after the comma...

Oh that's sick. So comma really /is/ an operator? Wow, that's evil.

I wonder if brackets are operators too? Perhaps you can do some crazy 
code obfuscation with that as well. :-P

>> Apparently every function must be declared before it can be used. (This
>> constantly trips me up...) It seems a function prototype doesn't need to
>> include argument names, just their types. (If only Java was like this!)
>> Bizarrely, if you don't specify a return type, it defaults to int. Not,
>> say, void. :-o
>
> yep. C++ is maniac: you must provide the prototype before using the
> function.
> You can have the parameter's names in the prototype if you like.

I was referring more to the fact that it's legal to not specify the 
return type, and if you don't, it defaults to something absurd.

>> There's a side-box that shows you the syntax for a function prototype
>> [even though the text already explained this]. But, for reasons unknown,
>> the diagram /actually/ shows a normal function definition, but with a
>> stray semicolon at the end of the prototype... In short, somebody seems
>> to have got their stuff mixed up.
>
> Nop.
>
> int main(int,char**); // is enough for the compiler
> int main(int argc, char** argv); // is more similar to the definition
>
> Both are ok.

To be clear: The diagram claims that a "function prototype" looks like this:

   double foo(int, int, int) ;
   {
     statement1;
     statement2;
     statement3;
   }

This is neither a valid prototype nor a valid definition. It's garbage.

If you take out the function body, it becomes a valid prototype. If you 
take out the semicolon and add some argument names, it becomes a valid 
definition. But is written, it's garbage.

>> Then we learn about global variables. Apparently if you write a variable
>> outside of any block, it's global. No word on exactly when it's
>> initialised, or precisely what "global" actually means. For example, I'm
>> /guessing/ the variable is only in-scope /below/ the line where it's
>> defined. That's how functions work, after all.
>
> Global variables are either scoped by the file in which they are (static
> prefix), or to the whole program.
> (you haven't seen namespace yet).
> Their position is irrelevant, but it's like prototype and declaration:
> you need the prototype before using it. The declaration of a global
> variable can also be used as its prototype, but you can only have one
> declaration.
>
> prototype (at head or in a header file):
> int foo;
>
> declaration (anywhere, usually at head to avoid a prototype):
> int foo=5;

You just explained more in a handful of sentences than the book 
explained in an entire chapter. :-P

>> Next, we learn that function arguments can have default values. But get
>> this:
>>
>>    "Any or all of the function's parameters can be assigned default
>> values. The one restriction is this: If any of the parameters does not
>> have a default value, no previous parameter may have a default value."
>>
>> ...um, what?
>
> At one point in the argument list, you must start providing default
> values, for all argument at the right of that point.

...which makes infinitely more sense than what the book said. :-P


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.