|
![](/i/fill.gif) |
> Day #1: Compiling, linking and running.
> Day #2: Program structure. Functions.
> Day #3: Variables and constants. Assignments. Printing results.
> Day #4: Statements, blocks and expressions. Branching.
> Day #5: Writing functions.
Well, I wasn't expecting great things. And I'm not /finding/ great
things. :-P
Day 1 contains almost no code. Which is fair enough. It begins by
explaining how all complex programs inherently have to be written using
OO techniques or they will be too difficult. It continues by explaining
how somebody took C, the best programming language in the world, and
added OO features to it to create C++, the new best programming language
in the world.
On one point it is clear: Should I learn C before learning C++? The
answer comes back as "no". Which seems quite correct to me.
On the subject of Java, it made me laugh. (Remember when this is
published: 1999.) Apparently "all the C++ programmers who left for Java
are now coming back to C++". That made me chuckle. As if C++ and Java
target the same problem domain or something. But the real giggler was this:
"In any case, C++ and Java are so similar that learning one involves
learning 90% of the other."
If that doesn't make you laugh out loud, I don't know what will. ;-)
Then we are told that "more than any other programming language, C++
demands that you /design/ a program before you build it". Followed by
some mumbo-jumbo about how programmer time is more expensive than
machine time, and how design mistakes are expensive, etc.
Half way through, we get to type in Hello World and run it. As you'd expect.
Then there's a cryptic reference to the fact that iostream and
iostream.h aren't the same, but the differences are "subtle, exotic, and
beyond the scope of an introductory primer". Yes, the 879-page monolith
of a book is only an "introductory primer".
(Incidentally, the table of contents runs to 21 pages. I always feel
that if the table of contents /itself/ merits another table of contents,
you're doing it wrong.)
Anyway, the book decides that it's going to stick with iostream.h,
possibly for compatibility reasons - it isn't really elaborated on much.
The chapter closes out with "You should feel confident that learning C++
is the right decision for anyone interested in programming in the next
decade."
Because, let's face it, C++ is the only programming language in use,
isn't it?
(Some comment about "interested in programming" verses "interested in
getting paid to program" could perhaps be made here...)
Day 2 continues to supply the laughs, opening with such gems as
"Every time you run the compiler, the preprocessor runs. The
preprocessor is discussed on day 21."
Right. So you've just brought up the fact that this thing called a
preprocessor exists, and in the next breath told me you're not going to
actually explain that at all until /the final chapter of the book/?
The entire book is riddled with stuff like this. Basically sentences
saying "you won't understand this for another two weeks". Yeah, great,
thanks for that. Mind you, if the order wasn't quite so scrambled up, it
would be easier. And I'm fairly sure you don't need an entire "day" just
to comprehend Hello World. (Does somebody have pages to fill?)
Anyway, apparently main() must always return int. It is against the ANSI
standard for it to return void. (No mention of it having parameters.
Indeed, even the return value is described as an "obscure feature which
we won't make use of".)
In keeping with the above, we are told that the notation "\n" won't be
explained until day 17 (although actually it's explained on day 3, in
complete contradiction to this statement), that cout is an object and we
won't find out what the heck that means until day 6, and cout is also a
stream and that isn't explained until day 16. (Why are you bothering to
tell me about all the things you aren't going to tell me about? Can't
you /summarise/ the salient points quickly or something?)
Then there's some mumbo-jumbo about how you can use cout "to do
addition", and how when you do count << 8+5, then "8+5 is passed to
cout, but 13 is what is actually printed". Nice, clear explanation,
that. :-P
There's a little bit about comments and how they work. The author
suggests that "comments should not explain /what/ is happening, but
/why/ it is happening".
And then the book insists that we have to learn about functions,
"because they're used constantly". So we learn how to define new
functions, how to call them, how to pass arguments into them and return
values out of them. Notice that we haven't learned about what variables
are or how to use them yet. We also haven't learned about types yet. But
the book insists that functions come first. It then goes on to not
define any function except main() for the next several chapters. :-P
In an example demonstrating all this, cin is casually dropped in but not
mentioned anywhere. Presumably we're supposed to /guess/ this detail.
"The difficulty of programming is that so much of what you have to learn
depends on everything else." Pfft. I think you just suck at explaining
it. :-P
Day 3: ...and /now/ we get to learn what variables are, how to assign to
them, and what types there are and how to use them. Because, hey, we
couldn't possibly have done that /before/ learning about passing
arguments to functions, no?
There's some garbled mumbo about how "memory" and "RAM" are different.
It doesn't really make sense.
And then it starts talking about types. Apparently C++ adds a new "bool"
type, which is supposed to be 1 byte. Apparently "char" doesn't mean
character at all, it means a 1-byte integer (and by default, a signed one).
Now, this I did not know, but: There is a long int, and a short int. And
then there's just int. I had always thought these were three different
sizes of integer. But it appears that actually, long int is one size,
short int is another size, and plain int refers to whichever one the
compiler writer chose on a whim. So there's only actually two integer
sizes, and plain int means "I don't care".
The book helpfully points out that on a "Pentium" system, int = long
int, whereas on older PCs int = short int. (Remember when this was
published? 1999?)
The book suggests "never use int, always use short int or long int". If
the table is to be believed, long int is 32-bits. Christ knows what you
do if you want more bits than that...
But hey, at least double and float work in a sane way, right?
The book casually mentions something which /seems/ to be claiming that
variables are not initialised to anything in particular unless you
specifically request this. That's interesting; I didn't know that.
We are shown how to do a typedef. And then we get a nice little program
to print out (the printable parts of) the ASCII character set. (Notice
that we haven't covered for-loops yet, and the program consists of a
single for-loop.)
On page 50, I get a chuckle when next to an example listing, they feel
the need to point out that "*" denotes multiplication. Even though we've
already used it to perform multiplication in a dozen examples so far.
Hmm, might wanna put that earlier in the book if you feel it's
necessary. :-P
It tells you how to use enum. Looks simple enough.
Apparently assigning a double to an int is only a /warning/, not a
compile-time error. The book helpfully fails to specify how the
conversion is actually done. (From the examples, it appears to round
towards negative infinity... but it would be kinda useful to have that
/stated/.)
Alright, Day 4! And the first thing we learn is this: Every C++
statement is also an expression.
This is obviously an extremely sick and twisted idea, and whoever
thought of it was either mentally disturbed or merely failed to
comprehend what a horrifying thing he just did.
On one hand, it means that
x = y = z;
is a perfectly valid statement in C++. On the other hand, it also means that
while (x[i] = y[i--]) ;
is perfectly valid. You sick, sick people.
Apparently every R-value is an L-value, but not vice versa. Which is
fair enough, I guess.
In most programming languages, performing division promotes integers to
reals. But not in C++, apparently. (Again, it is unspecified exactly how
integer division works. Whether the author actually doesn't /know/ how
it works or merely thought it unimportant is unclear.) At this point, we
are told that there's two ways to convert something to an double:
double x = (double)y;
double x = static_cast<double>y;
Apparently the former is bad, and the latter is good. No indication as
to why, it just is.
More giggles: There's a list of the self-modification operators. Check
it out:
+= Self-addition.
= Self-subtraction.
*= Self-multiplication.
/= Self-division.
Notice something missing there?
Then we get the ++ and -- operators explained to us. (Even though we've
already seen them used several times.) This being C++, some insane wacko
thought that having c++ and ++c would be a good idea, and wouldn't be
confusing in any way.
Next we learn that false=0, and true /= 0. (Christ, I'm /never/ going to
remember that!) Apparently there's a new bool type, but details are thin
as to what the exact significance of this is.
Then we learn about if/then, if/then/else, and finally ?: is
demonstrated, without once mentioning how it's different from
if/then/else. (I.e., it only works on expressions, not statements. Oh,
but wait! Silly me, statements /are/ expressions...)
Day 5 begins by explaining, all over again, what functions are, why
that's useful, and how you use them. We learn how to pass values in, and
return results out. So... /why/ did we need this in day 2, in half as
much detail and then never used again?
Apparently every function must be declared before it can be used. (This
constantly trips me up...) It seems a function prototype doesn't need to
include argument names, just their types. (If only Java was like this!)
Bizarrely, if you don't specify a return type, it defaults to int. Not,
say, void. :-o
We've got a source code listing where one line of text is randomly
indented by a different amount than the rest of the surrounding code.
Clearly this book has been through some tough QA. I'm only 92 pages into
it - that's about 10% of the way through.
There's a side-box that shows you the syntax for a function prototype
[even though the text already explained this]. But, for reasons unknown,
the diagram /actually/ shows a normal function definition, but with a
stray semicolon at the end of the prototype... In short, somebody seems
to have got their stuff mixed up.
It appears that in C++ it is legal for a function to overwrite its
arguments. In fact, there's an example of implementing a swap()
function, to demonstrate that it completely fails to swap its arguments
in the caller. Having done an extensive demonstration to call our
attention to this fact, we learn that we won't be told how to write
swap() correctly until day 8. Thanks for that. Couldn't you have brought
this example up, say, on day 8?
It seems that C++ allows variables to be declared anywhere inside a
block, and that variable is then local to the inner-most block. (Not
that the text says it this clearly.) And it appears that nested
functions are not allowed. Which is fine.
Then we learn about global variables. Apparently if you write a variable
outside of any block, it's global. No word on exactly when it's
initialised, or precisely what "global" actually means. For example, I'm
/guessing/ the variable is only in-scope /below/ the line where it's
defined. That's how functions work, after all.
The book then warns about the dangers of using global variables. To
quote the book itself:
"In C++, global variables are legal, but they are almost never used.
C++ grew out of C, and in C global variables are a dangerous but
necessary tool."
"Globals are dangerous because they are shared data, and one function
can change a global variable in a way that is invisible to another
function. This can and does create bugs that are very difficult to find."
"On Day 14 you'll see a powerful alternative to global variables that
C++ offers, but that is unavailable in C."
So now you understand why global variables are bad, right?
No, I didn't think so.
As somebody who first learned to program in BASIC, where /all/ variables
are global variables, I know all about how unsafe global variables are.
But this book makes no real attempt to explain what the problem is,
other than that "a function could change a global variable in a way that
is invisible to another function". WTF is /that/ supposed to even /mean/?!
Seriously. It's /slightly important/ to understand why using a global
variable is a bad idea. This is basic bread-and-butter programming
knowledge. Way to completely fail to explain it. :-P
Next, we learn that function arguments can have default values. But get
this:
"Any or all of the function's parameters can be assigned default
values. The one restriction is this: If any of the parameters does not
have a default value, no previous parameter may have a default value."
...um, what?
So you're saying that you can freely choose which ones have defaults,
except that if a parameter doesn't have a default, nothing to the left
can either?
So, aren't you basically saying all of the non-default parameters have
to come before all of the default parameters? That doesn't sound like
"any or all of the function's parameters can be assigned default
values". That sounds like the parameters are split into two chunks. Why
not just /say/ that? :-P
We are now on page 108, and that's where I stopped reading. So far none
of the burning questions I want answered have been addressed. But the
next chapter looks promising. I just hope it doesn't try to explain that
metaphore about an object being like a spark plug again... >_<
Post a reply to this message
|
![](/i/fill.gif) |