POV-Ray : Newsgroups : povray.off-topic : Unix shell Server Time
4 Sep 2024 03:14:39 EDT (-0400)
  Unix shell (Message 19 to 28 of 88)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Darren New
Subject: Re: Unix shell
Date: 29 Jan 2011 15:49:15
Message: <4d447d4b$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> Warp wrote:
>>> Darren New <dne### [at] sanrrcom> wrote:
>>>> Actually, the primary problem with makefiles is actually C's bizarre 
>>>> #include rules, where source files depend on other source files which aren't 
>>>> evident from looking at the source files.
>>>   Never miss a chance to belittle C-like languages.
> 
>> How come C is the only language from the 1970s that you get upset at my 
>> pointing out technology-induced limitations?
> 
>   It's not that you belittle it. It's the frequency with which you do it,
> and the situations where you do it.

It's the only language from that time that's still in popular use. :-)

Besides, I thought you don't like C either.

>   Just count your posts belittling eg. COBOL and the onces belittling C or
> C++. I bet the difference is something like two orders of magnitude.

Sure. Now count the number of posts talking about COBOL vs talking about 
C++. Heck, we even have a whole newsgroup about C++. ;-)

>   Also, in this particular case, what was the point? It did not add anything
> relevant or interesting to the thread, nor did it answer the original
> question.

Sorry you didn't understand the point.  The point is that most of the other 
languages out there don't have the sort of nested include behavior that C 
has, and hence don't need makefiles that are anything more than simply the 
list of source files that compile into each object library or executable.

In C# for example, the equivalent of the makefile says "here's the source 
files to compile. Here's the name of the DLL to create. Here's the global 
DLLs to link against (specified by content, not path). Here's the list of 
makefiles for other possibly-uncompiled sources I depend on. Here's the 
#define symbols to define and other compiler options." There's no list of 
chunks of source code unrelated to the object code that this compilation 
depends on. There's no big long list of which libraries link against other 
libraries, or what order to search them in, or what order to compile things 
in, or which source files use which types, or a 5000-character long compile 
line specifying the source code locations of every declaration used anywhere 
in your program.  (And yes, I've had individual makefile compilation lines 
that spanned several pages.)

And it deals with stuff that isn't compiled yet (since the reference to some 
other object file that might need compilation actually points that the 
equivalent of the *makefile* and not the equivalent of the DLL), and it 
deals with object code that has changed but still has an older date (like 
reverting a subversion repository), and it deals with changes in compilation 
(like changing #define values between compilations), and it deals with 
standard object files being updated or moved, all of which need external 
tools for make.

Basically, the *interesting* problems, Make can't handle, which is why you 
see qmake cmake ant autoconf and so on. The problem of specifying the list 
of dependencies is trivial in compiled languages that don't have nested 
source files: the object file depends on the source, the library depends on 
all the object files.

The problem that Make solves is letting you manually specify which nested 
#include files that might change but aren't easily found affect the 
compilation of this program. Every other change that affects the compilation 
of this program, Make can't handle. Hence, Make really doesn't make sense 
for languages other than those whose dependencies aren't trivially deducible 
by looking at the code.

 > (this is, afaik, what gcc does when you run it with the -M option)

BTW, why not just build this into the compiler? Then you wouldn't need 
makefiles. You'd have a three-line shell script: "compile everything that 
needs compiling. If the library is older than any of the object files, put 
the object files into a temp library file. Replace the existing library file 
if it's different from the temp library file."

>   I hate Java. Do you see me taking every opportunity to bash it?
> I sometimes do it, but I don't remember doing it at every possible
> opportunity.

I don't do it in every possible situation either.  I do it when people are 
talking about the difficulty of using a tool designed to compensate for 
limitations built into old programming languages.

>> When someone asked why the C 
>> standard limited global names to six characters and I said it was because of 
>> FORTRAN-based linkers, you didn't seem to mind.
> 
>   Perhaps because it was a relevant answer to a presented question?

Sure. As was this.

>> You don't seem to object 
>> when people make fun of COBOL's wordiness.
> 
>   How many times has that happened, exactly?

At least a couple times. :-)


-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

From: Warp
Subject: Re: Unix shell
Date: 29 Jan 2011 16:12:19
Message: <4d4482b2@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> If you change environment variables, it fails.

  I'm not exactly sure which environment variables a compiler depends on
which would affect the compilation process. Usually command-line arguments
(to the compiler) are used to affect the compilation.

  Well, I suppose you could make the makefile use some environment variables
to affect the compilation. Seems rather contrived and counter-productive
(because you are now putting part of your project settings outside of your
"project file", which is usually were they should be).

  If you want changes to the makefile itself cause everything to be
recompiled, you could add the makefile to the dependency rules.

> If you restore an 
> older version of an include file, it fails.

  I'm pretty certain most IDEs don't eg. calculate checksums of every single
source file to see if they have changed from the last time that it was
compiled. In a project with thousands of files it could take a while to
calculate checksums for all of them.

> If you change the source file 
> then accidentally change the object file it fails.

  Yeah, and if you accidentally delete the makefile, it also fails. Duh.
There are like a million scenarios where you accidentally do something to
your files which will cause a malfunction (regardless of your IDE).

> If you change a file your 
> compilation depends on that isn't listed as a dependency (like stdio.h say) 
> it fails.

  And if you don't specify any dependencies at all, it will also fail, duh.

  Of course the makefile assumes that you have specified all the necessary
dependencies. It cannot read your mind.

  If you are using a tool to create the dependency lists and that tool
fails for some reason, blame the tool, not make.

> >> The fact that all the 
> >> stuff in the makefile is something you can't automate,
> > 
> >   What do you mean you can't automate? Of course you can. There are tons of
> > tools to automatically create makefiles (and makefile rules).

> The existence of tools to generate a makefile is exactly the point I'm 
> making. At that point, you're no longer using Make. You have a makefile as 
> an intermediate step in your compilation. That's not automating something 
> with make. That's automating something with a makefile generator.

  I'm sure you are pretty well aware of the unix philosophy that tasks are
often distributed among separate tools, rather than having one single program
that tries to do everything.

  There's nothing wrong in using two tools to perform a task rather than
one. Do you also blame 'ls' for not supporting everything that 'sed'
supports?

  'make' is not used only to compile C/C++ programs. It can be used for
quite many tasks, including things that have nothing to do with compiling
a program. 'make' is designed to be a versatile tool for these tasks.

  Many IDEs also support this, but they aglomerate everything into one
single program. That's just not the typical unix ideology.

> >> and which in practice 
> >> you wide up actually having to compile everything twice, the first time just 
> >> to create the makefile.
> > 
> >   You don't need to "compile" anything to create the makefile rules. You
> > simply need to run a subset of the C preprocessor in order to see what
> > includes what (this is, afaik, what gcc does when you run it with the -M
> > option). It's not like it's an extremely heavy operation.

> So you invoke the compiler to create the makefile. You're splitting hairs 
> here.

  I'm not. gcc does not "compile" the program when I run it with the -M
parameter. It just runs a simplified C preprocessor to track all the
included files. The major difference is, obviously, speed.

> If I can't easily create the makefile by looking at the source code 
> I'm trying to compile, it's baroque. :-)

  I just can't see what's so wrong in using a tool to create the makefile
dependency rules. Do you really *want* to be writing makefiles by hand?

> >   Besides, you only have to create the rules once. If the dependencies
> > don't change, you don't need to create them every time you want to compile.

> You have to create the rules every time you change anything that might 
> affect the compilation, whether that includes the .c file you're compiling 
> or not.

  Generating the rules is pretty fast, and the need doesn't present itself
very often, and re-generating the rules is quite easy. I just can't see
the problem here.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Unix shell
Date: 29 Jan 2011 16:34:42
Message: <4d4487f1@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> Besides, I thought you don't like C either.

  The issue was the #include scheme, which you seem to hate and I really
can't understand why.

> >   Also, in this particular case, what was the point? It did not add anything
> > relevant or interesting to the thread, nor did it answer the original
> > question.

> Sorry you didn't understand the point.  The point is that most of the other 
> languages out there don't have the sort of nested include behavior that C 
> has, and hence don't need makefiles that are anything more than simply the 
> list of source files that compile into each object library or executable.

  Firstly, that doesn't answer the original question. It's not like Andrew
was making a choice of programming language for his next project. He just
wanted an easy way to compile a bunch of programs. Your answer doesn't help
his problem in any way.

  Secondly, what is it exactly that allows other languages to compile only
what has changed, rather than everything every time? Right: The compiler
keeps track of changed files and compiles only what has changed.

  In a unix system this tracking has been separated into its own set of
tools, rather than being part of the compiler. *That* is essentially what
you are complaining about. In other words, the C/C++ compilers doesn't do
everything out-of-the-box, but instead you need additional tools.

  Well, if you don't like it, then don't. It's just how the unix ideology is.
You might disagree with the ideology and argue how it's antiquated, but it's
not something that bothers everybody. Some people like one tool doing one
thing, rather than one program trying to do everything.

  You are, hence, barking up the wrong tree. It's not the programming
language that is the "problem". It's the separation of compiler and
dependency file system that irks you. You are just blaming the programming
language for it because you love to hate it.

> In C# for example, the equivalent of the makefile says "here's the source 
> files to compile. Here's the name of the DLL to create. Here's the global 
> DLLs to link against (specified by content, not path). Here's the list of 
> makefiles for other possibly-uncompiled sources I depend on. Here's the 
> #define symbols to define and other compiler options." There's no list of 
> chunks of source code unrelated to the object code that this compilation 
> depends on. There's no big long list of which libraries link against other 
> libraries, or what order to search them in, or what order to compile things 
> in, or which source files use which types, or a 5000-character long compile 
> line specifying the source code locations of every declaration used anywhere 
> in your program.  (And yes, I've had individual makefile compilation lines 
> that spanned several pages.)

  Can you guess how many source code dependency definitions I have ever
written when I have used Microsoft's Visual C++ or Apple's Xcode?

  That's right. You are putting the blame on the programming language,
when what really irks you are the unix compiling tools. You just love
to hate C, so you just can't miss an opportunity to bash it.

  Yes, I hate C too, but I don't blame it for flaws that it doesn't have.

>  > (this is, afaik, what gcc does when you run it with the -M option)

> BTW, why not just build this into the compiler? Then you wouldn't need 
> makefiles.

  Sometimes the dependencies are more complicated. For example you could
have a program in your project which generates code, which then gets
compiled into the actual executable. That code-generation program in turn
may depend on some input files from which it generates the code.

  You only want to compile the code generating program if its source files
change. You only want to run that program if the data files change. You
only want to recompile your actual executable if either of those (or its
own source files) change.

  Obviously a compiler cannot know all this if you simply give it a list
of all the source code files. It needs slightly more elaborate rules.

  Most IDEs support creating these types of dependencies (eg. Xcode does,
Visual Studio probably does too, even though I have so far not needed to
do something like that with it). Obviously you can achieve this with a
makefile as well (after all, it's exactly what it is designed to do).
And yes, I have needed to do this, several times.

  A makefile / project file can be quite useful even further. For example,
you could have a rule to create a distribution package from your program.
The creation of this distribution package ought to have dependencies on
all the files which need to be included. Hence if you eg. change one of
those data files and then build the distribution package, the executable
binary will be compiled first, so that it's up to date. You can't do this
with a simple shell script. (And yes, I also have needed to do this.)

> >> When someone asked why the C 
> >> standard limited global names to six characters and I said it was because of 
> >> FORTRAN-based linkers, you didn't seem to mind.
> > 
> >   Perhaps because it was a relevant answer to a presented question?

> Sure. As was this.

  How did it answer the original question?

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: Unix shell
Date: 29 Jan 2011 17:22:42
Message: <4d449332$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> If you change environment variables, it fails.
> 
>   I'm not exactly sure which environment variables a compiler depends on
> which would affect the compilation process. Usually command-line arguments
> (to the compiler) are used to affect the compilation.

http://www.gnu.org/software/hello/manual/make/Implicit-Variables.html#Implicit-Variables

Stuff like CFLAGS and LDFLAGS and LIBS and INCLUDEPATH, all of which tends 
to come from outside the environment. Sure, you *can* get around it, but in 
practice people don't.

>   Well, I suppose you could make the makefile use some environment variables
> to affect the compilation. Seems rather contrived and counter-productive
> (because you are now putting part of your project settings outside of your
> "project file", which is usually were they should be).
> 
>   If you want changes to the makefile itself cause everything to be
> recompiled, you could add the makefile to the dependency rules.

Unless it's in a different project, sure. If you want your kernel module to 
recompile any time the C runtime library or the different bits of the kernel 
it depends on changes, you have a bit of tracking down to do.

Besides, have you *ever* seen someone actually do this in a hand-generated 
makefile?

>> If you restore an 
>> older version of an include file, it fails.
> 
>   I'm pretty certain most IDEs don't eg. calculate checksums of every single
> source file to see if they have changed from the last time that it was
> compiled. 

In languages where it's important to be correct, like Ada, yes they do. In 
C#, yes they do, because they don't *look* at include files, they look at 
object files. C# doesn't *have* include files. That's the point. If you say 
"use the types from *that* assembly" and then later you replace the assembly 
with an incompatible version, your code will get recompiled (or will refuse 
to start if you try to run it without recompiling).

In Ada, you don't calculate the checksums of every source. You calculate the 
checksum of the file you're compiling when you compile it. In C#, your file 
is either associated with a project (i.e., with the equivalent of a 
makefile) or it's installed in the global assembly cache with a version 
number and a cryptographic checksum on it you can check against.

> In a project with thousands of files it could take a while to
> calculate checksums for all of them.

When it's important, you compile the header file separately, like in Ada. 
That's my problem with the way C does #include. Ada header files aren't 
chunks of arbitrary source code that you include nested in other chunks of 
source code. Ada external type declarations aren't a byproduct of compiling 
something unrelated to the type you're declaring. They're a separate piece 
of code that can be version controlled and dependency analyzed on its own. 
If you recompile a header file and don't recompile everything that depends 
on it in Ada, your code won't link, regardless of date stamps.

>> If you change the source file 
>> then accidentally change the object file it fails.
> 
>   Yeah, and if you accidentally delete the makefile, it also fails. Duh.

No, if you delete the makefile you get an error. If you touch the object 
code after changing the source, you get a silent failure. There's a big 
difference there.

> There are like a million scenarios where you accidentally do something to
> your files which will cause a malfunction (regardless of your IDE).

Sure. You're apparently trying to miss my point here, which is that the idea 
that basing dependency information and content-change information solely on 
file timestamps is iffy.  Yes, there's a lot of ways to break code, but 
those aren't relevant to the discussion we're having about Make.  There 
aren't a whole lot of scenarios where you accidentally do something to your 
Ada code and wind up with an executable that doesn't match the sources you 
compiled it from.

>   Of course the makefile assumes that you have specified all the necessary
> dependencies. It cannot read your mind.

I have never seen a makefile that lists as a dependency of my code all the 
things that stdio.h includes.

>   If you are using a tool to create the dependency lists and that tool
> fails for some reason, blame the tool, not make.

If I'm using a tool to create makefiles, then sure, but that's just saying 
makefiles are so useless that I have to automate their creation. Why would I 
use a tool to create makefiles using a tool that just does the job properly?

>   There's nothing wrong in using two tools to perform a task rather than
> one. Do you also blame 'ls' for not supporting everything that 'sed'
> supports?

There's really only one task going on here - dependency checking. That's the 
one and only thing Make does.

>   'make' is not used only to compile C/C++ programs. It can be used for
> quite many tasks, including things that have nothing to do with compiling
> a program. 'make' is designed to be a versatile tool for these tasks.

That's what I asked earlier. I just never found anything that *actually* 
uses the dependency analysis of make to build something other than C-like 
code or libraries therefrom.

Do you have things in your makefile for processing the graphics for your 
games? Does it only reprocess them only when the source graphics change?

It's not like "make test" only runs if you recompiled something (or, better, 
only tests the modules you changed), or "make clean" doesn't delete *.o 
instead of going through each file to see if it's there first.

>   Many IDEs also support this, but they aglomerate everything into one
> single program. 

Well, no, not really. In simple situations (like a directory full of Java 
source), they just give it to the compiler to figure out, because the 
compiler already knows all the dependencies. In complex situations, they 
have to do more than just look at timestamps.

>> So you invoke the compiler to create the makefile. You're splitting hairs 
>> here.
> 
>   I'm not. gcc does not "compile" the program when I run it with the -M
> parameter. It just runs a simplified C preprocessor to track all the
> included files. The major difference is, obviously, speed.

Sure. But you're invoking the compiler, and with the same sets of 
command-line options and environment variable expansions as when you do the 
full compilation.

I'm sorry, but if anyone in the world asked me "how do I invoke the gcc 
compiler", I'd say "use /usr/bin/gcc" or something. I would *not* add "but 
be careful not to pass the -M flag, because that doesn't invoke the compiler."

The point is that you're using C-specific code written by the same people 
who wrote the compiler to figure out what the C-specific dependencies are. 
The speed is irrelevant. What's relevant is that the C nested includes are 
sufficiently complex that you need a tool to generate the input to the next 
tool, where the "next tool" was supposed to stand alone on its own in 
simpler times.  In more complex times, the stand-alone tool is mostly 
useless. We have gotten to the point where any time "make" saves you enough 
compilation time to be worthwhile, generating the makefiles is too complex 
to be easily maintainable.  Make lives on not because it's good, but because 
it's as ubiquitous as C. But environments that don't use C don't use make 
either, because it's just not a very good tool.

>> If I can't easily create the makefile by looking at the source code 
>> I'm trying to compile, it's baroque. :-)
> 
>   I just can't see what's so wrong in using a tool to create the makefile
> dependency rules. Do you really *want* to be writing makefiles by hand?

I have no trouble at all writing makefile rules by hand when it's *not* C 
that I'm compiling. That's exactly my point. C's dependencies are 
sufficiently opaque that it's actually problematic just to figure out what 
they are, to the point where one feels the need to actually write a number 
of tools to figure it out, including modifying the compiler to help you out.

But even back in the MS-DOS 3 days, I was writing editor macros to suck 
#include declarations out of C sources to generate makefiles, simply because 
of the need for *correctness* rather than complexity.

>>>   Besides, you only have to create the rules once. If the dependencies
>>> don't change, you don't need to create them every time you want to compile.
> 
>> You have to create the rules every time you change anything that might 
>> affect the compilation, whether that includes the .c file you're compiling 
>> or not.
> 
>   Generating the rules is pretty fast, and the need doesn't present itself
> very often, and re-generating the rules is quite easy. I just can't see
> the problem here.

Regenerating the rules is only quite easy because someone added a flag to 
GCC to make it regenerate the rules. Now, go regenerate the rules without 
the -M flag. I'm not saying that generating the dependencies, *today*, isn't 
easy. I'm saying that the job Make was created to handle, namely recording 
and evaluating dependencies, is obsolete. Makefile dependency lists are, for 
the most part, created by other tools, and they're inadequate to actually 
capture all the dependencies that people have in modern code. It was a fine 
tool back when you had a dozen C files and a half-dozen include files in 
your project, and your choice was whether you wanted the compiler to 
optimize or not. Nowadays,
http://gcc.gnu.org/onlinedocs/gcc/Optimize-Options.html#Optimize-Options
http://www.boost.org/doc/libs/1_45_0/libs/config/doc/html/boost_config/rationale.html#boost_config.rationale.the_problem

You're arguing that the dependency rules for C's #include directive isn't 
complicated because the compiler can figure it out for you. In your words, 
"well, duh."  That doesn't make the rules easy. It just means that the 
complexity is encapsulated in the compiler. And that means if you're 
compiling a different language with different dependency rules, the sorts of 
things Make does just aren't the best way of doing it. C# needs something 
more than timestamps to determine dependencies. Java doesn't need makefiles 
at all for the code itself. Ada needs a completely different process, not 
infrequently involving a database. All the interpreted languages have their 
own thing going on.

(It also ignores the fact that you have to figure such crap out when you 
write the code, too. I mean, really, three #include files to invoke "open"? 
How would you even guess what include files the various #defines were in if 
it wasn't a system library documented all over the place? It's entirely 
possible for someone to give you a blob of source code and you have no way 
of even knowing how to correctly generate the dependencies from it: for 
example, it wasn't unusual at the last place to have a bunch of different 
include files with different values for the same #define in different 
directories, and spend a couple days trying to figure out which include path 
to use to get it to link correctly against the pre-compiled libraries also 
included. But that's also due to the nature of not putting type information 
into object code files as much as #include per se.)

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

From: Darren New
Subject: Re: Unix shell
Date: 29 Jan 2011 17:56:40
Message: <4d449b28$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> Besides, I thought you don't like C either.
> 
>   The issue was the #include scheme, which you seem to hate and I really
> can't understand why.

Experience! :-)

I have explained it numerous times, with specific examples and general 
descriptions, so I'm going to assume it's one of those things you just won't 
ever understand. I think sometimes you fail to understand things that you 
disagree with.

>   Firstly, that doesn't answer the original question. It's not like Andrew
> was making a choice of programming language for his next project. He just
> wanted an easy way to compile a bunch of programs. Your answer doesn't help
> his problem in any way.

It does.  "Make is not appropriate for your task." :-)

Granted, the conversation has progressed since then.  But the basic comment 
was "by the time you need make, make will be inadequate to the task." Then I 
explained why, which is what we're arguing about now.

>   Secondly, what is it exactly that allows other languages to compile only
> what has changed, rather than everything every time? Right: The compiler
> keeps track of changed files and compiles only what has changed.

Sometimes. Other times, not. When it *is* the compiler doing it, it's using 
a dependency analysis more sophisticated than just file timestamps, which is 
all that make supports.

>   In a unix system this tracking has been separated into its own set of
> tools, rather than being part of the compiler. 

Except for, you know, that pesky -M switch. That's what I'm talking about. 
The tracking *is* built into the compiler. It would be a pain in the butt to 
use make to track dependencies without it being built into the compiler.

> *That* is essentially what
> you are complaining about. In other words, the C/C++ compilers doesn't do
> everything out-of-the-box, but instead you need additional tools.

Nope. Other languages have separate tools too. It's just built into the 
compiler for simple cases.

>   Well, if you don't like it, then don't. 

And *that*, my friend, is how this discussion started. Andrew asked about 
make, I said "You won't like it, so use something else", and you got upset 
that I explained why he won't like it. :-)

>   You are, hence, barking up the wrong tree. It's not the programming
> language that is the "problem". It's the separation of compiler and
> dependency file system that irks you.

No, it's the inability to correctly figure out dependencies based on the 
source files that's really the cause of the troubles.

>   That's right. You are putting the blame on the programming language,
> when what really irks you are the unix compiling tools. 

We were talking about make. I said "make was designed to deal with C's 
#include mess." The fact that there are other tools that deal with C's 
#include mess better than make does doesn't mean C's #include mess is less 
messy, nor does it mean that make is the right tool for dealing with other 
dependency issues. Which is what we're discussing.

 > You just love
> to hate C, so you just can't miss an opportunity to bash it.

I actually like C. It just has a number of legacy flaws, one of which is 
using #include instead of having actual compiled declarations stored anywhere.

>   Obviously a compiler cannot know all this if you simply give it a list
> of all the source code files. It needs slightly more elaborate rules.

Yet, oddly enough, that's precisely what the XNA content pipeline does, and 
you just give it a list of all the source files, including the graphics and 
audio and stuff. Sure, you mention the name of the code used to compile it, 
but there's enough information from that to track down where the executable 
and such is. It also makes it possible for the thing that's compiling your 
collection of frames into an animation to say "the resultant animation 
depends on this list of frames as well as the text file specifying the list 
of frames", which you can't do with a makefile. (At least, not without 
writing the -M equivalent for your personal processors and remembering to 
run it when you change the list of frames.)

You can achieve it with a makefile, yes, but primarily not by having the 
dependency stuff going on, but by building them as different projects. That 
is, one tends to have a top-level makefile that says "first run make on the 
content generator, then run make on the content, then run make on the 
packaging directory."

Maybe it's just because every single makefile I've ever seen in my entire 
life really, really sucks and doesn't work right either that I'm down on make.

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

From: Darren New
Subject: Re: Unix shell
Date: 29 Jan 2011 18:23:06
Message: <4d44a15a$1@news.povray.org>
Warp wrote:
>   The issue was the #include scheme, which you seem to hate and I really
> can't understand why.

Let's see if I can express it simply and clearly enough:

1) Every source file should generate object code when compiled. In other 
words, declarations should generate output when you pass them to the 
compiler. Not necessarily *executable* statements, mind, but some sort of .o 
file.

2) It should be possible to deduce from object code what types that object 
code implements, without reference to the source code, well enough to 
determine if two object files can inter-operate (i.e., can be correctly 
linked together). This implies that linking incompatible compilations of the 
same type can also be noticed - i.e., if the caller passes packed structures 
to a callee that expects unpacked structures, this should be caught.

When you're lacking property #1, there's no clear and obvious "type" in your 
system. All you have is declarations which may or may not have been changed 
and which may or may not match your definitions. (Otherwise, people would 
not need to be encouraged to include xyz.h in their xyz.c file.)


C lacks property #1, which leads to #include files because manually 
repeating the code is too error-prone. This leads to the need to track 
dependencies between source files that come from different authors.

Without property #1, a consumer of a C-enabled library has to track 
dependencies on *source* code written by the provider of the library, and 
that *source* code is (by the time it's on the consumer's machine) unrelated 
in any way to the library code the consumer is trying to consume. I.e., 
there's no practical way for the consumer to use the library without a .h 
file, but that .h file is not part of the actual library code that gets 
deployed. It's entirely possible that the .h source code doesn't match the 
.o/.lib code you wind up linking against.

When you're lacking property #2, you not only need to have the object code 
library, but you need to know how to compile your code to make it work with 
that object code library. So now you have three things to harmonize: the .h 
file for the library, the .so file for the library, and the compile-time 
arguments for the library, and you have to pick compile time arguments for 
your own compilation that are compatible.

Most implementations of C also lack property #2, which means you actually 
have to track down the correct library source code and compilation options 
to use an already-compiled library. Interestingly enough, one of the 
criticisms of C++ is that each compiler tends to implement (to the extent it 
does) property #2 somewhat differently.

In other words, with #includes, using a library means you have to figure out 
how to correctly compile someone else's code, in an environment that the 
library author never tried compiling in, just to invoke "already compiled" code.

Also:

In *shitty* code, the fact that #include's aren't restricted to type 
declarations but rather can arbitrarily change the syntax of the language 
you're using is also a problem. Languages should strive to make it difficult 
to write shitty code, *especially* shitty code that makes people who write 
*good* code have a harder job of it. Again, this isn't a problem where one 
does not need to compile someone else's code in order to use pre-compiled code.



I hope that answers it. You may *disagree* it's a problem, but I hope you 
understand it.

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

From: Patrick Elliott
Subject: Re: Unix shell
Date: 29 Jan 2011 23:30:52
Message: <4d44e97c$1@news.povray.org>
On 1/29/2011 1:21 PM, Darren New wrote:
> Warp wrote:
>>> I'm not. I'm just pointing out why Make is so baroque.
>>
>> That claim doesn't make any sense. All IDE project files are like
>> makefiles
>> (just with a different format). It's not like a makefile is somewhat
>> special
>> and different from any project file.
>
> There are lots of ways in which make is inferior to most of the other
> build tools. For one, the only out-dated-ness information it tracks is
> file timestamps. If you change environment variables, it fails. If you
> restore an older version of an include file, it fails. If you change the
> source file then accidentally change the object file it fails. If you
> change a file your compilation depends on that isn't listed as a
> dependency (like stdio.h say) it fails.
>
>>> The fact that all the stuff in the makefile is something you can't
>>> automate,
>>
>> What do you mean you can't automate? Of course you can. There are tons of
>> tools to automatically create makefiles (and makefile rules).
>
> The existence of tools to generate a makefile is exactly the point I'm
> making. At that point, you're no longer using Make. You have a makefile
> as an intermediate step in your compilation. That's not automating
> something with make. That's automating something with a makefile generator.
>
Yeash.. Remember a damn mess with some MUD I was trying to run at one 
point. Only versions I could find pre-compiled for Windows all had ANSI 
disabled, and, worse, unlike some newer ones, anything you did add had 
to be compiled into the core. Makefile was for like VC++4.0, which 
VC++5.0 didn't like, but I didn't even know that initially, because I 
had to first find a copy of Bison for Windows, because the make file 
made a make file, as one of its steps, along with a bunch of 
libraries/changes, to compile under Windows, instead of *nix, and... 
well, things just went increasingly bad from there.. lol

Trying to work out what went wrong, and why, from the damn code, and the 
code that built the code, and the code that ran the code, that built the 
code... o.O

-- 
void main () {
   If Schrödingers_cat is alive or version > 98 {
     if version = "Vista" {
       call slow_by_half();
       call DRM_everything();
     }
     call functional_code();
   }
   else
     call crash_windows();
}

<A HREF='http://www.daz3d.com/index.php?refid=16130551'>Get 3D Models, 
3D Content, and 3D Software at DAZ3D!</A>


Post a reply to this message

From: Darren New
Subject: Re: Unix shell
Date: 30 Jan 2011 00:31:41
Message: <4d44f7bd$1@news.povray.org>
Patrick Elliott wrote:
> Trying to work out what went wrong, and why, from the damn code, and the 
> code that built the code, and the code that ran the code, that built the 
> code... o.O

Precisely. When the configuration file to create the build tools is based on 
turing-complete macro processing, you're gonna just be screwed if there's 
anything wrong.

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

From: Warp
Subject: Re: Unix shell
Date: 30 Jan 2011 03:26:48
Message: <4d4520c8@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> >   I'm pretty certain most IDEs don't eg. calculate checksums of every single
> > source file to see if they have changed from the last time that it was
> > compiled. 

> In languages where it's important to be correct, like Ada, yes they do. In 
> C#, yes they do, because they don't *look* at include files, they look at 
> object files. C# doesn't *have* include files. That's the point. If you say 
> "use the types from *that* assembly" and then later you replace the assembly 
> with an incompatible version, your code will get recompiled (or will refuse 
> to start if you try to run it without recompiling).

> In Ada, you don't calculate the checksums of every source. You calculate the 
> checksum of the file you're compiling when you compile it. In C#, your file 
> is either associated with a project (i.e., with the equivalent of a 
> makefile) or it's installed in the global assembly cache with a version 
> number and a cryptographic checksum on it you can check against.

  So if you have thousands of files in your project, the compiler will
calculate the checksums of every single one of them every time you want
to compile? And it was you who complained how creating makefile rules for
C files is inefficient... Right.

> >> If you change the source file 
> >> then accidentally change the object file it fails.
> > 
> >   Yeah, and if you accidentally delete the makefile, it also fails. Duh.

> No, if you delete the makefile you get an error. If you touch the object 
> code after changing the source, you get a silent failure. There's a big 
> difference there.

  Yeah, and if your hard drive dies, you also get a failure. Also a big
difference. I still don't get your point.

  You talk as if accidentally touching an object file is a relatively
common happenstance. Well, in the fifteen or so years I have been using
makefiles, guess how many times that has happened to me. (In fact, a HD
dying has happened to be more often than that.)

> > There are like a million scenarios where you accidentally do something to
> > your files which will cause a malfunction (regardless of your IDE).

> Sure. You're apparently trying to miss my point here, which is that the idea 
> that basing dependency information and content-change information solely on 
> file timestamps is iffy.

  The situations where it causes problem are so extremely rare that it just
isn't worth the price of the compiler calculating checksums of every single
file in the project every time you want to do a quick compile. Imagine that
every time you want to compile your project, you had to wait a minute for
the compiler to check all the checksums, when with the current scheme it
can do it in a few seconds.

  Nevertheless, you are still blaming the programming language for a
defect you see in the compiling tools. All this has nothing to do with C.

> >   Of course the makefile assumes that you have specified all the necessary
> > dependencies. It cannot read your mind.

> I have never seen a makefile that lists as a dependency of my code all the 
> things that stdio.h includes.

  And that makes it impossible to create such a makefile (especially if you
are using a dendepency-generation tool)?

  Anyways, exactly why would you want to recompile your program if stdio.h
changes? You see, stdio.h is standardized, and if it was changed in such
way that programs would need to be recompiled for them to work, it would
mean that the C standard has been changed to be backwards-incompatible and
it would break basically all programs in existence.

  Yeah, very likely to happen.

  Even if stdio.h would be changed in a backwards-compatible way (eg. a
new non-standard function declaration is added) you would not need to
recompile any existing program. It would not affect them.

  If you are making an extremely rare program which truly needs to be
recompiled every time stdio.h changes, the solution to this is rather
trivial.

> >   If you are using a tool to create the dependency lists and that tool
> > fails for some reason, blame the tool, not make.

> If I'm using a tool to create makefiles, then sure, but that's just saying 
> makefiles are so useless that I have to automate their creation. Why would I 
> use a tool to create makefiles using a tool that just does the job properly?

  You could make the same argument for any program that doesn't do everything,
and needs to work in conjunction with another program in order to perform
some task. Your argument is moot.

  As for makefiles, not all things that can be done with makefiles can be
automated.

> >   There's nothing wrong in using two tools to perform a task rather than
> > one. Do you also blame 'ls' for not supporting everything that 'sed'
> > supports?

> There's really only one task going on here - dependency checking. That's the 
> one and only thing Make does.

  Make can't know if eg. a code-generating program depends on some data
files unless you tell it.

> >   'make' is not used only to compile C/C++ programs. It can be used for
> > quite many tasks, including things that have nothing to do with compiling
> > a program. 'make' is designed to be a versatile tool for these tasks.

> That's what I asked earlier. I just never found anything that *actually* 
> uses the dependency analysis of make to build something other than C-like 
> code or libraries therefrom.

> Do you have things in your makefile for processing the graphics for your 
> games? Does it only reprocess them only when the source graphics change?

  I gave an example in my other post of an actual project I have been
working in where makefiles are useful for something else besides purely
compiling C++.

> >> So you invoke the compiler to create the makefile. You're splitting hairs 
> >> here.
> > 
> >   I'm not. gcc does not "compile" the program when I run it with the -M
> > parameter. It just runs a simplified C preprocessor to track all the
> > included files. The major difference is, obviously, speed.

> Sure. But you're invoking the compiler, and with the same sets of 
> command-line options and environment variable expansions as when you do the 
> full compilation.

> I'm sorry, but if anyone in the world asked me "how do I invoke the gcc 
> compiler", I'd say "use /usr/bin/gcc" or something. I would *not* add "but 
> be careful not to pass the -M flag, because that doesn't invoke the compiler."

  Actually I'm invoking the C preprocessor. I could call it directly, but
gcc does it automatically when you specify the -M parameter. The calls
"gcc -M test.cc" and "cpp -M test.cc" produce the exact same result.
I could simply use the latter if I wanted to be pedantic.

> The point is that you're using C-specific code written by the same people 
> who wrote the compiler to figure out what the C-specific dependencies are. 
> The speed is irrelevant. What's relevant is that the C nested includes are 
> sufficiently complex that you need a tool to generate the input to the next 
> tool, where the "next tool" was supposed to stand alone on its own in 
> simpler times.  In more complex times, the stand-alone tool is mostly 
> useless. We have gotten to the point where any time "make" saves you enough 
> compilation time to be worthwhile, generating the makefiles is too complex 
> to be easily maintainable.  Make lives on not because it's good, but because 
> it's as ubiquitous as C. But environments that don't use C don't use make 
> either, because it's just not a very good tool.

  Instead, we have "project files" in IDEs which do... what? Well, exactly
the same thing as makefiles. In fact, "project files" are just makefiles,
using some other IDE-specific syntax. (They might have features that 'make'
doesn't support, but in principle they are the same thing.)

  Yes, you can use project files for things other than purely compiling
the program, such as running command-line tools to build data files and
such. Exactly what makefiles do. (And yes, I have done this at least with
Xcode project files.)

  The only difference is that most IDEs generate the dependency rules for
source code files automatically so you don't have to write the few magic
lines into the project file yourself. However, again, in principle it's
no different from a generic makefile that generates dependency rules
automatically (like the one I pasted in an earlier post).

> >> If I can't easily create the makefile by looking at the source code 
> >> I'm trying to compile, it's baroque. :-)
> > 
> >   I just can't see what's so wrong in using a tool to create the makefile
> > dependency rules. Do you really *want* to be writing makefiles by hand?

> I have no trouble at all writing makefile rules by hand when it's *not* C 
> that I'm compiling. That's exactly my point. C's dependencies are 
> sufficiently opaque that it's actually problematic just to figure out what 
> they are, to the point where one feels the need to actually write a number 
> of tools to figure it out, including modifying the compiler to help you out.

  You still can't get past the fact that in unix different tasks are
separated into different tools. Fine, you hate that. Move on.

> >   Generating the rules is pretty fast, and the need doesn't present itself
> > very often, and re-generating the rules is quite easy. I just can't see
> > the problem here.

> Regenerating the rules is only quite easy because someone added a flag to 
> GCC to make it regenerate the rules.

  Actually to the C preprocessor.

> Now, go regenerate the rules without 
> the -M flag.

  And why exactly would I want to do that? You might as well ask "compile
the program without using the compiler".

> I'm not saying that generating the dependencies, *today*, isn't 
> easy. I'm saying that the job Make was created to handle, namely recording 
> and evaluating dependencies, is obsolete.

  And how do you propose to handle dependencies on things that the compiler
does not handle, such as creating data files from some input files using
some tools (and perhaps recompile the program only if those data files
change)?

> Makefile dependency lists are, for 
> the most part, created by other tools

  Only program source code dependencies. If you have other types of
dependencies you still need to specify them by hand because no automated
tool can read your mind.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Unix shell
Date: 30 Jan 2011 03:47:08
Message: <4d45258c@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> Warp wrote:
> > Darren New <dne### [at] sanrrcom> wrote:
> >> Besides, I thought you don't like C either.
> > 
> >   The issue was the #include scheme, which you seem to hate and I really
> > can't understand why.

> Experience! :-)

  Or lack thereof, it seems.

> I have explained it numerous times, with specific examples and general 
> descriptions, so I'm going to assume it's one of those things you just won't 
> ever understand. I think sometimes you fail to understand things that you 
> disagree with.

  It just looks to me that whenever there is a problem with something,
you just love to blame it on C/C++ if it's somehow involved.

> >   Firstly, that doesn't answer the original question. It's not like Andrew
> > was making a choice of programming language for his next project. He just
> > wanted an easy way to compile a bunch of programs. Your answer doesn't help
> > his problem in any way.

> It does.  "Make is not appropriate for your task." :-)

> Granted, the conversation has progressed since then.  But the basic comment 
> was "by the time you need make, make will be inadequate to the task." Then I 
> explained why, which is what we're arguing about now.

  And which tool, exactly, did you recommend instead? I remember you
mentioning shell scripts. Yeah, those can track dependencies just fine.

> >   In a unix system this tracking has been separated into its own set of
> > tools, rather than being part of the compiler. 

> Except for, you know, that pesky -M switch. That's what I'm talking about. 
> The tracking *is* built into the compiler. It would be a pain in the butt to 
> use make to track dependencies without it being built into the compiler.

  It's actually in the C preprocessor, which is a separate tool.

  You *could* have a separate dependency rule creation tool that does
nothing else. On the other hand, it was easier to add that to the C
preprocessor, so they did that.

> > *That* is essentially what
> > you are complaining about. In other words, the C/C++ compilers doesn't do
> > everything out-of-the-box, but instead you need additional tools.

> Nope. Other languages have separate tools too. It's just built into the 
> compiler for simple cases.

  So when other languages have dependency rule generation built in, it's ok,
but when the C compiler has, it's somehow not ok.

> >   Well, if you don't like it, then don't. 

> And *that*, my friend, is how this discussion started.

  In other words, you didn't answer his question, which is my point.

> Andrew asked about 
> make, I said "You won't like it, so use something else", and you got upset 
> that I explained why he won't like it. :-)

  And what exactly is that "something else"? How was your answer helpful
in any way?

> >   You are, hence, barking up the wrong tree. It's not the programming
> > language that is the "problem". It's the separation of compiler and
> > dependency file system that irks you.

> No, it's the inability to correctly figure out dependencies based on the 
> source files that's really the cause of the troubles.

  So now you are changing the argument. First the problem was that you
need a separate tool to generate the dependency rules, but now the problem
is "the inability to *correctly* figure out dependencies", whatever that
means.

  Which tool incorrectly figures out the dependencies?

  Why does it feel that you are desperately trying to pile up more and more
perceived problems to the issue?

>  > You just love
> > to hate C, so you just can't miss an opportunity to bash it.

> I actually like C.

  You are certainly making a very good job at hiding it.

> >   Obviously a compiler cannot know all this if you simply give it a list
> > of all the source code files. It needs slightly more elaborate rules.

> Yet, oddly enough, that's precisely what the XNA content pipeline does, and 
> you just give it a list of all the source files, including the graphics and 
> audio and stuff. Sure, you mention the name of the code used to compile it, 
> but there's enough information from that to track down where the executable 
> and such is. It also makes it possible for the thing that's compiling your 
> collection of frames into an animation to say "the resultant animation 
> depends on this list of frames as well as the text file specifying the list 
> of frames", which you can't do with a makefile.

  I don't get it. That's exactly what you do with make, which was exactly
my point (and exactly why I'm comparing it to the "project files" of other
IDEs).

  You seem to be desperately trying to find flaws that just aren't there.

  Maybe what irks you is that with makefiles you need to write a bit more
syntax to get the same effect.

> You can achieve it with a makefile, yes, but primarily not by having the 
> dependency stuff going on, but by building them as different projects. That 
> is, one tends to have a top-level makefile that says "first run make on the 
> content generator, then run make on the content, then run make on the 
> packaging directory."

  I suppose you could do that, but I don't think it's necessary.

-- 
                                                          - Warp


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.