POV-Ray : Newsgroups : povray.off-topic : Unix shell : Re: Unix shell Server Time
3 Sep 2024 19:15:32 EDT (-0400)
  Re: Unix shell  
From: Darren New
Date: 29 Jan 2011 17:22:42
Message: <4d449332$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> If you change environment variables, it fails.
> 
>   I'm not exactly sure which environment variables a compiler depends on
> which would affect the compilation process. Usually command-line arguments
> (to the compiler) are used to affect the compilation.

http://www.gnu.org/software/hello/manual/make/Implicit-Variables.html#Implicit-Variables

Stuff like CFLAGS and LDFLAGS and LIBS and INCLUDEPATH, all of which tends 
to come from outside the environment. Sure, you *can* get around it, but in 
practice people don't.

>   Well, I suppose you could make the makefile use some environment variables
> to affect the compilation. Seems rather contrived and counter-productive
> (because you are now putting part of your project settings outside of your
> "project file", which is usually were they should be).
> 
>   If you want changes to the makefile itself cause everything to be
> recompiled, you could add the makefile to the dependency rules.

Unless it's in a different project, sure. If you want your kernel module to 
recompile any time the C runtime library or the different bits of the kernel 
it depends on changes, you have a bit of tracking down to do.

Besides, have you *ever* seen someone actually do this in a hand-generated 
makefile?

>> If you restore an 
>> older version of an include file, it fails.
> 
>   I'm pretty certain most IDEs don't eg. calculate checksums of every single
> source file to see if they have changed from the last time that it was
> compiled. 

In languages where it's important to be correct, like Ada, yes they do. In 
C#, yes they do, because they don't *look* at include files, they look at 
object files. C# doesn't *have* include files. That's the point. If you say 
"use the types from *that* assembly" and then later you replace the assembly 
with an incompatible version, your code will get recompiled (or will refuse 
to start if you try to run it without recompiling).

In Ada, you don't calculate the checksums of every source. You calculate the 
checksum of the file you're compiling when you compile it. In C#, your file 
is either associated with a project (i.e., with the equivalent of a 
makefile) or it's installed in the global assembly cache with a version 
number and a cryptographic checksum on it you can check against.

> In a project with thousands of files it could take a while to
> calculate checksums for all of them.

When it's important, you compile the header file separately, like in Ada. 
That's my problem with the way C does #include. Ada header files aren't 
chunks of arbitrary source code that you include nested in other chunks of 
source code. Ada external type declarations aren't a byproduct of compiling 
something unrelated to the type you're declaring. They're a separate piece 
of code that can be version controlled and dependency analyzed on its own. 
If you recompile a header file and don't recompile everything that depends 
on it in Ada, your code won't link, regardless of date stamps.

>> If you change the source file 
>> then accidentally change the object file it fails.
> 
>   Yeah, and if you accidentally delete the makefile, it also fails. Duh.

No, if you delete the makefile you get an error. If you touch the object 
code after changing the source, you get a silent failure. There's a big 
difference there.

> There are like a million scenarios where you accidentally do something to
> your files which will cause a malfunction (regardless of your IDE).

Sure. You're apparently trying to miss my point here, which is that the idea 
that basing dependency information and content-change information solely on 
file timestamps is iffy.  Yes, there's a lot of ways to break code, but 
those aren't relevant to the discussion we're having about Make.  There 
aren't a whole lot of scenarios where you accidentally do something to your 
Ada code and wind up with an executable that doesn't match the sources you 
compiled it from.

>   Of course the makefile assumes that you have specified all the necessary
> dependencies. It cannot read your mind.

I have never seen a makefile that lists as a dependency of my code all the 
things that stdio.h includes.

>   If you are using a tool to create the dependency lists and that tool
> fails for some reason, blame the tool, not make.

If I'm using a tool to create makefiles, then sure, but that's just saying 
makefiles are so useless that I have to automate their creation. Why would I 
use a tool to create makefiles using a tool that just does the job properly?

>   There's nothing wrong in using two tools to perform a task rather than
> one. Do you also blame 'ls' for not supporting everything that 'sed'
> supports?

There's really only one task going on here - dependency checking. That's the 
one and only thing Make does.

>   'make' is not used only to compile C/C++ programs. It can be used for
> quite many tasks, including things that have nothing to do with compiling
> a program. 'make' is designed to be a versatile tool for these tasks.

That's what I asked earlier. I just never found anything that *actually* 
uses the dependency analysis of make to build something other than C-like 
code or libraries therefrom.

Do you have things in your makefile for processing the graphics for your 
games? Does it only reprocess them only when the source graphics change?

It's not like "make test" only runs if you recompiled something (or, better, 
only tests the modules you changed), or "make clean" doesn't delete *.o 
instead of going through each file to see if it's there first.

>   Many IDEs also support this, but they aglomerate everything into one
> single program. 

Well, no, not really. In simple situations (like a directory full of Java 
source), they just give it to the compiler to figure out, because the 
compiler already knows all the dependencies. In complex situations, they 
have to do more than just look at timestamps.

>> So you invoke the compiler to create the makefile. You're splitting hairs 
>> here.
> 
>   I'm not. gcc does not "compile" the program when I run it with the -M
> parameter. It just runs a simplified C preprocessor to track all the
> included files. The major difference is, obviously, speed.

Sure. But you're invoking the compiler, and with the same sets of 
command-line options and environment variable expansions as when you do the 
full compilation.

I'm sorry, but if anyone in the world asked me "how do I invoke the gcc 
compiler", I'd say "use /usr/bin/gcc" or something. I would *not* add "but 
be careful not to pass the -M flag, because that doesn't invoke the compiler."

The point is that you're using C-specific code written by the same people 
who wrote the compiler to figure out what the C-specific dependencies are. 
The speed is irrelevant. What's relevant is that the C nested includes are 
sufficiently complex that you need a tool to generate the input to the next 
tool, where the "next tool" was supposed to stand alone on its own in 
simpler times.  In more complex times, the stand-alone tool is mostly 
useless. We have gotten to the point where any time "make" saves you enough 
compilation time to be worthwhile, generating the makefiles is too complex 
to be easily maintainable.  Make lives on not because it's good, but because 
it's as ubiquitous as C. But environments that don't use C don't use make 
either, because it's just not a very good tool.

>> If I can't easily create the makefile by looking at the source code 
>> I'm trying to compile, it's baroque. :-)
> 
>   I just can't see what's so wrong in using a tool to create the makefile
> dependency rules. Do you really *want* to be writing makefiles by hand?

I have no trouble at all writing makefile rules by hand when it's *not* C 
that I'm compiling. That's exactly my point. C's dependencies are 
sufficiently opaque that it's actually problematic just to figure out what 
they are, to the point where one feels the need to actually write a number 
of tools to figure it out, including modifying the compiler to help you out.

But even back in the MS-DOS 3 days, I was writing editor macros to suck 
#include declarations out of C sources to generate makefiles, simply because 
of the need for *correctness* rather than complexity.

>>>   Besides, you only have to create the rules once. If the dependencies
>>> don't change, you don't need to create them every time you want to compile.
> 
>> You have to create the rules every time you change anything that might 
>> affect the compilation, whether that includes the .c file you're compiling 
>> or not.
> 
>   Generating the rules is pretty fast, and the need doesn't present itself
> very often, and re-generating the rules is quite easy. I just can't see
> the problem here.

Regenerating the rules is only quite easy because someone added a flag to 
GCC to make it regenerate the rules. Now, go regenerate the rules without 
the -M flag. I'm not saying that generating the dependencies, *today*, isn't 
easy. I'm saying that the job Make was created to handle, namely recording 
and evaluating dependencies, is obsolete. Makefile dependency lists are, for 
the most part, created by other tools, and they're inadequate to actually 
capture all the dependencies that people have in modern code. It was a fine 
tool back when you had a dozen C files and a half-dozen include files in 
your project, and your choice was whether you wanted the compiler to 
optimize or not. Nowadays,
http://gcc.gnu.org/onlinedocs/gcc/Optimize-Options.html#Optimize-Options
http://www.boost.org/doc/libs/1_45_0/libs/config/doc/html/boost_config/rationale.html#boost_config.rationale.the_problem

You're arguing that the dependency rules for C's #include directive isn't 
complicated because the compiler can figure it out for you. In your words, 
"well, duh."  That doesn't make the rules easy. It just means that the 
complexity is encapsulated in the compiler. And that means if you're 
compiling a different language with different dependency rules, the sorts of 
things Make does just aren't the best way of doing it. C# needs something 
more than timestamps to determine dependencies. Java doesn't need makefiles 
at all for the code itself. Ada needs a completely different process, not 
infrequently involving a database. All the interpreted languages have their 
own thing going on.

(It also ignores the fact that you have to figure such crap out when you 
write the code, too. I mean, really, three #include files to invoke "open"? 
How would you even guess what include files the various #defines were in if 
it wasn't a system library documented all over the place? It's entirely 
possible for someone to give you a blob of source code and you have no way 
of even knowing how to correctly generate the dependencies from it: for 
example, it wasn't unusual at the last place to have a bunch of different 
include files with different values for the same #define in different 
directories, and spend a couple days trying to figure out which include path 
to use to get it to link correctly against the pre-compiled libraries also 
included. But that's also due to the nature of not putting type information 
into object code files as much as #include per se.)

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.