POV-Ray : Newsgroups : povray.off-topic : Unix shell Server Time
3 Sep 2024 19:19:03 EDT (-0400)
  Unix shell (Message 31 to 40 of 88)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Darren New
Subject: Re: Unix shell
Date: 30 Jan 2011 15:10:31
Message: <4d45c5b7$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> Patrick Elliott wrote:
>>> Trying to work out what went wrong, and why, from the damn code, and the 
>>> code that built the code, and the code that ran the code, that built the 
>>> code... o.O
> 
>> Precisely. When the configuration file to create the build tools is based on 
>> turing-complete macro processing, you're gonna just be screwed if there's 
>> anything wrong.
> 
> Yet this is still somehow a problem with makefiles.

Where did I say makefiles in there?

I'm railing against arbitrarily complex build systems where you can't look 
at the configuration file and figure out what it's going to compile. It's 
the same thing I have against C include files: if they were limited to 
declarations, it would be far, far less of a problem.

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

From: Darren New
Subject: Re: Unix shell
Date: 30 Jan 2011 15:42:44
Message: <4d45cd44$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> Warp wrote:
>>> Darren New <dne### [at] sanrrcom> wrote:
>>>> Besides, I thought you don't like C either.
>>>   The issue was the #include scheme, which you seem to hate and I really
>>> can't understand why.
> 
>> Experience! :-)
> 
>   Or lack thereof, it seems.

Nope nope. Shell shock.  Zeno's Debugging.

I had a function that looked something like this:

RMint open(RMchar* name)
{
    RMerr err;
    RMthing = RMDoThing1(name);
    RMother = RMDoOther(name, RMthing);
    RMstuff = RMStuff(RMthing);
    .....
    return (RMint) 0;
failure:
    return err;
}

And what was coming back from that function was 0x19382. And I had to figure 
out which function was doing that.

Nowhere in the function do we actually have a goto, nor do we assign to err. 
Nowhere in any source file anywhere was there the number 19382 or its 
decimal or octal equivalent.

I attempted to use the debugger, but there were calls to precompiled 
libraries without debugging information that was calling back to this 
routine, so that didn't work.

After I figured out that RMDoThing1 and ilk were actually macros that 
assigned to err and did "goto failure" if it wasn't zero, I then had to do a 
full recompile of all the code with stdout redirected to a file, then track 
down the name of the .c file this code appeared in, then look thru the 
command line arguments to gcc to see which include file directories appeared 
in what order, then look at each of the dozen RMblah.h files in those 
directories in the right order (because each of them appeared in several 
directories) and do that recursively looking for the definition of each of 
those files to find the one that actually had each of those macros in it. 
(In hindsight, I probably should have used -E to see if I could figure it 
out, but that probably would have failed the way my luck was going.)

*Then* I had to figure out that somewhere in the dozens and dozens of 
include files was a line that said
    #define char }}}}
because they didn't want you using a char, they wanted you using a RMui8 or 
some such nonsense. Spend half a day trying to figure out why
    extern int write(int,char*,int);
won't compile at the top of your file and see how much you like C #includes.

Once I tracked that down, I had to figure out what the string was for that 
error message. I.e., what the defined symbol was. But the actual value 
didn't appear in any of the files. Indeed, the #include files wwere full of 
crap likle

#define ERR_NOTFOUND (ERRCURR+1)
#define ERRCURR (ERRCURR+1)

which means if you included the include files in a different order, or added 
one or left it out, your error numbers would change.

OK, so finally add -E.  Nope, sorry, the makefile depends on environment 
variables passed down from several makefiles above, *and* parts of it get 
recreated as part of that process, so changes you make to this makefile get 
clobbered. You have to go track down the source code for the makefile that's 
actually compiling the code. Good thing you saved the complete log of stdout 
and stderr while you were compiling so you could see which makefile in the 
process clobbers the makefile you're trying to modify, and see where it's 
coming from, assuming they didn't put an @ in front of that bit.


Over a week of work to track down which function call was returning the 
error and what it meant, due directly to #include, #define, and makefiles. 
And, admittedly, really really *bad* programming of those things.



So, in summary, no, not everyone who disagrees with your conclusions is a 
fool with no knowledge of the process.


>   It just looks to me that whenever there is a problem with something,
> you just love to blame it on C/C++ if it's somehow involved.

I believe that's because you don't look at what I'm saying objectively, and 
instead you simply say "Oh, Darren's on about C again, and since I don't 
have problems with it, he must just not know what he's talking about."

>   And which tool, exactly, did you recommend instead? I remember you
> mentioning shell scripts. Yeah, those can track dependencies just fine.

Indeed they can, when you don't have nested dependencies.  The original 
question wasn't about dependencies at all. His original question was how to 
compile a bunch of .c files into corresponding executables without getting 
a.out as the result. That has nothing to do with dependencies, and telling 
him to use make when a simple one-line shell script works is overkill.

Really. Go back and read the first post. Nowhere in there is a question 
about dependency tracking.

>   You *could* have a separate dependency rule creation tool that does
> nothing else. On the other hand, it was easier to add that to the C
> preprocessor, so they did that.

Or, you could have the tools generate dependency information as they compile 
the first time.

>>> *That* is essentially what
>>> you are complaining about. In other words, the C/C++ compilers doesn't do
>>> everything out-of-the-box, but instead you need additional tools.
> 
>> Nope. Other languages have separate tools too. It's just built into the 
>> compiler for simple cases.
> 
>   So when other languages have dependency rule generation built in, it's ok,
> but when the C compiler has, it's somehow not ok.

You're completely ignoring the point I'm making, apparently solely to be 
right.

>>>   Well, if you don't like it, then don't. 
> 
>> And *that*, my friend, is how this discussion started.
> 
>   In other words, you didn't answer his question, which is my point.

I *did* answer his question. I provided him a shell script to recompile the 
programs he wants compiled. He didn't ask about make. He asked about how to 
name the files. Someone else said "use make". I said "make is inappropriate 
for that."

>> Andrew asked about 
>> make, I said "You won't like it, so use something else", and you got upset 
>> that I explained why he won't like it. :-)
> 
>   And what exactly is that "something else"? How was your answer helpful
> in any way?

You said "if you find yourself compiling the same programs over and over, 
that's what Make is for."  No, make is for tracking dependencies.  I 
suggested he hit the up-arrow key, or use !!, or put the command line to 
compile all his programs into a shell script, all of which would answer hsi 
original question of "how do I take a .c file and turn it into an object 
file with the same name as the .c file without the extension."

Yes, Make can do that.  But a shell script is easier for someone who doesn't 
want to learn make.

>>>   You are, hence, barking up the wrong tree. It's not the programming
>>> language that is the "problem". It's the separation of compiler and
>>> dependency file system that irks you.
> 
>> No, it's the inability to correctly figure out dependencies based on the 
>> source files that's really the cause of the troubles.
> 
>   So now you are changing the argument. 

No. The programming language is such that it is impossible to figure out by 
looking at the source code what the dependencies are on other source code. 
Most other systems don't have that feature.  In Java, the .class file 
depends on the .java file, for example, and nothing else.

 > First the problem was that you
> need a separate tool to generate the dependency rules, but now the problem
> is "the inability to *correctly* figure out dependencies", whatever that
> means.

If I give you a collection of source files, you can't tell by looking at 
them what the dependencies are. That information isn't stored in or with the 
program itself.

>   Which tool incorrectly figures out the dependencies?

It's non-trivial to figure out the correct dependencies, to the point where 
you (a) need a tool to do it and (b) very often need to run the entire build 
system to figure out what gets included where.

>   Why does it feel that you are desperately trying to pile up more and more
> perceived problems to the issue?

Because you keep dismissing the problems I present as "not a problem", so I 
keep backing farther and farther up the chain of causality until you go 
"yes, I agree that happens."

>>  > You just love
>>> to hate C, so you just can't miss an opportunity to bash it.
> 
>> I actually like C.
> 
>   You are certainly making a very good job at hiding it.

There are things with C that cause problems. This is one of them. It's 
something you have to be very careful about. Building big systems in C is a 
difficult thing to do and takes great care, and I have been burned many 
times in my life working on big systems in C that

>> Yet, oddly enough, that's precisely what the XNA content pipeline does, and 
>> you just give it a list of all the source files, including the graphics and 
>> audio and stuff. Sure, you mention the name of the code used to compile it, 
>> but there's enough information from that to track down where the executable 
>> and such is. It also makes it possible for the thing that's compiling your 
>> collection of frames into an animation to say "the resultant animation 
>> depends on this list of frames as well as the text file specifying the list 
>> of frames", which you can't do with a makefile.
> 
>   I don't get it. That's exactly what you do with make,

Nope. It's not what you do with Make, any more than writing C++ code is the 
same thing you do when writing C code.

 >  Why does it feel that you are desperately trying to pile up more and more
 > perceived problems to the issue?

Why does it seem you're off-hand dismissing significant differences without 
even thinking about what I'm saying?


>   Maybe what irks you is that with makefiles you need to write a bit more
> syntax to get the same effect.

No, it is, in part, that the information in the makefiles is separate from 
the actual build system, so you have to duplicate the information.

>> You can achieve it with a makefile, yes, but primarily not by having the 
>> dependency stuff going on, but by building them as different projects. That 
>> is, one tends to have a top-level makefile that says "first run make on the 
>> content generator, then run make on the content, then run make on the 
>> packaging directory."
> 
>   I suppose you could do that, but I don't think it's necessary.

I've never seen it done differently. I suppose you could have one giant 
makefile for a complex project, but most times I've seen it done with 
recursive makefile invocation. Indeed, there are entire sections of the make 
manual and options to makefile variables that provide for this usage, so 
it's not exactly uncommong.

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

From: Darren New
Subject: Re: Unix shell
Date: 30 Jan 2011 15:56:34
Message: <4d45d082@news.povray.org>
Darren New wrote:
> Warp wrote:
>>> In Ada, you don't calculate the checksums of every source. You 
>>> calculate the checksum of the file you're compiling when you compile 
>>> it. In C#, your file is either associated with a project (i.e., with 
>>> the equivalent of a makefile) or it's installed in the global 
>>> assembly cache with a version number and a cryptographic checksum on 
>>> it you can check against.
>>
>>   So if you have thousands of files in your project, the compiler will
>> calculate the checksums of every single one of them every time you want
>> to compile? And it was you who complained how creating makefile rules for
>> C files is inefficient... Right.
> 
> Read the second sentence again. You calculate the checksum of the file 
> when you compile it. You then store that in the object code.

Oh, and just as an aside, GNAT did it this way for a while, and then they 
found it really was faster to actually re-read the header file and generate 
the checksum and (essentially) recompile it than it was to track a separate 
object file for the header file. So, yeah, in practice, not a problem. Read 
their documentation for details. :-)

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

From: Patrick Elliott
Subject: Re: Unix shell
Date: 30 Jan 2011 17:35:03
Message: <4d45e797$1@news.povray.org>
On 1/30/2011 1:54 AM, Warp wrote:
> Darren New<dne### [at] sanrrcom>  wrote:
>> Patrick Elliott wrote:
>>> Trying to work out what went wrong, and why, from the damn code, and the
>>> code that built the code, and the code that ran the code, that built the
>>> code... o.O
>
>> Precisely. When the configuration file to create the build tools is based on
>> turing-complete macro processing, you're gonna just be screwed if there's
>> anything wrong.
>
>    So let me get this straight: He was trying to compile a program with
> Visual C++, the project did not have a Visual C++ project file, he was
> trying to compile it in Windows which is not Unix and hence does not use
> the same core tools, and consequently he had big problems in compiling
> the program. Yet this is still somehow a problem with makefiles.
>
>    It seems to me that anybody could make a post with any random problem
> they had, and if a makefile was somehow involved, you would immediately
> rush to agree with a "precisely!" answer regardless of what was the real
> cause of the problem.
>
It did have a VC++ project file, unfortunately that project file 
included calls to Unix tools, to generate the changes needed to make it 
compile. A project file is just another name for a MAKE file (at least 
on Windows). It has processing logic, and can call externals, etc.

-- 
void main () {

     if version = "Vista" {
       call slow_by_half();
       call DRM_everything();
     }
     call functional_code();
   }
   else
     call crash_windows();
}

<A HREF='http://www.daz3d.com/index.php?refid=16130551'>Get 3D Models, 
3D Content, and 3D Software at DAZ3D!</A>


Post a reply to this message

From: Invisible
Subject: Re: Unix shell
Date: 31 Jan 2011 04:44:07
Message: <4d468467$1@news.povray.org>
On 28/01/2011 05:28 PM, Darren New wrote:

> But here's a question: Does it only regenerate the pages of
> documentation for the files where the source of the documentation has
> changed?

There are two kinds of documentation to consider.

GHC comes with a User Guide and a handful of other miscellaneous files, 
which are written in DocBook. GHC also comes with a small handful of 
Haskell libraries, which each have API reference documentation embedded 
in the source code. This is extracted and turned into HTML by a tool 
called Haddock.

Due to the amount of cross-referencing possible, editing any single 
source file can potentially affect an arbitrary number of documentation 
files. Short of a tool to actually analyse a source tree and figure out 
all the dependencies, it's difficult to handle this properly. You would 
need a tool to read the entire source and compute the dependencies.

GHC doesn't bother. If any source file changes, the whole documentation 
is rebuilt. Notice, however, that if the source for document X changes, 
only document X is rebuilt. So some level of dependency analysis *is* 
being done. (And especially since the documentation is of more than one 
form, so you have to know how to rebuild it correctly.)

> Does it only re-run test cases for code that has been
> recompiled since it last ran test cases?

AFAIK, all of the test cases work by running the final, finished 
compiler. I don't think there are any "unit tests" that test just one 
program module or something. Thus, if any compiler source code changes, 
the entire compiler is recompiled, and the entire test suite rerun. Same 
for the benchmarks. (Benchmarks of course depend on almost every part of 
the compiler, so it would be infeasible to do it any other way.)

> Or is it more like "this would
> be a shell script, but we already have a makefile so we might as well
> put it in there"?

Well, to some extent, yes. Although Make is selecting the correct shell 
script to run in each case...


Post a reply to this message

From: Warp
Subject: Re: Unix shell
Date: 31 Jan 2011 13:17:09
Message: <4d46fca5@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> >   Anyways, exactly why would you want to recompile your program if stdio.h
> > changes? You see, stdio.h is standardized, and if it was changed in such
> > way that programs would need to be recompiled for them to work, it would
> > mean that the C standard has been changed to be backwards-incompatible and
> > it would break basically all programs in existence.

> Nope. All I need to do is redefine O_RDWR to have a different value or 
> something like that. (OK, not stdio.h, because that's very modularized.)

> Or, if you want, errno.h or any of the other less-well-thought-out include 
> files.

  And you would immediately break like every single program in your system.
Way to go. (And please don't claim that you don't know why they would break.)

> Of course, if someone comes along and does a #define in some include file 
> that conflicts with something in stdio.h, then you're screwed too. Ever 
> spend three days trying to figure out why putting

>     extern int write(int,char*,int);

> at the top of your source code so you could print some debugging info throws 
> a compiler error saying "too many close braces"?

  Never miss a chance to bash C.

> >   Make can't know if eg. a code-generating program depends on some data
> > files unless you tell it.

> Yep! And there's the problem.

  The problem is that 'make' cannot read your mind, it seems.

  Of course it cannot know which files a something depends on unless you
tell it. Seems rather obvious to me.

> Right now, for example, I have an XML file I use to specify the levels for 
> my game. In those XML files I include the lists of animations going on in 
> different places, the textures, etc etc etc.

> Know what? Make doesn't handle that.

  Sounds like something 'make' was specifically designed for.

> Know what else? The tool I'm using 
> does, because as I compile that XML file into the internal format, I'm 
> telling the build system "By the way, the results of compiling this XML also 
> depends on that texture, that sound effect, and those two animations."

> If I were to try to do this with Make, I'd either be repeating all the 
> information manually, or I'd have to write a program to parse that XML file 
> and pull out the appropriate dependencies, except output them in a different 
> way.

  You are constantly shifting the goalposts, and don't seem to be able to
decide if you want to bash makefiles or C, whether what you want is
impossible to do with makefiles or simply more laborious, whether the
problem is that what you want to do is not possible solely with 'make' but
instead you need a third-party tool, or what.

  Sometimes it's "make sucks because of the include file system in C",
sometimes it's "make sucks because it can't do this", sometimes it's
"make sucks because while you can do this, it's more laborious than with
this other fancy tool I have", sometimes it's "make sucks because you
need to use the compiler to generate dependencies".

  You don't seem to understand that 'make' is a generic tool. Or more
precisely, you don't seem to understand what the word "generic" means.
You seem to think that it should mean "supports all programming and
markup languages in existence out-of-the-box, and is able to read your
mind". That's not what "generic" means.

  You have made it abundantly clear that you don't like the fact that
you need, oh gosh, third-party tools if you want to automatically create
dependency lists for some specific file formats. "Hey, I have these files
in this format, and I have this specialized tool that tracks dependencies
on them automatically, and 'make' cannot do that automatically, hence
'make' sucks." Way to miss the point.

  And the best part of all this? So far you haven't given even a single
alternative that would do everything that 'make' does and is better by
your definition (whichever that might be). And preferably one that can
be run in most systems.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Unix shell
Date: 31 Jan 2011 13:18:31
Message: <4d46fcf7@news.povray.org>
Patrick Elliott <sel### [at] npgcablecom> wrote:
> A project file is just another name for a MAKE file (at least 
> on Windows).

  Not according to Darren.

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Unix shell
Date: 31 Jan 2011 13:46:37
Message: <4d47038d@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> >>>   The issue was the #include scheme, which you seem to hate and I really
> >>> can't understand why.
> > 
> >> Experience! :-)
> > 
> >   Or lack thereof, it seems.

> Nope nope. Shell shock.  Zeno's Debugging.

  You have seen a lot of bad C code. I understand that. That's not what
I'm arguing.

> >   And which tool, exactly, did you recommend instead? I remember you
> > mentioning shell scripts. Yeah, those can track dependencies just fine.

> Indeed they can, when you don't have nested dependencies.

  "Indeed thay can"? Shell scripts don't have any kind of dependency
tracking whatsoever. I think you are somehow splitting hairs with
terminology.

>  The original 
> question wasn't about dependencies at all. His original question was how to 
> compile a bunch of .c files into corresponding executables without getting 
> a.out as the result. That has nothing to do with dependencies, and telling 
> him to use make when a simple one-line shell script works is overkill.

  Exactly how do you compile a bunch of .c files into corresponding
executables with "a simple one-line shell script"?

  Ok, you could write a 'for' loop which goes through a list of .c files
and compiles them (resolving the name of the executable by removing the
".c" part) and put all that in one line, but putting all commands into
one single long line is not what is usually meant by a "one-liner"
(except perhaps in the IOCCC). "One-liner" usually refers to being able
to express the task with one single command (a 'for' and its body are
usually considered separate commands).

  Perhaps what you meant was one line per program. That could probably
work for what he wants to do, if he doesn't mind that *every* program
will be recompiled every time he wants to recompile just one of them.

  This is exactly where a makefile would be useful: It would compile
only the program that needs recompiling, skipping the others. And even
if you wanted to go the easy way and just not have to specify any
dependencies, you can still use the makefile to compile only what you
want to compile (with something like "make program3").

  The advantage of that over having a separate shell script for every
single program? You only need to specify compiler options (eg. warnings
and optimizations) once, rather than duplicating it in each script.
(Of course you could have each script call a common script that has the
options, but that starts already being a lot more complicated than the
makefile would be, for no discernible benefit.)

  If you want to compile only one single program, or automatically compile
only the programs that need to be compiled, the *easiest* way is to use
a makefile. A shell script would be significantly more complicated. After
all, that's precisely what makefiles are for.

> I *did* answer his question. I provided him a shell script to recompile the 
> programs he wants compiled. He didn't ask about make. He asked about how to 
> name the files. Someone else said "use make". I said "make is inappropriate 
> for that."

  Which is BS. You seem to think that you can't create a makefile that
compiles programs without the dependency lists. It's not like you are
forced to provide them.

  Creating a shell script for this is the hard way.

> >>>   You are, hence, barking up the wrong tree. It's not the programming
> >>> language that is the "problem". It's the separation of compiler and
> >>> dependency file system that irks you.
> > 
> >> No, it's the inability to correctly figure out dependencies based on the 
> >> source files that's really the cause of the troubles.
> > 
> >   So now you are changing the argument. 

> No. The programming language is such that it is impossible to figure out by 
> looking at the source code what the dependencies are on other source code. 

  If that were true, then it would be impossible to compile the program.
After all, it's impossible for the compiler to figure out which files are
being included just by looking at the source code.

  I suppose compilers do it by magic, then.

  Besides, you *are* changing the argument. First it was "make cannot figure
it out on its own", then it was "it's not possible to correctly figure out
the dependencies" (whatever that means, as I still don't get it).

  Let's see where will you move the goalposts next.

> >   Which tool incorrectly figures out the dependencies?

> It's non-trivial to figure out the correct dependencies, to the point where 
> you (a) need a tool to do it and (b) very often need to run the entire build 
> system to figure out what gets included where.

  Oh, now "inability to correctly figure out dependencies" has been changed
to "it's non-trivial to figure out the correct dependencies".

  Keep moving the goalposts. Which fancy term will you use next?

  Why does it feel like you are desperately trying to find problems and
just can't seem to be able to decide what the actual problem is?

> >> Yet, oddly enough, that's precisely what the XNA content pipeline does, and 
> >> you just give it a list of all the source files, including the graphics and 
> >> audio and stuff. Sure, you mention the name of the code used to compile it, 
> >> but there's enough information from that to track down where the executable 
> >> and such is. It also makes it possible for the thing that's compiling your 
> >> collection of frames into an animation to say "the resultant animation 
> >> depends on this list of frames as well as the text file specifying the list 
> >> of frames", which you can't do with a makefile.
> > 
> >   I don't get it. That's exactly what you do with make,

> Nope. It's not what you do with Make, any more than writing C++ code is the 
> same thing you do when writing C code.

  Maybe *you* don't use 'make' for that, but I do.

  Then, perhaps your preferences are "right" and mine are "wrong".

> >   Maybe what irks you is that with makefiles you need to write a bit more
> > syntax to get the same effect.

> No, it is, in part, that the information in the makefiles is separate from 
> the actual build system, so you have to duplicate the information.

  But it's what allows you to generate building rules for things that don't
have any automated tools. (And yes, the same thing can be done with the
'project' files of most IDEs. They are, after all, doing basically the same
task.)

  I wrote an example earlier: Suppose you have written a program that
generates code, and it generates the code from some data files. Obviously
you would want the program to be run if (and only if) some of those data
files change. Exactly which automated tool would you use to do this?

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: Unix shell
Date: 31 Jan 2011 14:18:01
Message: <4d470ae9$1@news.povray.org>
Invisible wrote:
> Short of a tool to actually analyse a source tree and figure out 
> all the dependencies, it's difficult to handle this properly. 

And that's why I've been saying that Make isn't really a very good tool for 
maintaining that sort of thing.

> You would 
> need a tool to read the entire source and compute the dependencies.

You already have that tool. It just doesn't output its results in a form 
that Make can consume. Certainly the tools that generate the documentation 
files know which output files depends on which other files.

> Thus, if any compiler source code changes, 

Sure. I was asking whether if you change a test case, does "make test" just 
run that one test case?  I've never seen that done.

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

From: Darren New
Subject: Re: Unix shell
Date: 31 Jan 2011 14:25:33
Message: <4d470cad$1@news.povray.org>
Warp wrote:
>   Never miss a chance to bash C.

I don't understand how me describing a problem that caused me not to like 
certain features of C is "never missing a chance to bash C."

>   Of course it cannot know which files a something depends on unless you
> tell it. Seems rather obvious to me.

OK, at this point, you're just refusing to understand what I'm saying.

>> Right now, for example, I have an XML file I use to specify the levels for 
>> my game. In those XML files I include the lists of animations going on in 
>> different places, the textures, etc etc etc.
> 
>> Know what? Make doesn't handle that.
> 
>   Sounds like something 'make' was specifically designed for.

Not really. Make only handles it if I manually duplicate that information 
into a different format.  If I add something to the XML and not to the 
makefile, it's not going to work any more.

Are you telling me that the need for duplicating that sort of information is 
a *good* idea? Or do you admit that needing that information in two 
different places in two different forms is indeed a limitation that it would 
be desirable to eliminate?

>   You don't seem to understand that 'make' is a generic tool. Or more
> precisely, you don't seem to understand what the word "generic" means.

I'm fully aware of what "generic" means. You ask me why I don't like A, and 
I say "it leads to problems with B", and you say "make up your mind whether 
it's A or B you're going to bash."

>   And the best part of all this? So far you haven't given even a single
> alternative that would do everything that 'make' does and is better by
> your definition (whichever that might be). And preferably one that can
> be run in most systems.

And you know what? There will never be such an alternative by folks who 
think that anyone pointing out a limitation with their tools are simply 
bashing for the sake of bashing.

That said, there are many alternatives to make, some of which even generate 
makefiles as output.

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.