POV-Ray : Newsgroups : povray.off-topic : Unix shell Server Time
4 Sep 2024 01:16:41 EDT (-0400)
  Unix shell (Message 61 to 70 of 88)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Warp
Subject: Re: Unix shell
Date: 1 Feb 2011 09:29:09
Message: <4d4818b5@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> Or, look at it a different way, if you're *really* interested in pursuing 
> understanding what I'm saying:  Why do you #include stdio in your program?
> As you said, it's part of the standard. You can't change it without breaking 
> lots of programs. Also, you can't pick a different function to be called 
> "printf" without breaking any library that expects "printf" to mean the one 
> from stdio. (I.e., C's linker has a flat namespace, so if *anyone* includes 
> stdio, then the symbols from stdio will preclude using that name anywhere 
> else to mean something different.)  So why does one have to recompile 
> stdio.h every time you compile a program?

> I know the *technical* reason, as in "because that's how C works." I'm 
> asking you to try to understand why working that way is not as good as 
> simply telling the compiler you're using the standard library and symbols 
> come from there. You see the fact that you're recompiling bits and pieces of 
> the standard-mandated standard libraries (that you correctly acknowledge 
> can't be changed anyway) every single time you compile your program, yes? 
> And that doesn't sound like a strange idea?

  Let's take a better example: Every time I want to use std::vector,
I have to #include <vector> in the source file where I want to do so.

  Is there any advantage to this?

  You could move all standard (non-templated) functions into the core
features of the language rather than them being declared in header files,
and it would still work ok. OTOH, is there a huge disadvantage in having
to include the files other than the trouble of having to write the
#include lines?

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Unix shell
Date: 1 Feb 2011 09:34:44
Message: <4d481a03@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> >   So essentially you want the compiler to know what to compile without
> > you specifying what to compile.

> Nope. I specified that you want to compile "main".

  If your program consists of one single file named "main", then it could
work.

> Look at my statement: "It's impossible to figure out from the source code 
> what other source files it depends on."

> You're intentionally misinterpreting exactly what I'm saying, which is why I 
> put a *trivial* example in a separate post. You're standing here agreeing 
> that it's impossible to look at the source code and tell what depends on 
> what, because there's not enough information in the source code to determine 
> that. And now you're mocking me for pointing it out, in spite of the fact 
> that other systems don't seem to have this problem, because other systems 
> don't have bits and pieces of library source code compiled as part of the 
> process of compiling your own source code.

  Don't forget the context. You were talking about 'make' requiring overly
complicated steps in order to automatically track the dependencies of
include files in C programs, and how you don't like that. Then, suddenly,
you come up with a "it's impossible to track dependencies by looking at
the source code only" argument. It was clear from the context that you
were talking about C in particular, as if the same (quite trivial) problem
didn't happen in more "advanced" languages.

  Of course the argument is stupid. If you don't tell the compiler which
files to compile, it cannot know which files to compile. Seems rather
self-evident. If you have the same file (with possibly different
implementations) in more than one place, you need to tell the compiler
which of those files it needs to use. Again, rather self-evident, and
completely unrelated to a specific programming language.

  So what's the point?

-- 
                                                          - Warp


Post a reply to this message

From: Warp
Subject: Re: Unix shell
Date: 1 Feb 2011 11:48:13
Message: <4d48394d@news.povray.org>
Warp <war### [at] tagpovrayorg> wrote:
> Invisible <voi### [at] devnull> wrote:
> > The solution, as I just discovered, is to name the files *.cpp rather 
> > than *.c++; this apparently enables Make's implicit rules to fire.

>   Not a good solution because you are not getting warnings nor
> optimizations.

  Btw, if you haven't followed the ensuing flamewar in this thread, let me
repeat a nice trick with gnu make here. If you want to be able to just
write "make programname" and have 'make' compile the program, but you want
it to use some compiler options such as warnings and optimizations, create
a makefile in the directory where the source codes are and put this line
in it (nothing else is needed):

CXXFLAGS=-Wall -O3

  Now when you write "make programname", it will give those options to the
compiler.

  If you would want a simple "make" to compile all the programs, make the
makefile like this:

CXXFLAGS=-Wall -O3
all: program1 program2 program3

  Or if you don't want to always having to add the name of a new program
to the makefile, you can use this instead:

CXXFLAGS=-Wall -O3
SOURCES=$(wildcard *.cpp)
all: $(SOURCES:.cpp=)

-- 
                                                          - Warp


Post a reply to this message

From: Invisible
Subject: Re: Unix shell
Date: 1 Feb 2011 11:55:45
Message: <4d483b11$1@news.povray.org>
>>> The solution, as I just discovered, is to name the files *.cpp rather
>>> than *.c++; this apparently enables Make's implicit rules to fire.
>
>>    Not a good solution because you are not getting warnings nor
>> optimizations.
>
>    Btw, if you haven't followed the ensuing flamewar in this thread, let me
> repeat a nice trick with gnu make here. If you want to be able to just
> write "make programname" and have 'make' compile the program, but you want
> it to use some compiler options such as warnings and optimizations, create
> a makefile in the directory where the source codes are and put this line
> in it (nothing else is needed):
>
> CXXFLAGS=-Wall -O3
>
>    Now when you write "make programname", it will give those options to the
> compiler.

OK. Thanks for the tip.

I'm curious to see if turning on optimisations actually makes any 
noticeable performance difference... I guess I'll find out tomorrow.


Post a reply to this message

From: Warp
Subject: Re: Unix shell
Date: 1 Feb 2011 12:16:16
Message: <4d483fe0@news.povray.org>
Invisible <voi### [at] devnull> wrote:
> I'm curious to see if turning on optimisations actually makes any 
> noticeable performance difference... I guess I'll find out tomorrow.

  Well, it depends on what those programs are actually doing. If they are
doing some heavy-duty calculations that take seconds or more, then there
usually is a quite drastic difference between using -O3 and not using it.

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: Unix shell
Date: 1 Feb 2011 15:06:03
Message: <4d4867ab$1@news.povray.org>
Warp wrote:
>   You mentioned some XML file format used by XNA. You want built-in support
> in 'make' for that format?

Nope. Go back and read what I wrote about it.

>> I've given several examples of how to do this.
> 
>   By adding built-in support for every single such format in existence to
> 'make'?

No, go back and read what I wrote about it.

> You have not given any *generic* answer to the question, only
> very *specific* ones.

You skipped over the theoretical strace-based mechanism. You also are 
refusing to consider the possibility of generalizing the mechanism. I don't 
want to waste time trying to discuss possibilities with someone who won't or 
can't look at several examples and deduce a generality from them.

>   I don't remember a single one.

Then you're not even reading the answers I take time to write, so ... OK.

I noticed there were a couple of posts where I said "here, instead of a big 
long rambling question-and-answer, is a clear summary of why I hold the 
stance I do, well organized and cogent, and you seem to have skipped over 
all of those.

>> Oddly enough, so are many of the other build systems that nevertheless don't 
>> require you to list dependencies in multiple places. Several examples of 
>> which I have already given.
> 
>   You seem to be obsessed now with the "dependencies in multiple places"
> thing, yet when I ask where exactly is the duplication in the example
> I gave, you don't answer.
> 

Yet, two questions up, you say
"So you do and don't oppose the idea of using third-party tools to generate
the dependencies."

How does your third party tool generate dependencies? Does it not take those 
dependencies out of the input you give to *that* tool and store them in 
makefiles?

I already answered this. There's a dependency in the makefile that goes
prog.o : prog.c abc.h def.h

There's a dependency in the C file that goes
prog.c:
#include "abc.h"
#include "def.h"

That's the last time I'll answer that question for you, tho.

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

From: Darren New
Subject: Re: Unix shell
Date: 1 Feb 2011 15:22:31
Message: <4d486b87$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>>>   Yes, it's much more likely that he would learn to use the shell's 'for'
>>> construct more easily than he would learn how to write a makefile.
>>>
>>>   You write as if there was something wrong with learning how to use
>>> makefiles. (Well, by now it wouldn't really surprise me.)
> 
>> Nope. But using someone else's makefile is often problematic.
> 
>   Now you are changing your argument once again.

My argument is multi-faceted, as is most complex stances. When you only look 
at the messages I write that are answers to your specific questions, it 
looks like I'm changing what i'm saying, because you're looking through a 
pinhole at the entire argument.  When I write down in a handful of 
paragraphcs a complete and cogent description of my point, you apparently 
don't read it, as you never reply, ask about it, and then later claim that I 
never gave the examples I used in those posts.

>> I'm against (in principle) many technologies that are simple on the surface 
>> and get abused out the whazoo instead of improving the technology.
> 
>   And shell scripts, which is what you advokated instead of makefiles,
> cannot be abused nor can they be impossible to understand? I don't even
> understand your point anymore.

No, clearly not.

>   All things considered, I don't think that this simple case is all that
> complicated. (The only complicated thing is figuring this out.)

I agree. Not sure why you're making such a big thing of me pointing out that 
the simple system works fine with shell scripts.

>>>   (And granted, with a "standard" non-gnu makefile you would need more
>>> than this.)
> 
>> Another problem with make (and similar tools that get abused).
> 
>   I honestly cannot undrestand why you call this "abuse". 

Your reading comprehension is poor. Where did I say what you wrote is an 
abuse of make?  Answer: I didn't. I said that having a bunch of different 
versions is indicitive of a program whose capabilities have been expanded 
over time bit by bit without a coherent plan, in such a way that such 
extentions tend to get abused.

And in another post, I explained why such a thing gets abused, which answers 
the first question you had.

>>>   Well, the compiler can't look at the source code if you don't tell it
>>> where it is in the first place. 
> 
>> Ding ding ding!
> 
>   Your point being?
> 
>>> That seems rather obvious, and is true for any language. 
> 
>> Incorrectamundo!
> 
>   So there exists a compiler which indeed can read your mind so that you
> don't have to explicitly specify what it needs to compile. 

Nope. It doesn't have to read your mind.  Of course it has to know what to 
compile. The problem is not that you have to tell the compiler what to 
compile. The problem is you have to (in C) compile much more than your own 
source code in order to compile your program.

I have no problem telling a C compiler what and where my program is. I just 
don't think it's a good idea to have to tell a C compiler where everyone 
*else's* source code for libraries that are already compiled is. For some 
reason, you insist on ignoring the distinction. I suspect you do so in order 
to be able to feel you're still right.

(As far as me bashing, I find it amusing that when I bash a feature in C 
that's improved in C++, you don't get upset, but if I bash a feature in C 
that is carried over to C++, you justify it out the wazoo.)

>>> You are making it sound like most other languages
>>> don't need to be told where the libraries and source files are.)
> 
>> Generally, other languages don't need to be told where the sources for a 
>> particular library are stored, no.
> 
>   Bullshit. Specific compilers might have *default* paths for the libraries,
> and so does a typical gcc installation in linux. That doesn't mean that
> the compiler is not being told where the libraries are.

You don't understand the word "sources" in that sentence?  Reading 
comprehension, dear.

>   You really are desperately trying to find flaws that just aren't there.

You're just ignoring my complaint and tilting at strawmen.

>> No. No no no. I'm saying I can build dependencies in XNA *without* 
>> specifying files and such in the project file. You're just disregarding 
>> that.  I already listed what's in a project file for XNA C#. It 
>> conspicuously lacks mention of dependencies of source files on other source 
>> files.
> 
>   So XNA employs this new marvelous mind-reading technology, as it is able
> to know what to include in the project without you having to explicitly
> tell it.
> 
>   How does it work? Does it scan brainwaves?

No. I already wrote an extensive post describing all the sections of an XNA 
project file, as well as the mechanisms whereby the dependencies get 
described implicitly. Which you have apparently either not read or disregarded.

>> Oh, I see what you're saying. You're thinking of something like YACC, where 
>> the source files are passed on the command line. Yes, you're right, the 
>> makefile can be the sole repository of the list of dependencies, in that case.
> 
>   That's one of the most typical ways of building complex projects in unix,
> and it's precisely what makefiles are designed for (and is typically why
> command-line tools take things as command line parameters or, at worst,
> as data it reads from the standard input stream).

Yep. Of course, if your YACC grammar has something equivalent to an #include 
in it, you're back to listing dependencies in multiple places.

>> But look at the C compiler. You have the list of dependencies in the 
>> makefile for invoking the compiler, and then inside the source code there's 
>> the list of dependencies in the form of #include files.
> 
>   Which is why you can *automate* the creation of the dependency rules
> rather than having to do it by hand.

The fact that you can automate (in some cases) the copying of dependencies 
from one place to another does not mean you're not copying dependencies from 
one place to another.

>   The #include file mechanism in itself is useful for other reasons,
> especially in C++, which is why it has its advantages.

Yes. In C and C++, #include is useful. It allows a number of things (like 
really powerful templates) that would be difficult (but not impossible, as 
seen in LISP and Erlang for example) to do in other ways.

>>>   Your complaints have now shifted to duplication of file names. Exactly
>>> where is the duplication in this particular example?
> 
>> Do you deny that your makefile is going to have a list of files your main C 
>> code includes, and your main C code will also have that list of files? 
>> Isn't the -M option indeed specifically intended to duplicate that 
>> information in two different formats?
> 
>   I was talking about my example of a code generator creating code from
> some input files.

Right. And in *my* example, which is what I was talking about since you 
never gave a specific example, the dependencies are in the data file (like 
#includes would be) rather than on the command line.

>   Nevertheless, why should I care if some #include file information is
> duplicated, as long as it happens automatically?

Because now you need a third-party tool to automate dependency generation.

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

From: Warp
Subject: Re: Unix shell
Date: 1 Feb 2011 16:49:53
Message: <4d488001@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> >   All things considered, I don't think that this simple case is all that
> > complicated. (The only complicated thing is figuring this out.)

> I agree. Not sure why you're making such a big thing of me pointing out that 
> the simple system works fine with shell scripts.

  When talking about a project building system, a shell script is a
significantly more primitive tool than a makefile because shell scripts
are not designed specifically to deal with dependencies and building
projects, while makefiles are. You could say that shells scripts are
(significantly) *more* generic than makefiles. Going to the more primitive
tool is going to the wrong direction. You should go towards the tool that
is more designed for the task at hand.

  Your initial argument pro shell scripts is that, according to you, they
are simpler to create for the original task in this thread than a makefile,
which, again according to you, is the "wrong" tool.

  The problem is that you didn't know that you can actually implement what
Andrew wanted with just a couple of simple and straightforward lines with
a makefile (at least when using gnu make, which is the default for all
linux distros). You thought that you need to write a lot more. Hence you
mistakenly advocated shell scripts as the "simpler" solution, when they
really aren't. (Just compare your shell script solution to the makefile
examples I posted earlier and consider honestly which one is simpler.)

  'make' is not the wrong tool for what he wants to do, unlike what you
claimed. A shell script will not make the whole process any simpler.

  Could there be any better tools than gnu make for this task? Well, if
you don't need to track dependencies (as Andrew clearly didn't have to),
then no: gnu make performs the task perfectly with just a couple of simple
lines. I don't think any other tool could make a better job at it.

  If he wanted to automatically track all source file dependencies (in his
linux environment), then yes, there could be (and are) better tools for
this, tools which are even more specialized than 'make' and can automate
the process with less magic keywords and commands. However, such tools are
not necessarily included in many standard linux distributions, while gnu
make is (at least if you have installed any compiler, and probably even
regardless). 'make' is more "standard" (in a sense) than those other more
specific tools. In this particular case, however, there would be little
benefit in using those.

  It seems to me that you are opposing 'make' just as a matter of principle
rather than as a matter of practicality. You'd rather have him use shell
scripts rather than makefiles, for whatever reason.

> >>>   (And granted, with a "standard" non-gnu makefile you would need more
> >>> than this.)
> > 
> >> Another problem with make (and similar tools that get abused).
> > 
> >   I honestly cannot undrestand why you call this "abuse". 

> Your reading comprehension is poor. Where did I say what you wrote is an 
> abuse of make?  Answer: I didn't. I said that having a bunch of different 
> versions is indicitive of a program whose capabilities have been expanded 
> over time bit by bit without a coherent plan, in such a way that such 
> extentions tend to get abused.

  I think it's you who didn't understand. What exactly is this 'make'
abuse you are talking about?

  gnu make is a pretty standardized tool in linux distros (and many other
unixes). How do you "abuse" a standard tool when using it for its intended
purpose?

> Nope. It doesn't have to read your mind.  Of course it has to know what to 
> compile.

  Well, that was what I said, and you said that was incorrect.

> The problem is not that you have to tell the compiler what to 
> compile. The problem is you have to (in C) compile much more than your own 
> source code in order to compile your program.

  You have odd problems. I haven't encountered such a problem as far as
I remember.

  Maybe compilation times would get faster if the source files didn't depend
on gigantic header files? Probably. However, I haven't personally encountered
this as a problem, as I have never had to deal with gigantic projects with
millions of lines of code. (I do understand that it might be a nuisance in
such huge projects, though.)

> I have no problem telling a C compiler what and where my program is. I just 
> don't think it's a good idea to have to tell a C compiler where everyone 
> *else's* source code for libraries that are already compiled is.

  I don't have to tell it either. I don't even remember where exactly eg.
the C++ standard header files are in my OpenSuse system. Probably somewhere
inside /usr/include/something. The compiler is already configured to find
them so I don't have to care.

  If I install a new library from a OpenSuse repository, I still don't have
to care where it's installing it. In fact, I don't even *know* where it's
installing them, and I don't need to. They will still work.

  If I install a third-party library *locally* somewhere in my home directory
(rather than systemwide), *then* I may need to add a couple of compiler
options to tell the compiler where to find it. However, that seems pretty
obvious. After all, I have installed the library in a nonstandard location.
The compiler cannot guess where if I don't tell it.

  Can this scheme sometimes cause *more* problems than when dealing with
C# on Windows using Microsoft's IDE? Probably. It hasn't bothered me too
much so far, though. (Although I would be surprised that if you installed
a C# library in a nonstandard location, you wouldn't have to explicitly
tell the compiler where to find it. Well, unless the library installer
doesn't tell it automatically, but that's just a technicality.)

> (As far as me bashing, I find it amusing that when I bash a feature in C 
> that's improved in C++, you don't get upset, but if I bash a feature in C 
> that is carried over to C++, you justify it out the wazoo.)

  No, I oppose bashing C (and C++) for flaws that it doesn't really have.

> >>> You are making it sound like most other languages
> >>> don't need to be told where the libraries and source files are.)
> > 
> >> Generally, other languages don't need to be told where the sources for a 
> >> particular library are stored, no.
> > 
> >   Bullshit. Specific compilers might have *default* paths for the libraries,
> > and so does a typical gcc installation in linux. That doesn't mean that
> > the compiler is not being told where the libraries are.

> You don't understand the word "sources" in that sentence?  Reading 
> comprehension, dear.

  I don't think I have the sources eg. for libc.so installed in this
system, hence I don't have to tell the compiler where they might be.

  I don't get your point.

> The fact that you can automate (in some cases) the copying of dependencies 
> from one place to another does not mean you're not copying dependencies from 
> one place to another.

  And I asked why should I care if it happens automatically behind the
scenes.

> >   Nevertheless, why should I care if some #include file information is
> > duplicated, as long as it happens automatically?

> Because now you need a third-party tool to automate dependency generation.

  If you consider the C preprocessor a "third-party tool".

  Even if you do, so what?

-- 
                                                          - Warp


Post a reply to this message

From: Darren New
Subject: Re: Unix shell
Date: 1 Feb 2011 19:09:32
Message: <4d48a0bc@news.povray.org>
Warp wrote:
>   The problem is that you didn't know that you can actually implement what
> Andrew wanted with just a couple of simple and straightforward lines with
> a makefile 

Of course I knew that. Don't be insulting.

>>>>>   (And granted, with a "standard" non-gnu makefile you would need more
>>>>> than this.)
>>>> Another problem with make (and similar tools that get abused).
>>>   I honestly cannot undrestand why you call this "abuse". 
> 
>> Your reading comprehension is poor. Where did I say what you wrote is an 
>> abuse of make?  Answer: I didn't. I said that having a bunch of different 
>> versions is indicitive of a program whose capabilities have been expanded 
>> over time bit by bit without a coherent plan, in such a way that such 
>> extentions tend to get abused.
> 
>   I think it's you who didn't understand. What exactly is this 'make'
> abuse you are talking about?

If you've never seen abuse of makefiles, then power to you. I think anyone 
who has worked on an ugly system knows what I'm talking about. Otherwise, go 
look up the "buildroot" system for compiling kernels.  I mean, hell, 
buildroot was so makefile-abusive that people capable of working on the 
linux kernel said "screw this, let's throw it away and start over."

>> The problem is not that you have to tell the compiler what to 
>> compile. The problem is you have to (in C) compile much more than your own 
>> source code in order to compile your program.
> 
>   You have odd problems. I haven't encountered such a problem as far as
> I remember.

Well, good for you. The fact that you've not encountered that problem 
doesn't mean that the problem isn't real.

>   Maybe compilation times would get faster if the source files didn't depend
> on gigantic header files? Probably. However, I haven't personally encountered
> this as a problem, as I have never had to deal with gigantic projects with
> millions of lines of code. (I do understand that it might be a nuisance in
> such huge projects, though.)

Well, let me know when you've taken the source code for an entire toolchain, 
kernel, and application stack and cross-compiled it from scratch. Let me 
know how easy it is to figure out where everything goes when your makefiles 
download from FTP sites other makefiles and then run them thru sed to patch 
up the errors in them before invoking them, and then come back and tell me 
it isn't a problem.

>   If I install a new library from a OpenSuse repository, I still don't have
> to care where it's installing it. In fact, I don't even *know* where it's
> installing them, and I don't need to. They will still work.

Here you're arguing that someone else has already solved the problem for you.

>   Can this scheme sometimes cause *more* problems than when dealing with
> C# on Windows using Microsoft's IDE? Probably. It hasn't bothered me too
> much so far, though. 

Great. Install compilers for three different versions of Linux for three 
different architectures on the same machine. Make sure that you can use all 
three different versions of boost you need on each of those three compilers. 
Let me know how simple your makefiles are.

(And you're giving *me* shit about not knowing how makefiles work? Really??)


> (Although I would be surprised that if you installed
> a C# library in a nonstandard location, you wouldn't have to explicitly
> tell the compiler where to find it. Well, unless the library installer
> doesn't tell it automatically, but that's just a technicality.)

I could describe how it works, but you're not interested enough to read it.

>   No, I oppose bashing C (and C++) for flaws that it doesn't really have.

You mean "flaws you haven't encountered".  Good for you.  Maybe I've used C 
more than you have even.

>> You don't understand the word "sources" in that sentence?  Reading 
>> comprehension, dear.
> 
>   I don't think I have the sources eg. for libc.so installed in this
> system, hence I don't have to tell the compiler where they might be.

Um, yes, you do. You have the sources for some of the openssl libraries 
installed, don't you?

>   I don't get your point.

Yes you do.

>> The fact that you can automate (in some cases) the copying of dependencies 
>> from one place to another does not mean you're not copying dependencies from 
>> one place to another.
> 
>   And I asked why should I care if it happens automatically behind the
> scenes.

And I answered this repeatedly, and you picked nits instead about whether 
invoking the C preprocessor is invoking the compiler or not.

>> Because now you need a third-party tool to automate dependency generation.
>   Even if you do, so what?

I already answered that.

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

From: Darren New
Subject: Re: Unix shell
Date: 1 Feb 2011 19:20:06
Message: <4d48a336$1@news.povray.org>
Warp wrote:
> Darren New <dne### [at] sanrrcom> wrote:
>> Warp wrote:
>>>   How do you resolve this? That's right: You tell the compiler which files
>>> it needs to compile.
> 
>> Nope. You tell the compiler which *object* file goes with the appropriate 
>> *source* file.
> 
>   Yeah, huge difference.

And *this* is exactly why these arguments go on for so long.

I can't tell whether you're being a truly masterful troll, or whether your 
misplaced egoism has completely blinded you.


I make a comment. You strawman an argument against it.  I point out that 
you're putting words in my mouth. Instead of saying "OK, I was wrong in my 
assertion, I don't see the difference, please explain again why it matters", 
you say "I wasn't wrong, because even tho I was *factually* incorrect, it 
doesn't make any difference, because there couldn't possibly be any 
difference, so I'm really right anyway."


Yes, there *is* a huge difference. I spent an entire post of several dozen 
lines long pointing out exactly why there's a difference and the 
implications it has for this, simply so you could stop picking nits one 
sentence at a time and maybe comprehend what I was trying to say. But you 
either didn't read it, or you read it and decided you couldn't pick nits 
with it, so you asked the same question again and again, ignoring the answer 
each time by equating it to something different and then dismissing the 
differences.


And you know, for someone who distinguishes "invoking the compiler 
preprocessor" from "invoking the compiler", saying there's no significant 
difference between source code and object code just doesn't ring true.



>   Are you saying that it's impossible to have a C# source file with the
> same name in two different directories and which implement a class with
> the same name?

I'm saying that if you do, it doesn't matter, because your compilation isn't 
looking at that source code.

>> Do you really not see the difference between saying "each time I compile 
>> something, I have to pick which source file to compile" and saying "each 
>> time I compile something, I have to pick the right source files that go with 
>> the corresponding libraries of every library I'll eventually link this with"?
> 
>   Not really.

OK. Then I'm afraid I can't explain it to you.

> You have to specify which source files the program is
> composed of. 

It's not my program. I've never written a crypto library, nor have I written 
a video codec. But I've compiled the source code for parts of those 
libraries into every C program I've ever written that used them.

> Your example of the same C file being in two different directories is simply bogus.

Why?

> It has nothing to do with C particularly.

It is much more common in C than in other languages, I've found. YMMV, but 
that doesn't make me wrong.

>> Indeed, when I use the C# equivalent of stdio (or any other third-party 
>> library), I don't need any source code I didn't write myself, nor do I need 
>> to specify where to find the libraries. I merely say which version of what 
>> implementation I want.
> 
>   And when I write C++, same thing. So what?

So, when you use boost, you don't need to compile any source code to any 
software you didn't write yourself? Sorry, that's just a lie. I've looked at 
the boost code. It's mostly source code.  And you know what? I've worked on 
projects where I've needed two different versions of boost in the same code, 
neither version of which was in the repository. It sucks. It's very 
difficult to get right.

>   "I have seen the same C file in multiple places, but I have never seen
> the same C# file in multiple places, hence C is clearly the inferior
> language." Yes, very convincing argumentation.

It's only unconvincing because it's a straw man you refuse to acknowledge 
isn't what I'm arguing.

-- 
Darren New, San Diego CA, USA (PST)
  "How did he die?"   "He got shot in the hand."
     "That was fatal?"
          "He was holding a live grenade at the time."


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.