POV-Ray : Newsgroups : povray.off-topic : compiling huge C++ files Server Time
6 Sep 2024 17:21:34 EDT (-0400)
  compiling huge C++ files (Message 1 to 5 of 5)  
From: Kevin Wampler
Subject: compiling huge C++ files
Date: 3 Nov 2008 16:15:51
Message: <490f6a07$1@news.povray.org>
I'm attempting to compile some auto-generated code, and am running into 
problems with turning on optimization flags due to the large side of the 
files generated (.cpp file is currently 25MB, and could be as large as 
200MB later).  There is no natural way to break this code into smaller 
pieces and such large auto-generated files really are necessary, so I'm 
stuck with having to compile some really huge C++ files.

I'm using g++ for the compilation, and it seems that if I turn on 
optimization flags (even -O1) it will quickly eat through all 16 GB of 
ram when I attempt to compile it.

Since I'd like to have the resulting executable run as quickly as 
possible I am wonder if anyone has advice on either or all of the 
following options:

1) Is there any way to specify for g++ to only to use optimizations 
which do not require a significant amount of storage to compute?

2) Is there another compiler that works under linux (fedora 9) that 
might handle this better?  I should be able to find a version of the 
Intel compiler to use if someone has reason to think it'll perform better.

3) Can I change the format of the source code so that using g++ 
optimizations isn't as necessary?  Most of the execution time is spent 
in my auto-generated code, which looks something like this:

	...
	double t75 = t73 - t74;
	double t76 = t66 - t67;
	double t77 = t74 + t73;
	double t78 = t55 * t51;
	double t79 = t78 + t58;
	double t80 = t64 * -0.2;
	....

Only for hundreds of thousands of lines with the occasional division, 
sqrt, trigonometric function, etc thrown in.  Since I write the code 
generation functions, I can do some optimizations at the source-code level.

For example of such an optimization, I could implement a variable 
allocation optimizer that would, for instance, determine that t74 isn't 
used after the third line above, and then re-use that stack space to 
store the value in t77 instead of allocating a new double for it.

The problem is that I don't know what (if any) of this sort of thing g++ 
does already if I don't specify any optimization flags, or what 
source-level optimizations are likely to make the biggest improvement. 
Any suggestions?


Post a reply to this message

From: Warp
Subject: Re: compiling huge C++ files
Date: 3 Nov 2008 16:29:35
Message: <490f6d3f@news.povray.org>
Kevin Wampler <wampler+pov### [at] uwashingtonedu> wrote:
> I'm attempting to compile some auto-generated code, and am running into 
> problems with turning on optimization flags due to the large side of the 
> files generated (.cpp file is currently 25MB, and could be as large as 
> 200MB later).  There is no natural way to break this code into smaller 
> pieces and such large auto-generated files really are necessary, so I'm 
> stuck with having to compile some really huge C++ files.

  That's rather large. Are you sure it's the only way?

> 1) Is there any way to specify for g++ to only to use optimizations 
> which do not require a significant amount of storage to compute?

  If you run "info gcc" and the choose Invoking GCC > Optimize Options
you will get a detailed documentation of all the optimization flags gcc
supports. You can try turning then on one by one and see which ones
require significant amounts of memory and which ones don't.

> 2) Is there another compiler that works under linux (fedora 9) that 
> might handle this better?  I should be able to find a version of the 
> Intel compiler to use if someone has reason to think it'll perform better.

  I have no idea if it will perform better, but it doesn't hurt to try.

> 3) Can I change the format of the source code so that using g++ 
> optimizations isn't as necessary?  Most of the execution time is spent 
> in my auto-generated code, which looks something like this:

  Are you sure you can't precalculate things in your code generator?

-- 
                                                          - Warp


Post a reply to this message

From: Kevin Wampler
Subject: Re: compiling huge C++ files
Date: 3 Nov 2008 16:44:54
Message: <490f70d6@news.povray.org>
Warp wrote:
> Kevin Wampler <wampler+pov### [at] uwashingtonedu> wrote:
>> I'm attempting to compile some auto-generated code, and am running into 
>> problems with turning on optimization flags due to the large side of the 
>> files generated (.cpp file is currently 25MB, and could be as large as 
>> 200MB later).  There is no natural way to break this code into smaller 
>> pieces and such large auto-generated files really are necessary, so I'm 
>> stuck with having to compile some really huge C++ files.
> 
>   That's rather large. Are you sure it's the only way?

It may be possible to re-formulate my problem to generate less code, but 
  unfortunately there's no way to do it that's particularly easy, or 
that I'm entirely sure will work.

The auto-generated code itself is necessary because I do some 
computations which involve complete knowledge and manipulation of the 
function-composition graph which I eventually write to C++.  It's 
technically possible to avoid this, but it would be at a large cost in 
code complexity and would probably result in significantly slower code 
(defeating the purpose of such a change).

I can compile things perfectly well if I don't specify any optimization 
flags, so it's not a disaster if I can't get g++ to cooperate -- it will 
make make things somewhat less convenient.

>> 1) Is there any way to specify for g++ to only to use optimizations 
>> which do not require a significant amount of storage to compute?
> 
>   If you run "info gcc" and the choose Invoking GCC > Optimize Options
> you will get a detailed documentation of all the optimization flags gcc
> supports. You can try turning then on one by one and see which ones
> require significant amounts of memory and which ones don't.

Ahhh, I had tried to google for that information, but with little 
success.  That is extremely helpful, many thanks!

>> 2) Is there another compiler that works under linux (fedora 9) that 
>> might handle this better?  I should be able to find a version of the 
>> Intel compiler to use if someone has reason to think it'll perform better.
> 
>   I have no idea if it will perform better, but it doesn't hurt to try.

Fair enough.  The main reason I asked was because it'll probably take 
some effort to track down a version I can use (since it's not free), but 
you're probably right that it's a relatively easy thing to try.

>> 3) Can I change the format of the source code so that using g++ 
>> optimizations isn't as necessary?  Most of the execution time is spent 
>> in my auto-generated code, which looks something like this:
> 
>   Are you sure you can't precalculate things in your code generator?

To what degree this is possible I already do, including removing any 
computations which don't influence the result and precomputing anything 
which only involves constants.

It's possible I could do some algebraic optimizations (factoring out 
common multipliers, etc) to reduce the code size, but that's starting to 
become a rather more process difficult than I'd like.

I'll start off with setting the g++ optimization flags from the info 
file you pointed out, since that seems easiest.


Post a reply to this message

From: John VanSickle
Subject: Re: compiling huge C++ files
Date: 4 Nov 2008 17:46:24
Message: <4910d0c0@news.povray.org>
Kevin Wampler wrote:
> I'm attempting to compile some auto-generated code, and am running into 
> problems with turning on optimization flags due to the large side of the 
> files generated (.cpp file is currently 25MB, and could be as large as 
> 200MB later).  There is no natural way to break this code into smaller 
> pieces and such large auto-generated files really are necessary, so I'm 
> stuck with having to compile some really huge C++ files.

So the source code for each *function* is megabytes in size?  That's the 
only reason you couldn't divvy up the sources into separate files.

Regards,
John


Post a reply to this message

From: Kevin Wampler
Subject: Re: compiling huge C++ files
Date: 4 Nov 2008 18:24:32
Message: <4910d9b0$1@news.povray.org>
John VanSickle wrote:
> So the source code for each *function* is megabytes in size?  That's the 
> only reason you couldn't divvy up the sources into separate files.


Yes, that is correct.  Although strictly speaking there are several 
functions in the file, the lion's share (maybe 60%) of the size is taken 
up by a single function.  So breaking the file into multiple files would 
help a little, but not enough to really address the problem.


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.