* [patch 0/3] Header file reduction.
@ 2015-10-02 2:22 Andrew MacLeod
2015-10-02 2:33 ` [patch 3/3] Header file reduction - FE files Andrew MacLeod
` (3 more replies)
0 siblings, 4 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-02 2:22 UTC (permalink / raw)
To: gcc-patches
OK, newly regenerated patches to remove header files from the latest
version of the tools.
The patches are generated by a pair of tools.
* gcc-order-includes goes through the headers and canonically reorders
some of our more common/troublesome headers and removes any duplicates.
This includes headers which are included by other headers. (ie,
obstack.h can be removed as a duplicate if bitmap.h is included already.)
* remove-includes is the tool which tries to remove each non-conditional
header file and does the real work.
I'll have a patch shortly to add these and some other useful tools to a
header-tools directory in contrib.
There are 3 patches which follow: backend.a files, fe files, and config
files.. Effecting 547 files total
These were generated from a trunk snapshot taken about 2 weeks ago. The
tools ran, and once I finished some minor tweaking, I reapplied them to
a 9/28 branch. Anything which didn't apply cleanly due to intervening
changes, was simply re-reduced. I then reapplied them to a snapshot
from this morning for these patches.
The tool also monitor what macros are defined and conditionally
consumed, and wont remove a header which still compiles if it may define
a macro which is used in conditional compilation (tm.h is frequently
affected by this). The tool spits out messages like these:
* Passed host and target builds, but must keep target.h because it
provides ASM_OUTPUT_DEF Possibly required by ipa-icf.c
* Passed host and target builds, but must keep insn-attr.h because it
provides DELAY_SLOTS Possibly required by toplev.c
note my host arch doesn't define DELAY_SLOTS in insn-attr-common.h
(included by insn-attr.h) , but some other target archs do, and this is
caught this during some of the target builds. (toplev.c has the
following lines:)
#ifndef DELAY_SLOTS
if (flag_delayed_branch)
warning (0, "this target machine does not have delayed branches");
#endif
So as far as I can tell I'm catching all those conditional compilation
cases. The only ones I might have missed would be macros which some
host build defines on the command line and which doesn't show up in a
config-list.mk target build.. I guess.
Everything bootstraps on x86_64-pc-linux-gnu and
powerpc64le-unknown-linux-gnu. All targets in config-list.mk still
build. Regressions tests also came up clean.
OK for trunk?
I will make minor tweaks as needed when applying to trunk for
check-in.. I know that the new requirement for cgraph.h in builtins.c
is no longer required in trunk.
(https://gcc.gnu.org/ml/gcc-patches/2015-10/msg00102.html) Going
forward, we'll know when we are adding new dependencies, as this thread
so timely shows :-).
Andrew
PS. Then keep an eye for anything funny. I'm not expecting much, but if
a file causes an issue, simple reverting the change for that one file
should be sufficient until we figure out why it is an issue,. No change
in any of these files is dependent on any other.. it simple include
reduction, and there should be no functional change in code.
^ permalink raw reply [flat|nested] 65+ messages in thread
* [patch 3/3] Header file reduction - FE files.
2015-10-02 2:22 [patch 0/3] Header file reduction Andrew MacLeod
@ 2015-10-02 2:33 ` Andrew MacLeod
2015-10-02 2:33 ` [patch 2/3] Header file reduction - config files Andrew MacLeod
` (2 subsequent siblings)
3 siblings, 0 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-02 2:33 UTC (permalink / raw)
To: gcc-patches
[-- Attachment #1: Type: text/plain, Size: 312 bytes --]
141 front end files.. all in subdirectories of gcc.
Everything bootstraps an a x86_64-pc-linux-gnu and
powerpc64le-unknown-linux-gnu with
--enable-languages=all,ada,go,obj-c++,jit --enable-host-shared. All
targets in config-list.mk still build. Regressions tests also came up
clean.
OK for trunk?
Andrew
[-- Attachment #2: FE.patch.bz2 --]
[-- Type: application/x-bzip, Size: 10885 bytes --]
^ permalink raw reply [flat|nested] 65+ messages in thread
* [patch 1/3] Header file reduction - backend files.
2015-10-02 2:22 [patch 0/3] Header file reduction Andrew MacLeod
2015-10-02 2:33 ` [patch 3/3] Header file reduction - FE files Andrew MacLeod
2015-10-02 2:33 ` [patch 2/3] Header file reduction - config files Andrew MacLeod
@ 2015-10-02 2:33 ` Andrew MacLeod
2015-10-07 22:02 ` Jeff Law
2015-10-22 22:33 ` [patch 1/3] Header file reduction - backend files Jeff Law
2015-10-05 13:55 ` [patch 0/3] Header file reduction Bernd Schmidt
3 siblings, 2 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-02 2:33 UTC (permalink / raw)
To: gcc-patches
[-- Attachment #1: Type: text/plain, Size: 247 bytes --]
these are all in the main gcc directory. 297 files total.
Everything bootstraps on x86_64-pc-linux-gnu and
powerpc64le-unknown-linux-gnu. All targets in config-list.mk still
build. Regressions tests also came up clean.
OK for trunk?
Andrew
[-- Attachment #2: backend.patch.bz2 --]
[-- Type: application/x-bzip, Size: 23694 bytes --]
^ permalink raw reply [flat|nested] 65+ messages in thread
* [patch 2/3] Header file reduction - config files.
2015-10-02 2:22 [patch 0/3] Header file reduction Andrew MacLeod
2015-10-02 2:33 ` [patch 3/3] Header file reduction - FE files Andrew MacLeod
@ 2015-10-02 2:33 ` Andrew MacLeod
2015-10-02 2:33 ` [patch 1/3] Header file reduction - backend files Andrew MacLeod
2015-10-05 13:55 ` [patch 0/3] Header file reduction Bernd Schmidt
3 siblings, 0 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-02 2:33 UTC (permalink / raw)
To: gcc-patches
[-- Attachment #1: Type: text/plain, Size: 439 bytes --]
there are most of the config files... 109 files total.
I built every target in config-list.mk. The reduction tool reduced
each source file by examining *every* target object directory for the
resulting object, and testing the reduction each of those targets.
This took a long time to run :-)
all targets in config-list.mk still build, and x86 was bootstrapped
regression tested on x86_64-pc-linux-gnu. OK for trunk?
Andrew
[-- Attachment #2: config.patch.bz2 --]
[-- Type: application/x-bzip, Size: 8919 bytes --]
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 0/3] Header file reduction.
2015-10-02 2:22 [patch 0/3] Header file reduction Andrew MacLeod
` (2 preceding siblings ...)
2015-10-02 2:33 ` [patch 1/3] Header file reduction - backend files Andrew MacLeod
@ 2015-10-05 13:55 ` Bernd Schmidt
2015-10-05 14:10 ` Richard Biener
` (2 more replies)
3 siblings, 3 replies; 65+ messages in thread
From: Bernd Schmidt @ 2015-10-05 13:55 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/02/2015 04:22 AM, Andrew MacLeod wrote:
> The patches are generated by a pair of tools.
> * gcc-order-includes goes through the headers and canonically reorders some of our more common/troublesome headers and removes any duplicates. This includes headers which are included by other headers. (ie, obstack.h can be removed as a duplicate if bitmap.h is included already.)
> * remove-includes is the tool which tries to remove each non-conditional header file and does the real work.
Is the bitmap/obstack example really one of a change that is desirable?
I think if a file uses obstacks then an include of obstack.h is
perfectly fine, giving us freedom to e.g. change bitmaps not to use
obstacks. Given that multiple headers include obstack.h, and pretty much
everything seems to indirectly include bitmap.h anyway, maybe a better
change would be to just include it always in system.h.
> I'll have a patch shortly to add these and some other useful tools to a
> header-tools directory in contrib.
How soon? It's difficult to meaningfully comment on these patches
without looking at how they were generated. Two points:
* diff -c is somewhat unusual and I find diff -u much more readable.
* Maybe the patches for reordering and removing should be split, also
for readability and for easier future identification of problems.
Bernd
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 0/3] Header file reduction.
2015-10-05 13:55 ` [patch 0/3] Header file reduction Bernd Schmidt
@ 2015-10-05 14:10 ` Richard Biener
2015-10-05 20:10 ` Andrew MacLeod
2015-10-05 21:18 ` [patch 4/3] Header file reduction - Tools for contrib Andrew MacLeod
2 siblings, 0 replies; 65+ messages in thread
From: Richard Biener @ 2015-10-05 14:10 UTC (permalink / raw)
To: Bernd Schmidt; +Cc: Andrew MacLeod, gcc-patches
On Mon, Oct 5, 2015 at 3:27 PM, Bernd Schmidt <bschmidt@redhat.com> wrote:
> On 10/02/2015 04:22 AM, Andrew MacLeod wrote:
>>
>> The patches are generated by a pair of tools.
>> * gcc-order-includes goes through the headers and canonically reorders
>> some of our more common/troublesome headers and removes any duplicates.
>> This includes headers which are included by other headers. (ie, obstack.h
>> can be removed as a duplicate if bitmap.h is included already.)
>> * remove-includes is the tool which tries to remove each non-conditional
>> header file and does the real work.
>
>
> Is the bitmap/obstack example really one of a change that is desirable? I
> think if a file uses obstacks then an include of obstack.h is perfectly
> fine, giving us freedom to e.g. change bitmaps not to use obstacks. Given
> that multiple headers include obstack.h, and pretty much everything seems to
> indirectly include bitmap.h anyway, maybe a better change would be to just
> include it always in system.h.
Not system.h please - use coretypes.h if really necessary.
Richard.
>> I'll have a patch shortly to add these and some other useful tools to a
>> header-tools directory in contrib.
>
>
> How soon? It's difficult to meaningfully comment on these patches without
> looking at how they were generated. Two points:
> * diff -c is somewhat unusual and I find diff -u much more readable.
> * Maybe the patches for reordering and removing should be split, also
> for readability and for easier future identification of problems.
>
>
> Bernd
>
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 0/3] Header file reduction.
2015-10-05 13:55 ` [patch 0/3] Header file reduction Bernd Schmidt
2015-10-05 14:10 ` Richard Biener
@ 2015-10-05 20:10 ` Andrew MacLeod
2015-10-05 20:37 ` Bernd Schmidt
2015-10-06 21:44 ` Jeff Law
2015-10-05 21:18 ` [patch 4/3] Header file reduction - Tools for contrib Andrew MacLeod
2 siblings, 2 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-05 20:10 UTC (permalink / raw)
To: Bernd Schmidt, gcc-patches
On 10/05/2015 09:27 AM, Bernd Schmidt wrote:
> On 10/02/2015 04:22 AM, Andrew MacLeod wrote:
>> The patches are generated by a pair of tools.
>> * gcc-order-includes goes through the headers and canonically
>> reorders some of our more common/troublesome headers and removes any
>> duplicates. This includes headers which are included by other
>> headers. (ie, obstack.h can be removed as a duplicate if bitmap.h is
>> included already.)
>> * remove-includes is the tool which tries to remove each
>> non-conditional header file and does the real work.
>
> Is the bitmap/obstack example really one of a change that is
> desirable? I think if a file uses obstacks then an include of
> obstack.h is perfectly fine, giving us freedom to e.g. change bitmaps
> not to use obstacks. Given that multiple headers include obstack.h,
> and pretty much everything seems to indirectly include bitmap.h
> anyway, maybe a better change would be to just include it always in
> system.h.
Its just an example of the sort of redundant includes the tool removes.
And your assertion turns out to be incorrect... bitmap.h is barely used
outside the backend, thus it is included in the backend.h aggregator
(This is the only header now which includes bitmap.h... Most of this
many-month effort was to untangle all those indirect includes.)
There are only 6 remaining uses of bitmap.h in all the front end
files. ( Most files can get obstack.h from tree.h. (it comes from
symtab.h in tree-core.) If they don't get it there, it often comes from
diagnostics-core.h.)
I don't see the point in leaving redundant #includes in the source
code because of direct uses from that header in the source. I'm not
even sure how I could automate detecting that accurately.. Going
forward, If anyone ever makes a change which removes a header from an
include file, they just have to correct the fallout. heh. Thats kinda
all I've done for 4 months :-) At least we'll have grasp of the
ramifications..
>> I'll have a patch shortly to add these and some other useful tools to a
>> header-tools directory in contrib.
>
> How soon? It's difficult to meaningfully comment on these patches
> without looking at how they were generated. Two points:
within the next day, im just cobbling together some minimal
documentation. Dunno how much that will help reviewing the patches
tho. the include reduction process was described in more detail earlier
in this project.
> * diff -c is somewhat unusual and I find diff -u much more readable.
unsual? I've been using -cp for the past 2 decades and no one has ever
mentioned it before... poking around the wiki I see it mentions you
can use either -up or -cp.
I guess I could repackage things using -up... I don't even know where
my script is to change it :-). is -u what everyone uses now? no one
has mentioned it before that I am aware of.
> * Maybe the patches for reordering and removing should be split, also
> for readability and for easier future identification of problems.
>
I was trying to avoid too much churn on 550ish files... I didn't think
each one needed 2 sets of check-ins. It could be done, but it will
take a while. The reordering patch can be quickly generated, but the
reduction on all those files will take the better part of a week.
My theory is it perfectly safe to back out any single file from the
patch set if we discover it has an issue and then examine what the root
of the problem is..
tool patch coming shortly... probably tomorrow now.
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 0/3] Header file reduction.
2015-10-05 20:10 ` Andrew MacLeod
@ 2015-10-05 20:37 ` Bernd Schmidt
2015-10-05 21:11 ` Andrew MacLeod
2015-10-06 21:44 ` Jeff Law
1 sibling, 1 reply; 65+ messages in thread
From: Bernd Schmidt @ 2015-10-05 20:37 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/05/2015 10:10 PM, Andrew MacLeod wrote:
> Its just an example of the sort of redundant includes the tool removes.
> And your assertion turns out to be incorrect... bitmap.h is barely used
> outside the backend, thus it is included in the backend.h aggregator
> (This is the only header now which includes bitmap.h... Most of this
> many-month effort was to untangle all those indirect includes.)
I said a few headers include obstack.h, not bitmap.h, and that's true in
my (maybe a week old) checkout. My suggestion was to move the include of
the former to (as Richi corrected) coretypes.h.
And it's one example, but it does point out a problem with this sort of
automated approach: realistically no one is going to check the whole
patch, and it may contain changes that could be done better.
>> * diff -c is somewhat unusual and I find diff -u much more readable.
>
> unsual? I've been using -cp for the past 2 decades and no one has ever
> mentioned it before... poking around the wiki I see it mentions you
> can use either -up or -cp.
>
> I guess I could repackage things using -up... I don't even know where
> my script is to change it :-). is -u what everyone uses now? no one
> has mentioned it before that I am aware of.
I'm pretty much used to seeing diff -u, whenever I get a -c diff things
become harder to work out, because the region in the diff you're looking
at never tells you the full story. In this case in particular, the
existence of both reordering and removing changes makes it very hard to
mentally keep track of what's going on.
Let's take this example:
Index: attribs.c
===================================================================
*** attribs.c (revision 228331)
--- attribs.c (working copy)
*************** along with GCC; see the file COPYING3.
*** 20,36 ****
#include "config.h"
#include "system.h"
#include "coretypes.h"
! #include "tm.h"
#include "tree.h"
- #include "alias.h"
#include "stringpool.h"
#include "attribs.h"
#include "stor-layout.h"
- #include "flags.h"
- #include "diagnostic-core.h"
- #include "tm_p.h"
- #include "cpplib.h"
- #include "target.h"
#include "langhooks.h"
#include "plugin.h"
--- 20,31 ----
#include "config.h"
#include "system.h"
#include "coretypes.h"
! #include "target.h"
#include "tree.h"
#include "stringpool.h"
+ #include "diagnostic-core.h"
#include "attribs.h"
#include "stor-layout.h"
#include "langhooks.h"
#include "plugin.h"
You could be misled into thinking that diagnostic-core.h and target.h
are removed, and the algorithm confuses the issue by showing that lines
"changed" rather than getting removed or added. With a unified diff, the
size is cut in half which makes things more readable to begin with:
Index: attribs.c
===================================================================
--- attribs.c (revision 228331)
+++ attribs.c (working copy)
@@ -20,17 +20,12 @@
#include "config.h"
#include "system.h"
#include "coretypes.h"
-#include "tm.h"
+#include "target.h"
#include "tree.h"
-#include "alias.h"
#include "stringpool.h"
+#include "diagnostic-core.h"
#include "attribs.h"
#include "stor-layout.h"
-#include "flags.h"
-#include "diagnostic-core.h"
-#include "tm_p.h"
-#include "cpplib.h"
-#include "target.h"
#include "langhooks.h"
#include "plugin.h"
and with things closer together it's easier to follow what's actually
going on. This is a smaller example, many instances in your patch are
actually about a page long and you have to scroll back and forth to work
things out, getting confused because everything in the 410k of text
looks the same.
This was actually one of the reasons I proposed splitting the patch into
reordering and removal phases, it would alleviate the diff -c disadvantages.
> the reduction on all those files will take the better part of a week.
That's a little concerning due to the possibility of intervening
commits. I'd like to make one requirement for checkin, that you take the
revision at which you're committing and then run the script again,
verifying that the process produces the same changes as the patch you
committed. (Or do things in smaller chunks.).
Bernd
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 0/3] Header file reduction.
2015-10-05 20:37 ` Bernd Schmidt
@ 2015-10-05 21:11 ` Andrew MacLeod
2015-10-06 3:03 ` [patch 0/3] Header file reduction. - unified patches Andrew MacLeod
2015-10-06 21:55 ` [patch 0/3] Header file reduction Jeff Law
0 siblings, 2 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-05 21:11 UTC (permalink / raw)
To: Bernd Schmidt, gcc-patches
On 10/05/2015 04:37 PM, Bernd Schmidt wrote:
> On 10/05/2015 10:10 PM, Andrew MacLeod wrote:
>> Its just an example of the sort of redundant includes the tool removes.
>> And your assertion turns out to be incorrect... bitmap.h is barely used
>> outside the backend, thus it is included in the backend.h aggregator
>> (This is the only header now which includes bitmap.h... Most of this
>> many-month effort was to untangle all those indirect includes.)
>
> I said a few headers include obstack.h, not bitmap.h, and that's true
> in my (maybe a week old) checkout. My suggestion was to move the
> include of the former to (as Richi corrected) coretypes.h.
>
Ah, sorry the parsing was non-deterministic and I parsed it the other
way. My comments refer to the "true" dependencies in the code after
all the un-needed headers have been trimmed out... i've little doubt
mainline shows obstack.h and bitmap.h being included everywhere.
In any case, a direct include of obstack.h in coretypes.h was considered
earlier in the aggregation process and it didn't show up as something
that would be a win. It is included a couple of common places that we
have no control over.. in particular libcpp/include/symtab.h includes
obstack.h and is included by tree-core.h. A very significant number of
files bring that in. If we included obstack.h in coretypes.h then those
files would be including it again for a second time for no particularly
good reason. So I made the judgement call to not put it in coretypes.h.
> And it's one example, but it does point out a problem with this sort
> of automated approach: realistically no one is going to check the
> whole patch, and it may contain changes that could be done better.
The point being that the aggregation *wasn't* automated... and has
nothing to do with this patch set. I analyzed and preformed all that
sort of thing earlier. Sure judgment calls were made, but it wasn't
automated in the slightest. There are certainly further aggregation
improvements that could be made... and either I or someone else could do
more down the road., The heavy lifting has all been done now.
So the *only* thing that is automated is removing include files which
are not needed so that we can get an idea of what the true dependencies
in the source base are.
>>> * diff -c is somewhat unusual and I find diff -u much more readable.
>>
>> unsual? I've been using -cp for the past 2 decades and no one has ever
>> mentioned it before... poking around the wiki I see it mentions you
>> can use either -up or -cp.
>>
>> I guess I could repackage things using -up... I don't even know where
>> my script is to change it :-). is -u what everyone uses now? no one
>> has mentioned it before that I am aware of.
>
> I'm pretty much used to seeing diff -u, whenever I get a -c diff
> things become harder to work out, because the region in the diff
> you're looking at never tells you the full story. In this case in
> particular, the existence of both reordering and removing changes
> makes it very hard to mentally keep track of what's going on.
>
I can switch to -u.. I've just never seen the request before.
I can regenerate the patches with -u if you want.
>
>> the reduction on all those files will take the better part of a week.
>
> That's a little concerning due to the possibility of intervening
> commits. I'd like to make one requirement for checkin, that you take
> the revision at which you're committing and then run the script again,
> verifying that the process produces the same changes as the patch you
> committed. (Or do things in smaller chunks.).
>
Well, sure there are intervening commits.. the only ones that actually
matter are the ones which fail to compile because someone made a code
change which now requires a header that wasn't needed before. which is
really a state we're looking for I think. I fix those up before
committing. Its *possible* a conditional compilation issue could creep
in, but highly unlikely.
I can rerun everything on that revision from just before I committed and
see if everything is the same. It'll take a week to find out :-) but
that seems like a reasonable sanity check.
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* [patch 4/3] Header file reduction - Tools for contrib
2015-10-05 13:55 ` [patch 0/3] Header file reduction Bernd Schmidt
2015-10-05 14:10 ` Richard Biener
2015-10-05 20:10 ` Andrew MacLeod
@ 2015-10-05 21:18 ` Andrew MacLeod
2015-10-06 10:27 ` Bernd Schmidt
2 siblings, 1 reply; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-05 21:18 UTC (permalink / raw)
To: Bernd Schmidt, gcc-patches
[-- Attachment #1: Type: text/plain, Size: 1633 bytes --]
Here's the patch to add all the tools to contrib/headers.
There are 9 tools I used over the run of the project. They were
developed in various stages and iterations, but I tried to at least have
some common interface things, and I tried some cleaning up and
documentation. No commenting on the quality of python code... :-) I was
learning python on the fly. Im sure some things are QUITE awful.,
There is a readme file which gives a common use cases for each tool
Some of the tools are for analysis, aggregation, or flattening, some for
visualization, and some are for the include reduction. I would have just
filed them away somewhere, but Jeff suggested I contribute them in case
someone wants to do something with them down the road... which
presumably also includes me :-) Less chance of losing them this way.
They need more polishing, but I'm tired of looking at them. I will
return to them down the road and see about cleaning them up a bit more.
They still aren't perfect by any means, but should do their job safely.
when used properly. Comments in the code vary from good to absent,
depending on how irritable I was at the time I was working on itl
I will soon also provide a modified config-list.mk which still works
like the current one, but allows for easy overrides of certain things
the include reducer requires.. until now I've just made a copy of
config-list.mk and modified it for my own means.
The 2 tools for include reduction are gcc-order-headers and
reduce-headers
what the process/conditions for checking things into contrib? I've
never had to do it before :-)
Andrew
[-- Attachment #2: contrib.patch.bz2 --]
[-- Type: application/x-bzip, Size: 18444 bytes --]
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 0/3] Header file reduction. - unified patches
2015-10-05 21:11 ` Andrew MacLeod
@ 2015-10-06 3:03 ` Andrew MacLeod
2015-10-06 21:55 ` [patch 0/3] Header file reduction Jeff Law
1 sibling, 0 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-06 3:03 UTC (permalink / raw)
To: gcc-patches, Bernd Schmidt
[-- Attachment #1: Type: text/plain, Size: 306 bytes --]
On 10/05/2015 05:11 PM, Andrew MacLeod wrote:
>
> I can switch to -u.. I've just never seen the request before.
>
> I can regenerate the patches with -u if you want.
You are right, the patches are significantly easier to read with -u..
I've changed my svn diff script. here's all 3 patches:
Andrew
[-- Attachment #2: backend.patch.bz2 --]
[-- Type: application/x-bzip, Size: 19751 bytes --]
[-- Attachment #3: FE.patch.bz2 --]
[-- Type: application/x-bzip, Size: 9513 bytes --]
[-- Attachment #4: config.patch.bz2 --]
[-- Type: application/x-bzip, Size: 7709 bytes --]
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-05 21:18 ` [patch 4/3] Header file reduction - Tools for contrib Andrew MacLeod
@ 2015-10-06 10:27 ` Bernd Schmidt
2015-10-06 12:02 ` Bernd Schmidt
0 siblings, 1 reply; 65+ messages in thread
From: Bernd Schmidt @ 2015-10-06 10:27 UTC (permalink / raw)
To: Andrew MacLeod, Bernd Schmidt, gcc-patches
[-- Attachment #1: Type: text/plain, Size: 1928 bytes --]
On 10/05/2015 11:18 PM, Andrew MacLeod wrote:
> Here's the patch to add all the tools to contrib/headers.
Small patches should not be sent in compressed form, it makes reading
and quoting them harder. This message is only intended to contain the
patch in plain text so that I can quote it in further replies.
> There are 9 tools I used over the run of the project. They were
> developed in various stages and iterations, but I tried to at least have
> some common interface things, and I tried some cleaning up and
> documentation. No commenting on the quality of python code... :-) I was
> learning python on the fly. Im sure some things are QUITE awful.,
>
> There is a readme file which gives a common use cases for each tool
>
> Some of the tools are for analysis, aggregation, or flattening, some for
> visualization, and some are for the include reduction. I would have just
> filed them away somewhere, but Jeff suggested I contribute them in case
> someone wants to do something with them down the road... which
> presumably also includes me :-) Less chance of losing them this way.
>
> They need more polishing, but I'm tired of looking at them. I will
> return to them down the road and see about cleaning them up a bit more.
> They still aren't perfect by any means, but should do their job safely.
> when used properly. Comments in the code vary from good to absent,
> depending on how irritable I was at the time I was working on itl
>
> I will soon also provide a modified config-list.mk which still works
> like the current one, but allows for easy overrides of certain things
> the include reducer requires.. until now I've just made a copy of
> config-list.mk and modified it for my own means.
>
> The 2 tools for include reduction are gcc-order-headers and
> reduce-headers
>
> what the process/conditions for checking things into contrib? I've
> never had to do it before :-)
>
> Andrew
>
[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #2: contrib.patch --]
[-- Type: text/x-patch; name="contrib.patch", Size: 77046 bytes --]
Index: contrib/headers/ChangeLog
===================================================================
*** contrib/headers/ChangeLog (revision 0)
--- contrib/headers/ChangeLog (working copy)
***************
*** 0 ****
--- 1,12 ----
+ 2015-10-06 Andrew MacLeod <amacleod@redhat.com>
+
+ * README : New File.
+ * count-headers : New File.
+ * gcc-order-headers : New File.
+ * graph-header-logs : New File.
+ * graph-include-web : New File.
+ * headerutils.py : New File.
+ * included-by : New File.
+ * reduce-headers : New File.
+ * replace-header : New File.
+ * show-headers : New File.
Index: contrib/headers/README
===================================================================
*** contrib/headers/README (revision 0)
--- contrib/headers/README (working copy)
***************
*** 0 ****
--- 1,282 ----
+ Quick start documentation for the header file utilities.
+
+ This isn't a full breakdown of the tools, just they typical use scenarios.
+
+ - Each tool accepts -h to show its usage. usually no parameters will also
+ trigger the help message. Help may specify additonal functionality to what is
+ listed here.
+
+ - For *all* tools, option format for specifying filenames must have no spaces
+ between the option and filename.
+ ie.: tool -lfilename.h target.h
+
+ - Many of the tools are required to be run from the core gcc source directory
+ containing coretypes.h typically that is in gcc/gcc from a source checkout.
+ For these tools to work on files not in this directory, their path needs to be
+ specified on the command line,
+ ie.: tool c/c-decl.c lto/lto.c
+
+ - options can be intermixed with filenames anywhere on the command line
+ ie. tool ssa.h rtl.h -a is equivalent to
+ tool ssa.h -a rtl.h
+
+
+
+
+
+ gcc-order-headers
+ -----------------
+ This will reorder any primary backend headers files into a canonical order
+ which will resolve any hidden dependencies they may have. Any unknown
+ headers will simply be occur after the recognized core files, and retain the
+ same relative ordering they had.
+
+ Must be run in the core gcc source directory
+
+ simply execute the command listing any files you wish to process on the
+ command line.
+
+ Any files which are changed are output, and the original is saved with a
+ .bak extention.
+
+ ex.: gcc-order-headers tree-ssa.c c/c-decl.c
+
+ -s will list all of the known headers in their canonical order. It does not
+ show which of those headers include other headers, just the final canonical
+ ordering.
+
+ if any header files are included within a conditional code block, the tool
+ will issue a message and not change the file. When this happens, you can
+ manually inspect the file, and if reorder it will be fine, rerun the command
+ with -i on the files. This will ignore the conditional error condition
+ and perform the re-ordering anyway.
+
+ If any #include line has the beginning of a multi-line comment, it will also
+ refuse to process the file until that is resolved.
+
+
+
+
+ show-headers
+ ------------
+ This will show the include structure for any given file. Each level of nesting
+ is indented, and when any duplicate headers are seen, they have their
+ duplicate number shown
+
+ -i may be used to specify alternate search directories for headers to parse.
+
+ Must be run in the core gcc source directory
+
+ ex.: show-headers -i../../build/gcc -i../libcpp tree-ssa.c
+ tree-ssa.c
+ config.h
+ auto-host.h
+ ansidecl.h (1)
+ system.h
+ safe-ctype.h
+ filenames.h
+ hashtab.h (1)
+ ansidecl.h (2)
+ libiberty.h
+ ansidecl.h (3)
+ hwint.h
+ coretypes.h
+ machmode.h (1)
+ insn-modes.h (1)
+ signop.h
+ <...>
+
+
+
+
+ count-headers
+ -------------
+ simply count all the headers found in the specified files. A summary is
+ printed showing occurrences from high to low.
+
+ ex.: count-headers tree*.c
+ 86 : coretypes.h
+ 86 : config.h
+ 86 : system.h
+ 86 : tree.h
+ 82 : backend.h
+ 80 : gimple.h
+ 72 : gimple-iterator.h
+ 70 : ssa.h
+ 68 : fold-const.h
+ <...>
+
+
+
+ included-by
+ -----------
+ This tool will search all the .c,.cc and .h files and output a list of files
+ which include the specified header(s).
+
+ It does a 4 level deep find of all source files from the current directory
+ and look in each of those for a #include of the specified headers. So expect
+ a little bit of slowness.
+
+ -i limits the search to only other header files.
+ -c limits the search to .c and .cc files.
+ -a shows only source files which include *all* specified headers.
+ -f allows you to specify a file which contains a list of source files to
+ check rather than performing the much slower find command.
+
+ ex: included-by tree-vectorizer.h
+ config/aarch64/aarch64.c
+ config/i386/i386.c
+ config/rs6000/rs6000.c
+ tree-loop-distribution.c
+ tree-parloops.c
+ tree-ssa-loop-ivopts.c
+ tree-ssa-loop.c
+
+
+
+
+ replace-header
+ --------------
+ This tool simply replaces a single header file with one or more other headers.
+ -r specifies the include to replace, and one or more -f options specify the
+ replacement headers, in the order they occur.
+
+ This is commonly used in conjunction with 'included-by' to change all
+ occurrences of a header file to something else, or to insert new headers
+ before or after.
+
+ ex: to insert #include "before.h" before every occurence of tree.h in all
+ .c and .cc source files:
+
+ replace-header -rtree.h -fbefore.h -ftree.h `included-by -c tree.h`
+
+
+
+
+ reduce-headers
+ --------------
+
+ This tool removes any header files which are not needed from a source file.
+
+ This tool must be run for the core gcc source directory, and requires either
+ a native build and sometimes target builds, depending on what you are trying
+ to reduce.
+
+ it is good practive to run 'gcc-order-headers' on a source file before trying
+ to reduce it. This removes duplicates and performs some simplifications
+ which reduce the chances of the reduction tool missing things.
+
+ start with a completely bootstrapped native compiler.
+
+ Any desired target builds should be built in one directory using a modified
+ config-list.mk file which doesnt delete the build directory when its done.
+ any target directories which do not successfully complete a 'make all-gcc'
+ may cause the tool to not reduce anything.
+ (todo - provide a config-list.mk that leaves successful target builds, but
+ deletes ones which do not compile)
+
+ The tool will examine all the target builds to determine which targets build
+ the file, and include those targets in the testing.
+
+
+
+ The tool will analyze a source file and attempt to remove each non-conditional
+ header from last to first in the file.:
+ It will first attempt to build the native all-gcc target.
+ If that succeeds, it will attempt to build any target build .o files
+ If that suceeds, it will check to see if there are any conditional
+ compilation dependencies between this header file and the source file or
+ any header whihch have already been determined as non-removable.
+ If all these tests are passed, the header file is determined to be removable
+ and is removed from the source file.
+ This continues until all headers have been checked.
+ At this point, the a bootstrap is attempted in the native build, and if that
+ passes the file is considered reduced.
+
+ Any files from the config subdirectory require target builds to be present
+ in order to proceed.
+
+ A small subset of targets has been determined to provide excellent coverage,
+ at least as of Aug 31/15 . A fullset of targets reduced all of the files
+ making up libbackend.a. All of the features which requires target testing
+ were found to be triggered by one or more of these targets. They are
+ actually known to the tool, and when checkiong target, it will check those
+ targets first, then the rest. It is mostly safe to do a reduction with just
+ these targets, at least until some new whacky target comes along.
+ building config-list.mk with :
+ LIST="aarch64-linux-gnu arm-netbsdelf avr-rtems c6x-elf epiphany-elf hppa2.0-hpux10.1 i686-mingw32crt i686-pc-msdosdjgpp mipsel-elf powerpc-eabisimaltivec rs6000-ibm-aix5.1.0 sh-superh-elf sparc64-elf spu-elf"
+
+ -b specifies the native bootstrapped build root directory
+ -t specifies a target build root directory that config-list.mk was run from
+ -f is used to limit the headers for consideration.
+
+ example:
+
+ mkdir gcc // checkout gcc in subdir gcc
+ mdsir build // boostrap gcc in subdir build
+ mkdir target // create target directory and run config-list.mk
+ cd gcc/gcc
+
+ reduce-headers -b../../build -t../../targets -falias.h -fexpr.h tree*.c (1)
+ # This will attempt to remove only alias.h and expr.h from tree*.c
+
+ reduce-headers -b../../build -t../../targets tree-ssa-live.c
+ # This will attempt to remove all header files from tree-ssa-live.c
+
+
+ the tool will generate a number of log files:
+
+ reduce-headers.log : All the compilation failure output that tool tried.
+ reduce-headers.sum : One line summary of what happened to each source file.
+
+ (All the remaining logs are appended to, so if you run the tool multiple times
+ these files are just added to. You must physically remove them yourself.)
+
+ reduce-headers-kept.log: List of all the successful compiles that were
+ ignored because of conditional macro dependencies
+ and why it thinks taht is the case
+ $src.c.log : for each failed header removal, the compilation
+ messages as to why it failed.
+ $header.h.log: The same log is put into the relevent header log as well.
+
+
+ a sample output from ira.c.log:
+
+ Compilation failed:
+ for shrink-wrap.h:
+
+ ============================================
+ /gcc/2015-09-09/gcc/gcc/ira.c: In function âbool split_live_ranges_for_shrink_wrap()â:
+ /gcc/2015-09-09/gcc/gcc/ira.c:4839:8: error: âSHRINK_WRAPPING_ENABLEDâ was not declared in this scope
+ if (!SHRINK_WRAPPING_ENABLED)
+ ^
+ make: *** [ira.o] Error 1
+
+
+ the same message would be put into shrink-wrap.h.log.
+
+
+
+ graph-header-logs
+ -----------------
+ This tool will parse all the messages from the .C files, looking for failures
+ that show up in other headers... meaning there is a compilation dependency
+ between the 2 header files.
+
+ The tool will aggregate all these and generate a graph of the dependencies
+ exposed during compilation. red lines indicate dependecies that are
+ presednt because a head file physically includes another header. Black lines
+ represent data dependencies causing compilation if the header isnt present.
+
+ ex.: graph-header-logs *.c.log
+
+
+
+ graph-include-web
+ -----------------
+ This tool can be used to visualize the include structure in files. It is
+ rapidly turned useless if you specify too many things, but it can be
+ useful for finding cycles and redundancies, or simply to see what a single
+ file looks like.
+
+ ex.: graph-include-web tree.c
Index: contrib/headers/count-headers
===================================================================
*** contrib/headers/count-headers (revision 0)
--- contrib/headers/count-headers (working copy)
***************
*** 0 ****
--- 1,63 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+
+
+
+ usage = False
+ src = list()
+ flist = { }
+ process_h = True
+ process_c = True
+ verbose = False
+ all_inc = True
+ level = 0
+
+ only_use_list = list()
+
+
+
+ for x in sys.argv[1:]:
+ if x[0:2] == "-h":
+ usage = True
+ else:
+ src.append (x)
+
+
+ if not usage and len(src) > 0:
+
+ incl = { }
+ for fn in src:
+ src = readwholefile (fn)
+ dup = { }
+ for line in src:
+ d = find_pound_include (line, True, True)
+ if d != "" and d[-2:] ==".h":
+ if dup.get(d) == None:
+ if incl.get(d) == None:
+ incl[d] = 1
+ else:
+ incl[d] = incl[d]+ 1
+ dup[d] = 1
+
+ l = list()
+ for i in incl:
+ l.append ((incl[i], i))
+ l.sort(key=lambda tup:tup[0], reverse=True)
+
+ for f in l:
+ print str(f[0]) + " : " + f[1]
+
+ else:
+ print "count-headers file1 [filen]"
+ print "Count the number of occurrences of all includes across all listed files"
+
+
+
+
+
+
Property changes on: contrib/headers/count-headers
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: contrib/headers/gcc-order-headers
===================================================================
*** contrib/headers/gcc-order-headers (revision 0)
--- contrib/headers/gcc-order-headers (working copy)
***************
*** 0 ****
--- 1,366 ----
+ #! /usr/bin/python2
+ import os
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+ import Queue
+
+ file_list = list ()
+ usage = False
+
+ ignore_conditional = False
+
+ order = [
+ "system.h",
+ "coretypes.h",
+ "backend.h",
+ "target.h",
+ "rtl.h",
+ "c-family/c-target.h",
+ "c-family/c-target-def.h",
+ "tree.h",
+ "cp/cp-tree.h",
+ "c-family/c-common.h", # these must come before diagnostic.h
+ "c/c-tree.h",
+ "fortran/gfortran.h",
+ "gimple.h",
+ "cfghooks.h",
+ "df.h",
+ "tm_p.h",
+ "gimple-iterators.h",
+ "ssa.h",
+ "expmed.h",
+ "optabs.h",
+ "regs.h",
+ "ira.h",
+ "ira-int.h",
+ "gimple-streamer.h"
+
+ ]
+
+ exclude_special = [ "bversion.h", "obstack.h", "insn-codes.h", "hooks.h" ]
+ includes = { }
+ dups = { }
+ exclude_processing = [ "tree-vectorizer.h" , "c-target.h", "c-target-def.h", "cp-tree.h", "c-common.h", "c-tree.h", "gfortran.h" ]
+
+ master_list = list()
+ # where include file comes from in src
+ h_from = { }
+
+ # create the master ordering list... this is the desired order of headers
+ def create_master_list (fn, verbose):
+ if fn not in exclude_processing:
+ for x in includes[fn][1]:
+ create_master_list (x, verbose)
+ if not fn in master_list:
+ # Don't put diagnostic*.h into the ordering list, its special since
+ # various front ends have to set GCC_DIAG_STYLE before including it
+ # for each file, we'll tailor where it belongs by looking at the dup
+ # list and seeing which file is included, and position it appropriately.
+ if fn != "diagnostic.h" and fn != "diagnostic-core.h":
+ master_list.append (fn)
+ if (verbose):
+ print fn + " included by: " + includes[fn][0]
+
+
+
+ def print_dups ():
+ if dups:
+ print "\nduplicated includes"
+ for i in dups:
+ string = "dup : " + i + " : "
+ string += includes[i][0]
+ for i2 in dups[i]:
+ string += ", "+i2
+ print string
+
+
+ def process_known_dups ():
+ # rtl.h gets tagged as a duplicate includer for all of coretypes, but thats
+ # really for only generator files
+ rtl_remove = includes["coretypes.h"][1] + ["statistics.h", "vec.h"]
+ for i in rtl_remove:
+ if dups[i] and "rtl.h" in dups[i]:
+ dups[i].remove("rtl.h")
+ if not dups[i]:
+ dups.pop (i, None)
+
+ # make sure diagnostic.h is the owner of diagnostic-core.h
+ if includes["diagnostic-core.h"][0] != "diagnostic.h":
+ dups["diagnostic-core.h"].append (includes["diagnostic-core.h"][0])
+ includes["diagnostic-core.h"] = ("diagnostic.h", includes["diagnostic-core.h"][1])
+
+ def indirectly_included (header, header_list):
+ nm = os.path.basename (header)
+ while nm and includes.get(nm):
+ if includes[nm][0] in header_list:
+ return includes[nm][0]
+ nm = includes[nm][0]
+
+ if header == "diagnostic-core.h":
+ if dups.get("diagnostic-core.h"):
+ for f in dups["diagnostic-core.h"]:
+ if f in header_list:
+ return f
+ else:
+ if header in header_list:
+ return header
+ # Now check if diagnostics is included indirectly anywhere
+ header = "diagnostic.h"
+
+ if header == "diagnostic.h":
+ if dups.get("diagnostic.h"):
+ for f in dups["diagnostic.h"]:
+ if f in header_list:
+ return f
+ else:
+ if header in header_list:
+ return header
+
+ return ""
+
+
+ def get_new_order (src_h, desired_order):
+ new_order = list()
+ for h in desired_order:
+ if h in master_list:
+ # find what included this
+ iclist = list()
+ ib = includes[h][0]
+ while ib:
+ iclist.insert(0, ib)
+ ib = includes[ib][0]
+ if iclist:
+ for x in iclist:
+ if x in src_h and x not in exclude_processing:
+ if x not in new_order and x[:10] != "diagnostic" and h not in exclude_special:
+ new_order.append (x)
+ break;
+ else:
+ if h not in new_order:
+ new_order.append (h)
+
+ f = ""
+ if "diagnostic.h" in src_h:
+ f = "diagnostic.h"
+ elif "diagnostic-core.h" in src_h:
+ f = "diagnostic-core.h"
+
+
+ if f:
+ ii = indirectly_included (f, src_h)
+ if not ii or ii == f:
+ new_order.append (f)
+
+ return new_order
+
+
+
+ # stack of files to process
+ process_stack = list()
+
+ def process_one (info):
+ i = info[0]
+ owner = info[1]
+ name = os.path.basename(i)
+ if os.path.exists (i):
+ if includes.get(name) == None:
+ l = find_unique_include_list (i)
+ # create a list which has just basenames in it
+ new_list = list()
+ for x in l:
+ new_list.append (os.path.basename (x))
+ process_stack.append((x, name))
+ includes[name] = (owner, new_list)
+ elif owner:
+ if dups.get(name) == None:
+ dups[name] = [ owner ]
+ else:
+ dups[name].append (owner)
+ else:
+ # seed tm.h with options.h since its a build file and won't be seen.
+ if not includes.get(name):
+ if name == "tm.h":
+ includes[name] = (owner, [ "options.h" ])
+ includes["options.h"] = ("tm.h", list())
+ else:
+ includes[name] = (owner, list())
+
+
+ show_master = False
+
+ for arg in sys.argv[1:]:
+ if arg[0:1] == "-":
+ if arg[0:2] == "-h":
+ usage = True
+ elif arg[0:2] == "-i":
+ ignore_conditional = True
+ elif arg[0:2] == "-s":
+ show_master = True
+ else:
+ print "Error: unrecognized option " + arg
+ elif os.path.exists(arg):
+ file_list.append (arg)
+ else:
+ print "Error: file " + arg + " Does not exist."
+ usage = True
+
+ if not file_list and not show_master:
+ usage = True
+
+ if not usage and not os.path.exists ("coretypes.h"):
+ usage = True
+ print "Error: Must run command in main gcc source directory containing coretypes.h\n"
+
+ # process diagnostic.h first.. it's special since GCC_DIAG_STYLE can be
+ # overridden by languages, but must be done so by a file included BEFORE it.
+ # so make sure it isn't seen as inclujded by one of those files by making it
+ # appear to be included by the src file.
+ process_stack.insert (0, ("diagnostic.h", ""))
+
+ # Add the list of files in reverse order since it is processed as a stack later
+ for i in order:
+ process_stack.insert (0, (i, "") )
+
+ # build up the library of what header files include what other files.
+ while process_stack:
+ info = process_stack.pop ()
+ process_one (info)
+
+ # Now crate the master ordering list
+ for i in order:
+ create_master_list (os.path.basename (i), False)
+
+ # handle warts in the duplicate list
+ process_known_dups ()
+ desired_order = master_list
+
+ if show_master:
+ print " Canonical order of gcc include files: "
+ for x in master_list:
+ print x
+ print " "
+
+ if usage:
+ print "gcc-order-headers [-i] [-s] file1 [filen]"
+ print " Ensures gcc's headers files are included in a normalized form with"
+ print " redundant headers removed. The original files are saved in filename.bak"
+ print " Outputs a list of files which changed."
+ print " -i ignore conditional compilation."
+ print " Use after examining the file to be sure includes within #ifs are safe"
+ print " Any headers within conditional sections will be ignored."
+ print " -s Show the cananoical order of known includes"
+ sys.exit(0)
+
+
+ didnt_do = list ()
+
+ for fn in file_list:
+ nest = 0
+ src_h = list()
+ src_line = { }
+
+ master_list = list()
+ includes = { }
+ dups = { }
+
+ iinfo = process_ii_src (fn)
+ src = ii_src (iinfo)
+ include_list = ii_include_list (iinfo)
+
+ if ii_include_list_cond (iinfo):
+ if not ignore_conditional:
+ print fn + ": Cannot process due to conditional compilation of includes"
+ didnt_do.append (fn)
+ src = list ()
+
+ if not src:
+ continue
+
+ process_stack = list()
+ # prime the stack with headers in the main ordering list so we get them in
+ # this order.
+ for d in order:
+ if d in include_list:
+ process_stack.insert (0, (d, ""))
+
+ for d in include_list:
+ nm = os.path.basename(d)
+ src_h.append (nm)
+ iname = d
+ iname2 = os.path.dirname (fn) + "/" + d
+ if not os.path.exists (d) and os.path.exists (iname2):
+ iname = iname2
+ if iname not in process_stack:
+ process_stack.insert (0, (iname, ""))
+ src_line[nm] = ii_src_line(iinfo)[d]
+ if src_line[nm].find("/*") != -1 and src_line[nm].find("*/") == -1:
+ # this means we have a multi line comment, abort!'
+ print fn + ": Cannot process due to a multi-line comment :"
+ print " " + src_line[nm]
+ if fn not in didnt_do:
+ didnt_do.append (fn)
+ src = list ()
+
+ if not src:
+ continue
+
+ # Now create the list of includes as seen by the source file.
+ while process_stack:
+ info = process_stack.pop ()
+ process_one (info)
+
+ for i in include_list:
+ create_master_list (os.path.basename (i), False)
+
+ new_src = list()
+ header_added = list()
+ new_order = list()
+ for line in src:
+ d = find_pound_include (line, True, True)
+ if not d or d[-2:] != ".h":
+ new_src.append (line)
+ else:
+ if d == order[0] and not new_order:
+ new_order = get_new_order (src_h, desired_order)
+ for i in new_order:
+ new_src.append (src_line[i])
+ # if not seen, add it.
+ if i not in header_added:
+ header_added.append (i)
+ else:
+ nm = os.path.basename(d)
+ if nm not in header_added:
+ iby = indirectly_included (nm, src_h)
+ if not iby:
+ new_src.append (line)
+ header_added.append (nm)
+
+ if src != new_src:
+ os.rename (fn, fn + ".bak")
+ fl = open(fn,"w")
+ for line in new_src:
+ fl.write (line)
+ fl.close ()
+ print fn
+
+
+ if didnt_do:
+ print "\n\n Did not process the following files due to conditional dependencies:"
+ str = ""
+ for x in didnt_do:
+ str += x + " "
+ print str
+ print "\n"
+ print "Please examine to see if they are safe to process, and re-try with -i. "
+ print "Safeness is determined by checking whether any of the reordered headers are"
+ print "within a conditional and could be hauled out of the conditional, thus changing"
+ print "what the compiler will see."
+ print "Multi-line comments after a #include can also cause failuer, they must be turned"
+ print "into single line comments or removed."
+
+
+
+
Property changes on: contrib/headers/gcc-order-headers
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: contrib/headers/graph-header-logs
===================================================================
*** contrib/headers/graph-header-logs (revision 0)
--- contrib/headers/graph-header-logs (working copy)
***************
*** 0 ****
--- 1,226 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+
+ header_roots = { }
+ extra_edges = list()
+ verbose = False
+ verbosity = 0
+ nodes = list()
+
+ def unpretty (name):
+ if name[-2:] == "_h":
+ name = name[:-2] + ".h"
+ return name.replace("_", "-")
+
+ def pretty_name (name):
+ name = os.path.basename (name)
+ return name.replace(".","_").replace("-","_").replace("/","_").replace("+","_");
+
+ depstring = ("In file included from", " from")
+
+ ignore = [ "coretypes_h",
+ "machmode_h",
+ "signop_h",
+ "wide_int_h",
+ "double_int_h",
+ "real_h",
+ "fixed_value_h",
+ "hash_table_h",
+ "statistics_h",
+ "ggc_h",
+ "vec_h",
+ "hashtab_h",
+ "inchash_h",
+ "mem_stats_traits_h",
+ "hash_map_traits_h",
+ "mem_stats_h",
+ "hash_map_h",
+ "hash_set_h",
+ "input_h",
+ "line_map_h",
+ "is_a_h",
+ "system_h",
+ "config_h" ]
+
+ def process_log_file (header, logfile):
+ if header_roots.get (header) != None:
+ print "Error: already processed log file: " + header + ".log"
+ return
+ hname = pretty_name (header)
+ header_roots[hname] = { }
+
+ sline = list();
+ incfrom = list()
+ newinc = True
+ for line in logfile:
+ if len (line) > 21 and line[:21] in depstring:
+ if newinc:
+ incfrom = list()
+ newinc = False
+ fn = re.findall(ur".*/(.*?):", line)
+ if len(fn) != 1:
+ continue
+ if fn[0][-2:] != ".h":
+ continue
+ n = pretty_name (fn[0])
+ if n not in ignore:
+ incfrom.append (n)
+ continue
+ newinc = True
+ note = re.findall (ur"^.*note: (.*)", line)
+ if len(note) > 0:
+ sline.append (("note", note[0]))
+ else:
+ err_msg = re.findall (ur"^.*: error: (.*)", line)
+ if len(err_msg) == 1:
+ msg = err_msg[0]
+ if (len (re.findall("error: forward declaration", line))) != 0:
+ continue
+ path = re.findall (ur"^(.*?):.*error: ", line)
+ if len(path) != 1:
+ continue
+ if path[0][-2:] != ".h":
+ continue
+ fname = pretty_name (path[0])
+ if fname in ignore or fname[0:3] == "gt_":
+ continue
+ sline.append (("error", msg, fname, incfrom))
+
+ print str(len(sline)) + " lines to process"
+ lastline = "note"
+ for line in sline:
+ if line[0] != "note" and lastline[0] == "error":
+ fname = lastline[2]
+ msg = lastline[1]
+ incfrom = lastline[3]
+ string = ""
+ ofname = fname
+ if len(incfrom) != 0:
+ for t in incfrom:
+ string = string + t + " : "
+ ee = (fname, t)
+ if ee not in extra_edges:
+ extra_edges.append (ee)
+ fname = t
+ print string
+
+ if hname not in nodes:
+ nodes.append(hname)
+ if fname not in nodes:
+ nodes.append (ofname)
+ for y in incfrom:
+ if y not in nodes:
+ nodes.append (y)
+
+
+ if header_roots[hname].get(fname) == None:
+ header_roots[hname][fname] = list()
+ if msg not in header_roots[hname][fname]:
+ print string + ofname + " : " +msg
+ header_roots[hname][fname].append (msg)
+ lastline = line;
+
+
+ dotname = "graph.dot"
+ graphname = "graph.png"
+
+
+ def build_dot_file (file_list):
+ output = open(dotname, "w")
+ output.write ("digraph incweb {\n");
+ for x in file_list:
+ if os.path.exists (x) and x[-4:] == ".log":
+ header = x[:-4]
+ logfile = open(x).read().splitlines()
+ process_log_file (header, logfile)
+ elif os.path.exists (x + ".log"):
+ logfile = open(x + ".log").read().splitlines()
+ process_log_file (x, logfile)
+
+ for n in nodes:
+ fn = unpretty(n)
+ label = n + " [ label = \"" + fn + "\" ];"
+ output.write (label + "\n")
+ if os.path.exists (fn):
+ h = open(fn).read().splitlines()
+ for l in h:
+ t = find_pound_include (l, True, False)
+ if t != "":
+ t = pretty_name (t)
+ if t in ignore or t[-2:] != "_h":
+ continue
+ if t not in nodes:
+ nodes.append (t)
+ ee = (t, n)
+ if ee not in extra_edges:
+ extra_edges.append (ee)
+
+ depcount = list()
+ for h in header_roots:
+ for dep in header_roots[h]:
+ label = " [ label = "+ str(len(header_roots[h][dep])) + " ];"
+ string = h + " -> " + dep + label
+ output.write (string + "\n");
+ if verbose:
+ depcount.append ((h, dep, len(header_roots[h][dep])))
+
+ for ee in extra_edges:
+ string = ee[0] + " -> " + ee[1] + "[ color=red ];"
+ output.write (string + "\n");
+
+
+ if verbose:
+ depcount.sort(key=lambda tup:tup[2])
+ for x in depcount:
+ print " ("+str(x[2])+ ") : " + x[0] + " -> " + x[1]
+ if (x[2] <= verbosity):
+ for l in header_roots[x[0]][x[1]]:
+ print " " + l
+
+ output.write ("}\n");
+
+
+ files = list()
+ dohelp = False
+ edge_thresh = 0
+ for arg in sys.argv[1:]:
+ if arg[0:2] == "-o":
+ dotname = arg[2:]+".dot"
+ graphname = arg[2:]+".png"
+ elif arg[0:2] == "-h":
+ dohelp = True
+ elif arg[0:2] == "-v":
+ verbose = True
+ if len(arg) > 2:
+ verbosity = int (arg[2:])
+ if (verbosity == 9):
+ verbosity = 9999
+ elif arg[0:1] == "-":
+ print "Unrecognized option " + arg
+ dohelp = True
+ else:
+ files.append (arg)
+
+ if len(sys.argv) == 1:
+ dohelp = True
+
+ if dohelp:
+ print "Parses the log files from remove-include processes to generate"
+ print " dependency graphs for the include web for specified files."
+ print "Usage: [-nnum] [-h] [-v[n]] [-ooutput] file1 [[file2] ... [filen]]"
+ print " -ooutput : Specifies output to output.dot and output.png"
+ print " Defaults to 'graph.dot and graph.png"
+ print " -vn : verbose mode, shows the number of connections, and if n"
+ print " is specifies, show the messages if # < n. 9 is infinity"
+ print " -h : help"
+ else:
+ print files
+ build_dot_file (files)
+ os.system ("dot -Tpng " + dotname + " -o" + graphname)
+
+
Property changes on: contrib/headers/graph-header-logs
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: contrib/headers/graph-include-web
===================================================================
*** contrib/headers/graph-include-web (revision 0)
--- contrib/headers/graph-include-web (working copy)
***************
*** 0 ****
--- 1,122 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+
+ def pretty_name (name):
+ return name.replace(".","_").replace("-","_").replace("/","_").replace("+","_");
+
+
+ include_files = list()
+ edges = 0
+ one_c = False
+ clink = list()
+ noterm = False
+
+ def build_inclist (output, filen):
+ global edges
+ global one_c
+ global clink
+ global noterm
+ inc = build_include_list (filen)
+ if one_c and filen[-2:] == ".c":
+ pn = "all_c"
+ else:
+ pn = pretty_name(filen)
+ for nm in inc:
+ if pn == "all_c":
+ if nm not in clink:
+ if len(build_include_list(nm)) != 0 or not noterm:
+ output.write (pretty_name(nm) + " -> " + pn + ";\n")
+ edges = edges + 1
+ if nm not in include_files:
+ include_files.append(nm)
+ clink.append (nm)
+ else:
+ output.write (pretty_name(nm) + " -> " + pn + ";\n")
+ edges = edges + 1
+ if nm not in include_files:
+ include_files.append(nm)
+ return len(inc) == 0
+
+ dotname = "graph.dot"
+ graphname = "graph.png"
+
+ def build_dot_file (file_list):
+ global one_c
+ output = open(dotname, "w")
+ output.write ("digraph incweb {\n");
+ if one_c:
+ output.write ("all_c [shape=box];\n");
+ for x in file_list:
+ if x[-2:] == ".h":
+ include_files.append (x)
+ elif os.path.exists (x):
+ build_inclist (output, x)
+ if not one_c:
+ output.write (pretty_name (x) + "[shape=box];\n")
+
+ for x in include_files:
+ term = build_inclist (output, x)
+ if term:
+ output.write (pretty_name(x) + " [style=filled];\n")
+
+ output.write ("}\n");
+
+
+ files = list()
+ dohelp = False
+ edge_thresh = 0
+ for arg in sys.argv[1:]:
+ if arg[0:2] == "-o":
+ dotname = arg[2:]+".dot"
+ graphname = arg[2:]+".png"
+ elif arg[0:2] == "-h":
+ dohelp = True
+ elif arg[0:2] == "-a":
+ one_c = True
+ if arg[0:3] == "-at":
+ noterm = True
+ elif arg[0:2] == "-f":
+ if not os.path.exists (arg[2:]):
+ print "Option " + arg +" doesn't specify a proper file"
+ dohelp = True
+ else:
+ sfile = open (arg[2:], "r")
+ srcdata = sfile.readlines()
+ sfile.close()
+ for x in srcdata:
+ files.append(x.rstrip())
+ elif arg[0:2] == "-n":
+ edge_thresh = int (arg[2:])
+ elif arg[0:1] == "-":
+ print "Unrecognized option " + arg
+ dohelp = True
+ else:
+ files.append (arg)
+
+ if len(sys.argv) == 1:
+ dohelp = True
+
+ if dohelp:
+ print "Generates a graph of the include web for specified files."
+ print "Usage: [-finput_file] [-h] [-ooutput] [file1 ... [filen]]"
+ print " -finput_file : Input file is file containing a list of files"
+ print " -ooutput : Specifies output to output.dot and output.png"
+ print " defaults to graph.dot and graph.png"
+ print " -nnum : specifies the # of edges beyond which sfdp is invoked. def=0"
+ print " -a : Aggregate all .c files to 1 file. Shows only include web."
+ print " -at : Aggregate, but don't include terminal.h to .c links. "
+ print " -h : help"
+ else:
+ print files
+ build_dot_file (files)
+ if edges > edge_thresh:
+ os.system ("sfdp -Tpng " + dotname + " -o" + graphname)
+ else:
+ os.system ("dot -Tpng " + dotname + " -o" + graphname)
+
+
Property changes on: contrib/headers/graph-include-web
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: contrib/headers/headerutils.py
===================================================================
*** contrib/headers/headerutils.py (revision 0)
--- contrib/headers/headerutils.py (working copy)
***************
*** 0 ****
--- 1,500 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+ import subprocess
+ import shutil
+ import pickle
+
+ import multiprocessing
+
+ def find_pound_include (line, use_outside, use_slash):
+ inc = re.findall (ur"^\s*#\s*include\s*\"(.+?)\"", line)
+ if len(inc) == 1:
+ nm = inc[0]
+ if use_outside or os.path.exists (nm):
+ if use_slash or '/' not in nm:
+ return nm
+ return ""
+
+ def find_system_include (line):
+ inc = re.findall (ur"^\s*#\s*include\s*<(.+?)>", line)
+ if len(inc) == 1:
+ return inc[0]
+ return ""
+
+ def find_pound_define (line):
+ inc = re.findall (ur"^\s*#\s*define ([A-Za-z0-9_]+)", line)
+ if len(inc) != 0:
+ if len(inc) > 1:
+ print "What? more than 1 match in #define??"
+ print inc
+ sys.exit(5)
+ return inc[0];
+ return ""
+
+ def is_pound_if (line):
+ inc = re.findall ("^\s*#\s*if\s", line)
+ if not inc:
+ inc = re.findall ("^\s*#\s*if[n]?def\s", line)
+ if inc:
+ return True
+ return False
+
+ def is_pound_endif (line):
+ inc = re.findall ("^\s*#\s*endif", line)
+ if inc:
+ return True
+ return False
+
+ def find_pound_if (line):
+ inc = re.findall (ur"^\s*#\s*if\s+(.*)", line)
+ if len(inc) == 0:
+ inc = re.findall (ur"^\s*#\s*elif\s+(.*)", line)
+ if len(inc) > 0:
+ # inc2 = re.findall (ur"defined *\((.+?)\)", inc[0])
+ inc2 = re.findall (ur"defined\s*\((.+?)\)", inc[0])
+ inc3 = re.findall (ur"defined\s+([a-zA-Z0-9_]+)", inc[0])
+ for yy in inc3:
+ inc2.append (yy)
+ return inc2
+ else:
+ inc = re.findall (ur"^\s*#\s*ifdef\s(.*)", line)
+ if len(inc) == 0:
+ inc = re.findall (ur"^\s*#\s*ifndef\s(.*)", line)
+ if len(inc) > 0:
+ inc2 = re.findall ("[A-Za-z_][A-Za-z_0-9]*", inc[0])
+ return inc2
+ if len(inc) == 0:
+ return list ()
+ print "WTF. more than one line returned for find_pound_if"
+ print inc
+ sys.exit(5)
+
+ empty_iinfo = ("", "", list(), list(), list(), list(), list())
+
+ # find all relevant include data.
+ def process_include_info (filen, do_macros, keep_src):
+ header = False
+ if not os.path.exists (filen):
+ return empty_iinfo
+
+ sfile = open (filen, "r");
+ data = sfile.readlines()
+ sfile.close()
+
+ # Ignore the initial #ifdef HEADER_H in header files
+ if filen[-2:] == ".h":
+ nest = -1
+ header = True
+ else:
+ nest = 0
+
+ macout = list ()
+ macin = list()
+ incl = list()
+ cond_incl = list()
+ src_line = { }
+ guard = ""
+
+ for line in (data):
+ if is_pound_if (line):
+ nest += 1
+ elif is_pound_endif (line):
+ nest -= 1
+
+ nm = find_pound_include (line, True, True)
+ if nm != "" and nm not in incl and nm[-2:] == ".h":
+ incl.append (nm)
+ if nest > 0:
+ cond_incl.append (nm)
+ if keep_src:
+ src_line[nm] = line
+ continue
+
+ if do_macros:
+ d = find_pound_define (line)
+ if d:
+ if d not in macout:
+ macout.append (d);
+ continue
+
+ d = find_pound_if (line)
+ if d:
+ # The first #if in a header file should be the guard
+ if header and len (d) == 1 and guard == "":
+ if d[0][-2:] == "_H":
+ guard = d
+ else:
+ guard = "Guess there was no guard..."
+ else:
+ for mac in d:
+ if mac != "defined" and mac not in macin:
+ macin.append (mac);
+
+ if not keep_src:
+ data = list()
+
+ return (os.path.basename (filen), os.path.dirname (filen), incl, cond_incl,
+ macin, macout, data, src_line)
+
+ def process_ii (filen):
+ return process_include_info (filen, False, False)
+
+ def process_ii_macro (filen):
+ return process_include_info (filen, True, False)
+
+ def process_ii_src (filen):
+ return process_include_info (filen, False, True)
+
+ def process_ii_macro_src (filen):
+ return process_include_info (filen, True, True)
+
+ def ii_base (iinfo):
+ return iinfo[0]
+
+ def ii_path (iinfo):
+ return iinfo[1]
+
+ def ii_include_list (iinfo):
+ return iinfo[2]
+
+ def ii_include_list_cond (iinfo):
+ return iinfo[3]
+
+ def ii_include_list_non_cond (iinfo):
+ l = ii_include_list (iinfo)
+ for n in ii_include_list_cond (iinfo):
+ l.remove (n)
+ return l
+
+ def ii_macro_consume (iinfo):
+ return iinfo[4]
+
+ def ii_macro_define (iinfo):
+ return iinfo[5]
+
+ def ii_src (iinfo):
+ return iinfo[6]
+
+ def ii_src_line (iinfo):
+ return iinfo[7]
+
+ def ii_read (fname):
+ f = open (fname, 'rb')
+ incl = pickle.load (f)
+ consumes = pickle.load (f)
+ defines = pickle.load (f)
+ obj = (fname,fname,incl,list(), list(), consumes, defines, list(), list())
+ return obj
+
+ def ii_write (fname, obj):
+ f = open (fname, 'wb')
+ pickle.dump (obj[2], f)
+ pickle.dump (obj[4], f)
+ pickle.dump (obj[5], f)
+ f.close ()
+
+
+ # Find files matching pattern NAME, return in a list.
+ # CURRENT is True if you want to include the current directory
+ # DEEPER is True if you want to search 3 levels below the current directory
+ # any files with testsuite diurectories are ignored
+
+ def find_gcc_files (name, current, deeper):
+ files = list()
+ command = ""
+ if current:
+ if not deeper:
+ command = "find -maxdepth 1 -name " + name + " -not -path \"./testsuite/*\""
+ else:
+ command = "find -maxdepth 4 -name " + name + " -not -path \"./testsuite/*\""
+ else:
+ if deeper:
+ command = "find -maxdepth 4 -mindepth 2 -name " + name + " -not -path \"./testsuite/*\""
+
+ if command != "":
+ f = os.popen (command)
+ for x in f:
+ if x[0] == ".":
+ fn = x.rstrip()[2:]
+ else:
+ fn = x
+ files.append(fn)
+
+ return files
+
+ # find the list of unique include names found in a file.
+ def find_unique_include_list_src (data):
+ found = list ()
+ for line in data:
+ d = find_pound_include (line, True, True)
+ if d and d not in found and d[-2:] == ".h":
+ found.append (d)
+ return found
+
+ # find the list of unique include names found in a file.
+ def find_unique_include_list (filen):
+ data = open (filen).read().splitlines()
+ return find_unique_include_list_src (data)
+
+
+ # Create the macin, macout, and incl vectors for a file FILEN.
+ # macin are the macros that are used in #if* conditional expressions
+ # macout are the macros which are #defined
+ # incl is the list of incluide files encountered
+ # returned as a tuple of the filename followed by the triplet of lists
+ # (filen, macin, macout, incl)
+
+ def create_macro_in_out (filen):
+ sfile = open (filen, "r");
+ data = sfile.readlines()
+ sfile.close()
+
+ macout = list ()
+ macin = list()
+ incl = list()
+
+ for line in (data):
+ d = find_pound_define (line)
+ if d != "":
+ if d not in macout:
+ macout.append (d);
+ continue
+
+ d = find_pound_if (line)
+ if len(d) != 0:
+ for mac in d:
+ if mac != "defined" and mac not in macin:
+ macin.append (mac);
+ continue
+
+ nm = find_pound_include (line, True, True)
+ if nm != "" and nm not in incl:
+ incl.append (nm)
+
+ return (filen, macin, macout, incl)
+
+ # create the macro information for filen, and create .macin, .macout, and .incl
+ # files. Return the created macro tuple.
+ def create_include_data_files (filen):
+
+ macros = create_macro_in_out (filen)
+ depends = macros[1]
+ defines = macros[2]
+ incls = macros[3]
+
+ disp_message = filen
+ if len (defines) > 0:
+ disp_message = disp_message + " " + str(len (defines)) + " #defines"
+ dfile = open (filen + ".macout", "w")
+ for x in defines:
+ dfile.write (x + "\n")
+ dfile.close ()
+
+ if len (depends) > 0:
+ disp_message = disp_message + " " + str(len (depends)) + " #if dependencies"
+ dfile = open (filen + ".macin", "w")
+ for x in depends:
+ dfile.write (x + "\n")
+ dfile.close ()
+
+ if len (incls) > 0:
+ disp_message = disp_message + " " + str(len (incls)) + " #includes"
+ dfile = open (filen + ".incl", "w")
+ for x in incls:
+ dfile.write (x + "\n")
+ dfile.close ()
+
+ return macros
+
+
+
+ # extract data for include file name_h and enter it into the dictionary.
+ # this doesnt change once read in. use_requies is True if you want to
+ # prime the values with already created .requires and .provides files.
+ def get_include_data (name_h, use_requires):
+ macin = list()
+ macout = list()
+ incl = list ()
+ if use_requires and os.path.exists (name_h + ".requires"):
+ macin = open (name_h + ".requires").read().splitlines()
+ elif os.path.exists (name_h + ".macin"):
+ macin = open (name_h + ".macin").read().splitlines()
+
+ if use_requires and os.path.exists (name_h + ".provides"):
+ macout = open (name_h + ".provides").read().splitlines()
+ elif os.path.exists (name_h + ".macout"):
+ macout = open (name_h + ".macout").read().splitlines()
+
+ if os.path.exists (name_h + ".incl"):
+ incl = open (name_h + ".incl").read().splitlines()
+
+ if len(macin) == 0 and len(macout) == 0 and len(incl) == 0:
+ return ()
+ data = ( name_h, macin, macout, incl )
+ return data
+
+ # find FIND in src, and replace it with the list of includes in REPLACE
+ # remove any duplicates of find or replace, and if some of hte replace
+ # includes occur earlier in the inlude chain, leave them.
+ # return the new SRC only if anything changed.
+ def find_replace_include (find, replace, src):
+ res = list()
+ seen = { }
+ anything = False
+ for line in src:
+ inc = find_pound_include (line, True, True)
+ if inc == find:
+ for y in replace:
+ if seen.get(y) == None:
+ res.append("#include \""+y+"\"\n")
+ seen[y] = True
+ if y != find:
+ anything = True
+ # if find isnt in the replacement list, then we are deleting FIND, so changes.
+ if find not in replace:
+ anything = True
+ else:
+ if inc in replace:
+ if seen.get(inc) == None:
+ res.append (line)
+ seen[inc] = True
+ else:
+ res.append (line)
+
+ if (anything):
+ return res
+ else:
+ return list()
+
+
+ # pass in a require and provide dictionary to be read in.
+ def read_require_provides (require, provide):
+ if not os.path.exists ("require-provide.master"):
+ print "require-provide.master file is not available. please run data collection."
+ sys.exit(1)
+ incl_list = open("require-provide.master").read().splitlines()
+ for f in incl_list:
+ if os.path.exists (f+".requires"):
+ require[os.path.basename (f)] = open (f + ".requires").read().splitlines()
+ else:
+ require[os.path.basename (f)] = list ()
+ if os.path.exists (f+".provides"):
+ provide[os.path.basename (f)] = open (f + ".provides").read().splitlines()
+ else:
+ provide [os.path.basename (f)] = list ()
+
+
+ def build_include_list (filen):
+ include_files = list()
+ sfile = open (filen, "r")
+ data = sfile.readlines()
+ sfile.close()
+ for line in data:
+ nm = find_pound_include (line, False, False)
+ if nm != "" and nm[-2:] == ".h":
+ if nm not in include_files:
+ include_files.append(nm)
+ return include_files
+
+ def build_reverse_include_list (filen):
+ include_files = list()
+ sfile = open (filen, "r")
+ data = sfile.readlines()
+ sfile.close()
+ for line in reversed(data):
+ nm = find_pound_include (line, False, False)
+ if nm != "":
+ if nm not in include_files:
+ include_files.append(nm)
+ return include_files
+
+ # compensate for this stupid warning that should be an error for
+ # inlined templates
+ def get_make_rc (rc, output):
+ rc = rc % 1280
+ if rc == 0:
+ # This is not considered a fatal error for a build! /me rolls eyes
+ h = re.findall ("warning: inline function.*used but never defined", output)
+ if len(h) != 0:
+ rc = 1
+ return rc;
+
+ def get_make_output (build_dir, make_opt):
+ devnull = open('/dev/null', 'w')
+ at_a_time = multiprocessing.cpu_count() * 2
+ make = "make -j"+str(at_a_time)+ " "
+ if build_dir != "":
+ command = "cd " + build_dir +"; " + make + make_opt
+ else:
+ command = make + make_opt
+ process = subprocess.Popen(command, stdout=devnull, stderr=subprocess.PIPE, shell=True)
+ output = process.communicate();
+ rc = get_make_rc (process.returncode, output[1])
+ return (rc , output[1])
+
+ def spawn_makes (command_list):
+ devnull = open('/dev/null', 'w')
+ rc = (0,"", "")
+ proc_res = list()
+ text = " Trying target builds : "
+ for command_pair in command_list:
+ tname = command_pair[0]
+ command = command_pair[1]
+ text += tname + ", "
+ c = subprocess.Popen(command, bufsize=-1, stdout=devnull, stderr=subprocess.PIPE, shell=True)
+ proc_res.append ((c, tname))
+
+ print text[:-2]
+
+ for p in proc_res:
+ output = p[0].communicate()
+ ret = (get_make_rc (p[0].returncode, output[1]), output[1], p[1])
+ if (ret[0] != 0):
+ # Just record the first one.
+ if rc[0] == 0:
+ rc = ret;
+ return rc
+
+ def get_make_output_parallel (targ_list, make_opt, at_a_time):
+ command = list()
+ targname = list()
+ if at_a_time == 0:
+ at_a_time = multiprocessing.cpu_count() * 2
+ proc_res = [0] * at_a_time
+ for x in targ_list:
+ if make_opt[-2:] == ".o":
+ s = "cd " + x[1] + "/gcc/; make " + make_opt
+ else:
+ s = "cd " + x[1] +"; make " + make_opt
+ command.append ((x[0],s))
+
+ num = len(command)
+ rc = (0,"", "")
+ loops = num // at_a_time
+
+ if (loops > 0):
+ for idx in range (loops):
+ ret = spawn_makes (command[idx*at_a_time:(idx+1)*at_a_time])
+ if ret[0] != 0:
+ rc = ret
+ break
+
+ if (rc[0] == 0):
+ leftover = num % at_a_time
+ if (leftover > 0):
+ ret = spawn_makes (command[-leftover:])
+ if ret[0] != 0:
+ rc = ret
+
+ return rc
+
+
+ def readwholefile (src_file):
+ sfile = open (src_file, "r")
+ src_data = sfile.readlines()
+ sfile.close()
+ return src_data
+
Property changes on: contrib/headers/headerutils.py
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: contrib/headers/included-by
===================================================================
*** contrib/headers/included-by (revision 0)
--- contrib/headers/included-by (working copy)
***************
*** 0 ****
--- 1,112 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+
+
+
+ usage = False
+ src = list()
+ flist = { }
+ process_h = False
+ process_c = False
+ verbose = False
+ level = 0
+ match_all = False
+ num_match = 1
+
+ file_list = list()
+ current = True
+ deeper = True
+ scanfiles = True
+ for x in sys.argv[1:]:
+ if x[0:2] == "-h":
+ usage = True
+ elif x[0:2] == "-i":
+ process_h = True
+ elif x[0:2] == "-s" or x[0:2] == "-c":
+ process_c = True
+ elif x[0:2] == "-v":
+ verbose = True
+ elif x[0:2] == "-a":
+ match_all = True
+ elif x[0:2] == "-n":
+ num_match = int(x[2:])
+ elif x[0:2] == "-1":
+ deeper = False
+ elif x[0:2] == "-2":
+ current = False
+ elif x[0:2] == "-f":
+ file_list = open (x[2:]).read().splitlines()
+ scanfiles = False
+ elif x[0] == "-":
+ print "Error: Unknown option " + x
+ usage = True
+ else:
+ src.append (x)
+
+ if match_all:
+ num_match = len (src)
+
+ if not process_h and not process_c:
+ process_h = True
+ process_c = True
+
+ if len(src) == 0:
+ usage = True
+
+ if not usage:
+ if scanfiles:
+ if process_h:
+ file_list = find_gcc_files ("\*.h", current, deeper)
+ if process_c:
+ file_list = file_list + find_gcc_files ("\*.c", current, deeper)
+ file_list = file_list + find_gcc_files ("\*.cc", current, deeper)
+ else:
+ newlist = list()
+ for x in file_list:
+ if process_h and x[-2:] == ".h":
+ newlist.append (x)
+ elif process_c and (x[-2:] == ".c" or x[-3:] == ".cc"):
+ newlist.append (x)
+ file_list = newlist;
+
+ file_list.sort()
+ for fn in file_list:
+ found = find_unique_include_list (fn)
+ careabout = list()
+ output = ""
+ for inc in found:
+ if inc in src:
+ careabout.append (inc)
+ if output == "":
+ output = fn
+ if verbose:
+ output = output + " [" + inc +"]"
+ if len (careabout) < num_match:
+ output = ""
+ if output != "":
+ print output
+ else:
+ print "included-by [-h] [-i] [-c] [-v] [-a] [-nx] file1 [file2] ... [filen]"
+ print "find the list of all files in subdirectories that include any of "
+ print "the listed files. processed to a depth of 3 subdirs"
+ print " -h : Show this message"
+ print " -i : process only header files (*.h) for #include"
+ print " -c : process only source files (*.c *.cc) for #include"
+ print " If nothing is specified, defaults to -i -c"
+ print " -s : Same as -c."
+ print " -v : Show which include(s) were found"
+ print " -nx : Only list files which have at least x different matches. Default = 1"
+ print " -a : Show only files which *all* listed files are included"
+ print " This is equivilent to -nT where T == # of items in list"
+ print " -flistfile : Show only files contained in the list of files"
+
+
+
+
+
+
Property changes on: contrib/headers/included-by
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: contrib/headers/reduce-headers
===================================================================
*** contrib/headers/reduce-headers (revision 0)
--- contrib/headers/reduce-headers (working copy)
***************
*** 0 ****
--- 1,596 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+ import tempfile
+ import copy
+
+ from headerutils import *
+
+ requires = { }
+ provides = { }
+
+ no_remove = [ "system.h", "coretypes.h", "config.h" , "bconfig.h", "backend.h" ]
+
+ # These targets are the ones which provide "coverage". Typically, if any
+ # target is going to fail compilation, it's one of these. This was determined
+ # during the initial runs of reduce-headers... On a full set of target builds,
+ # every failure which occured was triggered by one of these.
+ # This list is used during target-list construction simply to put any of these
+ # *first* in the candidate list, increasing the probability that a failure is
+ # found quickly.
+ target_priority = [
+ "aarch64-linux-gnu",
+ "arm-netbsdelf",
+ "avr-rtems",
+ "c6x-elf",
+ "epiphany-elf",
+ "hppa2.0-hpux10.1",
+ "i686-mingw32crt",
+ "i686-pc-msdosdjgpp",
+ "mipsel-elf",
+ "powerpc-eabisimaltivec",
+ "rs6000-ibm-aix5.1.0",
+ "sh-superh-elf",
+ "sparc64-elf",
+ "spu-elf"
+ ]
+
+
+ target_dir = ""
+ build_dir = ""
+ ignore_list = list()
+ target_builds = list()
+
+ target_dict = { }
+ header_dict = { }
+ search_path = [ ".", "../include", "../libcpp/include" ]
+
+ remove_count = { }
+
+
+ # given a header name, normalize it. ie cp/cp-tree.h could be in gcc, while
+ # the same header could be referenecd from within the cp subdirectory as
+ # just cp-tree.h
+ # for now, just assume basenames are unique
+
+ def normalize_header (header):
+ return os.path.basename (header)
+
+
+ # Adds a header file and it's sub includes to the global dictionary if they
+ # aren't already there. SPecify s_path since different build directories may
+ # append themselves on demand to the global list.
+ # return entry for the specified header, knowing all sub entries are completed
+
+ def get_header_info (header, s_path):
+ global header_dict
+ global empty_iinfo
+ process_list = list ()
+ location = ""
+ bname = ""
+ bname_iinfo = empty_iinfo
+ for path in s_path:
+ if os.path.exists (path + "/" + header):
+ location = path + "/" + header
+ break
+
+ if location:
+ bname = normalize_header (location)
+ if header_dict.get (bname):
+ bname_iinfo = header_dict[bname]
+ loc2 = ii_path (bname_iinfo)+ "/" + bname
+ if loc2[:2] == "./":
+ loc2 = loc2[2:]
+ if location[:2] == "./":
+ location = location[2:]
+ if loc2 != location:
+ # Don't use the cache if it isnt the right one.
+ bname_iinfo = process_ii_macro (location)
+ return bname_iinfo
+
+ bname_iinfo = process_ii_macro (location)
+ header_dict[bname] = bname_iinfo
+ # now decend into the include tree
+ for i in ii_include_list (bname_iinfo):
+ get_header_info (i, s_path)
+ else:
+ # if the file isnt in the source directories, look in the build and target
+ # directories. If its here, then aggregate all the versions.
+ location = build_dir + "/gcc/" + header
+ build_inc = target_inc = False
+ if os.path.exists (location):
+ build_inc = True
+ for x in target_dict:
+ location = target_dict[x] + "/gcc/" + header
+ if os.path.exists (location):
+ target_inc = True
+ break
+
+ if (build_inc or target_inc):
+ bname = normalize_header(header)
+ defines = set()
+ consumes = set()
+ incl = set()
+ if build_inc:
+ iinfo = process_ii_macro (build_dir + "/gcc/" + header)
+ defines = set (ii_macro_define (iinfo))
+ consumes = set (ii_macro_consume (iinfo))
+ incl = set (ii_include_list (iinfo))
+
+ if (target_inc):
+ for x in target_dict:
+ location = target_dict[x] + "/gcc/" + header
+ if os.path.exists (location):
+ iinfo = process_ii_macro (location)
+ defines.update (ii_macro_define (iinfo))
+ consumes.update (ii_macro_consume (iinfo))
+ incl.update (ii_include_list (iinfo))
+
+ bname_iinfo = (header, "build", list(incl), list(), list(consumes), list(defines), list(), list())
+
+ header_dict[bname] = bname_iinfo
+ for i in incl:
+ get_header_info (i, s_path)
+
+ return bname_iinfo
+
+
+ # return a list of all headers brought in by this header
+ def all_headers (fname):
+ global header_dict
+ headers_stack = list()
+ headers_list = list()
+ if header_dict.get (fname) == None:
+ return list ()
+ for y in ii_include_list (header_dict[fname]):
+ headers_stack.append (y)
+
+ while headers_stack:
+ h = headers_stack.pop ()
+ hn = normalize_header (h)
+ if hn not in headers_list:
+ headers_list.append (hn)
+ if header_dict.get(hn):
+ for y in ii_include_list (header_dict[hn]):
+ if normalize_header (y) not in headers_list:
+ headers_stack.append (y)
+
+ return headers_list
+
+
+
+
+ # Search bld_dir for all target tuples, confirm that they have a build path with
+ # bld_dir/target-tuple/gcc, and build a dictionary of build paths indexed by
+ # target tuple..
+
+ def build_target_dict (bld_dir, just_these):
+ global target_dict
+ target_doct = { }
+ error = False
+ if os.path.exists (bld_dir):
+ if just_these:
+ ls = just_these
+ else:
+ ls = os.listdir(bld_dir)
+ for t in ls:
+ if t.find("-") != -1:
+ target = t.strip()
+ tpath = bld_dir + "/" + target
+ if not os.path.exists (tpath + "/gcc"):
+ print "Error: gcc build directory for target " + t + " Does not exist: " + tpath + "/gcc"
+ error = True
+ else:
+ target_dict[target] = tpath
+
+ if error:
+ target_dict = { }
+
+ def get_obj_name (src_file):
+ if src_file[-2:] == ".c":
+ return src_file.replace (".c", ".o")
+ elif src_file[-3:] == ".cc":
+ return src_file.replace (".cc", ".o")
+ return ""
+
+ def target_obj_exists (target, obj_name):
+ global target_dict
+ # look in a subdir if src has a subdir, then check gcc base directory.
+ if target_dict.get(target):
+ obj = target_dict[target] + "/gcc/" + obj_name
+ if not os.path.exists (obj):
+ obj = target_dict[target] + "/gcc/" + os.path.basename(obj_name)
+ if os.path.exists (obj):
+ return True
+ return False
+
+ # Given a src file, return a list of targets which may build this file.
+ def find_targets (src_file):
+ global target_dict
+ targ_list = list()
+ obj_name = get_obj_name (src_file)
+ if not obj_name:
+ print "Error: " + src_file + " - Cannot determine object name."
+ return list()
+
+ # Put the high priority targets which tend to trigger failures first
+ for target in target_priority:
+ if target_obj_exists (target, obj_name):
+ targ_list.append ((target, target_dict[target]))
+
+ for target in target_dict:
+ if target not in target_priority and target_obj_exists (target, obj_name):
+ targ_list.append ((target, target_dict[target]))
+
+ return targ_list
+
+
+ def try_to_remove (src_file, h_list, verbose):
+ global target_dict
+ global header_dict
+ global build_dir
+
+ # build from scratch each time
+ header_dict = { }
+ summary = ""
+ rmcount = 0
+
+ because = { }
+ src_info = process_ii_macro_src (src_file)
+ src_data = ii_src (src_info)
+ if src_data:
+ inclist = ii_include_list_non_cond (src_info)
+ # work is done if there are no includes to check
+ if not inclist:
+ return src_file + ": No include files to attempt to remove"
+
+ # work on the include list in reverse.
+ inclist.reverse()
+
+ # Get the target list
+ targ_list = list()
+ targ_list = find_targets (src_file)
+
+ spath = search_path
+ if os.path.dirname (src_file):
+ spath.append (os.path.dirname (src_file))
+
+ hostbuild = True
+ if src_file.find("config/") != -1:
+ # config files dont usually build on the host
+ hostbuild = False
+ obn = get_obj_name (os.path.basename (src_file))
+ if obn and os.path.exists (build_dir + "/gcc/" + obn):
+ hostbuild = True
+ if not target_dict:
+ summary = src_file + ": Target builds are required for config files. None found."
+ print summary
+ return summary
+ if not targ_list:
+ summary =src_file + ": Cannot find any targets which build this file."
+ print summary
+ return summary
+
+ if hostbuild:
+ # confirm it actually builds before we do anything
+ print "Confirming source file builds"
+ res = get_make_output (build_dir + "/gcc", "all")
+ if res[0] != 0:
+ message = "Error: " + src_file + " does not build currently."
+ summary = src_file + " does not build on host."
+ print message
+ print res[1]
+ if verbose:
+ verbose.write (message + "\n")
+ verbose.write (res[1]+ "\n")
+ return summary
+
+ src_requires = set (ii_macro_consume (src_info))
+ for macro in src_requires:
+ because[macro] = src_file
+ header_seen = list ()
+
+ os.rename (src_file, src_file + ".bak")
+ src_orig = copy.deepcopy (src_data)
+ src_tmp = copy.deepcopy (src_data)
+
+ try:
+ # process the includes from bottom to top. This is because we know that
+ # later includes have are known to be needed, so any dependency from this
+ # header is a true dependency
+ for inc_file in inclist:
+ inc_file_norm = normalize_header (inc_file)
+
+ if inc_file in no_remove:
+ continue
+ if len (h_list) != 0 and inc_file_norm not in h_list:
+ continue
+ if inc_file_norm[0:3] == "gt-":
+ continue
+ if inc_file_norm[0:6] == "gtype-":
+ continue
+ if inc_file_norm.replace(".h",".c") == os.path.basename(src_file):
+ continue
+
+ lookfor = ii_src_line(src_info)[inc_file]
+ src_tmp.remove (lookfor)
+ message = "Trying " + src_file + " without " + inc_file
+ print message
+ if verbose:
+ verbose.write (message + "\n")
+ out = open(src_file, "w")
+ for line in src_tmp:
+ out.write (line)
+ out.close()
+
+ keep = False
+ if hostbuild:
+ res = get_make_output (build_dir + "/gcc", "all")
+ else:
+ res = (0, "")
+
+ rc = res[0]
+ message = "Passed Host build"
+ if (rc != 0):
+ # host build failed
+ message = "Compilation failed:\n";
+ keep = True
+ else:
+ if targ_list:
+ objfile = get_obj_name (src_file)
+ t1 = targ_list[0]
+ if objfile and os.path.exists(t1[1] +"/gcc/"+objfile):
+ res = get_make_output_parallel (targ_list, objfile, 0)
+ else:
+ res = get_make_output_parallel (targ_list, "all-gcc", 0)
+ rc = res[0]
+ if rc != 0:
+ message = "Compilation failed on TARGET : " + res[2]
+ keep = True
+ else:
+ message = "Passed host and target builds"
+
+ if keep:
+ print message + "\n"
+
+ if (rc != 0):
+ if verbose:
+ verbose.write (message + "\n");
+ verbose.write (res[1])
+ verbose.write ("\n");
+ if os.path.exists (inc_file):
+ ilog = open(inc_file+".log","a")
+ ilog.write (message + " for " + src_file + ":\n\n");
+ ilog.write ("============================================\n");
+ ilog.write (res[1])
+ ilog.write ("\n");
+ ilog.close()
+ if os.path.exists (src_file):
+ ilog = open(src_file+".log","a")
+ ilog.write (message + " for " +inc_file + ":\n\n");
+ ilog.write ("============================================\n");
+ ilog.write (res[1])
+ ilog.write ("\n");
+ ilog.close()
+
+ # Given a sequence where :
+ # #include "tm.h"
+ # #include "target.h" // includes tm.h
+
+ # target.h was required, and when attempting to remove tm.h we'd see that
+ # all the macro defintions are "required" since they all look like:
+ # #ifndef HAVE_blah
+ # #define HAVE_blah
+ # endif
+
+ # when target.h was found to be required, tm.h will be tagged as included.
+ # so when we get this far, we know we dont have to check the macros for
+ # tm.h since we know its already been included.
+
+ if inc_file_norm not in header_seen:
+ iinfo = get_header_info (inc_file, spath)
+ newlist = all_headers (inc_file_norm)
+ if ii_path(iinfo) == "build" and not target_dict:
+ keep = True
+ text = message + " : Will not remove a build file without some targets."
+ print text
+ ilog = open(src_file+".log","a")
+ ilog.write (text +"\n")
+ ilog.write ("============================================\n");
+ ilog.close()
+ ilog = open("reduce-headers-kept.log","a")
+ ilog.write (src_file + " " + text +"\n")
+ ilog.close()
+ else:
+ newlist = list()
+ if not keep and inc_file_norm not in header_seen:
+ # now look for any macro requirements.
+ for h in newlist:
+ if not h in header_seen:
+ if header_dict.get(h):
+ defined = ii_macro_define (header_dict[h])
+ for dep in defined:
+ if dep in src_requires and dep not in ignore_list:
+ keep = True;
+ text = message + ", but must keep " + inc_file + " because it provides " + dep
+ if because.get(dep) != None:
+ text = text + " Possibly required by " + because[dep]
+ print text
+ ilog = open(inc_file+".log","a")
+ ilog.write (because[dep]+": Requires [dep] in "+src_file+"\n")
+ ilog.write ("============================================\n");
+ ilog.close()
+ ilog = open(src_file+".log","a")
+ ilog.write (text +"\n")
+ ilog.write ("============================================\n");
+ ilog.close()
+ ilog = open("reduce-headers-kept.log","a")
+ ilog.write (src_file + " " + text +"\n")
+ ilog.close()
+ if verbose:
+ verbose.write (text + "\n")
+
+ if keep:
+ # add all headers 'consumes' to src_requires list, and mark as seen
+ for h in newlist:
+ if not h in header_seen:
+ header_seen.append (h)
+ if header_dict.get(h):
+ consume = ii_macro_consume (header_dict[h])
+ for dep in consume:
+ if dep not in src_requires:
+ src_requires.add (dep)
+ if because.get(dep) == None:
+ because[dep] = inc_file
+
+ src_tmp = copy.deepcopy (src_data)
+ else:
+ print message + " --> removing " + inc_file + "\n"
+ rmcount += 1
+ if verbose:
+ verbose.write (message + " --> removing " + inc_file + "\n")
+ if remove_count.get(inc_file) == None:
+ remove_count[inc_file] = 1
+ else:
+ remove_count[inc_file] += 1
+ src_data = copy.deepcopy (src_tmp)
+ except:
+ print "Interuption: restoring original file"
+ out = open(src_file, "w")
+ for line in src_orig:
+ out.write (line)
+ out.close()
+ raise
+
+ # copy current version, since its the "right" one now.
+ out = open(src_file, "w")
+ for line in src_data:
+ out.write (line)
+ out.close()
+
+ # Try a final host bootstrap build to make sure everything is kosher.
+ if hostbuild:
+ res = get_make_output (build_dir, "all")
+ rc = res[0]
+ if (rc != 0):
+ # host build failed! return to original version
+ print "Error: " + src_file + " Failed to bootstrap at end!!! restoring."
+ print " Bad version at " + src_file + ".bad"
+ os.rename (src_file, src_file + ".bad")
+ out = open(src_file, "w")
+ for line in src_orig:
+ out.write (line)
+ out.close()
+ return src_file + ": failed to build after reduction. Restored original"
+
+ if src_data == src_orig:
+ summary = src_file + ": No change."
+ else:
+ summary = src_file + ": Reduction performed, "+str(rmcount)+" includes removed."
+ print summary
+ return summary
+
+ only_h = list ()
+ ignore_cond = False
+
+ usage = False
+ src = list()
+ only_targs = list ()
+ for x in sys.argv[1:]:
+ if x[0:2] == "-b":
+ build_dir = x[2:]
+ elif x[0:2] == "-f":
+ fn = normalize_header (x[2:])
+ if fn not in only_h:
+ only_h.append (fn)
+ elif x[0:2] == "-h":
+ usage = True
+ elif x[0:2] == "-d":
+ ignore_cond = True
+ elif x[0:2] == "-D":
+ ignore_list.append(x[2:])
+ elif x[0:2] == "-T":
+ only_targs.append(x[2:])
+ elif x[0:2] == "-t":
+ target_dir = x[2:]
+ elif x[0] == "-":
+ print "Error: Unrecognized option " + x
+ usgae = True
+ else:
+ if not os.path.exists (x):
+ print "Error: specified file " + x + " does not exist."
+ usage = True
+ else:
+ src.append (x)
+
+ if target_dir:
+ build_target_dict (target_dir, only_targs)
+
+ if build_dir == "" and target_dir == "":
+ print "Error: Must specify a build directory, and/or a target directory."
+ usage = True
+
+ if build_dir and not os.path.exists (build_dir):
+ print "Error: specified build directory does not exist : " + build_dir
+ usage = True
+
+ if target_dir and not os.path.exists (target_dir):
+ print "Error: specified target directory does not exist : " + target_dir
+ usage = True
+
+ if usage:
+ print "Attempts to remove extraneous include files from source files. "
+ print " "
+ print "Should be run from the main gcc source directory, and works on a target"
+ print "directory, as we attempt to make the 'all' target."
+ print " "
+ print "By default, gcc-reorder-includes is run on each file before attempting"
+ print "to remove includes. this removes duplicates and puts some headers in a"
+ print "canonical ordering"
+ print " "
+ print "The build directory should be ready to compile via make. Time is saved "
+ print "if the build is already complete, so that only changes need to be built."
+ print " "
+ print "Usage: [options] file1.c [file2.c] ... [filen.c]"
+ print " -bdir : the root build directory to attempt buiding .o files."
+ print " -tdir : the target build directory"
+ print " -d : Ignore conditional macro dependencies."
+ print " "
+ print " -Dmacro : Ignore a specific macro for dependencies"
+ print " -Ttarget : Only consider target in target directory."
+ print " -fheader : Specifies a specific .h file to be considered."
+ print " "
+ print " -D, -T, and -f can be specified mulitple times and are aggregated."
+ print " "
+ print " The original file will be in filen.bak"
+ print " "
+ sys.exit (0)
+
+ if only_h:
+ print "Attempting to remove only these files:"
+ for x in only_h:
+ print x
+ print " "
+
+ logfile = open("reduce-headers.log","w")
+
+ for x in src:
+ msg = try_to_remove (x, only_h, logfile)
+ ilog = open("reduce-headers.sum","a")
+ ilog.write (msg + "\n")
+ ilog.close()
+
+ ilog = open("reduce-headers.sum","a")
+ ilog.write ("===============================================================\n")
+ for x in remove_count:
+ msg = x + ": Removed " + str(remove_count[x]) + " times."
+ print msg
+ logfile.write (msg + "\n")
+ ilog.write (msg + "\n")
+
+
+
+
+
Property changes on: contrib/headers/reduce-headers
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: contrib/headers/replace-header
===================================================================
*** contrib/headers/replace-header (revision 0)
--- contrib/headers/replace-header (working copy)
***************
*** 0 ****
--- 1,53 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+
+
+ files = list()
+ replace = list()
+ find = ""
+ usage = False
+
+ for x in sys.argv[1:]:
+ if x[0:2] == "-h":
+ usage = True
+ elif x[0:2] == "-f" and find == "":
+ find = x[2:]
+ elif x[0:2] == "-r":
+ replace.append (x[2:])
+ elif x[0:1] == "-":
+ print "Error: unrecognized option " + x
+ usage = True
+ else:
+ files.append (x)
+
+ if find == "":
+ usage = True
+
+ if usage:
+ print "replace-header -fheader -rheader [-rheader] file1 [filen.]"
+ sys.exit(0)
+
+ string = ""
+ for x in replace:
+ string = string + " '"+x+"'"
+ print "Replacing '"+find+"' with"+string
+
+ for x in files:
+ src = readwholefile (x)
+ src = find_replace_include (find, replace, src)
+ if (len(src) > 0):
+ print x + ": Changed"
+ out = open(x, "w")
+ for line in src:
+ out.write (line);
+ out.close ()
+ else:
+ print x
+
+
+
Property changes on: contrib/headers/replace-header
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: contrib/headers/show-headers
===================================================================
*** contrib/headers/show-headers (revision 0)
--- contrib/headers/show-headers (working copy)
***************
*** 0 ****
--- 1,108 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+
+
+ tabstop = 2
+ padding = " "
+ seen = { }
+ output = list()
+ sawcore = False
+
+ incl_dirs = [".", "../include", "../../build/gcc", "../libcpp/include" ]
+
+ def append_1 (output, inc):
+ for n,t in enumerate (output):
+ if t.find(inc) != -1:
+ t += " (1)"
+ output[n] = t
+ return
+
+ rtl_core = [ "machmode.h" , "signop.h" , "wide-int.h" , "double-int.h" , "real.h" , "fixed-value.h" , "statistics.h" , "vec.h" , "hash-table.h" , "hash-set.h" , "input.h" , "is-a.h" ]
+
+ def find_include_data (inc):
+ global sawcore
+ for x in incl_dirs:
+ nm = x+"/"+inc
+ if os.path.exists (nm):
+ info = find_unique_include_list (nm)
+ # rtl.h mimics coretypes for GENERATOR FILES, remove if coretypes.h seen.
+ if inc == "coretypes.h":
+ sawcore = True
+ elif inc == "rtl.h" and sawcore:
+ for i in rtl_core:
+ if i in info:
+ info.remove (i)
+ return info
+ return list()
+
+ def process_include (inc, indent):
+ if inc[-2:] != ".h":
+ return
+ if seen.get(inc) == None:
+ seen[inc] = 1
+ output.append (padding[:indent*tabstop] + os.path.basename (inc))
+ info = find_include_data (inc)
+ for y in info:
+ process_include (y, indent+1)
+ else:
+ seen[inc] += 1
+ if (seen[inc] == 2):
+ append_1(output, inc)
+ output.append (padding[:indent*tabstop] + os.path.basename (inc) + " ("+str(seen[inc])+")")
+
+
+
+ blddir = [ "." ]
+ usage = False
+ src = list()
+
+ for x in sys.argv[1:]:
+ if x[0:2] == "-i":
+ bld = x[2:]
+ print "Build dir : " + bld
+ blddir.append (bld)
+ elif x[0:2] == "-h":
+ usage = True
+ else:
+ src.append (x)
+
+ if len(src) != 1:
+ usage = True
+
+ if usage:
+ print "show-headers [-idir] file1 "
+ print " "
+ print " show in a hierarchical visual format how many times each header file"
+ print " is included ina source file. Should be run from the source directory"
+ print " files from find-include-depends"
+ print " -i : Specifies 1 or more directories to search for includes."
+ print " defaults to looking in :"
+ print " . , ../include, ../libcpp/include, and ../../build/gcc"
+ print " specifying anything else creates a new list starting with '.'"
+ sys.exit(0)
+
+
+ if len(blddir) > 1:
+ incl_dirs = blddir
+
+ x = src[0]
+ # if source is in a subdirectory, add the subdirectory to the search list
+ srcpath = os.path.dirname(x)
+ if srcpath:
+ incl_dirs.append (srcpath)
+
+ output = list()
+ sawcore = False
+ incl = find_unique_include_list (x)
+ for inc in incl:
+ process_include (inc, 1)
+ print "\n" + x
+ for line in output:
+ print line
+
+
Property changes on: contrib/headers/show-headers
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-06 10:27 ` Bernd Schmidt
@ 2015-10-06 12:02 ` Bernd Schmidt
2015-10-06 14:04 ` Andrew MacLeod
` (4 more replies)
0 siblings, 5 replies; 65+ messages in thread
From: Bernd Schmidt @ 2015-10-06 12:02 UTC (permalink / raw)
To: Bernd Schmidt, Andrew MacLeod, gcc-patches
>> There are 9 tools I used over the run of the project. They were
>> developed in various stages and iterations, but I tried to at least have
>> some common interface things, and I tried some cleaning up and
>> documentation.
I'll probably have to make multiple passes over this. A disclaimer
first, I have done enough Python programming to develop a dislike for
the language, but not enough to call myself an expert.
General comments first. Where applicable, I think we should apply the
same coding standards to Python as we do for C/C++. That means things
like function comments documenting parameters. They are absent for the
most part in this patch, and I won't point out individual instances.
Also, I think the documentation should follow our usual rules. There are
spelling and grammar problems. I will point out what I find (only the
first instance for recurring problems), but please proofread the whole
thing for the next submission. The Thunderbird spellchecker actually is
pointing out a lot of these. Capitalize starts of sentences, write full
sentences and terminate with punctuation.
> No commenting on the quality of python code... :-) I was
> learning python on the fly. Im sure some things are QUITE awful.,
Yeah, the general impression is of fairly ad-hoc code. Not sure how much
can be done about this.
> + trigger the help message. Help may specify additonal functionality to what is
"additional"
> + - For*all* tools, option format for specifying filenames must have no spaces
Space after "For", remove double space. This pattern occurs very often -
something your editor does maybe?
> + - Many of the tools are required to be run from the core gcc source directory
> + containing coretypes.h typically that is in gcc/gcc from a source checkout.
Odd whitespace, and probably lack of punctuation before "typically".
> + gcc-order-headers
> + -----------------
> + This will reorder any primary backend headers files into a canonical order
> + which will resolve any hidden dependencies they may have. Any unknown
> + headers will simply be occur after the recognized core files, and retain the
> + same relative ordering they had.
Grammar ("be occur").
This sounds like the intention is to move recognized core files (I
assume these are the ones in the "order" array in the tool) to the
start, and leaving everything alone? I was a bit confused about this at
first; I see for example "timevar.h" moving around without being present
in the list, but it looks like it gets added implicitly through being
included by df.h. (Incidentally, this looks like another case like the
obstack one - a file using timevars should include timevar.h IMO, even
if it also includes df.h).
> +
> + Must be run in the core gcc source directory
"This tool must be run in the core gcc source directory." (Punctuation
and grammar).
> + Any files which are changed are output, and the original is saved with a
> + .bak extention.
> +
> + ex.: gcc-order-headers tree-ssa.c c/c-decl.c
It might be more useful to produce a diff rather than modify files in-place.
> + if any header files are included within a conditional code block, the tool
> + will issue a message and not change the file. When this happens, you can
> + manually inspect the file, and if reorder it will be fine, rerun the command
"if reorder it will be fine"?
> + It does a 4 level deep find of all source files from the current directory
> + and look in each of those for a #include of the specified headers. So expect
> + a little bit of slowness.
"looks"?
> +
> + -i limits the search to only other header files.
> + -c limits the search to .c and .cc files.
> + -a shows only source files which include*all* specified headers.
Whitespace.
> + it is good practive to run 'gcc-order-headers' on a source file before trying
"practice"
> + Any desired target builds should be built in one directory using a modified
> + config-list.mk file which doesnt delete the build directory when its done.
"doesn't", or more probably "does not" in documentation.
> + The tool will analyze a source file and attempt to remove each non-conditional
> + header from last to first in the file.:
> + It will first attempt to build the native all-gcc target.
> + If that succeeds, it will attempt to build any target build .o files
> + If that suceeds, it will check to see if there are any conditional
"succeeds"
> + compilation dependencies between this header file and the source file or
> + any header whihch have already been determined as non-removable.
"whihch"
> + If all these tests are passed, the header file is determined to be removable
> + and is removed from the source file.
> + This continues until all headers have been checked.
One thing I've wondered about - have you tried checking for object file
differences?
As far as I can tell the dependency checking does not check for undefs.
Is that correct? I think that needs to be added.
> + At this point, the a bootstrap is attempted in the native build, and if that
"the a"
> + A small subset of targets has been determined to provide excellent coverage,
> + at least as of Aug 31/15 . A fullset of targets reduced all of the files
"fullset", and whitespace. Determined how?
> + making up libbackend.a. All of the features which requires target testing
> + were found to be triggered by one or more of these targets. They are
> + actually known to the tool, and when checkiong target, it will check those
"checkiong".
> + targets first, then the rest. It is mostly safe to do a reduction with just
> + these targets, at least until some new whacky target comes along.
> + building config-list.mk with :
> + LIST="aarch64-linux-gnu arm-netbsdelf avr-rtems c6x-elf epiphany-elf hppa2.0-hpux10.1 i686-mingw32crt i686-pc-msdosdjgpp mipsel-elf powerpc-eabisimaltivec rs6000-ibm-aix5.1.0 sh-superh-elf sparc64-elf spu-elf"
I think I get what you're trying to say, but the whole paragraph could
be rewritten for clarity.
> + reduce-headers.log : All the compilation failure output that tool tried.
"the tool"?
> + and why it thinks taht is the case
"taht".
> + $src.c.log : for each failed header removal, the compilation
> + messages as to why it failed.
> + $header.h.log: The same log is put into the relevent header log as well.
"relevant"
> + The tool will aggregate all these and generate a graph of the dependencies
> + exposed during compilation. red lines indicate dependecies that are
"depndecies"
> + presednt because a head file physically includes another header. Black lines
"presednt"
> + represent data dependencies causing compilation if the header isnt present.
"is not"
> + for x in sys.argv[1:]:
> + if x[0:2] == "-h":
> + usage = True
> + else:
> + src.append (x)
There are getopt and argparse libraries available for python. I seem to
recall fighting them at some point because they didn't quite do what I
expected from a C getopt, so it may not be worth it trying to use them.
> + if not usage and len(src) > 0:
> +
> + incl = { }
Watch the extra blank lines.
> + if dup.get(d) == None:
I think we want to be consistent with our C style? I.e., extra space
before parentheses.
> + l.sort(key=lambda tup:tup[0], reverse=True)
And spaces around things like = operators.
> + # Don't put diagnostic*.h into the ordering list, its special since
"it is". Many instances, please grep for "its" and fix them all.
> + # various front ends have to set GCC_DIAG_STYLE before including it
> + # for each file, we'll tailor where it belongs by looking at the dup
> + # list and seeing which file is included, and position it appropriately.
From that comment it's not entirely clear how they are handled. Please
expand documentation of this mechanism.
> + # rtl.h gets tagged as a duplicate includer for all of coretypes, but thats
"that's"
> + # process diagnostic.h first.. it's special since GCC_DIAG_STYLE can be
> + # overridden by languages, but must be done so by a file included BEFORE it.
> + # so make sure it isn't seen as inclujded by one of those files by making it
"inclujded"
> + # Now crate the master ordering list
"create".
> + for i in order:
> + create_master_list (os.path.basename (i), False)
I found myself wanting to pass True. The tool could use a "-v" flag.
> + print " -s Show the cananoical order of known includes"
"canonical"
> + print "Multi-line comments after a #include can also cause failuer, they must be turned"
"failuer"
> + ignore = [ "coretypes_h",
> + "machmode_h",
> + "signop_h",
> + "wide_int_h",
> + "double_int_h",
> + "real_h",
> + "fixed_value_h",
> + "hash_table_h",
> + "statistics_h",
> + "ggc_h",
> + "vec_h",
> + "hashtab_h",
> + "inchash_h",
> + "mem_stats_traits_h",
> + "hash_map_traits_h",
> + "mem_stats_h",
> + "hash_map_h",
> + "hash_set_h",
> + "input_h",
> + "line_map_h",
> + "is_a_h",
> + "system_h",
> + "config_h" ]
Is the random indentation indicating some kind of nesting? If not,
please fix.
> + for line in logfile:
> + if len (line) > 21 and line[:21] in depstring:
> + if newinc:
> + incfrom = list()
> + newinc = False
It looks like you are mixing tab and space indentation. For a language
like Python, that is absolutely scary. Please fix throughout (I think
only spaces is probably best).
> + if dohelp:
> + print "Generates a graph of the include web for specified files."
> + print "Usage: [-finput_file] [-h] [-ooutput] [file1 ... [filen]]"
> + print " -finput_file : Input file is file containing a list of files"
> + print " -ooutput : Specifies output to output.dot and output.png"
> + print " defaults to graph.dot and graph.png"
> + print " -nnum : specifies the # of edges beyond which sfdp is invoked. def=0"
> + print " -a : Aggregate all .c files to 1 file. Shows only include web."
> + print " -at : Aggregate, but don't include terminal.h to .c links. "
> + print " -h : help"
The formatting of the help output seems somewhat random. Also "is a file"?
> + if len(inc) > 0:
> + # inc2 = re.findall (ur"defined *\((.+?)\)", inc[0])
> + inc2 = re.findall (ur"defined\s*\((.+?)\)", inc[0])
Intentionally commented out?
> +
> + def process_ii (filen):
> + return process_include_info (filen, False, False)
> +
> + def process_ii_macro (filen):
> + return process_include_info (filen, True, False)
> +
> + def process_ii_src (filen):
> + return process_include_info (filen, False, True)
> +
> + def process_ii_macro_src (filen):
> + return process_include_info (filen, True, True)
> +
> + def ii_base (iinfo):
> + return iinfo[0]
> +
> + def ii_path (iinfo):
> + return iinfo[1]
> +
> + def ii_include_list (iinfo):
> + return iinfo[2]
> +
> + def ii_include_list_cond (iinfo):
> + return iinfo[3]
> +
> + def ii_include_list_non_cond (iinfo):
> + l = ii_include_list (iinfo)
> + for n in ii_include_list_cond (iinfo):
> + l.remove (n)
> + return l
> +
> + def ii_macro_consume (iinfo):
> + return iinfo[4]
> +
> + def ii_macro_define (iinfo):
> + return iinfo[5]
> +
> + def ii_src (iinfo):
> + return iinfo[6]
> +
> + def ii_src_line (iinfo):
> + return iinfo[7]
That's a lot of little functions with pretty much no clue for the reader
what's going on. It looks like maybe there's an array where a struct
should have been used?
> + # extract data for include file name_h and enter it into the dictionary.
> + # this doesnt change once read in. use_requies is True if you want to
"does not", "use_requies"
> + # find FIND in src, and replace it with the list of includes in REPLACE
> + # remove any duplicates of find or replace, and if some of hte replace
"hte"
> + # includes occur earlier in the inlude chain, leave them.
"inlude"
> + # compensate for this stupid warning that should be an error for
> + # inlined templates
> + def get_make_rc (rc, output):
> + rc = rc % 1280
> + if rc == 0:
> + # This is not considered a fatal error for a build! /me rolls eyes
> + h = re.findall ("warning: inline function.*used but never defined", output)
> + if len(h) != 0:
> + rc = 1
> + return rc;
What's this about?
> + print " -a : Show only files which*all* listed files are included"
Whitespace around *all*. Seems to happen quite often.
> + # given a header name, normalize it. ie cp/cp-tree.h could be in gcc, while
Formatting, capitalization.
> + # the same header could be referenecd from within the cp subdirectory as
"referenced"
> + # Adds a header file and it's sub includes to the global dictionary if they
This time, "its".
> + # aren't already there. SPecify s_path since different build directories may
"SPecify"
> + if usage:
> + print "Attempts to remove extraneous include files from source files. "
> + print " "
> + print "Should be run from the main gcc source directory, and works on a target"
> + print "directory, as we attempt to make the 'all' target."
> + print " "
> + print "By default, gcc-reorder-includes is run on each file before attempting"
> + print "to remove includes. this removes duplicates and puts some headers in a"
> + print "canonical ordering"
> + print " "
> + print "The build directory should be ready to compile via make. Time is saved "
Space at the end of the line (two cases in this block).
> + print " "
> + print " show in a hierarchical visual format how many times each header file"
> + print " is included ina source file. Should be run from the source directory"
"ina".
Bernd
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-06 12:02 ` Bernd Schmidt
@ 2015-10-06 14:04 ` Andrew MacLeod
2015-10-06 14:57 ` Bernd Schmidt
2015-10-06 21:27 ` Jeff Law
2015-10-06 16:32 ` Joseph Myers
` (3 subsequent siblings)
4 siblings, 2 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-06 14:04 UTC (permalink / raw)
To: gcc-patches, Bernd Schmidt
On 10/06/2015 08:02 AM, Bernd Schmidt wrote:
>
>>> There are 9 tools I used over the run of the project. They were
>>> developed in various stages and iterations, but I tried to at least
>>> have
>>> some common interface things, and I tried some cleaning up and
>>> documentation.
>
> I'll probably have to make multiple passes over this. A disclaimer
> first, I have done enough Python programming to develop a dislike for
> the language, but not enough to call myself an expert.
>
> General comments first. Where applicable, I think we should apply the
> same coding standards to Python as we do for C/C++. That means things
> like function comments documenting parameters. They are absent for the
> most part in this patch, and I won't point out individual instances.
> Also, I think the documentation should follow our usual rules. There
> are spelling and grammar problems. I will point out what I find (only
> the first instance for recurring problems), but please proofread the
> whole thing for the next submission. The Thunderbird spellchecker
> actually is pointing out a lot of these. Capitalize starts of
> sentences, write full sentences and terminate with punctuation.
>
I primarily submitted it early because you wanted to look at the tools
before the code patch, which is the one I care about since the longer it
goes, the more effort it is to update the patch to mainline. I
apologize for not proofreading it as much as I usually do. My longer
term intention was to polish the readme stuff and put it into each tool
as well.
however, none of the other tools or scripts in contrib subscribe to
commenting every function the same as we do for c/c++. I did put
comments in many places where it wasn't obvious what was going on to
help with readability, but other cases it seemed obvious enough not to
bother. I don't mind adding missing ones that are important, but I do
not see why every function needs to have the full c/c++ coding standard
applied to it when no other tool does. These certainly appear as good
to me if not better than the existing scripts...
>> No commenting on the quality of python code... :-) I was
>> learning python on the fly. Im sure some things are QUITE awful.,
>
> Yeah, the general impression is of fairly ad-hoc code. Not sure how
> much can be done about this.
they were never intended as general purpose tools, they were developed
over multiple iterations and bugfixing and never properly designed..
they were never originally intended for public submission, so they
suffer... and I'm not interested in rewriting them yet again
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-06 14:04 ` Andrew MacLeod
@ 2015-10-06 14:57 ` Bernd Schmidt
2015-10-06 19:19 ` Andrew MacLeod
2015-10-06 21:27 ` Jeff Law
1 sibling, 1 reply; 65+ messages in thread
From: Bernd Schmidt @ 2015-10-06 14:57 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/06/2015 04:04 PM, Andrew MacLeod wrote:
> I primarily submitted it early because you wanted to look at the tools
> before the code patch, which is the one I care about since the longer it
> goes, the more effort it is to update the patch to mainline.
The problem is that the generated patch is impossible to review on its
own. It's just a half a megabyte dump of changes that can't
realistically be verified for correctness. Reading it can throw up some
interesting questions which can then (hopefully) be answered by
reference to the tools, such as "why does timevar.h move?" For that to
work, the tools need at least to have a minimum level of readability.
They are the important part here, not the generated patch. (Unless you
find a reviewer who's less risk-averse than me and is willing to approve
the whole set and hope for the best.)
I suspect you'll have to regenerate the includes patch anyway, because
of the missing #undef tracking I mentioned.
Let's consider the timevar.h example a bit more. Does the include have
to move? I don't see anything in that file that looks like a dependency,
and include files that need it are already including it. Is the fact
that df.h includes it in any way material for generating an order of
headers? IMO, no, it's an unnecessary change indicating a bug in the
script, and any kind of unnecessary change in a patch like this makes it
so much harder to verify. I think the canonical order that's produced
should probably ignore files included from other headers so that these
are left alone in their original order.
I'd still like more explanations of special cases in the tools like the
diagnostic.h area as well as
# seed tm.h with options.h since its a build file and won't be seen.
and I think we need to understand what makes them special in a way that
makes the rest of the algorithm not handle them correctly (so that we
don't overlook any other such cases).
Bernd
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-06 12:02 ` Bernd Schmidt
2015-10-06 14:04 ` Andrew MacLeod
@ 2015-10-06 16:32 ` Joseph Myers
2015-10-06 19:18 ` Andrew MacLeod
` (2 subsequent siblings)
4 siblings, 0 replies; 65+ messages in thread
From: Joseph Myers @ 2015-10-06 16:32 UTC (permalink / raw)
To: Bernd Schmidt; +Cc: Bernd Schmidt, Andrew MacLeod, gcc-patches
On Tue, 6 Oct 2015, Bernd Schmidt wrote:
> General comments first. Where applicable, I think we should apply the same
> coding standards to Python as we do for C/C++. That means things like function
FWIW, glibc's rule is to follow PEP 8 formatting for Python code.
https://sourceware.org/glibc/wiki/Style_and_Conventions#Code_formatting_in_python_sources
--
Joseph S. Myers
joseph@codesourcery.com
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-06 12:02 ` Bernd Schmidt
2015-10-06 14:04 ` Andrew MacLeod
2015-10-06 16:32 ` Joseph Myers
@ 2015-10-06 19:18 ` Andrew MacLeod
2015-10-07 16:35 ` Andrew MacLeod
2015-10-08 16:31 ` [patch 4/3] Header file reduction - Tools for contrib David Malcolm
4 siblings, 0 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-06 19:18 UTC (permalink / raw)
To: Bernd Schmidt, Bernd Schmidt, gcc-patches
On 10/06/2015 08:02 AM, Bernd Schmidt wrote:
>
>
> This sounds like the intention is to move recognized core files (I
> assume these are the ones in the "order" array in the tool) to the
> start, and leaving everything alone? I was a bit confused about this
> at first; I see for example "timevar.h" moving around without being
> present in the list, but it looks like it gets added implicitly
> through being included by df.h. (Incidentally, this looks like another
> case like the obstack one - a file using timevars should include
> timevar.h IMO, even if it also includes df.h).
>
Ordering the includes is perhaps more complex than you realize. It more
complex than I realized when I first started it. it took a long and very
frustrating period to get it working properly.
There are implicit dependencies between some include files. The primary
ordering list is to provide a canonical order for key files so that
those dependencies are automatically taken care of. Until now we've
managed it by hand. The problem is that the dependencies are not
necessary always from the main header file.. they may come from one of
the headers that were included in it. There are lots of dependencies on
symtab.h for instance, which comes from tree.h Some other source files
don't need tree.h, but they do need symtab.h. If symtab.h isn't in the
ordering list and the header which uses it is (like cgraph.h) , the tool
would move cgraph.h above symtab.h and the result doesn't work.
The solution is to take that initial canonical list, and fully expand it
to include everything that those headers include. This gives a linear
canonical list of close to 100 files. It means things like timevar.h
(which is included by df.h) are in this "ordering":
<...>
regset.h
alloc-pool.h
timevar.h
df.h
tm_p.h
gimple-iterator
<...>
A source file which does not include df.h but includes timevar.h muist
keep it in this same relative ordering, or some other header from the
ordering list which uses timevar.h may no longer compile. (timevar.h
would end up after everything in the canonical list instead of in fromt
of the other file)
This means the any of those 100 headers files which occur in a source
file should occur in this order. The original version of the tool tried
to spell out this exact order, but I realized that was not maintainable
as headers change, and it was actually far simply to specify the core
ones In the tool, and let it do the expansion based on what is in the
current tree.
This also means that taken as a snapshot, you are going to see things
like timevar.h move around in apparently random fashion... but it is not
random. It will be in front of any and all headers listed after it in
the ordering. Any headers which don't appear in the canonical list will
simply retain their current order in the source file, but AFTER all the
ones in the canonical list.
This also made it fairly easy to remove redundant includes that have
been seen already by including some other header... I just build the
list of headers that have been seen already
There are a couple of specialty cases that are handled..
The 'exclude processing' list are headers which shouldn't be expanded
like above. They can cause irreconcilable problems when expanded ,
especially the front end file files. They do need to be ordered since
diagnostics require them to be included first in order to satisfy the
requirement that GCC_DIAG_STYLE be defined before diagnostic.h is
included. Plus most of them include tree.h and/or diagnostic.h
themselves, but we don't want them to impact the ordering for the
backend files.
That list puts those core files in an appropriate place canoncailly, but
doesn't expand into the file because the order we get for the different
front ends would be different . Finally diagnostic*.h and friends are
removed from the list and put at the end to ensure eveything that might
be needed by them is available. Again, the front end files would have
made it much earlier than we wanted for the backend files.
I also disagree with the assertion that " a file using timevars should
include timevar.h IMO, even if it also includes df.h" It could, but I
don't see the value, and I doubt anyone really cares much. If someone
ever removes the only thing that does bring timevar.h, you simply add it
then. That is just part of updating headers. I'm sure before I run
this patch not every file which uses timevar.h actually physically
includes it. This process will set us to a somewhat consistent state.
Its simple enough to remove the ones that are redundant in an
automated way, and very difficult to determine whether they not
required, but contain content that is used.
The fully expanded canonical list looks something like this:
safe-ctype.h
filenames.h
libiberty.h
hwint.h
system.h
insn-modes.h
machmode.h
signop.h
wide-int.h
double-int.h
real.h
fixed-value.h
statistics.h
gtype-desc.h
ggc.h
vec.h
hashtab.h
inchash.h
mem-stats-traits.h
hash-traits.h
hash-map-traits.h
mem-stats.h
hash-map.h
hash-table.h
hash-set.h
line-map.h
input.h
is-a.h
memory-block.h
coretypes.h
options.h
tm.h
function.h
obstack.h
bitmap.h
sbitmap.h
basic-block.h
dominance.h
cfg.h
backend.h
insn-codes.h
hard-reg-set.h
target.h
genrtl.h
rtl.h
c-target.h
c-target-def.h
symtab.h
tree-core.h
tree-check.h
tree.h
cp-tree.h
c-common.h
c-tree.h
gfortran.h
tree-ssa-alias.h
gimple-expr.h
gimple.h
predict.h
cfghooks.h
regset.h
alloc-pool.h
timevar.hdf.h
tm_p.h
gimple-iterators.h
stringpool.h
tree-ssa-operands.h
gimple-ssa.h
tree-ssanames.h
tree-phinodes.h
ssa-iterators.h
ssa.h
expmed.h
insn-opinit.h
optabs-query.h
optabs-libfuncs.h
insn-config.h
optabs.h
regs.h
emit-rtl.h
ira.h
recog.h
ira-int.h
streamer-hooks.h
plugin-api.h
gcov-iov.h
gcov-io.h
wide-int-print.h
pretty-print.h
bversion.h
lto-streamer.h
data-streamer.h
tree-streamer.h
gimple-streamer.h
>
> Intentionally commented out?
>
>> +
>> + def process_ii (filen):
>> + return process_include_info (filen, False, False)
>> +
>> + def process_ii_macro (filen):
>> + return process_include_info (filen, True, False)
>> +
>> + def process_ii_src (filen):
>> + return process_include_info (filen, False, True)
>> +
>> + def process_ii_macro_src (filen):
>> + return process_include_info (filen, True, True)
>> +
>> + def ii_base (iinfo):
>> + return iinfo[0]
>> +
>> + def ii_path (iinfo):
>> + return iinfo[1]
>> +
>> + def ii_include_list (iinfo):
>> + return iinfo[2]
>> +
>> + def ii_include_list_cond (iinfo):
>> + return iinfo[3]
>> +
>> + def ii_include_list_non_cond (iinfo):
>> + l = ii_include_list (iinfo)
>> + for n in ii_include_list_cond (iinfo):
>> + l.remove (n)
>> + return l
>> +
>> + def ii_macro_consume (iinfo):
>> + return iinfo[4]
>> +
>> + def ii_macro_define (iinfo):
>> + return iinfo[5]
>> +
>> + def ii_src (iinfo):
>> + return iinfo[6]
>> +
>> + def ii_src_line (iinfo):
>> + return iinfo[7]
>
> That's a lot of little functions with pretty much no clue for the
> reader what's going on. It looks like maybe there's an array where a
> struct should have been used?
>
there once was a large comment at the start of process_include_info
describing the return value vactor... they simply access it. Im not
sure where it went. I will find and put the big comment back in.
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-06 14:57 ` Bernd Schmidt
@ 2015-10-06 19:19 ` Andrew MacLeod
2015-10-06 20:37 ` Bernd Schmidt
0 siblings, 1 reply; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-06 19:19 UTC (permalink / raw)
To: Bernd Schmidt, gcc-patches
On 10/06/2015 10:56 AM, Bernd Schmidt wrote:
> On 10/06/2015 04:04 PM, Andrew MacLeod wrote:
>
>> I primarily submitted it early because you wanted to look at the tools
>> before the code patch, which is the one I care about since the longer it
>> goes, the more effort it is to update the patch to mainline.
>
> The problem is that the generated patch is impossible to review on its
> own. It's just a half a megabyte dump of changes that can't
> realistically be verified for correctness. Reading it can throw up
> some interesting questions which can then (hopefully) be answered by
> reference to the tools, such as "why does timevar.h move?" For that to
> work, the tools need at least to have a minimum level of readability.
> They are the important part here, not the generated patch. (Unless you
> find a reviewer who's less risk-averse than me and is willing to
> approve the whole set and hope for the best.)
I dont get your fear. I could have created that patch by hand, it would
just take a long time, and would likely be less complete, but just as large.
I'm not changing functionality. ALL the tool is doing is removing
header files which aren't needed to compile. It goes to great pains to
make sure it doesn't remove a silent dependency that conditional
compilation might introduce. Other than that, the sanity check is that
everything compiles on every target and regression tests show nothing.
Since we're doing this with just include files, and not changing
functionality, Im not sure what your primary concern is? You are
unlikely to ever be able to read the patch and decide for yourself
whether removing expr.h from the header list is correct or not. Much
like if I proposed the same thing by hand.
Yes, I added the other tool in which reorders the headers and removes
duplicates, and perhaps that is what is causing you the angst. The
canonical ordering was developed by taking current practice and adding
in other core files which had ordering issues that showed up during the
reduction process. Reorderiing all files to this order should actually
resolve more issues than it causes. I can generate and provide that as
a patch if you want to look at it separately... I dont know what that
buys you. you could match the includes to the master list to make sure
the tool did its job by itself I guess.
The tools are unlikely to ever be used again... Jeff suggested I provide
them to contrib just in case someone decided to do something with them
someday, they wouldn't be lost,or at least they wouldn't have to track
me down to get them.
IF we discover that one or more of the tools does continue to have some
life, well then maybe at that point its worth putting some time into
refining it a bit better.
> I suspect you'll have to regenerate the includes patch anyway, because
> of the missing #undef tracking I mentioned.
I dont see that #undef is relevant at all. All the conditional
dependencies care about is "MAY DEFINE" Its conservative in that if
something could be defined, we'll assume it is and not remove any file
which may depend on it. to undefine something in a MAY DEFINE world
doesnt mean anything.
>
> Let's consider the timevar.h example a bit more. Does the include have
> to move? I don't see anything in that file that looks like a
> dependency, and include files that need it are already including it.
> Is the fact that df.h includes it in any way material for generating
> an order of headers? IMO, no, it's an unnecessary change indicating a
> bug in the script, and any kind of unnecessary change in a patch like
> this makes it so much harder to verify. I think the canonical order
> that's produced should probably ignore files included from other
> headers so that these are left alone in their original order.
>
I covered this in the last note. Pretty much every file is going to
have a "core" of up to 95 files reordered into the canonical form, which
taken as a snapshot of any given file, may look arbitrary but is in fact
a specific subset of the canonical ordering. You cant only order some
parts of it because there are subtle dependencies between the files
which force you to look at them all. Trust me, I didnt start by
reordering all of them this way... it developed over time.
> I'd still like more explanations of special cases in the tools like
> the diagnostic.h area as well as
> # seed tm.h with options.h since its a build file and won't be seen.
> and I think we need to understand what makes them special in a way
> that makes the rest of the algorithm not handle them correctly (so
> that we don't overlook any other such cases).
>
See the other note, its because of the front end files/diagnostic
dependencies or irreconcilable cycles because of what a header
includes. Any other case would have shown up the way those did
during development.
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-06 19:19 ` Andrew MacLeod
@ 2015-10-06 20:37 ` Bernd Schmidt
2015-10-06 21:30 ` Jeff Law
2015-10-06 22:43 ` Andrew MacLeod
0 siblings, 2 replies; 65+ messages in thread
From: Bernd Schmidt @ 2015-10-06 20:37 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/06/2015 09:19 PM, Andrew MacLeod wrote:
> I dont get your fear. I could have created that patch by hand, it would
> just take a long time, and would likely be less complete, but just as
> large.
>
> I'm not changing functionality. ALL the tool is doing is removing
> header files which aren't needed to compile. It goes to great pains to
> make sure it doesn't remove a silent dependency that conditional
> compilation might introduce. Other than that, the sanity check is that
> everything compiles on every target and regression tests show nothing.
> Since we're doing this with just include files, and not changing
> functionality, Im not sure what your primary concern is?
My concern is that I've seen occasions in the past where "harmless
cleanups" that were not intended to alter functionality introduced
severe and subtle bugs that went unnoticed for a significant amount of
time. If a change does not alter functionality, then there is a valid
question of "why apply it then?", and the question of correctness
becomes very important (to me anyway). The patch was produced by a
fairly complex process, and I'd want to at least be able to convince
myself that the process is correct.
Anyhow, I'll step back from this, you're probably better served by
someone else reviewing the patch.
Bernd
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-06 14:04 ` Andrew MacLeod
2015-10-06 14:57 ` Bernd Schmidt
@ 2015-10-06 21:27 ` Jeff Law
1 sibling, 0 replies; 65+ messages in thread
From: Jeff Law @ 2015-10-06 21:27 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches, Bernd Schmidt
On 10/06/2015 08:04 AM, Andrew MacLeod wrote:
>>> No commenting on the quality of python code... :-) I was
>>> learning python on the fly. Im sure some things are QUITE awful.,
>>
>> Yeah, the general impression is of fairly ad-hoc code. Not sure how
>> much can be done about this.
> they were never intended as general purpose tools, they were developed
> over multiple iterations and bugfixing and never properly designed..
> they were never originally intended for public submission, so they
> suffer... and I'm not interested in rewriting them yet again
So a little background for Bernd.
The tangled mess that our header files has been makes it extremely
difficult to do something introduce a new classes/interfaces to improve
the separation of various parts of GCC. Consider the case if we
wanted to drop trees from gimple onward by initially wrapping trees in a
trivially compatible class then converting files one by one to use the
new representation.
We'd want to be able to do the conversion, then ensure ourselves that
the old interfaces couldn't sneak in. Getting there required some
significant header file deconstruction, then reconstruction.
So Andrew set forth to try and untangle the mess of dependencies, remove
unnecessary includes, etc etc. He had the good sense to write some
scripts to help :-0
A few months ago as this stage of refactoring header files as nearing
completion, I asked Andrew how we were going to prevent things from
getting into the sorry shape we were in last year. From that discussion
the suggestion that he should polish up his scripts and submit them for
inclusion into the contrib/ subdirectory for future reference/use.
Ideally we'd occasionally run those scripts to ensure that we don't muck
things up too badly again in the future.
Anyway, that's how we got here. The scripts are just helper tools, but
I wouldn't consider them a core part of GCC. Obviously the cleaner and
easier to run, the better.
It's interesting that a lot of work done by Andrew has ended up
mirroring stuff I'm reading these days in Feathers' book.
Jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-06 20:37 ` Bernd Schmidt
@ 2015-10-06 21:30 ` Jeff Law
2015-10-06 22:43 ` Andrew MacLeod
1 sibling, 0 replies; 65+ messages in thread
From: Jeff Law @ 2015-10-06 21:30 UTC (permalink / raw)
To: Bernd Schmidt, Andrew MacLeod, gcc-patches
On 10/06/2015 02:37 PM, Bernd Schmidt wrote:
> On 10/06/2015 09:19 PM, Andrew MacLeod wrote:
>> I dont get your fear. I could have created that patch by hand, it would
>> just take a long time, and would likely be less complete, but just as
>> large.
>>
>> I'm not changing functionality. ALL the tool is doing is removing
>> header files which aren't needed to compile. It goes to great pains to
>> make sure it doesn't remove a silent dependency that conditional
>> compilation might introduce. Other than that, the sanity check is that
>> everything compiles on every target and regression tests show nothing.
>> Since we're doing this with just include files, and not changing
>> functionality, Im not sure what your primary concern is?
>
> My concern is that I've seen occasions in the past where "harmless
> cleanups" that were not intended to alter functionality introduced
> severe and subtle bugs that went unnoticed for a significant amount of
> time. If a change does not alter functionality, then there is a valid
> question of "why apply it then?", and the question of correctness
> becomes very important (to me anyway). The patch was produced by a
> fairly complex process, and I'd want to at least be able to convince
> myself that the process is correct.
A very valid concern. In fact, one could argue that one of the long
term problems we're likely to face as a project is the inability to do
this kind of refactoring with high degrees of confidence that we're not
breaking things.
>
> Anyhow, I'll step back from this, you're probably better served by
> someone else reviewing the patch.
That's fine. I don't mind covering this.
jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 0/3] Header file reduction.
2015-10-05 20:10 ` Andrew MacLeod
2015-10-05 20:37 ` Bernd Schmidt
@ 2015-10-06 21:44 ` Jeff Law
2015-10-07 8:16 ` Richard Biener
1 sibling, 1 reply; 65+ messages in thread
From: Jeff Law @ 2015-10-06 21:44 UTC (permalink / raw)
To: Andrew MacLeod, Bernd Schmidt, gcc-patches
On 10/05/2015 02:10 PM, Andrew MacLeod wrote:
>>
>> Is the bitmap/obstack example really one of a change that is
>> desirable? I think if a file uses obstacks then an include of
>> obstack.h is perfectly fine, giving us freedom to e.g. change bitmaps
>> not to use obstacks. Given that multiple headers include obstack.h,
>> and pretty much everything seems to indirectly include bitmap.h
>> anyway, maybe a better change would be to just include it always in
>> system.h.
>
> Its just an example of the sort of redundant includes the tool removes.
It may not be the best example. The tools don't treat obstack specially
(nor should they IMHO). So let's pretend it's not obstack.h which has
been arguably a core part of GCC for a long time.
>
> I don't see the point in leaving redundant #includes in the source
> code because of direct uses from that header in the source. I'm not
> even sure how I could automate detecting that accurately.. Going
> forward, If anyone ever makes a change which removes a header from an
> include file, they just have to correct the fallout. heh. Thats kinda
> all I've done for 4 months :-) At least we'll have grasp of the
> ramifications..
And the last sentence is the key here. We're trying to get to a point
where we can make certain kinds of changes, then have the compiler spit
out errors, fix the errors and have a high degree of confidence that the
final change is correct and that we've found all the places that need to
change.
The change could be as simple as moving a function declaration to its
natural place, collecting interfaces & data into classes, or something
more ambitious like removing trees from the backend. Folks will note
that these are all refactorings that we don't want to change any
observable behaviour.
>> * diff -c is somewhat unusual and I find diff -u much more readable.
>
> unsual? I've been using -cp for the past 2 decades and no one has ever
> mentioned it before... poking around the wiki I see it mentions you
> can use either -up or -cp.
>
> I guess I could repackage things using -up... I don't even know where
> my script is to change it :-). is -u what everyone uses now? no one
> has mentioned it before that I am aware of.
I'm probably the last person in the world that still generally prefers
-cp :-) I'm getting to the point where I can tolerate -u.
>
>
>> * Maybe the patches for reordering and removing should be split, also
>> for readability and for easier future identification of problems.
>>
> I was trying to avoid too much churn on 550ish files... I didn't think
> each one needed 2 sets of check-ins. It could be done, but it will
> take a while. The reordering patch can be quickly generated, but the
> reduction on all those files will take the better part of a week.
>
> My theory is it perfectly safe to back out any single file from the
> patch set if we discover it has an issue and then examine what the root
> of the problem is..
>
> tool patch coming shortly... probably tomorrow now.
I haven't looked at the 3 patches in detail yet. Given my familiarity
with the overall process/goal, I can probably handle them as-is.
They're just big mechanical changes.
jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 0/3] Header file reduction.
2015-10-05 21:11 ` Andrew MacLeod
2015-10-06 3:03 ` [patch 0/3] Header file reduction. - unified patches Andrew MacLeod
@ 2015-10-06 21:55 ` Jeff Law
1 sibling, 0 replies; 65+ messages in thread
From: Jeff Law @ 2015-10-06 21:55 UTC (permalink / raw)
To: Andrew MacLeod, Bernd Schmidt, gcc-patches
On 10/05/2015 03:11 PM, Andrew MacLeod wrote:
>
> In any case, a direct include of obstack.h in coretypes.h was considered
> earlier in the aggregation process and it didn't show up as something
> that would be a win. It is included a couple of common places that we
> have no control over.. in particular libcpp/include/symtab.h includes
> obstack.h and is included by tree-core.h. A very significant number of
> files bring that in. If we included obstack.h in coretypes.h then those
> files would be including it again for a second time for no particularly
> good reason. So I made the judgement call to not put it in coretypes.h.
And just as important, we can revisit the aggregators and when we do so,
we ought to be able to answer the question, "if obstack.h is put into
coretypes.h" does that clean things up elsewhere and re-run the tools to
clean things up.
>
>> And it's one example, but it does point out a problem with this sort
>> of automated approach: realistically no one is going to check the
>> whole patch, and it may contain changes that could be done better.
>
> The point being that the aggregation *wasn't* automated... and has
> nothing to do with this patch set. I analyzed and preformed all that
> sort of thing earlier. Sure judgment calls were made, but it wasn't
> automated in the slightest. There are certainly further aggregation
> improvements that could be made... and either I or someone else could do
> more down the road., The heavy lifting has all been done now.
Agreed.
>
> So the *only* thing that is automated is removing include files which
> are not needed so that we can get an idea of what the true dependencies
> in the source base are.
Also agreed.
>>> the reduction on all those files will take the better part of a week.
>>
>> That's a little concerning due to the possibility of intervening
>> commits. I'd like to make one requirement for checkin, that you take
>> the revision at which you're committing and then run the script again,
>> verifying that the process produces the same changes as the patch you
>> committed. (Or do things in smaller chunks.).
>>
>
> Well, sure there are intervening commits.. the only ones that actually
> matter are the ones which fail to compile because someone made a code
> change which now requires a header that wasn't needed before. which is
> really a state we're looking for I think. I fix those up before
> committing. Its *possible* a conditional compilation issue could creep
> in, but highly unlikely.
More likely is conditional compilation will be removed :-) We're trying
to get away from conditional compilation as a general direction.
Intervening commits are always a problem with this kind of large patch
that hits many places. But IMHO, they're an easily managed problem.
jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-06 20:37 ` Bernd Schmidt
2015-10-06 21:30 ` Jeff Law
@ 2015-10-06 22:43 ` Andrew MacLeod
1 sibling, 0 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-06 22:43 UTC (permalink / raw)
To: Bernd Schmidt, gcc-patches
On 10/06/2015 04:37 PM, Bernd Schmidt wrote:
> On 10/06/2015 09:19 PM, Andrew MacLeod wrote:
>> I dont get your fear. I could have created that patch by hand, it would
>> just take a long time, and would likely be less complete, but just as
>> large.
>>
>> I'm not changing functionality. ALL the tool is doing is removing
>> header files which aren't needed to compile. It goes to great pains to
>> make sure it doesn't remove a silent dependency that conditional
>> compilation might introduce. Other than that, the sanity check is that
>> everything compiles on every target and regression tests show nothing.
>> Since we're doing this with just include files, and not changing
>> functionality, Im not sure what your primary concern is?
>
> My concern is that I've seen occasions in the past where "harmless
> cleanups" that were not intended to alter functionality introduced
> severe and subtle bugs that went unnoticed for a significant amount of
> time. If a change does not alter functionality, then there is a valid
> question of "why apply it then?", and the question of correctness
> becomes very important (to me anyway). The patch was produced by a
> fairly complex process, and I'd want to at least be able to convince
> myself that the process is correct.
>
> Anyhow, I'll step back from this, you're probably better served by
> someone else reviewing the patch.
>
>
> Bernd
I do get it. And I have spent a lot of time trying to make sure none of
those sort of bugs come in, and ultimately have tried to be
conservative.. after all, its better to have the tool leave an include
than remove one that may be required.
Ultimately, these changes are unlikely to introduce an issue, but there
is a very slight possibility. Any issues that do surface should be of
the "not using a pattern" kind because a conditional compilation code
case was somehow missed. I'm hoping for none of those obviously.
Anyway, the tool does seem to work on all the tests I have looked at.
If any bugs are uncovered by this, then they are also latent issues we
didn't know about that should be exposed and fixed anyway.
I am fine if we'd like to separate the patches into the reordering, and
the deleting. Its not a lot of effort on my part, just a lot of time
compiling for the reducer in the background.. and we can do them as 2
commits if that is helpful.
What I don't want to do is spend a lot more time massaging the tools for
contrib because I am sick of looking at them right now, and no one is in
a hurry to use them anyway... if anyone ever does.:-) The documentation
grammer should certainly be fixed up and I will add some comments around
the questions you had.
we could also do a small scale submission on half a dozen files, provide
the reorder patch, and then the reduction patch with the logs if that
helps whoever is reviewing get comfortable with what the tool is doing,
then its easier to simply acknowledge the mechanical nature of the large
commit.
Perhaps it would be educational anyway.
I'll do it however you guys want... i just want to get it done :-)
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 0/3] Header file reduction.
2015-10-06 21:44 ` Jeff Law
@ 2015-10-07 8:16 ` Richard Biener
2015-10-08 15:48 ` Michael Matz
0 siblings, 1 reply; 65+ messages in thread
From: Richard Biener @ 2015-10-07 8:16 UTC (permalink / raw)
To: Jeff Law; +Cc: Andrew MacLeod, Bernd Schmidt, gcc-patches
On Tue, Oct 6, 2015 at 11:43 PM, Jeff Law <law@redhat.com> wrote:
> On 10/05/2015 02:10 PM, Andrew MacLeod wrote:
>>>
>>>
>>> Is the bitmap/obstack example really one of a change that is
>>> desirable? I think if a file uses obstacks then an include of
>>> obstack.h is perfectly fine, giving us freedom to e.g. change bitmaps
>>> not to use obstacks. Given that multiple headers include obstack.h,
>>> and pretty much everything seems to indirectly include bitmap.h
>>> anyway, maybe a better change would be to just include it always in
>>> system.h.
>>
>>
>> Its just an example of the sort of redundant includes the tool removes.
>
> It may not be the best example. The tools don't treat obstack specially
> (nor should they IMHO). So let's pretend it's not obstack.h which has been
> arguably a core part of GCC for a long time.
>
>>
>> I don't see the point in leaving redundant #includes in the source
>> code because of direct uses from that header in the source. I'm not
>> even sure how I could automate detecting that accurately.. Going
>> forward, If anyone ever makes a change which removes a header from an
>> include file, they just have to correct the fallout. heh. Thats kinda
>> all I've done for 4 months :-) At least we'll have grasp of the
>> ramifications..
>
> And the last sentence is the key here. We're trying to get to a point where
> we can make certain kinds of changes, then have the compiler spit out
> errors, fix the errors and have a high degree of confidence that the final
> change is correct and that we've found all the places that need to change.
>
> The change could be as simple as moving a function declaration to its
> natural place, collecting interfaces & data into classes, or something more
> ambitious like removing trees from the backend. Folks will note that these
> are all refactorings that we don't want to change any observable behaviour.
>
>
>
>
>
>>> * diff -c is somewhat unusual and I find diff -u much more readable.
>>
>>
>> unsual? I've been using -cp for the past 2 decades and no one has ever
>> mentioned it before... poking around the wiki I see it mentions you
>> can use either -up or -cp.
>>
>> I guess I could repackage things using -up... I don't even know where
>> my script is to change it :-). is -u what everyone uses now? no one
>> has mentioned it before that I am aware of.
>
> I'm probably the last person in the world that still generally prefers -cp
> :-) I'm getting to the point where I can tolerate -u.
No, I prefer -cp too - diff just too easily makes a mess out of diffs with -u,
esp. if you have re-indenting going on as well.
Richard.
>
>>
>>
>>> * Maybe the patches for reordering and removing should be split, also
>>> for readability and for easier future identification of problems.
>>>
>> I was trying to avoid too much churn on 550ish files... I didn't think
>> each one needed 2 sets of check-ins. It could be done, but it will
>> take a while. The reordering patch can be quickly generated, but the
>> reduction on all those files will take the better part of a week.
>>
>> My theory is it perfectly safe to back out any single file from the
>> patch set if we discover it has an issue and then examine what the root
>> of the problem is..
>>
>> tool patch coming shortly... probably tomorrow now.
>
> I haven't looked at the 3 patches in detail yet. Given my familiarity with
> the overall process/goal, I can probably handle them as-is. They're just big
> mechanical changes.
>
> jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-06 12:02 ` Bernd Schmidt
` (2 preceding siblings ...)
2015-10-06 19:18 ` Andrew MacLeod
@ 2015-10-07 16:35 ` Andrew MacLeod
2015-10-14 15:14 ` [patch 4/3] Header file reduction - Tools for contrib - second cut Andrew MacLeod
2015-10-08 16:31 ` [patch 4/3] Header file reduction - Tools for contrib David Malcolm
4 siblings, 1 reply; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-07 16:35 UTC (permalink / raw)
To: Bernd Schmidt, Bernd Schmidt, gcc-patches, law >> Jeff Law
I went through and addressed the comments.. Just for info, a few replies:
>> + # various front ends have to set GCC_DIAG_STYLE before
>> including it
>> + # for each file, we'll tailor where it belongs by looking at
>> the dup
>> + # list and seeing which file is included, and position it
>> appropriately.
>
> From that comment it's not entirely clear how they are handled. Please
> expand documentation of this mechanism.
I modified the comments in a couple of places to hopefully make it clearer.
>> + for i in order:
>> + create_master_list (os.path.basename (i), False)
>
> I found myself wanting to pass True. The tool could use a "-v" flag.
>
I changed the existing -s flag to -v, and simply passed the value
here... Now you see the final list, as well as the list of where each
one came from.
>> + for line in logfile:
>> + if len (line) > 21 and line[:21] in depstring:
>> + if newinc:
>> + incfrom = list()
>> + newinc = False
>
> It looks like you are mixing tab and space indentation. For a language
> like Python, that is absolutely scary. Please fix throughout (I think
> only spaces is probably best).
>
vi is doing that automatically for me.. I will expandtabs everything.
>> + # compensate for this stupid warning that should be an error for
>> + # inlined templates
>> + def get_make_rc (rc, output):
>> + rc = rc % 1280
>> + if rc == 0:
>> + # This is not considered a fatal error for a build! /me rolls
>> eyes
>> + h = re.findall ("warning: inline function.*used but never
>> defined", output)
>> + if len(h) != 0:
>> + rc = 1
>> + return rc;
>
> What's this about?
I've updated the comment to be clearer. Apparently its only a warning
to use a template inline function with no definition. I suspect this is
some oddball C++ thing :-). Maybe it can be resolved at link time
somehow? Anyway, what I found is that the return code from this is 0
since its just a warning. SO the tool would remove the header file, and
when I later try to link and build and object, it becomes a fatal link
error with the function used but undefined.
It shows up when checking target builds since I only try to build the .o
file there rather than build and link. So the tool checks the output
from the compilation, and if it sees this error, decided to be
conservative and report it as a build error, and thus it will leave the
header file in the source.
>
>> + print " -a : Show only files which*all* listed files are included"
>
> Whitespace around *all*. Seems to happen quite often.
Yeah, that is very odd. In the code here, there is a space in front of
every single one of those. I simply changed all these to 'all' instead
of '*all*'.
I'm also going to add a few more comments to functions in
gcc-order-headers and reduce-headers, as well as utils.py
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 1/3] Header file reduction - backend files.
2015-10-02 2:33 ` [patch 1/3] Header file reduction - backend files Andrew MacLeod
@ 2015-10-07 22:02 ` Jeff Law
2015-10-07 23:09 ` Andrew MacLeod
2015-10-08 13:37 ` [patch] header file re-ordering Andrew MacLeod
2015-10-22 22:33 ` [patch 1/3] Header file reduction - backend files Jeff Law
1 sibling, 2 replies; 65+ messages in thread
From: Jeff Law @ 2015-10-07 22:02 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/01/2015 08:33 PM, Andrew MacLeod wrote:
> these are all in the main gcc directory. 297 files total.
>
> Everything bootstraps on x86_64-pc-linux-gnu and
> powerpc64le-unknown-linux-gnu. All targets in config-list.mk still
> build. Regressions tests also came up clean.
>
> OK for trunk?
So as I look at this and make various spot checks, what really stands
out is how often something like alias.h gets included, often in places
that have absolutely no business/need to be looking at that file.
Cut-n-paste at its worst. It happens to many others, but alias.h seems
to have gotten its grubby self into just about everywhere for reasons
unkonwn.
I find myself also wondering if a two step approach would make this
easier. Step #1 being ordering the headers, step #2 being removal of
the duplicates. As you note, the downside is two checkins that would
affect most files in the tree. I guess I'll keep slogging through the
patch as is...
jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 1/3] Header file reduction - backend files.
2015-10-07 22:02 ` Jeff Law
@ 2015-10-07 23:09 ` Andrew MacLeod
2015-10-08 13:37 ` [patch] header file re-ordering Andrew MacLeod
1 sibling, 0 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-07 23:09 UTC (permalink / raw)
To: Jeff Law, gcc-patches
On 10/07/2015 06:02 PM, Jeff Law wrote:
> On 10/01/2015 08:33 PM, Andrew MacLeod wrote:
>> these are all in the main gcc directory. 297 files total.
>>
>> Everything bootstraps on x86_64-pc-linux-gnu and
>> powerpc64le-unknown-linux-gnu. All targets in config-list.mk still
>> build. Regressions tests also came up clean.
>>
>> OK for trunk?
> So as I look at this and make various spot checks, what really stands
> out is how often something like alias.h gets included, often in places
> that have absolutely no business/need to be looking at that file.
> Cut-n-paste at its worst. It happens to many others, but alias.h
> seems to have gotten its grubby self into just about everywhere for
> reasons unkonwn.
>
> I find myself also wondering if a two step approach would make this
> easier. Step #1 being ordering the headers, step #2 being removal of
> the duplicates. As you note, the downside is two checkins that would
> affect most files in the tree. I guess I'll keep slogging through the
> patch as is...
>
> jeff
No problem... I can generate the header reordering patch for you to
look at. gotta run right now, but either later tonight or first thingin
the morning.
alias.h is particularly bad because there were some headers which had
stupid dependencies. I broke those dependencies a couple of months ago,
now it doesn't need to be everywhere anymore. I had also noticed it was
the one which was removed the most frequently now :-)
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* [patch] header file re-ordering.
2015-10-07 22:02 ` Jeff Law
2015-10-07 23:09 ` Andrew MacLeod
@ 2015-10-08 13:37 ` Andrew MacLeod
2015-10-08 15:29 ` Jeff Law
` (4 more replies)
1 sibling, 5 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-08 13:37 UTC (permalink / raw)
To: Jeff Law, gcc-patches
[-- Attachment #1: Type: text/plain, Size: 2194 bytes --]
On 10/07/2015 06:02 PM, Jeff Law wrote:
> On 10/01/2015 08:33 PM, Andrew MacLeod wrote:
>> these are all in the main gcc directory. 297 files total.
>>
>> Everything bootstraps on x86_64-pc-linux-gnu and
>> powerpc64le-unknown-linux-gnu. All targets in config-list.mk still
>> build. Regressions tests also came up clean.
>>
>> OK for trunk?
> So as I look at this and make various spot checks, what really stands
> out is how often something like alias.h gets included, often in places
> that have absolutely no business/need to be looking at that file.
> Cut-n-paste at its worst. It happens to many others, but alias.h
> seems to have gotten its grubby self into just about everywhere for
> reasons unkonwn.
>
> I find myself also wondering if a two step approach would make this
> easier. Step #1 being ordering the headers, step #2 being removal of
> the duplicates. As you note, the downside is two checkins that would
> affect most files in the tree. I guess I'll keep slogging through the
> patch as is...
>
> jeff
Heres the patch for reordered headers. Building as we speak. Hard to
fully verify since Ada doesn't seem to bootstrap on trunk at the moment:
+===========================GNAT BUG DETECTED==============================+
| 6.0.0 20151008 (experimental) (x86_64-pc-linux-gnu) GCC error: |
| in gen_lowpart_common, at emit-rtl.c:1399 |
| Error detected around s-regpat.adb:1029:22 |
<...>
raised TYPES.UNRECOVERABLE_ERROR : comperr.adb:423
../gcc-interface/Makefile:311: recipe for target 's-regpat.o' failed
However, the tool has been run, and I've made the minor adjustments
required to the source files to make it work. (ie, a few multi-line
comments and the fact that mul-tables.c is generated on the tile* targets.
So this is what it should look like. I used -cp. Other languages are
bootstrapping, and I have yet to build all the targets... that'll just
take a day. Be nice if ada worked tho.
I can run the reduction tool over the weekend (its a long weekend here
:-) on this if you want... the other patch is a couple of weeks out of
date anyway now.
Andrew
[-- Attachment #2: backend-order.patch.bz2 --]
[-- Type: application/x-bzip, Size: 21703 bytes --]
[-- Attachment #3: FE-order.patch.bz2 --]
[-- Type: application/x-bzip, Size: 10166 bytes --]
[-- Attachment #4: config-order.patch.bz2 --]
[-- Type: application/x-bzip, Size: 8283 bytes --]
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch] header file re-ordering.
2015-10-08 13:37 ` [patch] header file re-ordering Andrew MacLeod
@ 2015-10-08 15:29 ` Jeff Law
2015-10-11 20:58 ` [BUILDROBOT] Bootstrap broken in Ada (was: [patch] header file re-ordering.) Jan-Benedict Glaw
` (3 subsequent siblings)
4 siblings, 0 replies; 65+ messages in thread
From: Jeff Law @ 2015-10-08 15:29 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/08/2015 07:37 AM, Andrew MacLeod wrote:
> On 10/07/2015 06:02 PM, Jeff Law wrote:
>> On 10/01/2015 08:33 PM, Andrew MacLeod wrote:
>>> these are all in the main gcc directory. 297 files total.
>>>
>>> Everything bootstraps on x86_64-pc-linux-gnu and
>>> powerpc64le-unknown-linux-gnu. All targets in config-list.mk still
>>> build. Regressions tests also came up clean.
>>>
>>> OK for trunk?
>> So as I look at this and make various spot checks, what really stands
>> out is how often something like alias.h gets included, often in places
>> that have absolutely no business/need to be looking at that file.
>> Cut-n-paste at its worst. It happens to many others, but alias.h
>> seems to have gotten its grubby self into just about everywhere for
>> reasons unkonwn.
>>
>> I find myself also wondering if a two step approach would make this
>> easier. Step #1 being ordering the headers, step #2 being removal of
>> the duplicates. As you note, the downside is two checkins that would
>> affect most files in the tree. I guess I'll keep slogging through the
>> patch as is...
>>
>> jeff
> Heres the patch for reordered headers. Building as we speak. Hard to
> fully verify since Ada doesn't seem to bootstrap on trunk at the moment:
Saw in IRC it was Jan's patch that broke Ada bootstrap. So you might
consider reverting that bit locally to restore bootstrapping for Ada.
>
> However, the tool has been run, and I've made the minor adjustments
> required to the source files to make it work. (ie, a few multi-line
> comments and the fact that mul-tables.c is generated on the tile* targets.
>
> So this is what it should look like. I used -cp. Other languages are
> bootstrapping, and I have yet to build all the targets... that'll just
> take a day. Be nice if ada worked tho.
OK. I'll take a look at this version and I think running the reducer
over the weekend sounds good.
Jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 0/3] Header file reduction.
2015-10-07 8:16 ` Richard Biener
@ 2015-10-08 15:48 ` Michael Matz
0 siblings, 0 replies; 65+ messages in thread
From: Michael Matz @ 2015-10-08 15:48 UTC (permalink / raw)
To: Richard Biener; +Cc: Jeff Law, Andrew MacLeod, Bernd Schmidt, gcc-patches
Hi,
On Wed, 7 Oct 2015, Richard Biener wrote:
> > I'm probably the last person in the world that still generally prefers
> > -cp :-) I'm getting to the point where I can tolerate -u.
>
> No, I prefer -cp too - diff just too easily makes a mess out of diffs
> with -u, esp. if you have re-indenting going on as well.
Actually -c was the recommended form of sending patches for many years
even in our own guidelines. It only got changed to -up or -cp when moving
instructions from the texinfo files to the website in 2001. From gcc 3.0
(https://gcc.gnu.org/onlinedocs/gcc-3.0/gcc_10.html):
Use `diff -c' to make your diffs. Diffs without context are hard for us
to install reliably. More than that, they make it hard for us to study
the diffs to decide whether we want to install them. Unidiff format is
better than contextless diffs, but not as easy to read as `-c' format.
If you have GNU diff, use `diff -cp', which shows the name of the
function that each change occurs in.
;-) (IMHO it depends on what the patch does if -c or -u is better, if
the _change_ is important -u might be better, if the new state is the
more interesting thing, -c is)
Ciao,
Michael.
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib
2015-10-06 12:02 ` Bernd Schmidt
` (3 preceding siblings ...)
2015-10-07 16:35 ` Andrew MacLeod
@ 2015-10-08 16:31 ` David Malcolm
4 siblings, 0 replies; 65+ messages in thread
From: David Malcolm @ 2015-10-08 16:31 UTC (permalink / raw)
To: Bernd Schmidt; +Cc: Bernd Schmidt, Andrew MacLeod, gcc-patches
On Tue, 2015-10-06 at 14:02 +0200, Bernd Schmidt wrote:
[...]
> > No commenting on the quality of python code... :-) I was
> > learning python on the fly. Im sure some things are QUITE awful.,
[...]
> > + def ii_base (iinfo):
> > + return iinfo[0]
> > +
> > + def ii_path (iinfo):
> > + return iinfo[1]
> > +
> > + def ii_include_list (iinfo):
> > + return iinfo[2]
> > +
> > + def ii_include_list_cond (iinfo):
> > + return iinfo[3]
> > +
> > + def ii_include_list_non_cond (iinfo):
> > + l = ii_include_list (iinfo)
> > + for n in ii_include_list_cond (iinfo):
> > + l.remove (n)
> > + return l
> > +
> > + def ii_macro_consume (iinfo):
> > + return iinfo[4]
> > +
> > + def ii_macro_define (iinfo):
> > + return iinfo[5]
> > +
> > + def ii_src (iinfo):
> > + return iinfo[6]
> > +
> > + def ii_src_line (iinfo):
> > + return iinfo[7]
>
> That's a lot of little functions with pretty much no clue for the reader
> what's going on. It looks like maybe there's an array where a struct
> should have been used?
FWIW, this kind of thing is often made a lot neater and easier to debug
by using "namedtuple" from within the "collections" module in the
standard library:
https://docs.python.org/2/library/collections.html#collections.namedtuple
which lets you refer e.g. to field 5 of the tuple as a "define"
attribute.
iinfo.define
and avoid all these accessor functions (and you can add methods and
properties, giving e.g. a "list_non_cond").
Not that I'm asking you to rewrite it; merely that namedtuple is one of
many gems in the python stdlib that are worth knowing about.
[...]
Hope this is constructive
Dave
^ permalink raw reply [flat|nested] 65+ messages in thread
* [BUILDROBOT] Bootstrap broken in Ada (was: [patch] header file re-ordering.)
2015-10-08 13:37 ` [patch] header file re-ordering Andrew MacLeod
2015-10-08 15:29 ` Jeff Law
@ 2015-10-11 20:58 ` Jan-Benedict Glaw
2015-10-11 22:27 ` [BUILDROBOT] Bootstrap broken in Ada Jeff Law
2015-10-12 8:04 ` [patch] header file re-ordering Jeff Law
` (2 subsequent siblings)
4 siblings, 1 reply; 65+ messages in thread
From: Jan-Benedict Glaw @ 2015-10-11 20:58 UTC (permalink / raw)
To: Jan Hubicka, Jeff Law; +Cc: gcc-patches, Andrew MacLeod
[-- Attachment #1: Type: text/plain, Size: 1559 bytes --]
On Thu, 2015-10-08 09:37:03 -0400, Andrew MacLeod <amacleod@redhat.com> wrote:
[...]
> Heres the patch for reordered headers. Building as we speak. Hard to fully
> verify since Ada doesn't seem to bootstrap on trunk at the moment:
>
> +===========================GNAT BUG DETECTED==============================+
> | 6.0.0 20151008 (experimental) (x86_64-pc-linux-gnu) GCC error: |
> | in gen_lowpart_common, at emit-rtl.c:1399 |
> | Error detected around s-regpat.adb:1029:22 |
>
> <...>
> raised TYPES.UNRECOVERABLE_ERROR : comperr.adb:423
> ../gcc-interface/Makefile:311: recipe for target 's-regpat.o' failed
Bulid log (native build on gcc76.fsffrance.org) can be found at
http://toolchain.lug-owl.de/buildbot/deliver_artifact.php?mode=view&id=4292383
for build
http://toolchain.lug-owl.de/buildbot/show_build_details.php?id=472655 .
So it's probably one of these two commits:
r228586 = 54ac7405ce75c141dae33532d491d5793fb583e3
Jan Hubicka <hubicka@ucw.cz>
Do not use TYPE_CANONICAL in useless_type_conversion
r228585 = 5b4ada2a11ab19842d77296fc4b75971ddb07434
Jeff Law <law@redhat.com>
[PATCH] Improve DOM's optimization of control statements
Haven't looked at the patches or what they're doing, but maybe you two
instantly recognize the issue?
MfG, JBG
--
Jan-Benedict Glaw jbglaw@lug-owl.de +49-172-7608481
Signature of: Wenn ich wach bin, träume ich.
the second :
[-- Attachment #2: Digital signature --]
[-- Type: application/pgp-signature, Size: 181 bytes --]
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [BUILDROBOT] Bootstrap broken in Ada
2015-10-11 20:58 ` [BUILDROBOT] Bootstrap broken in Ada (was: [patch] header file re-ordering.) Jan-Benedict Glaw
@ 2015-10-11 22:27 ` Jeff Law
2015-10-11 22:35 ` Jan Hubicka
0 siblings, 1 reply; 65+ messages in thread
From: Jeff Law @ 2015-10-11 22:27 UTC (permalink / raw)
To: Jan-Benedict Glaw, Jan Hubicka; +Cc: gcc-patches, Andrew MacLeod
On 10/11/2015 02:58 PM, Jan-Benedict Glaw wrote:
> On Thu, 2015-10-08 09:37:03 -0400, Andrew MacLeod <amacleod@redhat.com> wrote:
> [...]
>> Heres the patch for reordered headers. Building as we speak. Hard to fully
>> verify since Ada doesn't seem to bootstrap on trunk at the moment:
>>
>> +===========================GNAT BUG DETECTED==============================+
>> | 6.0.0 20151008 (experimental) (x86_64-pc-linux-gnu) GCC error: |
>> | in gen_lowpart_common, at emit-rtl.c:1399 |
>> | Error detected around s-regpat.adb:1029:22 |
>>
>> <...>
>> raised TYPES.UNRECOVERABLE_ERROR : comperr.adb:423
>> ../gcc-interface/Makefile:311: recipe for target 's-regpat.o' failed
>
> Bulid log (native build on gcc76.fsffrance.org) can be found at
> http://toolchain.lug-owl.de/buildbot/deliver_artifact.php?mode=view&id=4292383
> for build
> http://toolchain.lug-owl.de/buildbot/show_build_details.php?id=472655 .
>
> So it's probably one of these two commits:
>
> r228586 = 54ac7405ce75c141dae33532d491d5793fb583e3
> Jan Hubicka <hubicka@ucw.cz>
> Do not use TYPE_CANONICAL in useless_type_conversion
It's one of Jan's patches. I've got them reverted locally and Ada
builds fine.
jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [BUILDROBOT] Bootstrap broken in Ada
2015-10-11 22:27 ` [BUILDROBOT] Bootstrap broken in Ada Jeff Law
@ 2015-10-11 22:35 ` Jan Hubicka
0 siblings, 0 replies; 65+ messages in thread
From: Jan Hubicka @ 2015-10-11 22:35 UTC (permalink / raw)
To: Jeff Law; +Cc: Jan-Benedict Glaw, Jan Hubicka, gcc-patches, Andrew MacLeod
> On 10/11/2015 02:58 PM, Jan-Benedict Glaw wrote:
> >On Thu, 2015-10-08 09:37:03 -0400, Andrew MacLeod <amacleod@redhat.com> wrote:
> >[...]
> >>Heres the patch for reordered headers. Building as we speak. Hard to fully
> >>verify since Ada doesn't seem to bootstrap on trunk at the moment:
> >>
> >>+===========================GNAT BUG DETECTED==============================+
> >>| 6.0.0 20151008 (experimental) (x86_64-pc-linux-gnu) GCC error: |
> >>| in gen_lowpart_common, at emit-rtl.c:1399 |
> >>| Error detected around s-regpat.adb:1029:22 |
> >>
> >><...>
> >>raised TYPES.UNRECOVERABLE_ERROR : comperr.adb:423
> >>../gcc-interface/Makefile:311: recipe for target 's-regpat.o' failed
> >
> >Bulid log (native build on gcc76.fsffrance.org) can be found at
> >http://toolchain.lug-owl.de/buildbot/deliver_artifact.php?mode=view&id=4292383
> >for build
> >http://toolchain.lug-owl.de/buildbot/show_build_details.php?id=472655 .
> >
> >So it's probably one of these two commits:
> >
> > r228586 = 54ac7405ce75c141dae33532d491d5793fb583e3
> > Jan Hubicka <hubicka@ucw.cz>
> > Do not use TYPE_CANONICAL in useless_type_conversion
> It's one of Jan's patches. I've got them reverted locally and Ada
> builds fine.
A proposed patch is https://gcc.gnu.org/ml/gcc-patches/2015-10/msg01011.html
Honza
>
> jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch] header file re-ordering.
2015-10-08 13:37 ` [patch] header file re-ordering Andrew MacLeod
2015-10-08 15:29 ` Jeff Law
2015-10-11 20:58 ` [BUILDROBOT] Bootstrap broken in Ada (was: [patch] header file re-ordering.) Jan-Benedict Glaw
@ 2015-10-12 8:04 ` Jeff Law
2015-10-14 14:05 ` Andrew MacLeod
2015-10-16 19:52 ` config header file reduction patch checked in Andrew MacLeod
2015-10-22 21:07 ` [patch] header file re-ordering Jeff Law
2015-10-23 19:14 ` Jeff Law
4 siblings, 2 replies; 65+ messages in thread
From: Jeff Law @ 2015-10-12 8:04 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/08/2015 07:37 AM, Andrew MacLeod wrote:
> On 10/07/2015 06:02 PM, Jeff Law wrote:
>> On 10/01/2015 08:33 PM, Andrew MacLeod wrote:
>>> these are all in the main gcc directory. 297 files total.
>>>
>>> Everything bootstraps on x86_64-pc-linux-gnu and
>>> powerpc64le-unknown-linux-gnu. All targets in config-list.mk still
>>> build. Regressions tests also came up clean.
>>>
>>> OK for trunk?
>> So as I look at this and make various spot checks, what really stands
>> out is how often something like alias.h gets included, often in places
>> that have absolutely no business/need to be looking at that file.
>> Cut-n-paste at its worst. It happens to many others, but alias.h
>> seems to have gotten its grubby self into just about everywhere for
>> reasons unkonwn.
>>
>> I find myself also wondering if a two step approach would make this
>> easier. Step #1 being ordering the headers, step #2 being removal of
>> the duplicates. As you note, the downside is two checkins that would
>> affect most files in the tree. I guess I'll keep slogging through the
>> patch as is...
>>
>> jeff
> Heres the patch for reordered headers. Building as we speak. Hard to
> fully verify since Ada doesn't seem to bootstrap on trunk at the moment:
>
> +===========================GNAT BUG
> DETECTED==============================+
> | 6.0.0 20151008 (experimental) (x86_64-pc-linux-gnu) GCC
> error: |
> | in gen_lowpart_common, at
> emit-rtl.c:1399 |
> | Error detected around
> s-regpat.adb:1029:22 |
>
> <...>
> raised TYPES.UNRECOVERABLE_ERROR : comperr.adb:423
> ../gcc-interface/Makefile:311: recipe for target 's-regpat.o' failed
>
>
> However, the tool has been run, and I've made the minor adjustments
> required to the source files to make it work. (ie, a few multi-line
> comments and the fact that mul-tables.c is generated on the tile* targets.
>
> So this is what it should look like. I used -cp. Other languages are
> bootstrapping, and I have yet to build all the targets... that'll just
> take a day. Be nice if ada worked tho.
>
> I can run the reduction tool over the weekend (its a long weekend here
> :-) on this if you want... the other patch is a couple of weeks out of
> date anyway now.
I find myself looking at the objc stuff and wondering if it was built.
For example objc-act.c calls functions prototyped in fold-const.h, but
that header is no longer included after your patch.
Similarly in objcp we remove tree.h from objcp-decl.c, but it uses TREE
macros and I don't immediately see where those macros would be coming
from if tree.h is no longer included.
In general, I'm worried about the objc/objcp stuff. That in turn makes
me wonder about the other stuff in a more general sense. Regardless, I
think I can take a pretty good stab at the config/ changes.
A pattern that seems to play out a lot in the target files is they liked
to include insn-config.h, insn-codes.h, & timevar.h. I can see how
those typically won't be needed. The first two are amazingly common. A
comment in the nds32 port indicates that insn-config.h may have been
needed by recog.h in the past. nds32 actually included insn-config
twice :-)
Interestingly enough m32r, mcore & pdp11 still need insn-config....
The strangest thing I saw was rs6000 dropping an include of emit-rtl.h.
But presumably various powerpc targets were built, so I guess it's
really not needed.
I'm slightly concerned about the darwin, windows and solaris bits. The
former primarily because Darwin has been a general source of pain, and
in the others because I'm not sure the cross testing will exercise that
code terribly much.
I'll go ahead and approve all the config/ bits. Please be on the
lookout for any fallout.
I'll try and get into more of the other patches tomorrow.
jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch] header file re-ordering.
2015-10-12 8:04 ` [patch] header file re-ordering Jeff Law
@ 2015-10-14 14:05 ` Andrew MacLeod
2015-10-19 21:05 ` Jeff Law
2015-10-16 19:52 ` config header file reduction patch checked in Andrew MacLeod
1 sibling, 1 reply; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-14 14:05 UTC (permalink / raw)
To: Jeff Law, gcc-patches
On 10/12/2015 04:04 AM, Jeff Law wrote:
>
>> <...>
>> raised TYPES.UNRECOVERABLE_ERROR : comperr.adb:423
>> ../gcc-interface/Makefile:311: recipe for target 's-regpat.o' failed
>>
>>
>> However, the tool has been run, and I've made the minor adjustments
>> required to the source files to make it work. (ie, a few multi-line
>> comments and the fact that mul-tables.c is generated on the tile*
>> targets.
>>
>> So this is what it should look like. I used -cp. Other languages are
>> bootstrapping, and I have yet to build all the targets... that'll just
>> take a day. Be nice if ada worked tho.
>>
>> I can run the reduction tool over the weekend (its a long weekend here
>> :-) on this if you want... the other patch is a couple of weeks out of
>> date anyway now.
> I find myself looking at the objc stuff and wondering if it was built.
> For example objc-act.c calls functions prototyped in fold-const.h, but
> that header is no longer included after your patch.
wait, what? I don't see any differences to objc-act.c in the
reordering patches....
Oh, you must be looking at the original combined patch?
fold-const.h is indirectly included by cp-tree.h, which gets it from
including c-common.h. some of the output from show-headers on
objc-act.c (indentation represents levels of including. The number in
parenthesis is the number of times that include has been seen so far in
the files include list. As you can see, we include ansidecl.h a lot
:-) Most of the time there isn't much we can do about those sorts of
things. :
cp-tree.h
tm.h (2)
hard-reg-set.h
function.h (1)
c-common.h
splay-tree.h
ansidecl.h (4)
cpplib.h
symtab.h (2)
line-map.h (2)
alias.h
tree.h (2)
fold-const.h
diagnostic-core.h (1)
bversion.h
I guess It could be a useful addition to show-headers to specify a
header file you are looking for and show you where it comes from if its
included...
I any case, there is some indirection here because none of the front end
files were flattened that much
incidentally, you may notice this is the second time tree.h is
included. The first occurrence of tree.h is included directly by
objc-act.c, but it needs to be left because something between that and
cp-tree.h needs tree.h to compile. This sort of thing is resolved by
using the re-order tool, but I did not run that tool on most of the objc
and objcp files as they have some complex conditionals in their include
list:
#include "tree.h"
#include "stringpool.h"
#include "stor-layout.h"
#include "attribs.h"
#ifdef OBJCPLUS
#include "cp/cp-tree.h"
#else
#include "c/c-tree.h"
#include "c/c-lang.h"
#endif
#include "c-family/c-objc.h"
#include "langhooks.h"
Its beyond the scope of the reorder tool to deal with re-positioning
this automatically... and happens so rarely I didn't even look into it.
So they are not optimal as far as ordering goes.
>
> Similarly in objcp we remove tree.h from objcp-decl.c, but it uses
> TREE macros and I don't immediately see where those macros would be
> coming from if tree.h is no longer included.
>
Again, thanks to no flattening of the front end files :-) It also comes
from cp-tree.h. The objcp source files don't specify the full patch of
cp/cp-tree.h like objc does, so the simplistic show-headers tool doesn't
know where to look for cp-tree.h to show you what it included like in
the above example. Maybe I'll tweak the tool to look in common header
directories.
> In general, I'm worried about the objc/objcp stuff. That in turn
> makes me wonder about the other stuff in a more general sense.
> Regardless, I think I can take a pretty good stab at the config/ changes.
>
So you can not worry about that. It builds fine.
>
> A pattern that seems to play out a lot in the target files is they
> liked to include insn-config.h, insn-codes.h, & timevar.h. I can see
> how those typically won't be needed. The first two are amazingly
> common. A comment in the nds32 port indicates that ii may have been
> needed by recog.h in the past. nds32 actually included insn-config
> twice :-)
>
>
> Interestingly enough m32r, mcore & pdp11 still need insn-config....
most ports get insn-config.h from optabs.h:
optabs.h
optabs-query.h
insn-opinit.h (1)
optabs-libfuncs.h
insn-opinit.h (2)
insn-config.h
I think those ports that still include it do not include optabs.h
>
> The strangest thing I saw was rs6000 dropping an include of
> emit-rtl.h. But presumably various powerpc targets were built, so I
> guess it's really not needed.
It gets emit-rtl.h from ira.h:
regs.h
ira.h
emit-rtl.h
recog.h
insn-codes.h (2)
>
> I'm slightly concerned about the darwin, windows and solaris bits.
> The former primarily because Darwin has been a general source of pain,
> and in the others because I'm not sure the cross testing will exercise
> that code terribly much.
>
Its easy enough to NOT do this for any of those files if were too
worried about them. Its also easy to revert a single file if it
appears to be an issue. Thats why I wanted to run as many of these
on the compile farm natively as I could... but alas, powerPC was the
only thing the farm really offered me.
> I'll go ahead and approve all the config/ bits. Please be on the
> lookout for any fallout.
even darwin, windows and solaris? :-)
Im going to tweak the show-headers tool to look in a few common places,
and look for requested headers, and then repost the tools change with
most of the documentation changes Bernd pointed out. I didn't do lot
to the code itself, but I did comment a few more things.
Perhaps using the tool when you have the above questions would show its
usefulness :-) Its particularly useful when combined with
included-by, so you can see how many files include specific ones, and it
does point out some silly things, but we cant fix all of them without
flattening EVERYTHING... I tried to just do the main parts in the
backend we were likely to care about. Of course, the tool is useless
unless the patch is applied :-P However, I can continue to answer
questions easily enough :-)
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib - second cut
2015-10-07 16:35 ` Andrew MacLeod
@ 2015-10-14 15:14 ` Andrew MacLeod
2015-11-03 6:06 ` Jeff Law
0 siblings, 1 reply; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-14 15:14 UTC (permalink / raw)
To: Bernd Schmidt, gcc-patches, law >> Jeff Law
[-- Attachment #1: Type: text/plain, Size: 932 bytes --]
Here's the latest version of the tools for a sub directory in contrib.
I've handled all the feedback, except I have not fully commented the
python code in the tools, nor followed any particular coding
convention... Documentation has been handled, and I've added some
additional comments to the places which were noted as being unclear. Ive
also removed all tabs from the source files.
Ive also updated show-headers slightly to be a little more
error-resistant and to put some emphasis on any header files specified
on the command as being of interest . (when there are 140 shown, it can
be hard to find the one you are looking for sometimes)
Do we wish to impose anything in particular on the source for tools
going into this sub-directory of contrib? The other tools in contrib
don't seem to have much in the way of coding standards. I also
wonder if anyone other than me will look at them much :-)
Andrew
[-- Warning: decoded text below may be mangled, UTF-8 assumed --]
[-- Attachment #2: header3.patch --]
[-- Type: text/x-patch; name="header3.patch", Size: 83193 bytes --]
headers/
* README : New File.
* count-headers : New File.
* gcc-order-headers : New File.
* graph-header-logs : New File.
* graph-include-web : New File.
* headerutils.py : New File.
* included-by : New File.
* reduce-headers : New File.
* replace-header : New File.
* show-headers : New File.
Index: headers/README
===================================================================
*** headers/README (revision 0)
--- headers/README (working copy)
***************
*** 0 ****
--- 1,283 ----
+ Quick start documentation for the header file utilities.
+
+ This isn't a full breakdown of the tools, just they typical use scenarios.
+
+ - Each tool accepts -h to show it's usage. Usually no parameters will also
+ trigger the help message. Help may specify additional functionality to what is
+ listed here.
+
+ - For all tools, option format for specifying filenames must have no spaces
+ between the option and filename.
+ ie.: tool -lfilename.h target.h
+
+ - Many of the tools are required to be run from the core gcc source directory
+ containing coretypes.h. Typically that is in gcc/gcc from a source checkout.
+ For these tools to work on files not in this directory, their path needs to be
+ specified on the command line.
+ ie.: tool c/c-decl.c lto/lto.c
+
+ - options can be intermixed with filenames anywhere on the command line
+ ie. tool ssa.h rtl.h -a is equivalent to
+ tool ssa.h -a rtl.h
+
+
+
+
+
+ gcc-order-headers
+ -----------------
+ This will reorder any primary backend headers files known to the tool into a
+ canonical order which will resolve any hidden dependencies they may have.
+ Any unknown headers will simply be placed after the recognized files, and
+ retain the same relative ordering they had.
+
+ This tool must be run in the core gcc source directory.
+
+ Simply execute the command listing any files you wish to process on the
+ command line.
+
+ Any files which are changed are output, and the original is saved with a
+ .bak extention.
+
+ ex.: gcc-order-headers tree-ssa.c c/c-decl.c
+
+ -s will list all of the known headers in their canonical order. It does not
+ show which of those headers include other headers, just the final canonical
+ ordering.
+
+ if any header files are included within a conditional code block, the tool
+ will issue a message and not change the file. When this happens, you can
+ manually inspect the file to determine if reordering it is actually OK. Then
+ rerun the command with the -i option. This will ignore the conditional error
+ condition and perform the re-ordering anyway.
+
+ If any #include line has the beginning of a multi-line comment, it will also
+ refuse to process the file until that is resolved by terminating the comment
+ on the same line, or removing it.
+
+
+ show-headers
+ ------------
+ This will show the include structure for any given file. Each level of nesting
+ is indented, and when any duplicate headers are seen, they have their
+ duplicate number shown
+
+ -i may be used to specify alternate search directories for headers to parse.
+ -s specifies headers to look for and emphasize in the output.
+
+ This tool must be run in the core gcc source directory.
+
+ ex.: show-headers -sansidecl.h tree-ssa.c
+ tree-ssa.c
+ config.h
+ auto-host.h
+ ansidecl.h (1) <<-------
+ system.h
+ safe-ctype.h
+ filenames.h
+ hashtab.h (1)
+ ansidecl.h (2) <<-------
+ libiberty.h
+ ansidecl.h (3) <<-------
+ hwint.h
+ coretypes.h
+ machmode.h (1)
+ insn-modes.h (1)
+ signop.h
+ <...>
+
+
+
+
+ count-headers
+ -------------
+ simply count all the headers found in the specified files. A summary is
+ printed showing occurrences from high to low.
+
+ ex.: count-headers tree*.c
+ 86 : coretypes.h
+ 86 : config.h
+ 86 : system.h
+ 86 : tree.h
+ 82 : backend.h
+ 80 : gimple.h
+ 72 : gimple-iterator.h
+ 70 : ssa.h
+ 68 : fold-const.h
+ <...>
+
+
+
+ included-by
+ -----------
+ This tool will search all the .c,.cc and .h files and output a list of files
+ which include the specified header(s).
+
+ A 4 level deep 'find' of all source files is performed from the current
+ directory and each of those is inspected for a #include of the specified
+ headers. So expect a little bit of slowness.
+
+ -i limits the search to only other header files.
+ -c limits the search to .c and .cc files.
+ -a shows only source files which include all specified headers.
+ -f allows you to specify a file which contains a list of source files to
+ check rather than performing the much slower find command.
+
+ ex: included-by tree-vectorizer.h
+ config/aarch64/aarch64.c
+ config/i386/i386.c
+ config/rs6000/rs6000.c
+ tree-loop-distribution.c
+ tree-parloops.c
+ tree-ssa-loop-ivopts.c
+ tree-ssa-loop.c
+
+
+
+
+ replace-header
+ --------------
+ This tool simply replaces a single header file with one or more other headers.
+ -r specifies the include to replace, and one or more -f options specify the
+ replacement headers, in the order they occur.
+
+ This is commonly used in conjunction with 'included-by' to change all
+ occurrences of a header file to something else, or to insert new headers
+ before or after.
+
+ ex: to insert #include "before.h" before every occurence of tree.h in all
+ .c and .cc source files:
+
+ replace-header -rtree.h -fbefore.h -ftree.h `included-by -c tree.h`
+
+
+
+
+ reduce-headers
+ --------------
+
+ This tool removes any header files which are not needed from a source file.
+
+ This tool must be run for the core gcc source directory, and requires either
+ a native build and sometimes target builds, depending on what you are trying
+ to reduce.
+
+ it is good practice to run 'gcc-order-headers' on a source file before trying
+ to reduce it. This removes duplicates and performs some simplifications
+ which reduce the chances of the reduction tool missing things.
+
+ start with a completely bootstrapped native compiler.
+
+ Any desired target builds should be built in one directory using a modified
+ config-list.mk file which does not delete the build directory when it is done.
+ any target directories which do not successfully complete a 'make all-gcc'
+ may cause the tool to not reduce anything.
+ (todo - provide a config-list.mk that leaves successful target builds, but
+ deletes ones which do not compile)
+
+ The tool will examine all the target builds to determine which targets build
+ the file, and include those targets in the testing.
+
+
+
+ The tool will analyze a source file and attempt to remove each non-conditional
+ header from last to first in the file.:
+ It will first attempt to build the native all-gcc target.
+ If that succeeds, it will attempt to build any target build .o files
+ If that succeeds, it will check to see if there are any conditional
+ compilation dependencies between this header file and the source file or
+ any header which have already been determined as non-removable.
+ If all these tests are passed, the header file is determined to be removable
+ and is removed from the source file.
+ This continues until all headers have been checked.
+ At this point, a bootstrap is attempted in the native build, and if that
+ passes the file is considered reduced.
+
+ Any files from the config subdirectory require target builds to be present
+ in order to proceed.
+
+ A small subset of targets has been determined to provide excellent coverage,
+ at least as of Aug 31/15 . They were found by reducing all the files
+ contained in libbackend.a oer a full set of targets(207). All conditions
+ which disallowed removal of a header file were triggered by one or more of
+ these targets. They are also known to the tool. When building targets it
+ will check those targets before the rest.
+ This coverage can be achieved by building config-list.mk with :
+ LIST="aarch64-linux-gnu arm-netbsdelf avr-rtems c6x-elf epiphany-elf hppa2.0-hpux10.1 i686-mingw32crt i686-pc-msdosdjgpp mipsel-elf powerpc-eabisimaltivec rs6000-ibm-aix5.1.0 sh-superh-elf sparc64-elf spu-elf"
+
+ -b specifies the native bootstrapped build root directory
+ -t specifies a target build root directory that config-list.mk was run from
+ -f is used to limit the headers for consideration.
+
+ example:
+
+ mkdir gcc // checkout gcc in subdir gcc
+ mdsir build // boostrap gcc in subdir build
+ mkdir target // create target directory and run config-list.mk
+ cd gcc/gcc
+
+ reduce-headers -b../../build -t../../targets -falias.h -fexpr.h tree*.c (1)
+ # This will attempt to remove only alias.h and expr.h from tree*.c
+
+ reduce-headers -b../../build -t../../targets tree-ssa-live.c
+ # This will attempt to remove all header files from tree-ssa-live.c
+
+
+ the tool will generate a number of log files:
+
+ reduce-headers.log : All compilation failures from attempted reductions.
+ reduce-headers.sum : One line summary of what happened to each source file.
+
+ (All the remaining logs are appended to, so if the tool is run multiple times
+ these files are just added to. You must physically remove them yourself in
+ order to reset the logs.)
+
+ reduce-headers-kept.log: List of all the successful compiles that were
+ ignored because of conditional macro dependencies
+ and why it thinks that is the case
+ $src.c.log : for each failed header removal, the compilation
+ messages as to why it failed.
+ $header.h.log: The same log is put into the relevant header log as well.
+
+
+ a sample output from ira.c.log:
+
+ Compilation failed:
+ for shrink-wrap.h:
+
+ ============================================
+ /gcc/2015-09-09/gcc/gcc/ira.c: In function âbool split_live_ranges_for_shrink_wrap()â:
+ /gcc/2015-09-09/gcc/gcc/ira.c:4839:8: error: âSHRINK_WRAPPING_ENABLEDâ was not declared in this scope
+ if (!SHRINK_WRAPPING_ENABLED)
+ ^
+ make: *** [ira.o] Error 1
+
+
+ the same message would be put into shrink-wrap.h.log.
+
+
+
+ graph-header-logs
+ -----------------
+ This tool will parse all the messages from the .C files, looking for failures
+ that show up in other headers... meaning there is a compilation dependency
+ between the 2 header files.
+
+ The tool will aggregate all these and generate a graph of the dependencies
+ exposed during compilation. Red lines indicate dependencies that are
+ present because a header file physically includes another file. Black lines
+ represent data dependencies causing compilation failures if the header is
+ not present.
+
+ ex.: graph-header-logs *.c.log
+
+
+
+ graph-include-web
+ -----------------
+ This tool can be used to visualize the include structure in files. It is
+ rapidly turned useless if you specify too many things, but it can be
+ useful for finding cycles and redundancies, or simply to see what a single
+ file looks like.
+
+ ex.: graph-include-web tree.c
Index: headers/count-headers
===================================================================
*** headers/count-headers (revision 0)
--- headers/count-headers (working copy)
***************
*** 0 ****
--- 1,58 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+
+ usage = False
+ src = list ()
+ flist = { }
+ process_h = True
+ process_c = True
+ verbose = False
+ all_inc = True
+ level = 0
+
+ only_use_list = list ()
+
+ for x in sys.argv[1:]:
+ if x[0:2] == "-h":
+ usage = True
+ else:
+ src.append (x)
+
+
+ if not usage and len (src) > 0:
+ incl = { }
+ for fn in src:
+ src = readwholefile (fn)
+ dup = { }
+ for line in src:
+ d = find_pound_include (line, True, True)
+ if d != "" and d[-2:] ==".h":
+ if dup.get (d) == None:
+ if incl.get (d) == None:
+ incl[d] = 1
+ else:
+ incl[d] = incl[d]+ 1
+ dup[d] = 1
+
+ l = list ()
+ for i in incl:
+ l.append ((incl[i], i))
+ l.sort (key=lambda tup:tup[0], reverse=True)
+
+ for f in l:
+ print str (f[0]) + " : " + f[1]
+
+ else:
+ print "count-headers file1 [filen]"
+ print "Count the number of occurrences of all includes across all listed files"
+
+
+
+
+
+
Property changes on: headers/count-headers
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: headers/gcc-order-headers
===================================================================
*** headers/gcc-order-headers (revision 0)
--- headers/gcc-order-headers (working copy)
***************
*** 0 ****
--- 1,397 ----
+ #! /usr/bin/python2
+ import os
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+ import Queue
+
+ file_list = list ()
+ usage = False
+
+ ignore_conditional = False
+
+ order = [
+ "system.h",
+ "coretypes.h",
+ "backend.h",
+ "target.h",
+ "rtl.h",
+ "c-family/c-target.h",
+ "c-family/c-target-def.h",
+ "tree.h",
+ "cp/cp-tree.h",
+ "c-family/c-common.h", # these must come before diagnostic.h
+ "c/c-tree.h",
+ "fortran/gfortran.h",
+ "gimple.h",
+ "cfghooks.h",
+ "df.h",
+ "tm_p.h",
+ "gimple-iterators.h",
+ "ssa.h",
+ "expmed.h",
+ "optabs.h",
+ "regs.h",
+ "ira.h",
+ "ira-int.h",
+ "gimple-streamer.h"
+
+ ]
+
+ exclude_special = [ "bversion.h", "obstack.h", "insn-codes.h", "hooks.h" ]
+
+ # includes is a dictionary indexed by a header files basename.
+ # it consists of a 2 element tuple:
+ # [0] - Name of header file which included this header.
+ # [1] - vector of header file names included by this file.
+
+ includes = { }
+
+ # when a header is included multiple times, indexing this dictionary will
+ # return a vector of all the headers which included it.
+ dups = { }
+
+ # When creating the master list, do not descend into these files for what
+ # they include. Simply put the file itself in the list. This is primarily
+ # required because the front end files inlcude orders tend to be at odds with
+ # the order of middle end files, and its impossible to synchronize them.\
+ # They are ordered such that everything resolves properly.
+ exclude_processing = [ "tree-vectorizer.h" , "c-target.h", "c-target-def.h", "cp-tree.h", "c-common.h", "c-tree.h", "gfortran.h" ]
+
+ master_list = list ()
+ # where include file comes from in src
+ h_from = { }
+
+ # create the master ordering list... this is the desired order of headers
+ def create_master_list (fn, verbose):
+ if fn not in exclude_processing:
+ for x in includes[fn][1]:
+ create_master_list (x, verbose)
+ if not fn in master_list:
+ # Don't put diagnostic*.h into the ordering list. It is special since
+ # various front ends have to set GCC_DIAG_STYLE before including it.
+ # for each file, we'll tailor where it belongs by looking at the include
+ # list and determine its position appropriately.
+ if fn != "diagnostic.h" and fn != "diagnostic-core.h":
+ master_list.append (fn)
+ if (verbose):
+ print fn + " included by: " + includes[fn][0]
+
+
+
+ def print_dups ():
+ if dups:
+ print "\nduplicated includes"
+ for i in dups:
+ string = "dup : " + i + " : "
+ string += includes[i][0]
+ for i2 in dups[i]:
+ string += ", "+i2
+ print string
+
+
+ def process_known_dups ():
+ # rtl.h gets tagged as a duplicate includer for all of coretypes.h, but that
+ # is really for only generator files
+ rtl_remove = includes["coretypes.h"][1] + ["statistics.h", "vec.h"]
+ if dups:
+ for i in rtl_remove:
+ if dups[i] and "rtl.h" in dups[i]:
+ dups[i].remove("rtl.h")
+ if not dups[i]:
+ dups.pop (i, None)
+
+ # make sure diagnostic.h is the owner of diagnostic-core.h
+ if includes["diagnostic-core.h"][0] != "diagnostic.h":
+ dups["diagnostic-core.h"].append (includes["diagnostic-core.h"][0])
+ includes["diagnostic-core.h"] = ("diagnostic.h", includes["diagnostic-core.h"][1])
+
+ # This function scans back thorugh the list of headers which included other
+ # headers to determine what file in HEADER_LIST brought 'HEADER' in.
+ def indirectly_included (header, header_list):
+ nm = os.path.basename (header)
+ while nm and includes.get(nm):
+ if includes[nm][0] in header_list:
+ return includes[nm][0]
+ nm = includes[nm][0]
+
+ # diagnostic.h and diagnostic-core.h may not show up because we removed them
+ # from the header list to manually position in an appropriate place. They have
+ # specific requirements that they need to occur after certain FE files which
+ # may overide the definition of GCC_DIAG_STYLE.
+ # Check the dup list for whete they may have been included from and return
+ # that header.
+ if header == "diagnostic-core.h":
+ if dups.get("diagnostic-core.h"):
+ for f in dups["diagnostic-core.h"]:
+ if f in header_list:
+ return f
+ else:
+ if header in header_list:
+ return header
+ # Now check if diagnostics is included indirectly anywhere
+ header = "diagnostic.h"
+
+ if header == "diagnostic.h":
+ if dups.get("diagnostic.h"):
+ for f in dups["diagnostic.h"]:
+ if f in header_list:
+ return f
+ else:
+ if header in header_list:
+ return header
+
+ return ""
+
+
+ # This function will take a list of headers from a source file and return
+ # the desired new new order of the canonical headers in DESIRED_ORDER.
+ def get_new_order (src_h, desired_order):
+ new_order = list ()
+ for h in desired_order:
+ if h in master_list:
+ # Create the list of nested headers which included this file.
+ iclist = list ()
+ ib = includes[h][0]
+ while ib:
+ iclist.insert(0, ib)
+ ib = includes[ib][0]
+ if iclist:
+ for x in iclist:
+ # If header is in the source code, and we are allowed to look inside
+ if x in src_h and x not in exclude_processing:
+ if x not in new_order and x[:10] != "diagnostic" and h not in exclude_special:
+ new_order.append (x)
+ break;
+ else:
+ if h not in new_order:
+ new_order.append (h)
+
+ f = ""
+ if "diagnostic.h" in src_h:
+ f = "diagnostic.h"
+ elif "diagnostic-core.h" in src_h:
+ f = "diagnostic-core.h"
+
+
+ # If either diagnostic header was directly included in the main file, check to
+ # see if its already included indirectly, or whether we need to add it to the
+ # end of the canonically orders headers.
+ if f:
+ ii = indirectly_included (f, src_h)
+ if not ii or ii == f:
+ new_order.append (f)
+
+ return new_order
+
+
+
+ # stack of files to process
+ process_stack = list ()
+
+ def process_one (info):
+ i = info[0]
+ owner = info[1]
+ name = os.path.basename(i)
+ if os.path.exists (i):
+ if includes.get(name) == None:
+ l = find_unique_include_list (i)
+ # create a list which has just basenames in it
+ new_list = list ()
+ for x in l:
+ new_list.append (os.path.basename (x))
+ process_stack.append((x, name))
+ includes[name] = (owner, new_list)
+ elif owner:
+ if dups.get(name) == None:
+ dups[name] = [ owner ]
+ else:
+ dups[name].append (owner)
+ else:
+ # seed tm.h with options.h since it is a build file and won't be seen.
+ if not includes.get(name):
+ if name == "tm.h":
+ includes[name] = (owner, [ "options.h" ])
+ includes["options.h"] = ("tm.h", list ())
+ else:
+ includes[name] = (owner, list ())
+
+
+ show_master = False
+
+ for arg in sys.argv[1:]:
+ if arg[0:1] == "-":
+ if arg[0:2] == "-h":
+ usage = True
+ elif arg[0:2] == "-i":
+ ignore_conditional = True
+ elif arg[0:2] == "-v":
+ show_master = True
+ else:
+ print "Error: unrecognized option " + arg
+ elif os.path.exists(arg):
+ file_list.append (arg)
+ else:
+ print "Error: file " + arg + " Does not exist."
+ usage = True
+
+ if not file_list and not show_master:
+ usage = True
+
+ if not usage and not os.path.exists ("coretypes.h"):
+ usage = True
+ print "Error: Must run command in main gcc source directory containing coretypes.h\n"
+
+ # process diagnostic.h first.. it's special since GCC_DIAG_STYLE can be
+ # overridden by languages, but must be done so by a file included BEFORE it.
+ # so make sure it isn't seen as included by one of those files by making it
+ # appear to be included by the src file.
+ process_stack.insert (0, ("diagnostic.h", ""))
+
+ # Add the list of files in reverse order since it is processed as a stack later
+ for i in order:
+ process_stack.insert (0, (i, "") )
+
+ # build up the library of what header files include what other files.
+ while process_stack:
+ info = process_stack.pop ()
+ process_one (info)
+
+ # Now create the master ordering list
+ for i in order:
+ create_master_list (os.path.basename (i), show_master)
+
+ # handle warts in the duplicate list
+ process_known_dups ()
+ desired_order = master_list
+
+ if show_master:
+ print " Canonical order of gcc include files: "
+ for x in master_list:
+ print x
+ print " "
+
+ if usage:
+ print "gcc-order-headers [-i] [-v] file1 [filen]"
+ print " Ensures gcc's headers files are included in a normalized form with"
+ print " redundant headers removed. The original files are saved in filename.bak"
+ print " Outputs a list of files which changed."
+ print " -i ignore conditional compilation."
+ print " Use after examining the file to be sure includes within #ifs are safe"
+ print " Any headers within conditional sections will be ignored."
+ print " -v Show the canonical order of known headers"
+ sys.exit(0)
+
+
+ didnt_do = list ()
+
+ for fn in file_list:
+ nest = 0
+ src_h = list ()
+ src_line = { }
+
+ master_list = list ()
+
+ includes = { }
+ dups = { }
+
+ iinfo = process_ii_src (fn)
+ src = ii_src (iinfo)
+ include_list = ii_include_list (iinfo)
+
+ if ii_include_list_cond (iinfo):
+ if not ignore_conditional:
+ print fn + ": Cannot process due to conditional compilation of includes"
+ didnt_do.append (fn)
+ src = list ()
+
+ if not src:
+ continue
+
+ process_stack = list ()
+ # prime the stack with headers in the main ordering list so we get them in
+ # this order.
+ for d in order:
+ if d in include_list:
+ process_stack.insert (0, (d, ""))
+
+ for d in include_list:
+ nm = os.path.basename(d)
+ src_h.append (nm)
+ iname = d
+ iname2 = os.path.dirname (fn) + "/" + d
+ if not os.path.exists (d) and os.path.exists (iname2):
+ iname = iname2
+ if iname not in process_stack:
+ process_stack.insert (0, (iname, ""))
+ src_line[nm] = ii_src_line(iinfo)[d]
+ if src_line[nm].find("/*") != -1 and src_line[nm].find("*/") == -1:
+ # this means we have a multi line comment, abort!'
+ print fn + ": Cannot process due to a multi-line comment :"
+ print " " + src_line[nm]
+ if fn not in didnt_do:
+ didnt_do.append (fn)
+ src = list ()
+
+ if not src:
+ continue
+
+ # Now create the list of includes as seen by the source file.
+ while process_stack:
+ info = process_stack.pop ()
+ process_one (info)
+
+ for i in include_list:
+ create_master_list (os.path.basename (i), False)
+
+ new_src = list ()
+ header_added = list ()
+ new_order = list ()
+ for line in src:
+ d = find_pound_include (line, True, True)
+ if not d or d[-2:] != ".h":
+ new_src.append (line)
+ else:
+ if d == order[0] and not new_order:
+ new_order = get_new_order (src_h, desired_order)
+ for i in new_order:
+ new_src.append (src_line[i])
+ # if not seen, add it.
+ if i not in header_added:
+ header_added.append (i)
+ else:
+ nm = os.path.basename(d)
+ if nm not in header_added:
+ iby = indirectly_included (nm, src_h)
+ if not iby:
+ new_src.append (line)
+ header_added.append (nm)
+
+ if src != new_src:
+ os.rename (fn, fn + ".bak")
+ fl = open(fn,"w")
+ for line in new_src:
+ fl.write (line)
+ fl.close ()
+ print fn
+
+
+ if didnt_do:
+ print "\n\n Did not process the following files due to conditional dependencies:"
+ str = ""
+ for x in didnt_do:
+ str += x + " "
+ print str
+ print "\n"
+ print "Please examine to see if they are safe to process, and re-try with -i. "
+ print "Safeness is determined by checking whether any of the reordered headers are"
+ print "within a conditional and could be hauled out of the conditional, thus changing"
+ print "what the compiler will see."
+ print "Multi-line comments after a #include can also cause failuer, they must be turned"
+ print "into single line comments or removed."
+
+
+
+
Property changes on: headers/gcc-order-headers
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: headers/graph-header-logs
===================================================================
*** headers/graph-header-logs (revision 0)
--- headers/graph-header-logs (working copy)
***************
*** 0 ****
--- 1,227 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+
+ header_roots = { }
+ extra_edges = list()
+ verbose = False
+ verbosity = 0
+ nodes = list()
+
+ def unpretty (name):
+ if name[-2:] == "_h":
+ name = name[:-2] + ".h"
+ return name.replace("_", "-")
+
+ def pretty_name (name):
+ name = os.path.basename (name)
+ return name.replace(".","_").replace("-","_").replace("/","_").replace("+","_");
+
+ depstring = ("In file included from", " from")
+
+ # indentation indicates nesting levels of included files
+ ignore = [ "coretypes_h",
+ "machmode_h",
+ "signop_h",
+ "wide_int_h",
+ "double_int_h",
+ "real_h",
+ "fixed_value_h",
+ "hash_table_h",
+ "statistics_h",
+ "ggc_h",
+ "vec_h",
+ "hashtab_h",
+ "inchash_h",
+ "mem_stats_traits_h",
+ "hash_map_traits_h",
+ "mem_stats_h",
+ "hash_map_h",
+ "hash_set_h",
+ "input_h",
+ "line_map_h",
+ "is_a_h",
+ "system_h",
+ "config_h" ]
+
+ def process_log_file (header, logfile):
+ if header_roots.get (header) != None:
+ print "Error: already processed log file: " + header + ".log"
+ return
+ hname = pretty_name (header)
+ header_roots[hname] = { }
+
+ sline = list();
+ incfrom = list()
+ newinc = True
+ for line in logfile:
+ if len (line) > 21 and line[:21] in depstring:
+ if newinc:
+ incfrom = list()
+ newinc = False
+ fn = re.findall(ur".*/(.*?):", line)
+ if len(fn) != 1:
+ continue
+ if fn[0][-2:] != ".h":
+ continue
+ n = pretty_name (fn[0])
+ if n not in ignore:
+ incfrom.append (n)
+ continue
+ newinc = True
+ note = re.findall (ur"^.*note: (.*)", line)
+ if len(note) > 0:
+ sline.append (("note", note[0]))
+ else:
+ err_msg = re.findall (ur"^.*: error: (.*)", line)
+ if len(err_msg) == 1:
+ msg = err_msg[0]
+ if (len (re.findall("error: forward declaration", line))) != 0:
+ continue
+ path = re.findall (ur"^(.*?):.*error: ", line)
+ if len(path) != 1:
+ continue
+ if path[0][-2:] != ".h":
+ continue
+ fname = pretty_name (path[0])
+ if fname in ignore or fname[0:3] == "gt_":
+ continue
+ sline.append (("error", msg, fname, incfrom))
+
+ print str(len(sline)) + " lines to process"
+ lastline = "note"
+ for line in sline:
+ if line[0] != "note" and lastline[0] == "error":
+ fname = lastline[2]
+ msg = lastline[1]
+ incfrom = lastline[3]
+ string = ""
+ ofname = fname
+ if len(incfrom) != 0:
+ for t in incfrom:
+ string = string + t + " : "
+ ee = (fname, t)
+ if ee not in extra_edges:
+ extra_edges.append (ee)
+ fname = t
+ print string
+
+ if hname not in nodes:
+ nodes.append(hname)
+ if fname not in nodes:
+ nodes.append (ofname)
+ for y in incfrom:
+ if y not in nodes:
+ nodes.append (y)
+
+
+ if header_roots[hname].get(fname) == None:
+ header_roots[hname][fname] = list()
+ if msg not in header_roots[hname][fname]:
+ print string + ofname + " : " +msg
+ header_roots[hname][fname].append (msg)
+ lastline = line;
+
+
+ dotname = "graph.dot"
+ graphname = "graph.png"
+
+
+ def build_dot_file (file_list):
+ output = open(dotname, "w")
+ output.write ("digraph incweb {\n");
+ for x in file_list:
+ if os.path.exists (x) and x[-4:] == ".log":
+ header = x[:-4]
+ logfile = open(x).read().splitlines()
+ process_log_file (header, logfile)
+ elif os.path.exists (x + ".log"):
+ logfile = open(x + ".log").read().splitlines()
+ process_log_file (x, logfile)
+
+ for n in nodes:
+ fn = unpretty(n)
+ label = n + " [ label = \"" + fn + "\" ];"
+ output.write (label + "\n")
+ if os.path.exists (fn):
+ h = open(fn).read().splitlines()
+ for l in h:
+ t = find_pound_include (l, True, False)
+ if t != "":
+ t = pretty_name (t)
+ if t in ignore or t[-2:] != "_h":
+ continue
+ if t not in nodes:
+ nodes.append (t)
+ ee = (t, n)
+ if ee not in extra_edges:
+ extra_edges.append (ee)
+
+ depcount = list()
+ for h in header_roots:
+ for dep in header_roots[h]:
+ label = " [ label = "+ str(len(header_roots[h][dep])) + " ];"
+ string = h + " -> " + dep + label
+ output.write (string + "\n");
+ if verbose:
+ depcount.append ((h, dep, len(header_roots[h][dep])))
+
+ for ee in extra_edges:
+ string = ee[0] + " -> " + ee[1] + "[ color=red ];"
+ output.write (string + "\n");
+
+
+ if verbose:
+ depcount.sort(key=lambda tup:tup[2])
+ for x in depcount:
+ print " ("+str(x[2])+ ") : " + x[0] + " -> " + x[1]
+ if (x[2] <= verbosity):
+ for l in header_roots[x[0]][x[1]]:
+ print " " + l
+
+ output.write ("}\n");
+
+
+ files = list()
+ dohelp = False
+ edge_thresh = 0
+ for arg in sys.argv[1:]:
+ if arg[0:2] == "-o":
+ dotname = arg[2:]+".dot"
+ graphname = arg[2:]+".png"
+ elif arg[0:2] == "-h":
+ dohelp = True
+ elif arg[0:2] == "-v":
+ verbose = True
+ if len(arg) > 2:
+ verbosity = int (arg[2:])
+ if (verbosity == 9):
+ verbosity = 9999
+ elif arg[0:1] == "-":
+ print "Unrecognized option " + arg
+ dohelp = True
+ else:
+ files.append (arg)
+
+ if len(sys.argv) == 1:
+ dohelp = True
+
+ if dohelp:
+ print "Parses the log files from the reduce-headers tool to generate"
+ print "dependency graphs for the include web for specified files."
+ print "Usage: [-nnum] [-h] [-v[n]] [-ooutput] file1 [[file2] ... [filen]]"
+ print " -ooutput : Specifies output to output.dot and output.png"
+ print " Defaults to 'graph.dot and graph.png"
+ print " -vn : verbose mode, shows the number of connections, and if n"
+ print " is specified, show the messages if # < n. 9 is infinity"
+ print " -h : help"
+ else:
+ print files
+ build_dot_file (files)
+ os.system ("dot -Tpng " + dotname + " -o" + graphname)
+
+
Property changes on: headers/graph-header-logs
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: headers/graph-include-web
===================================================================
*** headers/graph-include-web (revision 0)
--- headers/graph-include-web (working copy)
***************
*** 0 ****
--- 1,122 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+
+ def pretty_name (name):
+ return name.replace(".","_").replace("-","_").replace("/","_").replace("+","_");
+
+
+ include_files = list()
+ edges = 0
+ one_c = False
+ clink = list()
+ noterm = False
+
+ def build_inclist (output, filen):
+ global edges
+ global one_c
+ global clink
+ global noterm
+ inc = build_include_list (filen)
+ if one_c and filen[-2:] == ".c":
+ pn = "all_c"
+ else:
+ pn = pretty_name(filen)
+ for nm in inc:
+ if pn == "all_c":
+ if nm not in clink:
+ if len(build_include_list(nm)) != 0 or not noterm:
+ output.write (pretty_name(nm) + " -> " + pn + ";\n")
+ edges = edges + 1
+ if nm not in include_files:
+ include_files.append(nm)
+ clink.append (nm)
+ else:
+ output.write (pretty_name(nm) + " -> " + pn + ";\n")
+ edges = edges + 1
+ if nm not in include_files:
+ include_files.append(nm)
+ return len(inc) == 0
+
+ dotname = "graph.dot"
+ graphname = "graph.png"
+
+ def build_dot_file (file_list):
+ global one_c
+ output = open(dotname, "w")
+ output.write ("digraph incweb {\n");
+ if one_c:
+ output.write ("all_c [shape=box];\n");
+ for x in file_list:
+ if x[-2:] == ".h":
+ include_files.append (x)
+ elif os.path.exists (x):
+ build_inclist (output, x)
+ if not one_c:
+ output.write (pretty_name (x) + "[shape=box];\n")
+
+ for x in include_files:
+ term = build_inclist (output, x)
+ if term:
+ output.write (pretty_name(x) + " [style=filled];\n")
+
+ output.write ("}\n");
+
+
+ files = list()
+ dohelp = False
+ edge_thresh = 0
+ for arg in sys.argv[1:]:
+ if arg[0:2] == "-o":
+ dotname = arg[2:]+".dot"
+ graphname = arg[2:]+".png"
+ elif arg[0:2] == "-h":
+ dohelp = True
+ elif arg[0:2] == "-a":
+ one_c = True
+ if arg[0:3] == "-at":
+ noterm = True
+ elif arg[0:2] == "-f":
+ if not os.path.exists (arg[2:]):
+ print "Option " + arg +" doesn't specify a proper file"
+ dohelp = True
+ else:
+ sfile = open (arg[2:], "r")
+ srcdata = sfile.readlines()
+ sfile.close()
+ for x in srcdata:
+ files.append(x.rstrip())
+ elif arg[0:2] == "-n":
+ edge_thresh = int (arg[2:])
+ elif arg[0:1] == "-":
+ print "Unrecognized option " + arg
+ dohelp = True
+ else:
+ files.append (arg)
+
+ if len(sys.argv) == 1:
+ dohelp = True
+
+ if dohelp:
+ print "Generates a graph of the include web for specified files."
+ print "Usage: [-finput_file] [-h] [-ooutput] [file1 ... [filen]]"
+ print " -finput_file : Input file containing a list of files to process."
+ print " -ooutput : Specifies output to output.dot and output.png."
+ print " defaults to graph.dot and graph.png."
+ print " -nnum : Specifies the # of edges beyond which sfdp is invoked. def=0."
+ print " -a : Aggregate all .c files to 1 file. Shows only include web."
+ print " -at : Aggregate, but don't include terminal.h to .c links."
+ print " -h : Print this help."
+ else:
+ print files
+ build_dot_file (files)
+ if edges > edge_thresh:
+ os.system ("sfdp -Tpng " + dotname + " -o" + graphname)
+ else:
+ os.system ("dot -Tpng " + dotname + " -o" + graphname)
+
+
Property changes on: headers/graph-include-web
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: headers/headerutils.py
===================================================================
*** headers/headerutils.py (revision 0)
--- headers/headerutils.py (working copy)
***************
*** 0 ****
--- 1,523 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+ import subprocess
+ import shutil
+ import pickle
+
+ import multiprocessing
+
+ def find_pound_include (line, use_outside, use_slash):
+ inc = re.findall (ur"^\s*#\s*include\s*\"(.+?)\"", line)
+ if len(inc) == 1:
+ nm = inc[0]
+ if use_outside or os.path.exists (nm):
+ if use_slash or '/' not in nm:
+ return nm
+ return ""
+
+ def find_system_include (line):
+ inc = re.findall (ur"^\s*#\s*include\s*<(.+?)>", line)
+ if len(inc) == 1:
+ return inc[0]
+ return ""
+
+ def find_pound_define (line):
+ inc = re.findall (ur"^\s*#\s*define ([A-Za-z0-9_]+)", line)
+ if len(inc) != 0:
+ if len(inc) > 1:
+ print "What? more than 1 match in #define??"
+ print inc
+ sys.exit(5)
+ return inc[0];
+ return ""
+
+ def is_pound_if (line):
+ inc = re.findall ("^\s*#\s*if\s", line)
+ if not inc:
+ inc = re.findall ("^\s*#\s*if[n]?def\s", line)
+ if inc:
+ return True
+ return False
+
+ def is_pound_endif (line):
+ inc = re.findall ("^\s*#\s*endif", line)
+ if inc:
+ return True
+ return False
+
+ def find_pound_if (line):
+ inc = re.findall (ur"^\s*#\s*if\s+(.*)", line)
+ if len(inc) == 0:
+ inc = re.findall (ur"^\s*#\s*elif\s+(.*)", line)
+ if len(inc) > 0:
+ inc2 = re.findall (ur"defined\s*\((.+?)\)", inc[0])
+ inc3 = re.findall (ur"defined\s+([a-zA-Z0-9_]+)", inc[0])
+ for yy in inc3:
+ inc2.append (yy)
+ return inc2
+ else:
+ inc = re.findall (ur"^\s*#\s*ifdef\s(.*)", line)
+ if len(inc) == 0:
+ inc = re.findall (ur"^\s*#\s*ifndef\s(.*)", line)
+ if len(inc) > 0:
+ inc2 = re.findall ("[A-Za-z_][A-Za-z_0-9]*", inc[0])
+ return inc2
+ if len(inc) == 0:
+ return list ()
+ print "WTF. more than one line returned for find_pound_if"
+ print inc
+ sys.exit(5)
+
+
+ # IINFO - this is a vector of include information. It consists of 7 elements.
+ # [0] - base name of the file
+ # [1] - path leading to this file.
+ # [2] - orderd list of all headers directly included by this file.
+ # [3] - Ordered list of any headers included within condionally compiled code.
+ # headers files are expected to have all includes one level deep due to
+ # the omnipresent guards at the top of the file.
+ # [4] - List of all macros which are consumed (used) within this file.
+ # [5] - list of all macros which may be defined in this file.
+ # [6] - The source code for this file, if cached.
+ # [7] - line number info for any headers in the source file. Indexed by base
+ # name, returning the line the include is on.
+
+ empty_iinfo = ("", "", list(), list(), list(), list(), list())
+
+ # This function will process a file and extract interesting information.
+ # DO_MACROS indicates whether macros defined and used should be recorded.
+ # KEEP_SRC indicates the source for the file should be cached.
+ def process_include_info (filen, do_macros, keep_src):
+ header = False
+ if not os.path.exists (filen):
+ return empty_iinfo
+
+ sfile = open (filen, "r");
+ data = sfile.readlines()
+ sfile.close()
+
+ # Ignore the initial #ifdef HEADER_H in header files
+ if filen[-2:] == ".h":
+ nest = -1
+ header = True
+ else:
+ nest = 0
+
+ macout = list ()
+ macin = list()
+ incl = list()
+ cond_incl = list()
+ src_line = { }
+ guard = ""
+
+ for line in (data):
+ if is_pound_if (line):
+ nest += 1
+ elif is_pound_endif (line):
+ nest -= 1
+
+ nm = find_pound_include (line, True, True)
+ if nm != "" and nm not in incl and nm[-2:] == ".h":
+ incl.append (nm)
+ if nest > 0:
+ cond_incl.append (nm)
+ if keep_src:
+ src_line[nm] = line
+ continue
+
+ if do_macros:
+ d = find_pound_define (line)
+ if d:
+ if d not in macout:
+ macout.append (d);
+ continue
+
+ d = find_pound_if (line)
+ if d:
+ # The first #if in a header file should be the guard
+ if header and len (d) == 1 and guard == "":
+ if d[0][-2:] == "_H":
+ guard = d
+ else:
+ guard = "Guess there was no guard..."
+ else:
+ for mac in d:
+ if mac != "defined" and mac not in macin:
+ macin.append (mac);
+
+ if not keep_src:
+ data = list()
+
+ return (os.path.basename (filen), os.path.dirname (filen), incl, cond_incl,
+ macin, macout, data, src_line)
+
+ # Extract header info, but no macros or source code.
+ def process_ii (filen):
+ return process_include_info (filen, False, False)
+
+ # Extract header information, and collect macro information.
+ def process_ii_macro (filen):
+ return process_include_info (filen, True, False)
+
+ # Extract header information, cache the source lines.
+ def process_ii_src (filen):
+ return process_include_info (filen, False, True)
+
+ # Extract header information, coolewc macro info and cache the source lines.
+ def process_ii_macro_src (filen):
+ return process_include_info (filen, True, True)
+
+
+ def ii_base (iinfo):
+ return iinfo[0]
+
+ def ii_path (iinfo):
+ return iinfo[1]
+
+ def ii_include_list (iinfo):
+ return iinfo[2]
+
+ def ii_include_list_cond (iinfo):
+ return iinfo[3]
+
+ def ii_include_list_non_cond (iinfo):
+ l = ii_include_list (iinfo)
+ for n in ii_include_list_cond (iinfo):
+ l.remove (n)
+ return l
+
+ def ii_macro_consume (iinfo):
+ return iinfo[4]
+
+ def ii_macro_define (iinfo):
+ return iinfo[5]
+
+ def ii_src (iinfo):
+ return iinfo[6]
+
+ def ii_src_line (iinfo):
+ return iinfo[7]
+
+ def ii_read (fname):
+ f = open (fname, 'rb')
+ incl = pickle.load (f)
+ consumes = pickle.load (f)
+ defines = pickle.load (f)
+ obj = (fname,fname,incl,list(), list(), consumes, defines, list(), list())
+ return obj
+
+ def ii_write (fname, obj):
+ f = open (fname, 'wb')
+ pickle.dump (obj[2], f)
+ pickle.dump (obj[4], f)
+ pickle.dump (obj[5], f)
+ f.close ()
+
+
+ # Find files matching pattern NAME, return in a list.
+ # CURRENT is True if you want to include the current directory
+ # DEEPER is True if you want to search 3 levels below the current directory
+ # any files with testsuite diurectories are ignored
+
+ def find_gcc_files (name, current, deeper):
+ files = list()
+ command = ""
+ if current:
+ if not deeper:
+ command = "find -maxdepth 1 -name " + name + " -not -path \"./testsuite/*\""
+ else:
+ command = "find -maxdepth 4 -name " + name + " -not -path \"./testsuite/*\""
+ else:
+ if deeper:
+ command = "find -maxdepth 4 -mindepth 2 -name " + name + " -not -path \"./testsuite/*\""
+
+ if command != "":
+ f = os.popen (command)
+ for x in f:
+ if x[0] == ".":
+ fn = x.rstrip()[2:]
+ else:
+ fn = x
+ files.append(fn)
+
+ return files
+
+ # find the list of unique include names found in a file.
+ def find_unique_include_list_src (data):
+ found = list ()
+ for line in data:
+ d = find_pound_include (line, True, True)
+ if d and d not in found and d[-2:] == ".h":
+ found.append (d)
+ return found
+
+ # find the list of unique include names found in a file.
+ def find_unique_include_list (filen):
+ data = open (filen).read().splitlines()
+ return find_unique_include_list_src (data)
+
+
+ # Create the macin, macout, and incl vectors for a file FILEN.
+ # macin are the macros that are used in #if* conditional expressions
+ # macout are the macros which are #defined
+ # incl is the list of incluide files encountered
+ # returned as a tuple of the filename followed by the triplet of lists
+ # (filen, macin, macout, incl)
+
+ def create_macro_in_out (filen):
+ sfile = open (filen, "r");
+ data = sfile.readlines()
+ sfile.close()
+
+ macout = list ()
+ macin = list()
+ incl = list()
+
+ for line in (data):
+ d = find_pound_define (line)
+ if d != "":
+ if d not in macout:
+ macout.append (d);
+ continue
+
+ d = find_pound_if (line)
+ if len(d) != 0:
+ for mac in d:
+ if mac != "defined" and mac not in macin:
+ macin.append (mac);
+ continue
+
+ nm = find_pound_include (line, True, True)
+ if nm != "" and nm not in incl:
+ incl.append (nm)
+
+ return (filen, macin, macout, incl)
+
+ # create the macro information for filen, and create .macin, .macout, and .incl
+ # files. Return the created macro tuple.
+ def create_include_data_files (filen):
+
+ macros = create_macro_in_out (filen)
+ depends = macros[1]
+ defines = macros[2]
+ incls = macros[3]
+
+ disp_message = filen
+ if len (defines) > 0:
+ disp_message = disp_message + " " + str(len (defines)) + " #defines"
+ dfile = open (filen + ".macout", "w")
+ for x in defines:
+ dfile.write (x + "\n")
+ dfile.close ()
+
+ if len (depends) > 0:
+ disp_message = disp_message + " " + str(len (depends)) + " #if dependencies"
+ dfile = open (filen + ".macin", "w")
+ for x in depends:
+ dfile.write (x + "\n")
+ dfile.close ()
+
+ if len (incls) > 0:
+ disp_message = disp_message + " " + str(len (incls)) + " #includes"
+ dfile = open (filen + ".incl", "w")
+ for x in incls:
+ dfile.write (x + "\n")
+ dfile.close ()
+
+ return macros
+
+
+
+ # extract data for include file name_h and enter it into the dictionary.
+ # this does not change once read in. use_requires is True if you want to
+ # prime the values with already created .requires and .provides files.
+ def get_include_data (name_h, use_requires):
+ macin = list()
+ macout = list()
+ incl = list ()
+ if use_requires and os.path.exists (name_h + ".requires"):
+ macin = open (name_h + ".requires").read().splitlines()
+ elif os.path.exists (name_h + ".macin"):
+ macin = open (name_h + ".macin").read().splitlines()
+
+ if use_requires and os.path.exists (name_h + ".provides"):
+ macout = open (name_h + ".provides").read().splitlines()
+ elif os.path.exists (name_h + ".macout"):
+ macout = open (name_h + ".macout").read().splitlines()
+
+ if os.path.exists (name_h + ".incl"):
+ incl = open (name_h + ".incl").read().splitlines()
+
+ if len(macin) == 0 and len(macout) == 0 and len(incl) == 0:
+ return ()
+ data = ( name_h, macin, macout, incl )
+ return data
+
+ # find FIND in src, and replace it with the list of headers in REPLACE.
+ # Remove any duplicates of FIND in REPLACE, and if some of the REPLACE
+ # headers occur earlier in the include chain, leave them.
+ # Return the new SRC only if anything changed.
+ def find_replace_include (find, replace, src):
+ res = list()
+ seen = { }
+ anything = False
+ for line in src:
+ inc = find_pound_include (line, True, True)
+ if inc == find:
+ for y in replace:
+ if seen.get(y) == None:
+ res.append("#include \""+y+"\"\n")
+ seen[y] = True
+ if y != find:
+ anything = True
+ # if find isnt in the replacement list, then we are deleting FIND, so changes.
+ if find not in replace:
+ anything = True
+ else:
+ if inc in replace:
+ if seen.get(inc) == None:
+ res.append (line)
+ seen[inc] = True
+ else:
+ res.append (line)
+
+ if (anything):
+ return res
+ else:
+ return list()
+
+
+ # pass in a require and provide dictionary to be read in.
+ def read_require_provides (require, provide):
+ if not os.path.exists ("require-provide.master"):
+ print "require-provide.master file is not available. please run data collection."
+ sys.exit(1)
+ incl_list = open("require-provide.master").read().splitlines()
+ for f in incl_list:
+ if os.path.exists (f+".requires"):
+ require[os.path.basename (f)] = open (f + ".requires").read().splitlines()
+ else:
+ require[os.path.basename (f)] = list ()
+ if os.path.exists (f+".provides"):
+ provide[os.path.basename (f)] = open (f + ".provides").read().splitlines()
+ else:
+ provide [os.path.basename (f)] = list ()
+
+
+ def build_include_list (filen):
+ include_files = list()
+ sfile = open (filen, "r")
+ data = sfile.readlines()
+ sfile.close()
+ for line in data:
+ nm = find_pound_include (line, False, False)
+ if nm != "" and nm[-2:] == ".h":
+ if nm not in include_files:
+ include_files.append(nm)
+ return include_files
+
+ def build_reverse_include_list (filen):
+ include_files = list()
+ sfile = open (filen, "r")
+ data = sfile.readlines()
+ sfile.close()
+ for line in reversed(data):
+ nm = find_pound_include (line, False, False)
+ if nm != "":
+ if nm not in include_files:
+ include_files.append(nm)
+ return include_files
+
+ # Get compilation return code, and compensate for a warning that we want to
+ # consider an error when it comes to inlined templates.
+ def get_make_rc (rc, output):
+ rc = rc % 1280
+ if rc == 0:
+ # This is not considered an error during compilation of an individual file,
+ # but it will cause an error during link if it isn't defined. If this
+ # warning is seen during compiling a file, make it a build error so we
+ # don't remove the header.
+ h = re.findall ("warning: inline function.*used but never defined", output)
+ if len(h) != 0:
+ rc = 1
+ return rc;
+
+ def get_make_output (build_dir, make_opt):
+ devnull = open('/dev/null', 'w')
+ at_a_time = multiprocessing.cpu_count() * 2
+ make = "make -j"+str(at_a_time)+ " "
+ if build_dir != "":
+ command = "cd " + build_dir +"; " + make + make_opt
+ else:
+ command = make + make_opt
+ process = subprocess.Popen(command, stdout=devnull, stderr=subprocess.PIPE, shell=True)
+ output = process.communicate();
+ rc = get_make_rc (process.returncode, output[1])
+ return (rc , output[1])
+
+ def spawn_makes (command_list):
+ devnull = open('/dev/null', 'w')
+ rc = (0,"", "")
+ proc_res = list()
+ text = " Trying target builds : "
+ for command_pair in command_list:
+ tname = command_pair[0]
+ command = command_pair[1]
+ text += tname + ", "
+ c = subprocess.Popen(command, bufsize=-1, stdout=devnull, stderr=subprocess.PIPE, shell=True)
+ proc_res.append ((c, tname))
+
+ print text[:-2]
+
+ for p in proc_res:
+ output = p[0].communicate()
+ ret = (get_make_rc (p[0].returncode, output[1]), output[1], p[1])
+ if (ret[0] != 0):
+ # Just record the first one.
+ if rc[0] == 0:
+ rc = ret;
+ return rc
+
+ def get_make_output_parallel (targ_list, make_opt, at_a_time):
+ command = list()
+ targname = list()
+ if at_a_time == 0:
+ at_a_time = multiprocessing.cpu_count() * 2
+ proc_res = [0] * at_a_time
+ for x in targ_list:
+ if make_opt[-2:] == ".o":
+ s = "cd " + x[1] + "/gcc/; make " + make_opt
+ else:
+ s = "cd " + x[1] +"; make " + make_opt
+ command.append ((x[0],s))
+
+ num = len(command)
+ rc = (0,"", "")
+ loops = num // at_a_time
+
+ if (loops > 0):
+ for idx in range (loops):
+ ret = spawn_makes (command[idx*at_a_time:(idx+1)*at_a_time])
+ if ret[0] != 0:
+ rc = ret
+ break
+
+ if (rc[0] == 0):
+ leftover = num % at_a_time
+ if (leftover > 0):
+ ret = spawn_makes (command[-leftover:])
+ if ret[0] != 0:
+ rc = ret
+
+ return rc
+
+
+ def readwholefile (src_file):
+ sfile = open (src_file, "r")
+ src_data = sfile.readlines()
+ sfile.close()
+ return src_data
+
Index: headers/included-by
===================================================================
*** headers/included-by (revision 0)
--- headers/included-by (working copy)
***************
*** 0 ****
--- 1,112 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+
+
+
+ usage = False
+ src = list()
+ flist = { }
+ process_h = False
+ process_c = False
+ verbose = False
+ level = 0
+ match_all = False
+ num_match = 1
+
+ file_list = list()
+ current = True
+ deeper = True
+ scanfiles = True
+ for x in sys.argv[1:]:
+ if x[0:2] == "-h":
+ usage = True
+ elif x[0:2] == "-i":
+ process_h = True
+ elif x[0:2] == "-s" or x[0:2] == "-c":
+ process_c = True
+ elif x[0:2] == "-v":
+ verbose = True
+ elif x[0:2] == "-a":
+ match_all = True
+ elif x[0:2] == "-n":
+ num_match = int(x[2:])
+ elif x[0:2] == "-1":
+ deeper = False
+ elif x[0:2] == "-2":
+ current = False
+ elif x[0:2] == "-f":
+ file_list = open (x[2:]).read().splitlines()
+ scanfiles = False
+ elif x[0] == "-":
+ print "Error: Unknown option " + x
+ usage = True
+ else:
+ src.append (x)
+
+ if match_all:
+ num_match = len (src)
+
+ if not process_h and not process_c:
+ process_h = True
+ process_c = True
+
+ if len(src) == 0:
+ usage = True
+
+ if not usage:
+ if scanfiles:
+ if process_h:
+ file_list = find_gcc_files ("\*.h", current, deeper)
+ if process_c:
+ file_list = file_list + find_gcc_files ("\*.c", current, deeper)
+ file_list = file_list + find_gcc_files ("\*.cc", current, deeper)
+ else:
+ newlist = list()
+ for x in file_list:
+ if process_h and x[-2:] == ".h":
+ newlist.append (x)
+ elif process_c and (x[-2:] == ".c" or x[-3:] == ".cc"):
+ newlist.append (x)
+ file_list = newlist;
+
+ file_list.sort()
+ for fn in file_list:
+ found = find_unique_include_list (fn)
+ careabout = list()
+ output = ""
+ for inc in found:
+ if inc in src:
+ careabout.append (inc)
+ if output == "":
+ output = fn
+ if verbose:
+ output = output + " [" + inc +"]"
+ if len (careabout) < num_match:
+ output = ""
+ if output != "":
+ print output
+ else:
+ print "included-by [-h] [-i] [-c] [-v] [-a] [-nx] file1 [file2] ... [filen]"
+ print "find the list of all files in subdirectories that include any of "
+ print "the listed files. processed to a depth of 3 subdirs"
+ print " -h : Show this message"
+ print " -i : process only header files (*.h) for #include"
+ print " -c : process only source files (*.c *.cc) for #include"
+ print " If nothing is specified, defaults to -i -c"
+ print " -s : Same as -c."
+ print " -v : Show which include(s) were found"
+ print " -nx : Only list files which have at least x different matches. Default = 1"
+ print " -a : Show only files which all listed files are included"
+ print " This is equivilent to -nT where T == # of items in list"
+ print " -flistfile : Show only files contained in the list of files"
+
+
+
+
+
+
Property changes on: headers/included-by
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: headers/reduce-headers
===================================================================
*** headers/reduce-headers (revision 0)
--- headers/reduce-headers (working copy)
***************
*** 0 ****
--- 1,596 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+ import tempfile
+ import copy
+
+ from headerutils import *
+
+ requires = { }
+ provides = { }
+
+ no_remove = [ "system.h", "coretypes.h", "config.h" , "bconfig.h", "backend.h" ]
+
+ # These targets are the ones which provide "coverage". Typically, if any
+ # target is going to fail compilation, it's one of these. This was determined
+ # during the initial runs of reduce-headers... On a full set of target builds,
+ # every failure which occured was triggered by one of these.
+ # This list is used during target-list construction simply to put any of these
+ # *first* in the candidate list, increasing the probability that a failure is
+ # found quickly.
+ target_priority = [
+ "aarch64-linux-gnu",
+ "arm-netbsdelf",
+ "avr-rtems",
+ "c6x-elf",
+ "epiphany-elf",
+ "hppa2.0-hpux10.1",
+ "i686-mingw32crt",
+ "i686-pc-msdosdjgpp",
+ "mipsel-elf",
+ "powerpc-eabisimaltivec",
+ "rs6000-ibm-aix5.1.0",
+ "sh-superh-elf",
+ "sparc64-elf",
+ "spu-elf"
+ ]
+
+
+ target_dir = ""
+ build_dir = ""
+ ignore_list = list()
+ target_builds = list()
+
+ target_dict = { }
+ header_dict = { }
+ search_path = [ ".", "../include", "../libcpp/include" ]
+
+ remove_count = { }
+
+
+ # Given a header name, normalize it. ie. cp/cp-tree.h could be in gcc, while
+ # the same header could be referenced from within the cp subdirectory as
+ # just cp-tree.h
+ # for now, just assume basenames are unique
+
+ def normalize_header (header):
+ return os.path.basename (header)
+
+
+ # Adds a header file and its sub-includes to the global dictionary if they
+ # aren't already there. Specify s_path since different build directories may
+ # append themselves on demand to the global list.
+ # return entry for the specified header, knowing all sub entries are completed
+
+ def get_header_info (header, s_path):
+ global header_dict
+ global empty_iinfo
+ process_list = list ()
+ location = ""
+ bname = ""
+ bname_iinfo = empty_iinfo
+ for path in s_path:
+ if os.path.exists (path + "/" + header):
+ location = path + "/" + header
+ break
+
+ if location:
+ bname = normalize_header (location)
+ if header_dict.get (bname):
+ bname_iinfo = header_dict[bname]
+ loc2 = ii_path (bname_iinfo)+ "/" + bname
+ if loc2[:2] == "./":
+ loc2 = loc2[2:]
+ if location[:2] == "./":
+ location = location[2:]
+ if loc2 != location:
+ # Don't use the cache if it isnt the right one.
+ bname_iinfo = process_ii_macro (location)
+ return bname_iinfo
+
+ bname_iinfo = process_ii_macro (location)
+ header_dict[bname] = bname_iinfo
+ # now decend into the include tree
+ for i in ii_include_list (bname_iinfo):
+ get_header_info (i, s_path)
+ else:
+ # if the file isnt in the source directories, look in the build and target
+ # directories. If it is here, then aggregate all the versions.
+ location = build_dir + "/gcc/" + header
+ build_inc = target_inc = False
+ if os.path.exists (location):
+ build_inc = True
+ for x in target_dict:
+ location = target_dict[x] + "/gcc/" + header
+ if os.path.exists (location):
+ target_inc = True
+ break
+
+ if (build_inc or target_inc):
+ bname = normalize_header(header)
+ defines = set()
+ consumes = set()
+ incl = set()
+ if build_inc:
+ iinfo = process_ii_macro (build_dir + "/gcc/" + header)
+ defines = set (ii_macro_define (iinfo))
+ consumes = set (ii_macro_consume (iinfo))
+ incl = set (ii_include_list (iinfo))
+
+ if (target_inc):
+ for x in target_dict:
+ location = target_dict[x] + "/gcc/" + header
+ if os.path.exists (location):
+ iinfo = process_ii_macro (location)
+ defines.update (ii_macro_define (iinfo))
+ consumes.update (ii_macro_consume (iinfo))
+ incl.update (ii_include_list (iinfo))
+
+ bname_iinfo = (header, "build", list(incl), list(), list(consumes), list(defines), list(), list())
+
+ header_dict[bname] = bname_iinfo
+ for i in incl:
+ get_header_info (i, s_path)
+
+ return bname_iinfo
+
+
+ # return a list of all headers brought in by this header
+ def all_headers (fname):
+ global header_dict
+ headers_stack = list()
+ headers_list = list()
+ if header_dict.get (fname) == None:
+ return list ()
+ for y in ii_include_list (header_dict[fname]):
+ headers_stack.append (y)
+
+ while headers_stack:
+ h = headers_stack.pop ()
+ hn = normalize_header (h)
+ if hn not in headers_list:
+ headers_list.append (hn)
+ if header_dict.get(hn):
+ for y in ii_include_list (header_dict[hn]):
+ if normalize_header (y) not in headers_list:
+ headers_stack.append (y)
+
+ return headers_list
+
+
+
+
+ # Search bld_dir for all target tuples, confirm that they have a build path with
+ # bld_dir/target-tuple/gcc, and build a dictionary of build paths indexed by
+ # target tuple..
+
+ def build_target_dict (bld_dir, just_these):
+ global target_dict
+ target_doct = { }
+ error = False
+ if os.path.exists (bld_dir):
+ if just_these:
+ ls = just_these
+ else:
+ ls = os.listdir(bld_dir)
+ for t in ls:
+ if t.find("-") != -1:
+ target = t.strip()
+ tpath = bld_dir + "/" + target
+ if not os.path.exists (tpath + "/gcc"):
+ print "Error: gcc build directory for target " + t + " Does not exist: " + tpath + "/gcc"
+ error = True
+ else:
+ target_dict[target] = tpath
+
+ if error:
+ target_dict = { }
+
+ def get_obj_name (src_file):
+ if src_file[-2:] == ".c":
+ return src_file.replace (".c", ".o")
+ elif src_file[-3:] == ".cc":
+ return src_file.replace (".cc", ".o")
+ return ""
+
+ def target_obj_exists (target, obj_name):
+ global target_dict
+ # look in a subdir if src has a subdir, then check gcc base directory.
+ if target_dict.get(target):
+ obj = target_dict[target] + "/gcc/" + obj_name
+ if not os.path.exists (obj):
+ obj = target_dict[target] + "/gcc/" + os.path.basename(obj_name)
+ if os.path.exists (obj):
+ return True
+ return False
+
+ # Given a src file, return a list of targets which may build this file.
+ def find_targets (src_file):
+ global target_dict
+ targ_list = list()
+ obj_name = get_obj_name (src_file)
+ if not obj_name:
+ print "Error: " + src_file + " - Cannot determine object name."
+ return list()
+
+ # Put the high priority targets which tend to trigger failures first
+ for target in target_priority:
+ if target_obj_exists (target, obj_name):
+ targ_list.append ((target, target_dict[target]))
+
+ for target in target_dict:
+ if target not in target_priority and target_obj_exists (target, obj_name):
+ targ_list.append ((target, target_dict[target]))
+
+ return targ_list
+
+
+ def try_to_remove (src_file, h_list, verbose):
+ global target_dict
+ global header_dict
+ global build_dir
+
+ # build from scratch each time
+ header_dict = { }
+ summary = ""
+ rmcount = 0
+
+ because = { }
+ src_info = process_ii_macro_src (src_file)
+ src_data = ii_src (src_info)
+ if src_data:
+ inclist = ii_include_list_non_cond (src_info)
+ # work is done if there are no includes to check
+ if not inclist:
+ return src_file + ": No include files to attempt to remove"
+
+ # work on the include list in reverse.
+ inclist.reverse()
+
+ # Get the target list
+ targ_list = list()
+ targ_list = find_targets (src_file)
+
+ spath = search_path
+ if os.path.dirname (src_file):
+ spath.append (os.path.dirname (src_file))
+
+ hostbuild = True
+ if src_file.find("config/") != -1:
+ # config files dont usually build on the host
+ hostbuild = False
+ obn = get_obj_name (os.path.basename (src_file))
+ if obn and os.path.exists (build_dir + "/gcc/" + obn):
+ hostbuild = True
+ if not target_dict:
+ summary = src_file + ": Target builds are required for config files. None found."
+ print summary
+ return summary
+ if not targ_list:
+ summary =src_file + ": Cannot find any targets which build this file."
+ print summary
+ return summary
+
+ if hostbuild:
+ # confirm it actually builds before we do anything
+ print "Confirming source file builds"
+ res = get_make_output (build_dir + "/gcc", "all")
+ if res[0] != 0:
+ message = "Error: " + src_file + " does not build currently."
+ summary = src_file + " does not build on host."
+ print message
+ print res[1]
+ if verbose:
+ verbose.write (message + "\n")
+ verbose.write (res[1]+ "\n")
+ return summary
+
+ src_requires = set (ii_macro_consume (src_info))
+ for macro in src_requires:
+ because[macro] = src_file
+ header_seen = list ()
+
+ os.rename (src_file, src_file + ".bak")
+ src_orig = copy.deepcopy (src_data)
+ src_tmp = copy.deepcopy (src_data)
+
+ try:
+ # process the includes from bottom to top. This is because we know that
+ # later includes have are known to be needed, so any dependency from this
+ # header is a true dependency
+ for inc_file in inclist:
+ inc_file_norm = normalize_header (inc_file)
+
+ if inc_file in no_remove:
+ continue
+ if len (h_list) != 0 and inc_file_norm not in h_list:
+ continue
+ if inc_file_norm[0:3] == "gt-":
+ continue
+ if inc_file_norm[0:6] == "gtype-":
+ continue
+ if inc_file_norm.replace(".h",".c") == os.path.basename(src_file):
+ continue
+
+ lookfor = ii_src_line(src_info)[inc_file]
+ src_tmp.remove (lookfor)
+ message = "Trying " + src_file + " without " + inc_file
+ print message
+ if verbose:
+ verbose.write (message + "\n")
+ out = open(src_file, "w")
+ for line in src_tmp:
+ out.write (line)
+ out.close()
+
+ keep = False
+ if hostbuild:
+ res = get_make_output (build_dir + "/gcc", "all")
+ else:
+ res = (0, "")
+
+ rc = res[0]
+ message = "Passed Host build"
+ if (rc != 0):
+ # host build failed
+ message = "Compilation failed:\n";
+ keep = True
+ else:
+ if targ_list:
+ objfile = get_obj_name (src_file)
+ t1 = targ_list[0]
+ if objfile and os.path.exists(t1[1] +"/gcc/"+objfile):
+ res = get_make_output_parallel (targ_list, objfile, 0)
+ else:
+ res = get_make_output_parallel (targ_list, "all-gcc", 0)
+ rc = res[0]
+ if rc != 0:
+ message = "Compilation failed on TARGET : " + res[2]
+ keep = True
+ else:
+ message = "Passed host and target builds"
+
+ if keep:
+ print message + "\n"
+
+ if (rc != 0):
+ if verbose:
+ verbose.write (message + "\n");
+ verbose.write (res[1])
+ verbose.write ("\n");
+ if os.path.exists (inc_file):
+ ilog = open(inc_file+".log","a")
+ ilog.write (message + " for " + src_file + ":\n\n");
+ ilog.write ("============================================\n");
+ ilog.write (res[1])
+ ilog.write ("\n");
+ ilog.close()
+ if os.path.exists (src_file):
+ ilog = open(src_file+".log","a")
+ ilog.write (message + " for " +inc_file + ":\n\n");
+ ilog.write ("============================================\n");
+ ilog.write (res[1])
+ ilog.write ("\n");
+ ilog.close()
+
+ # Given a sequence where :
+ # #include "tm.h"
+ # #include "target.h" // includes tm.h
+
+ # target.h was required, and when attempting to remove tm.h we'd see that
+ # all the macro defintions are "required" since they all look like:
+ # #ifndef HAVE_blah
+ # #define HAVE_blah
+ # endif
+
+ # when target.h was found to be required, tm.h will be tagged as included.
+ # so when we get this far, we know we dont have to check the macros for
+ # tm.h since we know it is already been included.
+
+ if inc_file_norm not in header_seen:
+ iinfo = get_header_info (inc_file, spath)
+ newlist = all_headers (inc_file_norm)
+ if ii_path(iinfo) == "build" and not target_dict:
+ keep = True
+ text = message + " : Will not remove a build file without some targets."
+ print text
+ ilog = open(src_file+".log","a")
+ ilog.write (text +"\n")
+ ilog.write ("============================================\n");
+ ilog.close()
+ ilog = open("reduce-headers-kept.log","a")
+ ilog.write (src_file + " " + text +"\n")
+ ilog.close()
+ else:
+ newlist = list()
+ if not keep and inc_file_norm not in header_seen:
+ # now look for any macro requirements.
+ for h in newlist:
+ if not h in header_seen:
+ if header_dict.get(h):
+ defined = ii_macro_define (header_dict[h])
+ for dep in defined:
+ if dep in src_requires and dep not in ignore_list:
+ keep = True;
+ text = message + ", but must keep " + inc_file + " because it provides " + dep
+ if because.get(dep) != None:
+ text = text + " Possibly required by " + because[dep]
+ print text
+ ilog = open(inc_file+".log","a")
+ ilog.write (because[dep]+": Requires [dep] in "+src_file+"\n")
+ ilog.write ("============================================\n");
+ ilog.close()
+ ilog = open(src_file+".log","a")
+ ilog.write (text +"\n")
+ ilog.write ("============================================\n");
+ ilog.close()
+ ilog = open("reduce-headers-kept.log","a")
+ ilog.write (src_file + " " + text +"\n")
+ ilog.close()
+ if verbose:
+ verbose.write (text + "\n")
+
+ if keep:
+ # add all headers 'consumes' to src_requires list, and mark as seen
+ for h in newlist:
+ if not h in header_seen:
+ header_seen.append (h)
+ if header_dict.get(h):
+ consume = ii_macro_consume (header_dict[h])
+ for dep in consume:
+ if dep not in src_requires:
+ src_requires.add (dep)
+ if because.get(dep) == None:
+ because[dep] = inc_file
+
+ src_tmp = copy.deepcopy (src_data)
+ else:
+ print message + " --> removing " + inc_file + "\n"
+ rmcount += 1
+ if verbose:
+ verbose.write (message + " --> removing " + inc_file + "\n")
+ if remove_count.get(inc_file) == None:
+ remove_count[inc_file] = 1
+ else:
+ remove_count[inc_file] += 1
+ src_data = copy.deepcopy (src_tmp)
+ except:
+ print "Interuption: restoring original file"
+ out = open(src_file, "w")
+ for line in src_orig:
+ out.write (line)
+ out.close()
+ raise
+
+ # copy current version, since it is the "right" one now.
+ out = open(src_file, "w")
+ for line in src_data:
+ out.write (line)
+ out.close()
+
+ # Try a final host bootstrap build to make sure everything is kosher.
+ if hostbuild:
+ res = get_make_output (build_dir, "all")
+ rc = res[0]
+ if (rc != 0):
+ # host build failed! return to original version
+ print "Error: " + src_file + " Failed to bootstrap at end!!! restoring."
+ print " Bad version at " + src_file + ".bad"
+ os.rename (src_file, src_file + ".bad")
+ out = open(src_file, "w")
+ for line in src_orig:
+ out.write (line)
+ out.close()
+ return src_file + ": failed to build after reduction. Restored original"
+
+ if src_data == src_orig:
+ summary = src_file + ": No change."
+ else:
+ summary = src_file + ": Reduction performed, "+str(rmcount)+" includes removed."
+ print summary
+ return summary
+
+ only_h = list ()
+ ignore_cond = False
+
+ usage = False
+ src = list()
+ only_targs = list ()
+ for x in sys.argv[1:]:
+ if x[0:2] == "-b":
+ build_dir = x[2:]
+ elif x[0:2] == "-f":
+ fn = normalize_header (x[2:])
+ if fn not in only_h:
+ only_h.append (fn)
+ elif x[0:2] == "-h":
+ usage = True
+ elif x[0:2] == "-d":
+ ignore_cond = True
+ elif x[0:2] == "-D":
+ ignore_list.append(x[2:])
+ elif x[0:2] == "-T":
+ only_targs.append(x[2:])
+ elif x[0:2] == "-t":
+ target_dir = x[2:]
+ elif x[0] == "-":
+ print "Error: Unrecognized option " + x
+ usgae = True
+ else:
+ if not os.path.exists (x):
+ print "Error: specified file " + x + " does not exist."
+ usage = True
+ else:
+ src.append (x)
+
+ if target_dir:
+ build_target_dict (target_dir, only_targs)
+
+ if build_dir == "" and target_dir == "":
+ print "Error: Must specify a build directory, and/or a target directory."
+ usage = True
+
+ if build_dir and not os.path.exists (build_dir):
+ print "Error: specified build directory does not exist : " + build_dir
+ usage = True
+
+ if target_dir and not os.path.exists (target_dir):
+ print "Error: specified target directory does not exist : " + target_dir
+ usage = True
+
+ if usage:
+ print "Attempts to remove extraneous include files from source files."
+ print " "
+ print "Should be run from the main gcc source directory, and works on a target"
+ print "directory, as we attempt to make the 'all' target."
+ print " "
+ print "By default, gcc-reorder-includes is run on each file before attempting"
+ print "to remove includes. this removes duplicates and puts some headers in a"
+ print "canonical ordering"
+ print " "
+ print "The build directory should be ready to compile via make. Time is saved"
+ print "if the build is already complete, so that only changes need to be built."
+ print " "
+ print "Usage: [options] file1.c [file2.c] ... [filen.c]"
+ print " -bdir : the root build directory to attempt buiding .o files."
+ print " -tdir : the target build directory"
+ print " -d : Ignore conditional macro dependencies."
+ print " "
+ print " -Dmacro : Ignore a specific macro for dependencies"
+ print " -Ttarget : Only consider target in target directory."
+ print " -fheader : Specifies a specific .h file to be considered."
+ print " "
+ print " -D, -T, and -f can be specified mulitple times and are aggregated."
+ print " "
+ print " The original file will be in filen.bak"
+ print " "
+ sys.exit (0)
+
+ if only_h:
+ print "Attempting to remove only these files:"
+ for x in only_h:
+ print x
+ print " "
+
+ logfile = open("reduce-headers.log","w")
+
+ for x in src:
+ msg = try_to_remove (x, only_h, logfile)
+ ilog = open("reduce-headers.sum","a")
+ ilog.write (msg + "\n")
+ ilog.close()
+
+ ilog = open("reduce-headers.sum","a")
+ ilog.write ("===============================================================\n")
+ for x in remove_count:
+ msg = x + ": Removed " + str(remove_count[x]) + " times."
+ print msg
+ logfile.write (msg + "\n")
+ ilog.write (msg + "\n")
+
+
+
+
+
Property changes on: headers/reduce-headers
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: headers/replace-header
===================================================================
*** headers/replace-header (revision 0)
--- headers/replace-header (working copy)
***************
*** 0 ****
--- 1,53 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+
+
+ files = list()
+ replace = list()
+ find = ""
+ usage = False
+
+ for x in sys.argv[1:]:
+ if x[0:2] == "-h":
+ usage = True
+ elif x[0:2] == "-f" and find == "":
+ find = x[2:]
+ elif x[0:2] == "-r":
+ replace.append (x[2:])
+ elif x[0:1] == "-":
+ print "Error: unrecognized option " + x
+ usage = True
+ else:
+ files.append (x)
+
+ if find == "":
+ usage = True
+
+ if usage:
+ print "replace-header -fheader -rheader [-rheader] file1 [filen.]"
+ sys.exit(0)
+
+ string = ""
+ for x in replace:
+ string = string + " '"+x+"'"
+ print "Replacing '"+find+"' with"+string
+
+ for x in files:
+ src = readwholefile (x)
+ src = find_replace_include (find, replace, src)
+ if (len(src) > 0):
+ print x + ": Changed"
+ out = open(x, "w")
+ for line in src:
+ out.write (line);
+ out.close ()
+ else:
+ print x
+
+
+
Property changes on: headers/replace-header
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
Index: headers/show-headers
===================================================================
*** headers/show-headers (revision 0)
--- headers/show-headers (working copy)
***************
*** 0 ****
--- 1,138 ----
+ #! /usr/bin/python2
+ import os.path
+ import sys
+ import shlex
+ import re
+
+ from headerutils import *
+
+
+ tabstop = 2
+ padding = " "
+ seen = { }
+ output = list()
+ summary = list()
+ sawcore = False
+
+ # list of headers to emphasize
+ highlight = list ()
+
+ # search path for headers
+ incl_dirs = [".", "../include", "../../build/gcc", "../libcpp/include" ]
+ # extra search paths to look in *after* the directory the source file is in.
+ extra_dirs = [ "common", "c-family", "c", "cp", "config" ]
+
+ # append (1) to the end of the first line which includes INC in list INC.
+ def append_1 (output, inc):
+ for n,t in enumerate (output):
+ idx = t.find(inc)
+ if idx != -1:
+ eos = idx + len (inc)
+ t = t[:eos] + " (1)" + t[eos+1:]
+ output[n] = t
+ return
+
+ # These headers show up as duplicates in rtl.h due to conditional code arund the includes
+ rtl_core = [ "machmode.h" , "signop.h" , "wide-int.h" , "double-int.h" , "real.h" , "fixed-value.h" , "statistics.h" , "vec.h" , "hash-table.h" , "hash-set.h" , "input.h" , "is-a.h" ]
+
+ def find_include_data (inc):
+ global sawcore
+ for x in incl_dirs:
+ nm = x+"/"+inc
+ if os.path.exists (nm):
+ info = find_unique_include_list (nm)
+ # rtl.h mimics coretypes for GENERATOR FILES, remove if coretypes.h seen.
+ if inc == "coretypes.h":
+ sawcore = True
+ elif inc == "rtl.h" and sawcore:
+ for i in rtl_core:
+ if i in info:
+ info.remove (i)
+ return info
+ return list()
+
+ def process_include (inc, indent):
+ if inc[-2:] != ".h":
+ return
+ bname = os.path.basename (inc)
+ if bname in highlight:
+ arrow = " <<-------"
+ if bname not in summary:
+ summary.append (bname)
+ else:
+ arrow = ""
+ if seen.get(inc) == None:
+ seen[inc] = 1
+ output.append (padding[:indent*tabstop] + bname + arrow)
+ info = find_include_data (inc)
+ for y in info:
+ process_include (y, indent+1)
+ else:
+ seen[inc] += 1
+ if (seen[inc] == 2):
+ append_1(output, inc)
+ output.append (padding[:indent*tabstop] + bname + " ("+str(seen[inc])+")" + arrow)
+
+
+
+ blddir = [ "." ]
+ usage = False
+ src = list()
+
+ for x in sys.argv[1:]:
+ if x[0:2] == "-i":
+ bld = x[2:]
+ print "Build dir : " + bld
+ blddir.append (bld)
+ elif x[0:2] == "-s":
+ highlight.append (os.path.basename (x[2:]))
+ elif x[0:2] == "-h":
+ usage = True
+ else:
+ src.append (x)
+
+ if len(src) != 1:
+ usage = True
+ elif not os.path.exists (src[0]):
+ print src[0] + ": Requested source file does not exist.\n"
+ usage = True
+
+ if usage:
+ print "show-headers [-idir] [-sfilen] file1 "
+ print " "
+ print " Show a hierarchical visual format how many times each header file"
+ print " is included in a source file. Should be run from the source directory"
+ print " files from find-include-depends"
+ print " -s : search for a header, and point it out."
+ print " -i : Specifies 1 or more directories to search for includes."
+ sys.exit(0)
+
+
+ if len(blddir) > 1:
+ incl_dirs = blddir
+
+ x = src[0]
+ # if source is in a subdirectory, add the subdirectory to the search list
+ srcpath = os.path.dirname(x)
+ if srcpath:
+ incl_dirs.append (srcpath)
+ for yy in extra_dirs:
+ incl_dirs.append (yy)
+
+ output = list()
+ sawcore = False
+ incl = find_unique_include_list (x)
+ for inc in incl:
+ process_include (inc, 1)
+ print "\n" + x
+ for line in output:
+ print line
+
+ if highlight:
+ print " "
+ for h in summary:
+ print h + " is included by source file."
+ for h in highlight:
+ if h not in summary:
+ print h + " is not included by source file."
+
Property changes on: headers/show-headers
___________________________________________________________________
Added: svn:executable
## -0,0 +1 ##
+*
\ No newline at end of property
^ permalink raw reply [flat|nested] 65+ messages in thread
* config header file reduction patch checked in.
2015-10-12 8:04 ` [patch] header file re-ordering Jeff Law
2015-10-14 14:05 ` Andrew MacLeod
@ 2015-10-16 19:52 ` Andrew MacLeod
2015-10-16 20:17 ` Andrew MacLeod
2015-10-18 9:34 ` Iain Sandoe
1 sibling, 2 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-16 19:52 UTC (permalink / raw)
To: Jeff Law, gcc-patches
[-- Attachment #1: Type: text/plain, Size: 1180 bytes --]
On 10/12/2015 04:04 AM, Jeff Law wrote:
> On 10/08/2015 07:37 AM, Andrew MacLeod wrote:
>> On 10/07/2015 06:02 PM, Jeff Law wrote:
>
> I'm slightly concerned about the darwin, windows and solaris bits.
> The former primarily because Darwin has been a general source of pain,
> and in the others because I'm not sure the cross testing will exercise
> that code terribly much.
>
> I'll go ahead and approve all the config/ bits. Please be on the
> lookout for any fallout.
>
> I'll try and get into more of the other patches tomorrow.
>
>
OK, I've checked in the config changes. I rebuilt all the cross
compilers for the 200+ targets, and they still build.. as well as
bootstrapping on x86_64-pc-linux-gnu with no regressions.
So. If any one runs into a native build issue you can either add the
required header back in, or back out the file for your port, and I'll
look into why something happened. The only thing I can imagine is
files that have conditional compilation based on a macro that is only
ever defined on a native build command line or headers. Its unlikely...
but possible.
I've attached the latest version of the patch for the record.
Andrew
[-- Attachment #2: config2-final.patch.bz2 --]
[-- Type: application/x-bzip, Size: 8957 bytes --]
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: config header file reduction patch checked in.
2015-10-16 19:52 ` config header file reduction patch checked in Andrew MacLeod
@ 2015-10-16 20:17 ` Andrew MacLeod
2015-10-18 9:34 ` Iain Sandoe
1 sibling, 0 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-16 20:17 UTC (permalink / raw)
To: Jeff Law, gcc-patches
On 10/16/2015 03:49 PM, Andrew MacLeod wrote:
> On 10/12/2015 04:04 AM, Jeff Law wrote:
>> On 10/08/2015 07:37 AM, Andrew MacLeod wrote:
>>> On 10/07/2015 06:02 PM, Jeff Law wrote:
>>
>> I'm slightly concerned about the darwin, windows and solaris bits.
>> The former primarily because Darwin has been a general source of
>> pain, and in the others because I'm not sure the cross testing will
>> exercise that code terribly much.
>>
>> I'll go ahead and approve all the config/ bits. Please be on the
>> lookout for any fallout.
>>
>> I'll try and get into more of the other patches tomorrow.
>>
>>
>
> OK, I've checked in the config changes. I rebuilt all the cross
> compilers for the 200+ targets, and they still build.. as well as
> bootstrapping on x86_64-pc-linux-gnu with no regressions.
>
> So. If any one runs into a native build issue you can either add the
> required header back in, or back out the file for your port, and I'll
> look into why something happened. The only thing I can imagine is
> files that have conditional compilation based on a macro that is only
> ever defined on a native build command line or headers. Its
> unlikely... but possible.
>
btw, out of all the targets, the only one which didn't build before my
patch was i686-interix3OPT-enable-obsolete...
so that one isn't my fault :-)
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: config header file reduction patch checked in.
2015-10-16 19:52 ` config header file reduction patch checked in Andrew MacLeod
2015-10-16 20:17 ` Andrew MacLeod
@ 2015-10-18 9:34 ` Iain Sandoe
2015-10-19 15:55 ` Andrew MacLeod
1 sibling, 1 reply; 65+ messages in thread
From: Iain Sandoe @ 2015-10-18 9:34 UTC (permalink / raw)
To: Andrew MacLeod
Cc: Jeff Law, gcc-patches List, Mike Stump, Dominique Dhumieres
Hi Andrew,
On 16 Oct 2015, at 20:49, Andrew MacLeod wrote:
> On 10/12/2015 04:04 AM, Jeff Law wrote:
>> On 10/08/2015 07:37 AM, Andrew MacLeod wrote:
>>> On 10/07/2015 06:02 PM, Jeff Law wrote:
>>
>> I'm slightly concerned about the darwin, windows and solaris bits. The former primarily because Darwin has been a general source of pain, and in the others because I'm not sure the cross testing will exercise that code terribly much.
>>
>> I'll go ahead and approve all the config/ bits. Please be on the lookout for any fallout.
>>
>> I'll try and get into more of the other patches tomorrow.
>>
>>
>
> OK, I've checked in the config changes. I rebuilt all the cross compilers for the 200+ targets, and they still build.. as well as bootstrapping on x86_64-pc-linux-gnu with no regressions.
>
> So. If any one runs into a native build issue you can either add the required header back in, or back out the file for your port, and I'll look into why something happened. The only thing I can imagine is files that have conditional compilation based on a macro that is only ever defined on a native build command line or headers. Its unlikely... but possible.
I've applied the following to fix Darwin native bootstrap.
AFAICT (from reading the other thread on the re-ordering tools) putting the diagnostics header at the end of the list is the right thing to do.
FWIW,
a) of course, Darwin exercises ObjC/ObjC++ in *both* NeXT and GNU mode - so those are pretty well covered by this too.
b) darwin folks will usually do their best to test any patch that you think is specifically risky - but you need to ask, because we have (very) limited resources in time and hardware ;-) ...
thanks for tidying things up!
(I, for one, think that improving the separation of things is worth a small amount of pain along the way).
cheers,
Iain
gcc/
+2015-10-18 Iain Sandoe <iain@codesourcery.com>
+
+ * config/darwin-driver.h: Adjust includes to add diagnostic-core.
+
2015-10-16 Trevor Saunders <tbsaunde+gcc@tbsaunde.org>
* lra-constraints.c (add_next_usage_insn): Change argument type
Index: gcc/config/darwin-driver.c
===================================================================
--- gcc/config/darwin-driver.c (revision 228938)
+++ gcc/config/darwin-driver.c (working copy)
@@ -23,6 +23,7 @@
#include "coretypes.h"
#include "tm.h"
#include "opts.h"
+#include "diagnostic-core.h"
#ifndef CROSS_DIRECTORY_STRUCTURE
#include <sys/sysctl.h>
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: config header file reduction patch checked in.
2015-10-18 9:34 ` Iain Sandoe
@ 2015-10-19 15:55 ` Andrew MacLeod
2015-10-23 17:02 ` Bernd Schmidt
0 siblings, 1 reply; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-19 15:55 UTC (permalink / raw)
To: Iain Sandoe; +Cc: Jeff Law, gcc-patches List, Mike Stump, Dominique Dhumieres
On 10/18/2015 05:31 AM, Iain Sandoe wrote:
> Hi Andrew,
>
> On 16 Oct 2015, at 20:49, Andrew MacLeod wrote:
>
>> On 10/12/2015 04:04 AM, Jeff Law wrote:
>>> On 10/08/2015 07:37 AM, Andrew MacLeod wrote:
>>>> On 10/07/2015 06:02 PM, Jeff Law wrote:
>>> I'm slightly concerned about the darwin, windows and solaris bits. The former primarily because Darwin has been a general source of pain, and in the others because I'm not sure the cross testing will exercise that code terribly much.
>>>
>>> I'll go ahead and approve all the config/ bits. Please be on the lookout for any fallout.
>>>
>>> I'll try and get into more of the other patches tomorrow.
>>>
>>>
>> OK, I've checked in the config changes. I rebuilt all the cross compilers for the 200+ targets, and they still build.. as well as bootstrapping on x86_64-pc-linux-gnu with no regressions.
>>
>> So. If any one runs into a native build issue you can either add the required header back in, or back out the file for your port, and I'll look into why something happened. The only thing I can imagine is files that have conditional compilation based on a macro that is only ever defined on a native build command line or headers. Its unlikely... but possible.
> I've applied the following to fix Darwin native bootstrap.
> AFAICT (from reading the other thread on the re-ordering tools) putting the diagnostics header at the end of the list is the right thing to do.
>
> FWIW,
> a) of course, Darwin exercises ObjC/ObjC++ in *both* NeXT and GNU mode - so those are pretty well covered by this too.
>
> b) darwin folks will usually do their best to test any patch that you think is specifically risky - but you need to ask, because we have (very) limited resources in time and hardware ;-) ...
>
> thanks for tidying things up!
> (I, for one, think that improving the separation of things is worth a small amount of pain along the way).
>
> cheers,
> Iain
>
> gcc/
>
> +2015-10-18 Iain Sandoe <iain@codesourcery.com>
> +
> + * config/darwin-driver.h: Adjust includes to add diagnostic-core.
> +
interesting that none of the cross builds need diagnostics-core.h. I see
it used in 7 different targets. Must be something on the native build
command line that is defined which causes it to be needed.
Anyway, Thanks for fixing it.
btw, that should be darwin-driver.c not .h in the changelog right?
Andrew
> 2015-10-16 Trevor Saunders <tbsaunde+gcc@tbsaunde.org>
>
> * lra-constraints.c (add_next_usage_insn): Change argument type
> Index: gcc/config/darwin-driver.c
> ===================================================================
> --- gcc/config/darwin-driver.c (revision 228938)
> +++ gcc/config/darwin-driver.c (working copy)
> @@ -23,6 +23,7 @@
> #include "coretypes.h"
> #include "tm.h"
> #include "opts.h"
> +#include "diagnostic-core.h"
>
> #ifndef CROSS_DIRECTORY_STRUCTURE
> #include <sys/sysctl.h>
>
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch] header file re-ordering.
2015-10-14 14:05 ` Andrew MacLeod
@ 2015-10-19 21:05 ` Jeff Law
0 siblings, 0 replies; 65+ messages in thread
From: Jeff Law @ 2015-10-19 21:05 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/14/2015 08:05 AM, Andrew MacLeod wrote:
> On 10/12/2015 04:04 AM, Jeff Law wrote:
>
> Oh, you must be looking at the original combined patch?
Possibly :-)
>
> fold-const.h is indirectly included by cp-tree.h, which gets it from
> including c-common.h. some of the output from show-headers on
> objc-act.c (indentation represents levels of including. The number in
> parenthesis is the number of times that include has been seen so far in
> the files include list. As you can see, we include ansidecl.h a lot
> :-) Most of the time there isn't much we can do about those sorts of
> things. :
>
> cp-tree.h
> tm.h (2)
> hard-reg-set.h
> function.h (1)
> c-common.h
> splay-tree.h
> ansidecl.h (4)
> cpplib.h
> symtab.h (2)
> line-map.h (2)
> alias.h
> tree.h (2)
> fold-const.h
> diagnostic-core.h (1)
> bversion.h
>
> I guess It could be a useful addition to show-headers to specify a
> header file you are looking for and show you where it comes from if its
> included...
Yea. Though I think it's probably easy enough to get it from the
current output.
>
> I any case, there is some indirection here because none of the front end
> files were flattened that much
And I think that's probably some source of the confusion on my part. I
thought we'd flattened the front-end .h files too. So I didn't look
deeply into the .h files to see if they were doing something undesirable
behind my back.
>
> incidentally, you may notice this is the second time tree.h is
> included. The first occurrence of tree.h is included directly by
> objc-act.c, but it needs to be left because something between that and
> cp-tree.h needs tree.h to compile. This sort of thing is resolved by
> using the re-order tool, but I did not run that tool on most of the objc
> and objcp files as they have some complex conditionals in their include
> list:
> #include "tree.h"
> #include "stringpool.h"
> #include "stor-layout.h"
> #include "attribs.h"
>
> #ifdef OBJCPLUS
> #include "cp/cp-tree.h"
> #else
> #include "c/c-tree.h"
> #include "c/c-lang.h"
> #endif
>
> #include "c-family/c-objc.h"
> #include "langhooks.h"
>
> Its beyond the scope of the reorder tool to deal with re-positioning
> this automatically... and happens so rarely I didn't even look into it.
> So they are not optimal as far as ordering goes.
Understood. This unholy sharing had me concerned as well.
> So you can not worry about that. It builds fine.
OK. I think the major source of confusion was the lack of flattening
for the front-ends. I'll go back to it with that in mind and probably
start using the tools when I get a WTF moment.
>
>>
>> I'm slightly concerned about the darwin, windows and solaris bits. The
>> former primarily because Darwin has been a general source of pain, and
>> in the others because I'm not sure the cross testing will exercise
>> that code terribly much.
>>
> Its easy enough to NOT do this for any of those files if were too
> worried about them. Its also easy to revert a single file if it
> appears to be an issue. Thats why I wanted to run as many of these
> on the compile farm natively as I could... but alas, powerPC was the
> only thing the farm really offered me.
>
>
>> I'll go ahead and approve all the config/ bits. Please be on the
>> lookout for any fallout.
>
> even darwin, windows and solaris? :-)
Yup. The changes are straighforward enough that if there's fallout (and
to some degree I expect minor fallout from native builds) it can be
easily fixed.
Jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch] header file re-ordering.
2015-10-08 13:37 ` [patch] header file re-ordering Andrew MacLeod
` (2 preceding siblings ...)
2015-10-12 8:04 ` [patch] header file re-ordering Jeff Law
@ 2015-10-22 21:07 ` Jeff Law
2015-10-22 21:21 ` Andrew MacLeod
2015-10-23 19:14 ` Jeff Law
4 siblings, 1 reply; 65+ messages in thread
From: Jeff Law @ 2015-10-22 21:07 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/08/2015 07:37 AM, Andrew MacLeod wrote:
> On 10/07/2015 06:02 PM, Jeff Law wrote:
>
> However, the tool has been run, and I've made the minor adjustments
> required to the source files to make it work. (ie, a few multi-line
> comments and the fact that mul-tables.c is generated on the tile* targets.
>
> So this is what it should look like. I used -cp. Other languages are
> bootstrapping, and I have yet to build all the targets... that'll just
> take a day. Be nice if ada worked tho.
>
> I can run the reduction tool over the weekend (its a long weekend here
> :-) on this if you want... the other patch is a couple of weeks out of
> date anyway now.
So I'm playing with this stuff a little. I was surprised to see that
the reordering script also removes duplicates.
For some dumb reason I thought that functionality was part of the header
file reducer, but that's only concerned with removing stuff that's
unnecessary.
Anyway, just surprised me. Not sure if it's worth splitting that
functionality out or making it conditional on a flag is worth it.
It certainly helps in that I won't look at the changes and expect that
headers are just reordered :-)
jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch] header file re-ordering.
2015-10-22 21:07 ` [patch] header file re-ordering Jeff Law
@ 2015-10-22 21:21 ` Andrew MacLeod
2015-10-22 22:25 ` Jeff Law
0 siblings, 1 reply; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-22 21:21 UTC (permalink / raw)
To: Jeff Law, gcc-patches
On 10/22/2015 04:55 PM, Jeff Law wrote:
> On 10/08/2015 07:37 AM, Andrew MacLeod wrote:
>> On 10/07/2015 06:02 PM, Jeff Law wrote:
>>
>> However, the tool has been run, and I've made the minor adjustments
>> required to the source files to make it work. (ie, a few multi-line
>> comments and the fact that mul-tables.c is generated on the tile*
>> targets.
>>
>> So this is what it should look like. I used -cp. Other languages are
>> bootstrapping, and I have yet to build all the targets... that'll just
>> take a day. Be nice if ada worked tho.
>>
>> I can run the reduction tool over the weekend (its a long weekend here
>> :-) on this if you want... the other patch is a couple of weeks out of
>> date anyway now.
> So I'm playing with this stuff a little. I was surprised to see that
> the reordering script also removes duplicates.
>
> For some dumb reason I thought that functionality was part of the
> header file reducer, but that's only concerned with removing stuff
> that's unnecessary.
>
> Anyway, just surprised me. Not sure if it's worth splitting that
> functionality out or making it conditional on a flag is worth it.
>
> It certainly helps in that I won't look at the changes and expect that
> headers are just reordered :-)
>
> jeff
Yeah, the reordering removes anything which is a duplicate. The way the
processing works, it was very natural to do it there, and trivial. And
it seemed silly to put 2copies in a row when it reordered something.
The reducer also gets an extra order of complexity when it has to deal
with duplicate header files.. ie, no longer does a #include become a
unique thing that I can hash and build a dictionary on... , it has to
remember whether it was the first or second or nth instance, and it was
just much much simpler to make it only have to deal with removing
#Include "header.h". The original version dealt with multiples OK, but
I eventually removed it on one of the iterations as being superfluous
with the addition of the ordering tool.
It was actually only at the 11th hour I decided to keep the ordering
tool and reducer as seperate tools.. they were going to be combined, but
it seemed better to leave them separate.
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch] header file re-ordering.
2015-10-22 21:21 ` Andrew MacLeod
@ 2015-10-22 22:25 ` Jeff Law
0 siblings, 0 replies; 65+ messages in thread
From: Jeff Law @ 2015-10-22 22:25 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/22/2015 03:07 PM, Andrew MacLeod wrote:
> On 10/22/2015 04:55 PM, Jeff Law wrote:
>> On 10/08/2015 07:37 AM, Andrew MacLeod wrote:
>>> On 10/07/2015 06:02 PM, Jeff Law wrote:
>>>
>>> However, the tool has been run, and I've made the minor adjustments
>>> required to the source files to make it work. (ie, a few multi-line
>>> comments and the fact that mul-tables.c is generated on the tile*
>>> targets.
>>>
>>> So this is what it should look like. I used -cp. Other languages are
>>> bootstrapping, and I have yet to build all the targets... that'll just
>>> take a day. Be nice if ada worked tho.
>>>
>>> I can run the reduction tool over the weekend (its a long weekend here
>>> :-) on this if you want... the other patch is a couple of weeks out of
>>> date anyway now.
>> So I'm playing with this stuff a little. I was surprised to see that
>> the reordering script also removes duplicates.
>>
>> For some dumb reason I thought that functionality was part of the
>> header file reducer, but that's only concerned with removing stuff
>> that's unnecessary.
>>
>> Anyway, just surprised me. Not sure if it's worth splitting that
>> functionality out or making it conditional on a flag is worth it.
>>
>> It certainly helps in that I won't look at the changes and expect that
>> headers are just reordered :-)
>>
>> jeff
> Yeah, the reordering removes anything which is a duplicate. The way the
> processing works, it was very natural to do it there, and trivial. And
> it seemed silly to put 2copies in a row when it reordered something.
Yup. Understood. I'll probably approve a goodly amount of the backend
files today... Roughly half are really easy. Making my 2nd pass
through right now.
Jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 1/3] Header file reduction - backend files.
2015-10-02 2:33 ` [patch 1/3] Header file reduction - backend files Andrew MacLeod
2015-10-07 22:02 ` Jeff Law
@ 2015-10-22 22:33 ` Jeff Law
2015-10-22 22:36 ` Andrew MacLeod
2015-10-23 6:22 ` Jeff Law
1 sibling, 2 replies; 65+ messages in thread
From: Jeff Law @ 2015-10-22 22:33 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/01/2015 08:33 PM, Andrew MacLeod wrote:
> these are all in the main gcc directory. 297 files total.
>
> Everything bootstraps on x86_64-pc-linux-gnu and
> powerpc64le-unknown-linux-gnu. All targets in config-list.mk still
> build. Regressions tests also came up clean.
>
> OK for trunk?
So just to be clear, I'm looking at the backend-reorder patch. Now that
I know it's going to be reordering & removing duplicates, I know better
what to expect.
Can you please look at data-streamer-{in,out}.c, data-streamer.c,
gimplify-me.c. Maybe its changed recently, but options.h is removed as
a duplicate by your patch, but on the trunk, I only see it showing up
once (using your tools)
I realize it's gotten a little stale, but IMHO once approved, if you
need to make adjustments due to changes since you originally produced
the patch those adjustments are pre-approved.
Given all this stuff should be independent, I'm going to start by
carving out those which I'm most confident are correct. Please go ahead
and commit the reorder/remove duplicate changes for the following:
auto-inc-dec.c
bt-load.c
caller-save.c
cfganal.c
cfgbuild.c
cfghooks.c
cfgloop.c
cfgloopanal.c
cfgloopmanip.c
cfgrtl.c
combine-stack-adj.c
compare-elim.c
cse.c
ddg.c
debug.c
dfp.c
dominance.c
double-int.c
dumpfile.c
emit-rtl.c
fixed-value.c
fwprop.c
ggc-common.c
gimple-expr.c
gimple-iterator.c
gimple-pretty-print.c
gimple-walk.c
godump.c
graph.c
graphite-poly.c
hw-dooploop.c
init-regs.c
ipa-comdats.c
ipa-inline-analysis.c
ipa-inline-transform.c
ipa-polyporphic-call.c
ipa-visibility.c
jump.c
lcm.c
lists.c
loop-doloop.c
loop-init.c
loop-iv.c
lower-subreg.c
lto-cgraph.c
lto-section-in.c
lto-section-out.c
lto-streamer.c
mode-switching.c
plugin.c
postreload-gcse.c
predict.c
realmpfr.c
reg-stack.c
regcprop.c
regrename.c
reorg.c
resource.c
rtl-chkp.c
rtl-error.c
rtlanal.c
rtlhooks.c
sched-ebb.c
sched-rgn.c
stack-ptr-mod.c
statistics.c
store-motion.c
stringpool.c
tracer.c
tree-call-cdce.c
tree-cfgcleanup.c
tree-chkp.c
tree-complex.c
tree-dfa.c
tree-diagnostic.c
tree-eh.c
tree-inline.c
tree-into-ssa.c
tree-iterator.c
tree-nested.c
tree-object-size.c
tree-outof-ssa.c
tree-phinodes.c
tree-sra.c
tree-ssa-address.c
tree-ssa-coalesce.c
tree-ssa-copy.c
tree-ssa-dce.c
tree-ssa-dom.c
tree-ssa-dse.c
tree-ssa-ifcombine.c
tree-ssa-live.c
tree-ssa-loop-im.c
tree-ssa-loop-ivopts.c
tree-ssa-loop-manip.c
tree-ssa-loop-unswitch.c
tree-ssa-loop.c
tree-ssa-phiopt.c
tree-ssa-phiprop.c
tree-ssa-pre.c
tree-ssa-propagate.c
tree-ssa-sccvn.c
tree-ssa-sink.c
tree-ssa-structalias.c
tree-ssa-ter.c
tree-ssa-threadedge.c
tree-ssa-uninit.c
tree-ssanames.c
tree-switch-conversion.c
tree-tailcall.c
tree-vect-loop-manip.c
tree-vectorizer.c
tree-vrp.c
var-tracking.c
web.c
While I was going through those, I'm pretty sure there's another sizable
number which I can get through on a second pass. So the following are
approved as well:
alias.c
attribs.c
auto-profile.c
cfg.c
cfgcleanup.c
cgraph.c
cgraphbuild.c
cgraphclones.c
convert.c
cprop.c
cselib.c
dbxout.c
dce.c
df-core.c
df-problems.c
df-scan.c
dojump.c
dse.c
dwarf2asm.c
dwarf2cfi.c
dwarf2out.c
fold-const.c
function.c
incpath.c
profile.c
Out of time for now. More to follow...
If you want to commit the stuff approved so far, be my guest.
Jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 1/3] Header file reduction - backend files.
2015-10-22 22:33 ` [patch 1/3] Header file reduction - backend files Jeff Law
@ 2015-10-22 22:36 ` Andrew MacLeod
2015-10-23 6:22 ` Jeff Law
1 sibling, 0 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-22 22:36 UTC (permalink / raw)
To: Jeff Law, gcc-patches
On 10/22/2015 06:25 PM, Jeff Law wrote:
> On 10/01/2015 08:33 PM, Andrew MacLeod wrote:
>> these are all in the main gcc directory. 297 files total.
>>
>> Everything bootstraps on x86_64-pc-linux-gnu and
>> powerpc64le-unknown-linux-gnu. All targets in config-list.mk still
>> build. Regressions tests also came up clean.
>>
>> OK for trunk?
> So just to be clear, I'm looking at the backend-reorder patch. Now
> that I know it's going to be reordering & removing duplicates, I know
> better what to expect.
>
> Can you please look at data-streamer-{in,out}.c, data-streamer.c,
> gimplify-me.c. Maybe its changed recently, but options.h is removed
> as a duplicate by your patch, but on the trunk, I only see it showing
> up once (using your tools)
>
show-headers data-streamer.c -soptions.h
on trunk shows me:
alias.h
backend.h
tm.h
options.h (1) <<-------
flag-types.h
i386-opts.h
<...>
hard-reg-set.h
options.h (2) <<-------
fold-const.h
so the first occurrence is loaded via tm.h from backend.h
>
>
> I realize it's gotten a little stale, but IMHO once approved, if you
> need to make adjustments due to changes since you originally produced
> the patch those adjustments are pre-approved.
yeah. what I've been doing is apply the patch... and anything that fails
to apply, simply run the tool again on that file... its easier than
trying to fix things up :-)
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 1/3] Header file reduction - backend files.
2015-10-22 22:33 ` [patch 1/3] Header file reduction - backend files Jeff Law
2015-10-22 22:36 ` Andrew MacLeod
@ 2015-10-23 6:22 ` Jeff Law
2015-10-23 12:26 ` Andrew MacLeod
1 sibling, 1 reply; 65+ messages in thread
From: Jeff Law @ 2015-10-23 6:22 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/22/2015 04:25 PM, Jeff Law wrote:
> On 10/01/2015 08:33 PM, Andrew MacLeod wrote:
>> these are all in the main gcc directory. 297 files total.
[ ... ]
>
> Out of time for now. More to follow...
So a quirk of show-headers. Let's look at cgraphunit.c on the trunk:
[law@tor gcc]$ ../contrib/show-headers cgraphunit.c -sgimplify.h
[ ... ]
gimplify.h <<-------
[ ... ]
That's all it spits out. But...
#include "gimplify.h"
#include "gimplify.h"
I'm not sure if that's intentional, but it does make it harder to use
show-headers to help understand the changes being made by your scripts.
As for the rest of the backend files. They're OK for the trunk.
jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 1/3] Header file reduction - backend files.
2015-10-23 6:22 ` Jeff Law
@ 2015-10-23 12:26 ` Andrew MacLeod
2015-10-23 15:15 ` Jeff Law
0 siblings, 1 reply; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-23 12:26 UTC (permalink / raw)
To: Jeff Law, gcc-patches
On 10/23/2015 01:43 AM, Jeff Law wrote:
> On 10/22/2015 04:25 PM, Jeff Law wrote:
>> On 10/01/2015 08:33 PM, Andrew MacLeod wrote:
>>> these are all in the main gcc directory. 297 files total.
> [ ... ]
>>
>> Out of time for now. More to follow...
> So a quirk of show-headers. Let's look at cgraphunit.c on the trunk:
>
> [law@tor gcc]$ ../contrib/show-headers cgraphunit.c -sgimplify.h
> [ ... ]
> gimplify.h <<-------
> [ ... ]
>
> That's all it spits out. But...
> #include "gimplify.h"
> #include "gimplify.h"
>
> I'm not sure if that's intentional, but it does make it harder to use
> show-headers to help understand the changes being made by your scripts.
>
>
> As for the rest of the backend files. They're OK for the trunk.
>
Hmm. yeah, It was mostly intended to show you what structure you have
and are getting indirectly, not so much indicate that you actually
include things directly twice. It utilizes a common function which
returns unique header files (find_unique_include_list) and just
processes those. A quick tweak should resolve that to just linearly
parse and process.
It also occurs to me the reason you only found one options.h in the
previous note... show-headers probably didn't know where to look for
your build directory to delve into tm.h. It defaults to ../../build
which is what I use. I will also tweak that to attempt to automatically
find a build directory below ../.. and failing that, have you specify one.
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 1/3] Header file reduction - backend files.
2015-10-23 12:26 ` Andrew MacLeod
@ 2015-10-23 15:15 ` Jeff Law
2015-10-23 16:30 ` Andrew MacLeod
0 siblings, 1 reply; 65+ messages in thread
From: Jeff Law @ 2015-10-23 15:15 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/23/2015 06:24 AM, Andrew MacLeod wrote:
>
> Hmm. yeah, It was mostly intended to show you what structure you have
> and are getting indirectly, not so much indicate that you actually
> include things directly twice. It utilizes a common function which
> returns unique header files (find_unique_include_list) and just
> processes those. A quick tweak should resolve that to just linearly
> parse and process.
It may not be worth the trouble.
There were maybe a half-dozen files with repeated direct includes and
they stood out as not being as mechanical as the others (with respect to
reviewing the changes). I suspect that in the future it won't happen
much, if at all, and we won't be looking at 15k line patches to clean
things up.
>
> It also occurs to me the reason you only found one options.h in the
> previous note... show-headers probably didn't know where to look for
> your build directory to delve into tm.h. It defaults to ../../build
> which is what I use. I will also tweak that to attempt to automatically
> find a build directory below ../.. and failing that, have you specify one.
Yea, I almost asked about procedures for the tool to find tm.h as it was
relatively common for tm.h to include something indirectly and make
later stuff redundant. But after a while, I got pretty good at knowing
what was in tm.h and how that would effect what headers became redundant
as tm.h got included earlier (via target.h). Similarly for optabs.h
moving around and making other stuff redundant.
Jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 1/3] Header file reduction - backend files.
2015-10-23 15:15 ` Jeff Law
@ 2015-10-23 16:30 ` Andrew MacLeod
0 siblings, 0 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-23 16:30 UTC (permalink / raw)
To: Jeff Law, gcc-patches
[-- Attachment #1: Type: text/plain, Size: 1964 bytes --]
On 10/23/2015 11:11 AM, Jeff Law wrote:
> On 10/23/2015 06:24 AM, Andrew MacLeod wrote:
>>
>> Hmm. yeah, It was mostly intended to show you what structure you have
>> and are getting indirectly, not so much indicate that you actually
>> include things directly twice. It utilizes a common function which
>> returns unique header files (find_unique_include_list) and just
>> processes those. A quick tweak should resolve that to just linearly
>> parse and process.
> It may not be worth the trouble.
>
> There were maybe a half-dozen files with repeated direct includes and
> they stood out as not being as mechanical as the others (with respect
> to reviewing the changes). I suspect that in the future it won't
> happen much, if at all, and we won't be looking at 15k line patches to
> clean things up.
>
>>
>> It also occurs to me the reason you only found one options.h in the
>> previous note... show-headers probably didn't know where to look for
>> your build directory to delve into tm.h. It defaults to ../../build
>> which is what I use. I will also tweak that to attempt to automatically
>> find a build directory below ../.. and failing that, have you
>> specify one.
> Yea, I almost asked about procedures for the tool to find tm.h as it
> was relatively common for tm.h to include something indirectly and
> make later stuff redundant. But after a while, I got pretty good at
> knowing what was in tm.h and how that would effect what headers became
> redundant as tm.h got included earlier (via target.h). Similarly for
> optabs.h moving around and making other stuff redundant.
>
Apply this diff to the contents of contrib/headers
Duplicates are now treated as they are seen instead of ignored.
Rather than hardcoding ../../build/gcc, the tool now makes an attempt to
find a "gcc/tm.h" from ../.. It will print a message if it fails to
find one. In which case you can use -i to specify a directory to look in
Andrew
[-- Attachment #2: H --]
[-- Type: text/plain, Size: 6412 bytes --]
*** headerutils.py 2015-10-23 11:55:46.652867427 -0400
--- /home/amacleod/headers/headerutils.py 2015-10-23 11:57:13.247023166 -0400
*************** def ii_write (fname, obj):
*** 216,221 ****
--- 216,258 ----
pickle.dump (obj[5], f)
f.close ()
+ # execute a system command which returns file names
+ def execute_command (command):
+ files = list()
+ f = os.popen (command)
+ for x in f:
+ if x[0:2] == "./":
+ fn = x.rstrip()[2:]
+ else:
+ fn = x.rstrip()
+ files.append(fn)
+ return files
+
+ # Try to locate a build directory from PATH
+ def find_gcc_bld_dir (path):
+ blddir = ""
+ # Look for blddir/gcc/tm.h
+ command = "find " + path + " -mindepth 2 -maxdepth 3 -name tm.h"
+ files = execute_command (command)
+ for y in files:
+ p = os.path.dirname (y)
+ if os.path.basename (p) == "gcc":
+ blddir = p
+ break
+ # If not found, try looking a bit deeper
+ # Dont look this deep initially because a lot of cross target builds may show
+ # up in the list before a native build... but those are better than nothing.
+ if not blddir:
+ command = "find " + path + " -mindepth 3 -maxdepth 5 -name tm.h"
+ files = execute_command (command)
+ for y in files:
+ p = os.path.dirname (y)
+ if os.path.basename (p) == "gcc":
+ blddir = p
+ break
+
+ return blddir
+
# Find files matching pattern NAME, return in a list.
# CURRENT is True if you want to include the current directory
*************** def find_gcc_files (name, current, deepe
*** 235,247 ****
command = "find -maxdepth 4 -mindepth 2 -name " + name + " -not -path \"./testsuite/*\""
if command != "":
! f = os.popen (command)
! for x in f:
! if x[0] == ".":
! fn = x.rstrip()[2:]
! else:
! fn = x
! files.append(fn)
return files
--- 272,278 ----
command = "find -maxdepth 4 -mindepth 2 -name " + name + " -not -path \"./testsuite/*\""
if command != "":
! files = execute_command (command)
return files
*** show-headers 2015-10-23 11:55:46.655867328 -0400
--- /home/amacleod/headers/show-headers 2015-10-23 11:53:43.668906942 -0400
*************** sawcore = False
*** 17,26 ****
# list of headers to emphasize
highlight = list ()
# search path for headers
! incl_dirs = [".", "../include", "../../build/gcc", "../libcpp/include" ]
# extra search paths to look in *after* the directory the source file is in.
- extra_dirs = [ "common", "c-family", "c", "cp", "config" ]
# append (1) to the end of the first line which includes INC in list INC.
def append_1 (output, inc):
--- 17,26 ----
# list of headers to emphasize
highlight = list ()
+ bld_dir = ""
# search path for headers
! incl_dirs = ["../include", "../libcpp/include", "common", "c-family", "c", "cp", "config" ]
# extra search paths to look in *after* the directory the source file is in.
# append (1) to the end of the first line which includes INC in list INC.
def append_1 (output, inc):
*************** def process_include (inc, indent):
*** 75,89 ****
! blddir = [ "." ]
usage = False
src = list()
for x in sys.argv[1:]:
if x[0:2] == "-i":
bld = x[2:]
! print "Build dir : " + bld
! blddir.append (bld)
elif x[0:2] == "-s":
highlight.append (os.path.basename (x[2:]))
elif x[0:2] == "-h":
--- 75,88 ----
! extradir = list()
usage = False
src = list()
for x in sys.argv[1:]:
if x[0:2] == "-i":
bld = x[2:]
! extradir.append (bld)
elif x[0:2] == "-s":
highlight.append (os.path.basename (x[2:]))
elif x[0:2] == "-h":
*************** if usage:
*** 104,129 ****
print " is included in a source file. Should be run from the source directory"
print " files from find-include-depends"
print " -s : search for a header, and point it out."
! print " -i : Specifies 1 or more directories to search for includes."
sys.exit(0)
- if len(blddir) > 1:
- incl_dirs = blddir
x = src[0]
- # if source is in a subdirectory, add the subdirectory to the search list
srcpath = os.path.dirname(x)
if srcpath:
! incl_dirs.append (srcpath)
! for yy in extra_dirs:
! incl_dirs.append (yy)
output = list()
sawcore = False
! incl = find_unique_include_list (x)
! for inc in incl:
! process_include (inc, 1)
print "\n" + x
for line in output:
print line
--- 103,142 ----
print " is included in a source file. Should be run from the source directory"
print " files from find-include-depends"
print " -s : search for a header, and point it out."
! print " -i : Specifies additonal directories to search for includes."
sys.exit(0)
+ if extradir:
+ incl_dirs = extradir + incl_dirs;
+
+ blddir = find_gcc_bld_dir ("../..")
+
+ if blddir:
+ print "Using build directory: " + blddir
+ incl_dirs.insert (0, blddir)
+ else:
+ print "Could not find a build directory, better results if you specify one with -i"
+
+ # search path is now ".", blddir, extradirs_from_-i, built_in_incl_dirs
+ incl_dirs.insert (0, ".")
+
+ # if source is in a subdirectory, prepend the subdirectory to the search list
x = src[0]
srcpath = os.path.dirname(x)
if srcpath:
! incl_dirs.insert (0, srcpath)
output = list()
sawcore = False
!
! data = open (x).read().splitlines()
! for line in data:
! d = find_pound_include (line, True, True)
! if d and d[-2:] == ".h":
! process_include (d, 1)
!
print "\n" + x
for line in output:
print line
*** README 2015-10-23 11:55:46.655867328 -0400
--- /home/amacleod/headers/README 2015-10-23 11:58:50.100841918 -0400
*************** show-headers
*** 62,68 ****
is indented, and when any duplicate headers are seen, they have their
duplicate number shown
! -i may be used to specify alternate search directories for headers to parse.
-s specifies headers to look for and emphasize in the output.
This tool must be run in the core gcc source directory.
--- 62,68 ----
is indented, and when any duplicate headers are seen, they have their
duplicate number shown
! -i may be used to specify additional search directories for headers to parse.
-s specifies headers to look for and emphasize in the output.
This tool must be run in the core gcc source directory.
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: config header file reduction patch checked in.
2015-10-19 15:55 ` Andrew MacLeod
@ 2015-10-23 17:02 ` Bernd Schmidt
2015-10-23 17:22 ` Mike Stump
0 siblings, 1 reply; 65+ messages in thread
From: Bernd Schmidt @ 2015-10-23 17:02 UTC (permalink / raw)
To: Andrew MacLeod, Iain Sandoe
Cc: Jeff Law, gcc-patches List, Mike Stump, Dominique Dhumieres
On 10/19/2015 05:53 PM, Andrew MacLeod wrote:
> interesting that none of the cross builds need diagnostics-core.h. I see
> it used in 7 different targets. Must be something on the native build
> command line that is defined which causes it to be needed.
I'm guessing it's the CROSS_DIRECTORY_STRUCTURE macro which is used by
darwin targets. It's also used for several other targets, so you may
want to double check those.
Bernd
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: config header file reduction patch checked in.
2015-10-23 17:02 ` Bernd Schmidt
@ 2015-10-23 17:22 ` Mike Stump
2015-10-23 17:26 ` Bernd Schmidt
0 siblings, 1 reply; 65+ messages in thread
From: Mike Stump @ 2015-10-23 17:22 UTC (permalink / raw)
To: Bernd Schmidt
Cc: Andrew MacLeod, Iain Sandoe, Jeff Law, gcc-patches List,
Dominique Dhumieres
On Oct 23, 2015, at 9:57 AM, Bernd Schmidt <bschmidt@redhat.com> wrote:
>
> I'm guessing it's the CROSS_DIRECTORY_STRUCTURE macro which is used by darwin targets. It's also used for several other targets, so you may want to double check those.
No, only darwin is special, as presently only darwin has the requisite support in the object file format to do what needs doing.
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: config header file reduction patch checked in.
2015-10-23 17:22 ` Mike Stump
@ 2015-10-23 17:26 ` Bernd Schmidt
2015-10-23 17:36 ` Andrew MacLeod
2015-10-23 17:39 ` Mike Stump
0 siblings, 2 replies; 65+ messages in thread
From: Bernd Schmidt @ 2015-10-23 17:26 UTC (permalink / raw)
To: Mike Stump
Cc: Andrew MacLeod, Iain Sandoe, Jeff Law, gcc-patches List,
Dominique Dhumieres
On 10/23/2015 07:15 PM, Mike Stump wrote:
> On Oct 23, 2015, at 9:57 AM, Bernd Schmidt <bschmidt@redhat.com>
> wrote:
>>
>> I'm guessing it's the CROSS_DIRECTORY_STRUCTURE macro which is used
>> by darwin targets. It's also used for several other targets, so you
>> may want to double check those.
>
> No, only darwin is special, as presently only darwin has the
> requisite support in the object file format to do what needs doing.
Not sure what you mean by "what needs doing", but grep shows a number of
uses of CROSS_DIRECTORY_STRUCTURE. Anything that uses it would
presumably cause a difference between a cross and host build which could
lead to an issue like the one Iain found.
Bernd
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: config header file reduction patch checked in.
2015-10-23 17:26 ` Bernd Schmidt
@ 2015-10-23 17:36 ` Andrew MacLeod
2015-10-23 17:49 ` Mike Stump
2015-10-23 17:39 ` Mike Stump
1 sibling, 1 reply; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-23 17:36 UTC (permalink / raw)
To: Bernd Schmidt, Mike Stump
Cc: Iain Sandoe, Jeff Law, gcc-patches List, Dominique Dhumieres
On 10/23/2015 01:24 PM, Bernd Schmidt wrote:
> On 10/23/2015 07:15 PM, Mike Stump wrote:
>> On Oct 23, 2015, at 9:57 AM, Bernd Schmidt <bschmidt@redhat.com>
>> wrote:
>>>
>>> I'm guessing it's the CROSS_DIRECTORY_STRUCTURE macro which is used
>>> by darwin targets. It's also used for several other targets, so you
>>> may want to double check those.
>>
>> No, only darwin is special, as presently only darwin has the
>> requisite support in the object file format to do what needs doing.
>
> Not sure what you mean by "what needs doing", but grep shows a number
> of uses of CROSS_DIRECTORY_STRUCTURE. Anything that uses it would
> presumably cause a difference between a cross and host build which
> could lead to an issue like the one Iain found.
>
well, in the config directories, darwin-driver.c is the only source file
which uses the definition... the other places are all .h files.
A quick glance at those and virtually all the uses of the macro are to
change the definition of a macro... which is harmless as far as this
exercise goes.
darwin-driver.c had some code that depended on one of the include
files, bit no other part of the file needed it, so that was the issue
there. The only other place it seems could be an issue is with
collect2.c... so I'll monitor that one closely before checking anything
in.. and get the darwin guys to test it for me before committing.
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: config header file reduction patch checked in.
2015-10-23 17:26 ` Bernd Schmidt
2015-10-23 17:36 ` Andrew MacLeod
@ 2015-10-23 17:39 ` Mike Stump
1 sibling, 0 replies; 65+ messages in thread
From: Mike Stump @ 2015-10-23 17:39 UTC (permalink / raw)
To: Bernd Schmidt
Cc: Andrew MacLeod, Iain Sandoe, Jeff Law, gcc-patches List,
Dominique Dhumieres
On Oct 23, 2015, at 10:24 AM, Bernd Schmidt <bschmidt@redhat.com> wrote:
> On 10/23/2015 07:15 PM, Mike Stump wrote:
>> On Oct 23, 2015, at 9:57 AM, Bernd Schmidt <bschmidt@redhat.com>
>> wrote:
>>>
>>> I'm guessing it's the CROSS_DIRECTORY_STRUCTURE macro which is used
>>> by darwin targets. It's also used for several other targets, so you
>>> may want to double check those.
>>
>> No, only darwin is special, as presently only darwin has the
>> requisite support in the object file format to do what needs doing.
>
> Not sure what you mean by "what needs doing", but grep shows a number of uses of CROSS_DIRECTORY_STRUCTURE. Anything that uses it would presumably cause a difference between a cross and host build which could lead to an issue like the one Iain found.
What needs doing, means the ability to pack two different architectures into one file. If you look at all the uses, you discover two things. All non-darwin ports use it in trivial ways. Only darwin uses it (doesn’t use it), in non-trivial ways that may impact headers. For the non-darwin targets, the use of it is universal, not related to any target. So, testing any cross (the condition under which things are different) and any non-cross will test most things. It is this notion of several other targets in your email that just doesn’t apply. Any target which is a cross, is the wording that would apply.
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: config header file reduction patch checked in.
2015-10-23 17:36 ` Andrew MacLeod
@ 2015-10-23 17:49 ` Mike Stump
0 siblings, 0 replies; 65+ messages in thread
From: Mike Stump @ 2015-10-23 17:49 UTC (permalink / raw)
To: Andrew MacLeod
Cc: Bernd Schmidt, Iain Sandoe, Jeff Law, gcc-patches List,
Dominique Dhumieres
On Oct 23, 2015, at 10:36 AM, Andrew MacLeod <amacleod@redhat.com> wrote:
>
> darwin-driver.c had some code that depended on one of the include files, bit no other part of the file needed it, so that was the issue there. The only other place it seems could be an issue is with collect2.c... so I'll monitor that one closely before checking anything in.. and get the darwin guys to test it for me before committing.
I’m fine with checking it in on the assumption darwin won’t break here. If it does, I think it will be trivial to clean it up. Just watch out for any darwin doesn’t build due to missing decls or some such and let them know what header needs to be included in darwin.h based upon the missing decls. I faced one of these recently with my port as the headers flexed. Annoying, but not too hard to recover from.
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch] header file re-ordering.
2015-10-08 13:37 ` [patch] header file re-ordering Andrew MacLeod
` (3 preceding siblings ...)
2015-10-22 21:07 ` [patch] header file re-ordering Jeff Law
@ 2015-10-23 19:14 ` Jeff Law
2015-10-23 19:28 ` Andrew MacLeod
4 siblings, 1 reply; 65+ messages in thread
From: Jeff Law @ 2015-10-23 19:14 UTC (permalink / raw)
To: Andrew MacLeod, gcc-patches
On 10/08/2015 07:37 AM, Andrew MacLeod wrote:
> On 10/07/2015 06:02 PM, Jeff Law wrote:
>> On 10/01/2015 08:33 PM, Andrew MacLeod wrote:
>>> these are all in the main gcc directory. 297 files total.
>>>
>>> Everything bootstraps on x86_64-pc-linux-gnu and
>>> powerpc64le-unknown-linux-gnu. All targets in config-list.mk still
>>> build. Regressions tests also came up clean.
>>>
>>> OK for trunk?
>> So as I look at this and make various spot checks, what really stands
>> out is how often something like alias.h gets included, often in places
>> that have absolutely no business/need to be looking at that file.
>> Cut-n-paste at its worst. It happens to many others, but alias.h
>> seems to have gotten its grubby self into just about everywhere for
>> reasons unkonwn.
>>
>> I find myself also wondering if a two step approach would make this
>> easier. Step #1 being ordering the headers, step #2 being removal of
>> the duplicates. As you note, the downside is two checkins that would
>> affect most files in the tree. I guess I'll keep slogging through the
>> patch as is...
>>
>> jeff
> Heres the patch for reordered headers. Building as we speak. Hard to
> fully verify since Ada doesn't seem to bootstrap on trunk at the moment:
>
> +===========================GNAT BUG
> DETECTED==============================+
> | 6.0.0 20151008 (experimental) (x86_64-pc-linux-gnu) GCC
> error: |
> | in gen_lowpart_common, at
> emit-rtl.c:1399 |
> | Error detected around
> s-regpat.adb:1029:22 |
>
> <...>
> raised TYPES.UNRECOVERABLE_ERROR : comperr.adb:423
> ../gcc-interface/Makefile:311: recipe for target 's-regpat.o' failed
>
>
> However, the tool has been run, and I've made the minor adjustments
> required to the source files to make it work. (ie, a few multi-line
> comments and the fact that mul-tables.c is generated on the tile* targets.
>
> So this is what it should look like. I used -cp. Other languages are
> bootstrapping, and I have yet to build all the targets... that'll just
> take a day. Be nice if ada worked tho.
>
> I can run the reduction tool over the weekend (its a long weekend here
> :-) on this if you want... the other patch is a couple of weeks out of
> date anyway now.
So I'll approve the reordering and duplicate removal for the front-ends
as well.
Now it's just removal of unnecessary crud and the scripts, right?
jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch] header file re-ordering.
2015-10-23 19:14 ` Jeff Law
@ 2015-10-23 19:28 ` Andrew MacLeod
0 siblings, 0 replies; 65+ messages in thread
From: Andrew MacLeod @ 2015-10-23 19:28 UTC (permalink / raw)
To: Jeff Law, gcc-patches
On 10/23/2015 03:09 PM, Jeff Law wrote:
> On 10/08/2015 07:37 AM, Andrew MacLeod wrote:
>> On 10/07/2015 06:02 PM, Jeff Law wrote:
>>> On 10/01/2015 08:33 PM, Andrew MacLeod wrote:
>>>> these are all in the main gcc directory. 297 files total.
>>>>
>>>> Everything bootstraps on x86_64-pc-linux-gnu and
>>>> powerpc64le-unknown-linux-gnu. All targets in config-list.mk still
>>>> build. Regressions tests also came up clean.
>>>>
>>>> OK for trunk?
>>> So as I look at this and make various spot checks, what really stands
>>> out is how often something like alias.h gets included, often in places
>>> that have absolutely no business/need to be looking at that file.
>>> Cut-n-paste at its worst. It happens to many others, but alias.h
>>> seems to have gotten its grubby self into just about everywhere for
>>> reasons unkonwn.
>>>
>>> I find myself also wondering if a two step approach would make this
>>> easier. Step #1 being ordering the headers, step #2 being removal of
>>> the duplicates. As you note, the downside is two checkins that would
>>> affect most files in the tree. I guess I'll keep slogging through the
>>> patch as is...
>>>
>>> jeff
>> Heres the patch for reordered headers. Building as we speak. Hard to
>> fully verify since Ada doesn't seem to bootstrap on trunk at the moment:
>>
>> +===========================GNAT BUG
>> DETECTED==============================+
>> | 6.0.0 20151008 (experimental) (x86_64-pc-linux-gnu) GCC
>> error: |
>> | in gen_lowpart_common, at
>> emit-rtl.c:1399 |
>> | Error detected around
>> s-regpat.adb:1029:22 |
>>
>> <...>
>> raised TYPES.UNRECOVERABLE_ERROR : comperr.adb:423
>> ../gcc-interface/Makefile:311: recipe for target 's-regpat.o' failed
>>
>>
>> However, the tool has been run, and I've made the minor adjustments
>> required to the source files to make it work. (ie, a few multi-line
>> comments and the fact that mul-tables.c is generated on the tile*
>> targets.
>>
>> So this is what it should look like. I used -cp. Other languages are
>> bootstrapping, and I have yet to build all the targets... that'll just
>> take a day. Be nice if ada worked tho.
>>
>> I can run the reduction tool over the weekend (its a long weekend here
>> :-) on this if you want... the other patch is a couple of weeks out of
>> date anyway now.
> So I'll approve the reordering and duplicate removal for the
> front-ends as well.
>
> Now it's just removal of unnecessary crud and the scripts, right?
>
> jeff
Correct. I gave you a new patch for the backend removal stuff today.
the config stuff is all done. and I'll give you another patch the
beginning of next week for just the removals of headers in the front
ends.. I ran in into some build hiccups which delayed getting it ready
today as well.
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib - second cut
2015-10-14 15:14 ` [patch 4/3] Header file reduction - Tools for contrib - second cut Andrew MacLeod
@ 2015-11-03 6:06 ` Jeff Law
2015-11-03 13:24 ` Andrew MacLeod
0 siblings, 1 reply; 65+ messages in thread
From: Jeff Law @ 2015-11-03 6:06 UTC (permalink / raw)
To: Andrew MacLeod, Bernd Schmidt, gcc-patches
On 10/14/2015 09:14 AM, Andrew MacLeod wrote:
> Here's the latest version of the tools for a sub directory in contrib.
> I've handled all the feedback, except I have not fully commented the
> python code in the tools, nor followed any particular coding
> convention... Documentation has been handled, and I've added some
> additional comments to the places which were noted as being unclear. Ive
> also removed all tabs from the source files.
>
> Ive also updated show-headers slightly to be a little more
> error-resistant and to put some emphasis on any header files specified
> on the command as being of interest . (when there are 140 shown, it can
> be hard to find the one you are looking for sometimes)
>
> Do we wish to impose anything in particular on the source for tools
> going into this sub-directory of contrib? The other tools in contrib
> don't seem to have much in the way of coding standards. I also
> wonder if anyone other than me will look at them much :-)
I'm certainly interested in them.
Do you have any sense of whether or not coverage of the tools has
improved over short time since we started squashing out conditional
compilation? I was running the header file reordering bits on the trunk
and was a bit surprised of how many things they're still changing. But
that would make sense if some files are now being processed that weren't
before because we've squashed out the conditional compilation.
It certainly is true that the total result is smaller than any of the
backend, config/ or languages changes that you posted, and I'm running
it across the entire source tree, but I'm still surprised at how much
churn I'm seeing.
If it weren't for the level of churn, I'd probably be suggesting we just
have this stuff run regularly (weekly, monthly, whatever) and commit the
result after a sanity looksie. I've yet to see this tool botch anything
and if we're not unnecessarily churning the sources, keeping us clean
WRT canononical ordering and duplicate removal automatically seems like
a good place to be.
Maybe do another commit of the reordering output and evaluate again in a
month?
I don't think we're quite there on the reducer and it obviously requires
more infrastructure in place to test. But it'd be nice to get to a
similar state on that tool.
Which reminds me, you ought to add a VMS target to your tests. The
reducer botched vmsdbgout.c.
alpha64-dec-vms
alpha-dec-vms
ia64-hp-vms
Covering any one of those ought to do the trick.
Thoughts?
jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib - second cut
2015-11-03 6:06 ` Jeff Law
@ 2015-11-03 13:24 ` Andrew MacLeod
2015-11-03 14:00 ` Jeff Law
0 siblings, 1 reply; 65+ messages in thread
From: Andrew MacLeod @ 2015-11-03 13:24 UTC (permalink / raw)
To: Jeff Law, Bernd Schmidt, gcc-patches
On 11/03/2015 01:06 AM, Jeff Law wrote:
> On 10/14/2015 09:14 AM, Andrew MacLeod wrote:
>> Here's the latest version of the tools for a sub directory in contrib.
>> I've handled all the feedback, except I have not fully commented the
>> python code in the tools, nor followed any particular coding
>> convention... Documentation has been handled, and I've added some
>> additional comments to the places which were noted as being unclear. Ive
>> also removed all tabs from the source files.
>>
>> Ive also updated show-headers slightly to be a little more
>> error-resistant and to put some emphasis on any header files specified
>> on the command as being of interest . (when there are 140 shown, it can
>> be hard to find the one you are looking for sometimes)
>>
>> Do we wish to impose anything in particular on the source for tools
>> going into this sub-directory of contrib? The other tools in contrib
>> don't seem to have much in the way of coding standards. I also
>> wonder if anyone other than me will look at them much :-)
> I'm certainly interested in them.
>
> Do you have any sense of whether or not coverage of the tools has
> improved over short time since we started squashing out conditional
> compilation? I was running the header file reordering bits on the
> trunk and was a bit surprised of how many things they're still
> changing. But that would make sense if some files are now being
> processed that weren't before because we've squashed out the
> conditional compilation.
hmm. no, i dont have a feel for that. Anl to be fair, I didn't run the
tools on every file in trunk. I limited it to the ones in backend.h,
and took out even a few of those that were troublesome in some way or
other at some point. I wouldnt expect the conditional stuff to affect
reordering much. reducing... we might start to see things like tm.h or
target.h included less.
A further enhancement in line with that would be to teach the reducer
about a couple of special files.. like the relationship between
options.h, tm.h and target.h. sometimes target.h was included when in
fact options.h was the only thing actually needed.. During the
flayttening process I manually handled this by flattening tm.h out of
target.h and options.h into anything that included tm.h... so every
file had options.h, tm.h and target.h explicitly included, and then the
reducer would just pick the "minimum". of course, the reorder tool
works against this by combining them again :-)
however, the tool could be taught when it see target.h for instance, if
it can't be removed, it could try replacing it with options.h and if
that fails, tm.h.. That sort of thing could automatically remove
headers that arent needed because target macros have been turned into
hooks or something. I suppose that could even be generalized to trying
to replace each header that included other headers... I wonder how
safe that would be. hum.
>
> It certainly is true that the total result is smaller than any of the
> backend, config/ or languages changes that you posted, and I'm running
> it across the entire source tree, but I'm still surprised at how much
> churn I'm seeing.
>
> If it weren't for the level of churn, I'd probably be suggesting we
> just have this stuff run regularly (weekly, monthly, whatever) and
> commit the result after a sanity looksie. I've yet to see this tool
> botch anything and if we're not unnecessarily churning the sources,
> keeping us clean WRT canononical ordering and duplicate removal
> automatically seems like a good place to be.
>
it can botch one of the go files.. go has a backend.h of it's own...
which buggers things up quite nicely since it doesnt include a bunch of
the headers gcc's backend.h does :-)
The reordering tool is likely safer to run across the board.. especially
if we can determine the very small subset it shouldn't be run on.
Right now it triggers off the presence of system.h... if system.h is not
present, it wont do anything to the file. I haven't tried running it
against *.c to see if there are any other failures, perhaps thats not a
bad idea. That will also provide us with a list of files which have
headers included within conditional compliation... there are a few of
those :-P and maybe they could be fixed. by default it wou=nt do
anything to those either.
Anyway, if we run it against everything and check it in, then in theory
there isn't any reason you couldnt spot run it at some interval.. there
shouldn't be much churn then.
> Maybe do another commit of the reordering output and evaluate again in
> a month?
>
> I don't think we're quite there on the reducer and it obviously
> requires more infrastructure in place to test. But it'd be nice to
> get to a similar state on that tool.
>
yeah, the reducer still needs some tweaks to be generally runnable I
think. IN particular, how to deal with externally supplied macros it
cant really see. Im still thinking about that one.
> Which reminds me, you ought to add a VMS target to your tests. The
> reducer botched vmsdbgout.c.
Thats one of the reasons vmsdbgout.c wasn't in the list of things I
reduced :-)
#include "config.h"
#include "system.h"
#include "coretypes.h"
#ifdef VMS_DEBUGGING_INFO
#include "alias.h"
#include "tree.h"
#include "varasm.h"
#include "version.h"
#include "flags.h"
#include "rtl.h"
#include "output.h"
#include "vmsdbg.h"
#include "debug.h"
#include "langhooks.h"
#include "function.h"
#include "target.h"
You have to override it with -i in order to reorder them too.. There are
headers included within conditional compilation, and this is one of
those that is simply not safe... And yes, I guess adding a vms target
would cover that for the reducer at least...
I almost suggested moving all the includes out of the conditional...
which is not unreasonable... and would resolve the issue as well. It
would be good to remove all the conditionals in all the source files
like that.. some are easier than others tho.
back to reordering... the gen files are a bit of a pain too because of
the rtl.h conditional inclusions.. which I never really found a good
solution for... maybe we should have a brtl.h which is used in concert
with any source which uses bconfig.h.. brtl.h could verifies bconfig.h
has been included and then includes those headers it needs, followed by
rtl.h itself.. and the tool could confirm the right pairing of
config.h/rtl.h bconfig.h/brtl.h is used. hmm.
Andrew
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib - second cut
2015-11-03 13:24 ` Andrew MacLeod
@ 2015-11-03 14:00 ` Jeff Law
2015-11-03 14:19 ` Andrew MacLeod
0 siblings, 1 reply; 65+ messages in thread
From: Jeff Law @ 2015-11-03 14:00 UTC (permalink / raw)
To: Andrew MacLeod, Bernd Schmidt, gcc-patches
On 11/03/2015 06:24 AM, Andrew MacLeod wrote:
>>
>> Do you have any sense of whether or not coverage of the tools has
>> improved over short time since we started squashing out conditional
>> compilation? I was running the header file reordering bits on the
>> trunk and was a bit surprised of how many things they're still
>> changing. But that would make sense if some files are now being
>> processed that weren't before because we've squashed out the
>> conditional compilation.
>
> hmm. no, i dont have a feel for that. Anl to be fair, I didn't run the
> tools on every file in trunk. I limited it to the ones in backend.h,
> and took out even a few of those that were troublesome in some way or
> other at some point. I wouldnt expect the conditional stuff to affect
> reordering much. reducing... we might start to see things like tm.h or
> target.h included less.
Well, the reorder tool will punt if it sees conditional compilation in
the headers, so I was kind hoping that some of the churn would be
explainable by the ongoing removal of conditional compilation causing
files to be processed now that weren't before. But it appears its
other factors.
>
> A further enhancement in line with that would be to teach the reducer
> about a couple of special files.. like the relationship between
> options.h, tm.h and target.h. sometimes target.h was included when in
> fact options.h was the only thing actually needed.. During the
> flayttening process I manually handled this by flattening tm.h out of
> target.h and options.h into anything that included tm.h... so every
> file had options.h, tm.h and target.h explicitly included, and then the
> reducer would just pick the "minimum". of course, the reorder tool
> works against this by combining them again :-)
A fair amount of the churn was options.h related. I'll run it again and
look closer to see how much exactly.
>>
>> If it weren't for the level of churn, I'd probably be suggesting we
>> just have this stuff run regularly (weekly, monthly, whatever) and
>> commit the result after a sanity looksie. I've yet to see this tool
>> botch anything and if we're not unnecessarily churning the sources,
>> keeping us clean WRT canononical ordering and duplicate removal
>> automatically seems like a good place to be.
>>
> it can botch one of the go files.. go has a backend.h of it's own...
> which buggers things up quite nicely since it doesnt include a bunch of
> the headers gcc's backend.h does :-)
Cute.
>
> The reordering tool is likely safer to run across the board.. especially
> if we can determine the very small subset it shouldn't be run on.
go, the gen* files perhaps a few others. Blacklisting and running
regularly is probably the way to go then.
>
> Right now it triggers off the presence of system.h... if system.h is not
> present, it wont do anything to the file. I haven't tried running it
> against *.c to see if there are any other failures, perhaps thats not a
> bad idea. That will also provide us with a list of files which have
> headers included within conditional compliation... there are a few of
> those :-P and maybe they could be fixed. by default it wou=nt do
> anything to those either.
I didn't know it keyed on system.h. I'd manually blacklisted testsuite/
but otherwise let it run wild just for giggles. Knowing it keys on
system.h is helpful in that we don't have to blacklist nearly as much stuff.
And yes, there's a few files with conditional headers. It wasn't
terrible and makes a nice todo list for someone new to tackle.
>
> Anyway, if we run it against everything and check it in, then in theory
> there isn't any reason you couldnt spot run it at some interval.. there
> shouldn't be much churn then.
That's the idea and obviously the more automated the better.
> yeah, the reducer still needs some tweaks to be generally runnable I
> think. IN particular, how to deal with externally supplied macros it
> cant really see. Im still thinking about that one.
Well, the solution is obvious, we continue the move away from
conditionally compiled code so that those macros don't matter in the end :-)
>
>> Which reminds me, you ought to add a VMS target to your tests. The
>> reducer botched vmsdbgout.c.
>
> Thats one of the reasons vmsdbgout.c wasn't in the list of things I
> reduced :-)
Ahem, but vmsdbgout.c was part of the commit on Friday...
>
> back to reordering... the gen files are a bit of a pain too because of
> the rtl.h conditional inclusions.. which I never really found a good
> solution for... maybe we should have a brtl.h which is used in concert
> with any source which uses bconfig.h.. brtl.h could verifies bconfig.h
> has been included and then includes those headers it needs, followed by
> rtl.h itself.. and the tool could confirm the right pairing of
> config.h/rtl.h bconfig.h/brtl.h is used. hmm.
I think initially we could blacklist the gen* files. I'm less concerned
about the generators than I am the compiler proper.
jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib - second cut
2015-11-03 14:00 ` Jeff Law
@ 2015-11-03 14:19 ` Andrew MacLeod
2015-11-10 21:16 ` Jeff Law
0 siblings, 1 reply; 65+ messages in thread
From: Andrew MacLeod @ 2015-11-03 14:19 UTC (permalink / raw)
To: Jeff Law, Bernd Schmidt, gcc-patches
On 11/03/2015 09:00 AM, Jeff Law wrote:
>
>> yeah, the reducer still needs some tweaks to be generally runnable I
>> think. IN particular, how to deal with externally supplied macros it
>> cant really see. Im still thinking about that one.
> Well, the solution is obvious, we continue the move away from
> conditionally compiled code so that those macros don't matter in the
> end :-)
>
yeah but in the meantime its an issue. I *think* I can simply provide
to tool with a set of macros to define on the build command whenever it
tries building a file.. we'll see.
It should also be possible to extract, after reduction, a list of
macros that were used in the source file in conditional compilation, but
which never saw a definition in any of the files. THat could also be
useful information. IN fact, that could be a stand alone analysis
pretty easily I think...
>
>>
>>> Which reminds me, you ought to add a VMS target to your tests. The
>>> reducer botched vmsdbgout.c.
>>
>> Thats one of the reasons vmsdbgout.c wasn't in the list of things I
>> reduced :-)
> Ahem, but vmsdbgout.c was part of the commit on Friday...
ahh opps. it snuck back in over time :-P sorry.
>
>>
>> back to reordering... the gen files are a bit of a pain too because of
>> the rtl.h conditional inclusions.. which I never really found a good
>> solution for... maybe we should have a brtl.h which is used in concert
>> with any source which uses bconfig.h.. brtl.h could verifies bconfig.h
>> has been included and then includes those headers it needs, followed by
>> rtl.h itself.. and the tool could confirm the right pairing of
>> config.h/rtl.h bconfig.h/brtl.h is used. hmm.
> I think initially we could blacklist the gen* files. I'm less
> concerned about the generators than I am the compiler proper.
>
yeah, its just annoying from a more abstract level (and results from a
few of the tools) for rtl.h to have to conditionally include a bunch of
stuff provided by coretypes.h.
> jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
* Re: [patch 4/3] Header file reduction - Tools for contrib - second cut
2015-11-03 14:19 ` Andrew MacLeod
@ 2015-11-10 21:16 ` Jeff Law
0 siblings, 0 replies; 65+ messages in thread
From: Jeff Law @ 2015-11-10 21:16 UTC (permalink / raw)
To: Andrew MacLeod, Bernd Schmidt, gcc-patches
Andrew, can you go ahead and commit those changes into contrib? I think
in a subdirectory would be best so that you can include the README.
Make sure the permissions are set correctly. Applying them as a patch
kept mucking them up.
header-tools or somesuch should be a fine directory name to use.
Generally we haven't required the same level of rigor on the contrib/
bits that's required elsewhere. And I really don't want to lose these
tools and see them bitrot.
Jeff
^ permalink raw reply [flat|nested] 65+ messages in thread
end of thread, other threads:[~2015-11-10 21:16 UTC | newest]
Thread overview: 65+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2015-10-02 2:22 [patch 0/3] Header file reduction Andrew MacLeod
2015-10-02 2:33 ` [patch 3/3] Header file reduction - FE files Andrew MacLeod
2015-10-02 2:33 ` [patch 2/3] Header file reduction - config files Andrew MacLeod
2015-10-02 2:33 ` [patch 1/3] Header file reduction - backend files Andrew MacLeod
2015-10-07 22:02 ` Jeff Law
2015-10-07 23:09 ` Andrew MacLeod
2015-10-08 13:37 ` [patch] header file re-ordering Andrew MacLeod
2015-10-08 15:29 ` Jeff Law
2015-10-11 20:58 ` [BUILDROBOT] Bootstrap broken in Ada (was: [patch] header file re-ordering.) Jan-Benedict Glaw
2015-10-11 22:27 ` [BUILDROBOT] Bootstrap broken in Ada Jeff Law
2015-10-11 22:35 ` Jan Hubicka
2015-10-12 8:04 ` [patch] header file re-ordering Jeff Law
2015-10-14 14:05 ` Andrew MacLeod
2015-10-19 21:05 ` Jeff Law
2015-10-16 19:52 ` config header file reduction patch checked in Andrew MacLeod
2015-10-16 20:17 ` Andrew MacLeod
2015-10-18 9:34 ` Iain Sandoe
2015-10-19 15:55 ` Andrew MacLeod
2015-10-23 17:02 ` Bernd Schmidt
2015-10-23 17:22 ` Mike Stump
2015-10-23 17:26 ` Bernd Schmidt
2015-10-23 17:36 ` Andrew MacLeod
2015-10-23 17:49 ` Mike Stump
2015-10-23 17:39 ` Mike Stump
2015-10-22 21:07 ` [patch] header file re-ordering Jeff Law
2015-10-22 21:21 ` Andrew MacLeod
2015-10-22 22:25 ` Jeff Law
2015-10-23 19:14 ` Jeff Law
2015-10-23 19:28 ` Andrew MacLeod
2015-10-22 22:33 ` [patch 1/3] Header file reduction - backend files Jeff Law
2015-10-22 22:36 ` Andrew MacLeod
2015-10-23 6:22 ` Jeff Law
2015-10-23 12:26 ` Andrew MacLeod
2015-10-23 15:15 ` Jeff Law
2015-10-23 16:30 ` Andrew MacLeod
2015-10-05 13:55 ` [patch 0/3] Header file reduction Bernd Schmidt
2015-10-05 14:10 ` Richard Biener
2015-10-05 20:10 ` Andrew MacLeod
2015-10-05 20:37 ` Bernd Schmidt
2015-10-05 21:11 ` Andrew MacLeod
2015-10-06 3:03 ` [patch 0/3] Header file reduction. - unified patches Andrew MacLeod
2015-10-06 21:55 ` [patch 0/3] Header file reduction Jeff Law
2015-10-06 21:44 ` Jeff Law
2015-10-07 8:16 ` Richard Biener
2015-10-08 15:48 ` Michael Matz
2015-10-05 21:18 ` [patch 4/3] Header file reduction - Tools for contrib Andrew MacLeod
2015-10-06 10:27 ` Bernd Schmidt
2015-10-06 12:02 ` Bernd Schmidt
2015-10-06 14:04 ` Andrew MacLeod
2015-10-06 14:57 ` Bernd Schmidt
2015-10-06 19:19 ` Andrew MacLeod
2015-10-06 20:37 ` Bernd Schmidt
2015-10-06 21:30 ` Jeff Law
2015-10-06 22:43 ` Andrew MacLeod
2015-10-06 21:27 ` Jeff Law
2015-10-06 16:32 ` Joseph Myers
2015-10-06 19:18 ` Andrew MacLeod
2015-10-07 16:35 ` Andrew MacLeod
2015-10-14 15:14 ` [patch 4/3] Header file reduction - Tools for contrib - second cut Andrew MacLeod
2015-11-03 6:06 ` Jeff Law
2015-11-03 13:24 ` Andrew MacLeod
2015-11-03 14:00 ` Jeff Law
2015-11-03 14:19 ` Andrew MacLeod
2015-11-10 21:16 ` Jeff Law
2015-10-08 16:31 ` [patch 4/3] Header file reduction - Tools for contrib David Malcolm
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).