public inbox for newlib@sourceware.org
 help / color / mirror / Atom feed
* Question about autoreconf to regenerate configuration files
@ 2022-01-21 12:32 Matthew Joyce
  2022-01-21 13:44 ` Corinna Vinschen
  2022-01-21 15:02 ` R. Diez
  0 siblings, 2 replies; 17+ messages in thread
From: Matthew Joyce @ 2022-01-21 12:32 UTC (permalink / raw)
  To: newlib

Hello,

I am working with Sebastian Huber on the previously-discussed 
thread-local objects configuration option. If I may, I'd like to ask a 
question about regenerating the configuration files.

I added a new source file which defines one of these tls objects in 
libc/errno/ and added the file to the LIB_SOURCES in errno/Makefile.am. 
I then ran autoreconf in the libc directory.

nm shows me that the new object is defined in the symbol table, but I 
have also unwittingly modified well over 100 files (numerous 
Makefile.in, aclocal.m4, and configure files).

I'd like to understand: Is this expected or desirable? Should I not be 
using autoreconf for this?

Thank you very much for your time!

Sincerely,

Matt

-- 
embedded brains GmbH
Herr Matthew JOYCE
Dornierstr. 4
82178 Puchheim
Germany


^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-21 12:32 Question about autoreconf to regenerate configuration files Matthew Joyce
@ 2022-01-21 13:44 ` Corinna Vinschen
  2022-01-21 15:02 ` R. Diez
  1 sibling, 0 replies; 17+ messages in thread
From: Corinna Vinschen @ 2022-01-21 13:44 UTC (permalink / raw)
  To: newlib

On Jan 21 13:32, Matthew Joyce wrote:
> Hello,
> 
> I am working with Sebastian Huber on the previously-discussed thread-local
> objects configuration option. If I may, I'd like to ask a question about
> regenerating the configuration files.
> 
> I added a new source file which defines one of these tls objects in
> libc/errno/ and added the file to the LIB_SOURCES in errno/Makefile.am. I
> then ran autoreconf in the libc directory.
> 
> nm shows me that the new object is defined in the symbol table, but I have
> also unwittingly modified well over 100 files (numerous Makefile.in,
> aclocal.m4, and configure files).
> 
> I'd like to understand: Is this expected or desirable? Should I not be using
> autoreconf for this?

You should be able to do that, but maybe you used another autoconf
version than the one used to generate the files in the first place.
I wouldn't worry about that right now, especially while Mike is still
turning our autotools setup upside down.


Corinna


^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-21 12:32 Question about autoreconf to regenerate configuration files Matthew Joyce
  2022-01-21 13:44 ` Corinna Vinschen
@ 2022-01-21 15:02 ` R. Diez
  2022-01-21 15:37   ` Joel Sherrill
  1 sibling, 1 reply; 17+ messages in thread
From: R. Diez @ 2022-01-21 15:02 UTC (permalink / raw)
  To: Matthew Joyce; +Cc: Newlib


> [...]
> but I have also unwittingly modified well over 100 files (numerous Makefile.in, aclocal.m4, and configure files).
> I'd like to understand: Is this expected or desirable? Should I not be using autoreconf for this?

This is another drawback of checking into the repository the files generated by the Autotools, see this e-mail:

require autoconf-2.69 exactly
Wed Jan 12 20:01:01 GMT 2022
https://sourceware.org/pipermail/newlib/2022/018866.html

See also the answer from Mike Frysinger about this problem:

require autoconf-2.69 exactly
Wed Jan 12 21:37:22 GMT 2022
https://sourceware.org/pipermail/newlib/2022/018867.html

Regards,
   rdiez

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-21 15:02 ` R. Diez
@ 2022-01-21 15:37   ` Joel Sherrill
  2022-01-21 16:09     ` R. Diez
  0 siblings, 1 reply; 17+ messages in thread
From: Joel Sherrill @ 2022-01-21 15:37 UTC (permalink / raw)
  To: R. Diez; +Cc: Matthew Joyce, Newlib

On Fri, Jan 21, 2022 at 9:03 AM R. Diez via Newlib
<newlib@sourceware.org> wrote:
>
>
> > [...]
> > but I have also unwittingly modified well over 100 files (numerous Makefile.in, aclocal.m4, and configure files).
> > I'd like to understand: Is this expected or desirable? Should I not be using autoreconf for this?
>
> This is another drawback of checking into the repository the files generated by the Autotools, see this e-mail:
>
> require autoconf-2.69 exactly
> Wed Jan 12 20:01:01 GMT 2022
> https://sourceware.org/pipermail/newlib/2022/018866.html
>
> See also the answer from Mike Frysinger about this problem:
>
> require autoconf-2.69 exactly
> Wed Jan 12 21:37:22 GMT 2022
> https://sourceware.org/pipermail/newlib/2022/018867.html

Over at RTEMS when we used autotools, we didn't commit most of the
output.  Our tools included the preferred autoconf/automake versions.

We did put the generated acinclude.m4 for the list of BSPs in git. But
even on that, we had issues because the order of BSPs would vary
depending on which host generated it. Ultimately, we eliminated that
file.

The bootstrap time after checkout was quite large compared to the
actual compilation time. The bootstrap time was large enough to
negatively impact our ability to do automated regression testing. We
have switched away from autotools.

One of our long standing concerns with letting users generate was
reproducibility. How do you know that two end users end up with the
same generated output?

It's a pain to put the generated output in git but at least it saves
generating it and ensures it is the same for all users building.

All solutions in this area seem to suck. It is only a matter of degree.

--joel

> Regards,
>    rdiez

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-21 15:37   ` Joel Sherrill
@ 2022-01-21 16:09     ` R. Diez
  2022-01-21 22:09       ` Mike Frysinger
  0 siblings, 1 reply; 17+ messages in thread
From: R. Diez @ 2022-01-21 16:09 UTC (permalink / raw)
  To: joel; +Cc: Matthew Joyce, Newlib


> [...]
> The bootstrap time was large enough to
> negatively impact our ability to do automated regression testing.

A very long bootstrap time could be an issue.

However, compilation time normally outweighs by far the Autotools regeneration step. Is that a problem in Newlib at the moment?


> One of our long standing concerns with letting users generate was
> reproducibility. How do you know that two end users end up with the
> same generated output?

Newlib users will use different compiler versions, different GNU Make versions, etc. I do not understand why fixing the Autotools versions would be an advantage for Newlib.

If you are trying to reproduce some weird build bug, you can ask the developer to use particular versions of the Autotools. If you are trying to get reproducible builds, your build script can make sure it is using the same Autools versions every time. That is not hard to achieve with this script of mine:

https://github.com/rdiez/Tools/tree/master/Autotools


> It's a pain to put the generated output in git but at least it saves
> generating it and ensures it is the same for all users building.

Users would not normally download the Git Head, but some release tarball. That tarball should then have all the Autotools-generated files. This way, all users will build with the same Autotools-generated files.

Only Newlib developers working against Git Head would have to deal with the Autotools. And that is not normally a problem, if the build system is healthy, like it should be.

Checking the Autotools files into the repository has several drawbacks. We have seen 2 of them again recently, and I am sure that we will see more in the future. I understand that the developer doing the much-needed Autotools clean-up, Mike Frysinger, also advised against checking in those files.

If Newlib wishes to depart from best practice, it would be nice to know the concrete issues in the context of this project, and not just some general "all solutions in this area seem to suck" justification.

Regards,
   rdiez

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-21 16:09     ` R. Diez
@ 2022-01-21 22:09       ` Mike Frysinger
  2022-01-21 23:08         ` Joel Sherrill
                           ` (2 more replies)
  0 siblings, 3 replies; 17+ messages in thread
From: Mike Frysinger @ 2022-01-21 22:09 UTC (permalink / raw)
  To: R. Diez; +Cc: joel, Newlib, Matthew Joyce

[-- Attachment #1: Type: text/plain, Size: 2195 bytes --]

On 21 Jan 2022 17:09, R. Diez via Newlib wrote:
> > [...]
> > The bootstrap time was large enough to
> > negatively impact our ability to do automated regression testing.
> 
> A very long bootstrap time could be an issue.
> 
> However, compilation time normally outweighs by far the Autotools regeneration step. Is that a problem in Newlib at the moment?

autotools (autoreconf really) doesn't run in parallel, so every subdir
with a configure script needs a separate serialized run of all the tools.
newlib has many many of these (arguably, too many).

on my quad core 4.2GHz AMD that is otherwise idle ...

$ time (cd newlib && autoreconf)
real    5m22.170s
user    3m13.709s
sys     0m12.332s

$ time (cd libgloss && autoreconf)
real    1m41.754s
user    0m43.505s
sys     0m3.618s
<this errored out, not sure why, so it might normally take even longer :p>

# Blackfin builds 8 copies (multilib) of newlib+libgloss by default.
$ time (cd build; ../configure --host=bfin-elf; make -j4)
real    1m40.950s
user    0m58.032s
sys     0m30.968s

so yeah, autotools generation here is significant.

> If Newlib wishes to depart from best practice, it would be nice to know the concrete issues in the context of this project, and not just some general "all solutions in this area seem to suck" justification.

again, this isn't "just newlib".  newlib is part of the historically combined
toolchain tree/ecosystem.  that means you can take binutils, gdb, gcc, newlib,
libgloss, cgen, sim, zlib, etc... and have a single monolithic source tree and
build them all at once.  the projects have separated a little bit in that they
have diff git repos, but the top-level dir and a few subdirs are still shared,
and some folks still hand merge them.  newlib is part of that ecosystem and as
such, follows its conventions.  changing newlib behavior would have a ripple
effect and is why consensus across all of them is desirable.  although usually
if you can convince gcc to change, the rest will follow to keep things simple.

i'm not advocating for this system, but i understand the trade-offs, and it's
been around longer than i've been a programmer.
-mike

[-- Attachment #2: signature.asc --]
[-- Type: application/pgp-signature, Size: 833 bytes --]

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-21 22:09       ` Mike Frysinger
@ 2022-01-21 23:08         ` Joel Sherrill
  2022-01-22 21:20         ` R. Diez
  2022-01-26 10:02         ` Mike Frysinger
  2 siblings, 0 replies; 17+ messages in thread
From: Joel Sherrill @ 2022-01-21 23:08 UTC (permalink / raw)
  To: R. Diez, Joel Sherrill, Newlib, Matthew Joyce

On Fri, Jan 21, 2022 at 4:09 PM Mike Frysinger <vapier@gentoo.org> wrote:
>
> On 21 Jan 2022 17:09, R. Diez via Newlib wrote:
> > > [...]
> > > The bootstrap time was large enough to
> > > negatively impact our ability to do automated regression testing.
> >
> > A very long bootstrap time could be an issue.
> >
> > However, compilation time normally outweighs by far the Autotools regeneration step. Is that a problem in Newlib at the moment?
>
> autotools (autoreconf really) doesn't run in parallel, so every subdir
> with a configure script needs a separate serialized run of all the tools.
> newlib has many many of these (arguably, too many).
>
> on my quad core 4.2GHz AMD that is otherwise idle ...
>
> $ time (cd newlib && autoreconf)
> real    5m22.170s
> user    3m13.709s
> sys     0m12.332s
>
> $ time (cd libgloss && autoreconf)
> real    1m41.754s
> user    0m43.505s
> sys     0m3.618s
> <this errored out, not sure why, so it might normally take even longer :p>
>
> # Blackfin builds 8 copies (multilib) of newlib+libgloss by default.
> $ time (cd build; ../configure --host=bfin-elf; make -j4)
> real    1m40.950s
> user    0m58.032s
> sys     0m30.968s
>
> so yeah, autotools generation here is significant.

This is on the order of what we saw using autotools with RTEMS.

> > If Newlib wishes to depart from best practice, it would be nice to know the concrete issues in the context of this project, and not just some general "all solutions in this area seem to suck" justification.

Sorry. I meant that more as all build systems have tradeoffs. You just
have to be conscious of the pain points and who incurs them.

From an RTEMS use case, all our users build tools from source with
something similar to BSD ports. These fetch source and apply patches.
We rarely use the newlib release tarballs and track from git.

I built an aarch64 toolchain yesterday and it tool about 25 minutes on my
laptop. It's a 2.87Ghz i7 with 4 real cores. It's a few years old and
far from the
fastest but still faster than what I see user with when I teach RTEMS classes.
This would add 5 minutes to that since I assume we can skip libgloss as
we don't use it.

If newlib goes to needing a bootstrap process, RTEMS will adjust. The question
is how many newlib users get binaries and how many build it themselves.
The burden goes on newlib developers, tool distributors, or end users building
from source.  This is something for the newlib community to decide

> again, this isn't "just newlib".  newlib is part of the historically combined
> toolchain tree/ecosystem.  that means you can take binutils, gdb, gcc, newlib,
> libgloss, cgen, sim, zlib, etc... and have a single monolithic source tree and
> build them all at once.  the projects have separated a little bit in that they
> have diff git repos, but the top-level dir and a few subdirs are still shared,
> and some folks still hand merge them.  newlib is part of that ecosystem and as
> such, follows its conventions.  changing newlib behavior would have a ripple
> effect and is why consensus across all of them is desirable.  although usually
> if you can convince gcc to change, the rest will follow to keep things simple.
>
> i'm not advocating for this system, but i understand the trade-offs, and it's
> been around longer than i've been a programmer.

That's all I was trying to point out. Every build system and approach to using
it has advantages and disadvantages.

> -mike

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-21 22:09       ` Mike Frysinger
  2022-01-21 23:08         ` Joel Sherrill
@ 2022-01-22 21:20         ` R. Diez
  2022-01-23  0:17           ` Joel Sherrill
  2022-01-23  7:29           ` Mike Frysinger
  2022-01-26 10:02         ` Mike Frysinger
  2 siblings, 2 replies; 17+ messages in thread
From: R. Diez @ 2022-01-22 21:20 UTC (permalink / raw)
  To: Mike Frysinger; +Cc: Newlib, Matthew Joyce, joel

Hallo Mike:

Newlib's build system has been a major pain and I would like to take this opportunity to learn more about it, and if that also helps with the other toolchain components, it's a plus point.


> [...]
> autotools (autoreconf really) doesn't run in parallel, so every subdir
> with a configure script needs a separate serialized run of all the tools.
> newlib has many many of these (arguably, too many).
> 
> on my quad core 4.2GHz AMD that is otherwise idle ...
> 
> $ time (cd newlib && autoreconf)
> real    5m22.170s
> user    3m13.709s
> sys     0m12.332s

My system is not really a very fast machine for development purposes:
- A laptop with an Intel Core i5-8265U CPU @ 1.60GHz (4 cores + hyperthreading)
- 1TB consumer-grade SSD Crucial MX500
- Ubuntu 20.04
My result is:

$ ( cd newlib && time autoreconf --force --verbose )
real    3m47,685s
user    1m48,533s
sys     0m10,338s

That's actually a rather long time. But if I understand correctly, your target is to have just 2 'configure' scripts for newlib and libgloss, which means that the Autoconf regeneration time is going to get cut drastically. Is that right?

Is it actually a problem that regenerating the Autoconf files in Newlib takes 5 minutes?

Is the Newlib project doing Continuous Integration? If so, it should actually be doing the Autoconf regeneration anyway, in order to test it as well. In any case, would a 5-minute delay there be an issue?

This Autoconf-related delay would only apply to sources directly obtained from the Git repository. The Newlib tarball releases would already include the Autoconf files, so no delay there.

If somebody, like RTEMS, is tracking Newlib's Git repository directly, that's their choice. But even then, they have options. If they distribute a release with patches etc, they can regenerate the Autoconf files before packaging Newlib for their own release. If the problem lies in the Continuous Integration system, you can cache the Autoconf files: if last time you checked out the same Git commit hash, then you do not need to regenerate the Autotools files again.


> again, this isn't "just newlib".  newlib is part of the historically combined
> toolchain tree/ecosystem.  that means you can take binutils, gdb, gcc, newlib,
> libgloss, cgen, sim, zlib, etc... and have a single monolithic source tree and
> build them all at once.  the projects have separated a little bit in that they
> have diff git repos, but the top-level dir and a few subdirs are still shared,
> and some folks still hand merge them.  newlib is part of that ecosystem and as
> such, follows its conventions.  changing newlib behavior would have a ripple
> effect and is why consensus across all of them is desirable.  although usually
> if you can convince gcc to change, the rest will follow to keep things simple.

OK, let me try to understand the details here.

I have been using the following makefile for years to build a cross-compiler GCC toolchain:

https://github.com/rdiez/JtagDue/blob/master/Toolchain/Makefile

That makefile builds GCC in 2 phases: a first, temporary GCC without a C library, and the final GCC with Newlib.

With that makefile, Newlib is configured, built and installed separately from GCC, so Newlib's build system does not matter at all in this scenario.

I think that is the same strategy as the OpenWrt toolchain makefile:
   https://git.openwrt.org/
   toolchain/Makefile
This way, you can use another libc. I believe that OpenWrt tends to use Musl instead of glibc.


The other way to build such a cross-compiler toolchain is to merge Newlib's sources with GCC's etc. That is called a "combined tree" and is mentioned here:

https://gcc.gnu.org/wiki/Building_Cross_Toolchains_with_gcc

The instructions on that page appear outdated and do not seem complete.

If you are keeping the Newlib build system compatible with GCC's etc, you must be testing with some commands, or maybe a script, that merges the trees. Could you share those steps? Alternatively, do you know of a web page that describes the steps accurately?

The instructions on the page I mentioned above talk about GCC's files overwriting any files in Newlib etc. I wonder at what level that is supposed to happen. For example, gcc-10.3.0 has a top-level 'configure' script. Is that supposed to overwrite Newlib's top-level 'configure' script?

In the combined tree, GCC must know somehow that it should include the 'newlib' subdirectory. Is that what GCC's 'configure' option '--with-newlib' is supposed to do? I could not find any information about "--with-newlib" in the GCC documentation.


To what extent is build system compatibility in this GCC ecosystem a problem? Newlib has had very outdated Autoconf files for quite some time and I presume that it has been working with many different GCC versions over the years. I do not suppose that we maintain a table of build system version compatibility between Newlib and GCC releases, do we?

My guess is that GCC's build system is calling 'configure', 'make' etc. inside newlib/, and it does not really matter if the versions are a little different. Or is a lot of the build system really shared?

-- 
rdiez

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-22 21:20         ` R. Diez
@ 2022-01-23  0:17           ` Joel Sherrill
  2022-01-23 16:57             ` R. Diez
  2022-01-23  7:29           ` Mike Frysinger
  1 sibling, 1 reply; 17+ messages in thread
From: Joel Sherrill @ 2022-01-23  0:17 UTC (permalink / raw)
  To: R. Diez; +Cc: Mike Frysinger, Newlib, Matthew Joyce

On Sat, Jan 22, 2022, 3:20 PM R. Diez <rdiezmail-newlib@yahoo.de> wrote:

> Hallo Mike:
>
> Newlib's build system has been a major pain and I would like to take this
> opportunity to learn more about it, and if that also helps with the other
> toolchain components, it's a plus point.
>
>
> > [...]
> > autotools (autoreconf really) doesn't run in parallel, so every subdir
> > with a configure script needs a separate serialized run of all the tools.
> > newlib has many many of these (arguably, too many).
> >
> > on my quad core 4.2GHz AMD that is otherwise idle ...
> >
> > $ time (cd newlib && autoreconf)
> > real    5m22.170s
> > user    3m13.709s
> > sys     0m12.332s
>
> My system is not really a very fast machine for development purposes:
> - A laptop with an Intel Core i5-8265U CPU @ 1.60GHz (4 cores +
> hyperthreading)
> - 1TB consumer-grade SSD Crucial MX500
> - Ubuntu 20.04
> My result is:
>
> $ ( cd newlib && time autoreconf --force --verbose )
> real    3m47,685s
> user    1m48,533s
> sys     0m10,338s
>
> That's actually a rather long time. But if I understand correctly, your
> target is to have just 2 'configure' scripts for newlib and libgloss, which
> means that the Autoconf regeneration time is going to get cut drastically.
> Is that right?
>
> Is it actually a problem that regenerating the Autoconf files in Newlib
> takes 5 minutes?
>
> Is the Newlib project doing Continuous Integration? If so, it should
> actually be doing the Autoconf regeneration anyway, in order to test it as
> well. In any case, would a 5-minute delay there be an issue?
>

The RTEMS project checks three times a day and builds a target for
Coverity.

We build full tool chains mates to three  branches on about 8 OSes once a
week. With just under 20 targets, 3 branches and 8 VMs for the OSes, that's
240 builds a week for test builds. All are.on the same physical computer.
We could be looking at 12 to 20 hours more of build time a week.

Would reducing the number of Makefile.am files help newlib? That worked for
RTEMS. Also allowed make to see more parallelism. Portable libc, portable
libc, and one per machine or sys directory. Just thinking out loud.


> This Autoconf-related delay would only apply to sources directly obtained
> from the Git repository. The Newlib tarball releases would already include
> the Autoconf files, so no delay there.
>
> If somebody, like RTEMS, is tracking Newlib's Git repository directly,
> that's their choice. But even then, they have options. If they distribute a
> release with patches etc, they can regenerate the Autoconf files before
> packaging Newlib for their own release. If the problem lies in the
> Continuous Integration system, you can cache the Autoconf files: if last
> time you checked out the same Git commit hash, then you do not need to
> regenerate the Autotools files again.
>

Yes that is possible. Not sure how this would work on our system but
possible.

>
>
> > again, this isn't "just newlib".  newlib is part of the historically
> combined
> > toolchain tree/ecosystem.  that means you can take binutils, gdb, gcc,
> newlib,
> > libgloss, cgen, sim, zlib, etc... and have a single monolithic source
> tree and
> > build them all at once.  the projects have separated a little bit in
> that they
> > have diff git repos, but the top-level dir and a few subdirs are still
> shared,
> > and some folks still hand merge them.  newlib is part of that ecosystem
> and as
> > such, follows its conventions.  changing newlib behavior would have a
> ripple
> > effect and is why consensus across all of them is desirable.  although
> usually
> > if you can convince gcc to change, the rest will follow to keep things
> simple.
>
> OK, let me try to understand the details here.
>
> I have been using the following makefile for years to build a
> cross-compiler GCC toolchain:
>
> https://github.com/rdiez/JtagDue/blob/master/Toolchain/Makefile
>
> That makefile builds GCC in 2 phases: a first, temporary GCC without a C
> library, and the final GCC with Newlib.
>
> With that makefile, Newlib is configured, built and installed separately
> from GCC, so Newlib's build system does not matter at all in this scenario.
>
> I think that is the same strategy as the OpenWrt toolchain makefile:
>    https://git.openwrt.org/
>    toolchain/Makefile
> This way, you can use another libc. I believe that OpenWrt tends to use
> Musl instead of glibc.
>
>
> The other way to build such a cross-compiler toolchain is to merge
> Newlib's sources with GCC's etc. That is called a "combined tree" and is
> mentioned here:
>
> https://gcc.gnu.org/wiki/Building_Cross_Toolchains_with_gcc
>
> The instructions on that page appear outdated and do not seem complete.
>

We do thia one tree style.

>
> If you are keeping the Newlib build system compatible with GCC's etc, you
> must be testing with some commands, or maybe a script, that merges the
> trees. Could you share those steps? Alternatively, do you know of a web
> page that describes the steps accurately?
>
> The instructions on the page I mentioned above talk about GCC's files
> overwriting any files in Newlib etc. I wonder at what level that is
> supposed to happen. For example, gcc-10.3.0 has a top-level 'configure'
> script. Is that supposed to overwrite Newlib's top-level 'configure' script?
>

We move or symlink the newlib/ directory under the top GCC directory.

>
> In the combined tree, GCC must know somehow that it should include the
> 'newlib' subdirectory. Is that what GCC's 'configure' option
> '--with-newlib' is supposed to do? I could not find any information about
> "--with-newlib" in the GCC documentation.
>
>
> To what extent is build system compatibility in this GCC ecosystem a
> problem? Newlib has had very outdated Autoconf files for quite some time
> and I presume that it has been working with many different GCC versions
> over the years. I do not suppose that we maintain a table of build system
> version compatibility between Newlib and GCC releases, do we?
>

We've been doing it this way since the early 90s. No noticeable difference
in procedure since the old Cygnus one tree instructions.

>
> My guess is that GCC's build system is calling 'configure', 'make' etc.
> inside newlib/, and it does not really matter if the versions are a little
> different. Or is a lot of the build system really shared?
>

Based on how we always build, I'd say it is just invocation.

--joel

>
> --
> rdiez
>

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-22 21:20         ` R. Diez
  2022-01-23  0:17           ` Joel Sherrill
@ 2022-01-23  7:29           ` Mike Frysinger
  1 sibling, 0 replies; 17+ messages in thread
From: Mike Frysinger @ 2022-01-23  7:29 UTC (permalink / raw)
  To: R. Diez; +Cc: Newlib, Matthew Joyce, joel

[-- Attachment #1: Type: text/plain, Size: 7542 bytes --]

On 22 Jan 2022 22:20, R. Diez wrote:
> Newlib's build system has been a major pain

it's no more painful than any other autotool based system.  if you mean
installing the specific version of autotools on your system is a major
issue, i don't think that's true.  it's pretty easy to download locally
each version and run them directly.

> > [...]
> > autotools (autoreconf really) doesn't run in parallel, so every subdir
> > with a configure script needs a separate serialized run of all the tools.
> > newlib has many many of these (arguably, too many).
> > 
> > on my quad core 4.2GHz AMD that is otherwise idle ...
> > 
> > $ time (cd newlib && autoreconf)
> > real    5m22.170s
> > user    3m13.709s
> > sys     0m12.332s
> 
> My system is not really a very fast machine for development purposes:
> - A laptop with an Intel Core i5-8265U CPU @ 1.60GHz (4 cores + hyperthreading)
> - 1TB consumer-grade SSD Crucial MX500
> - Ubuntu 20.04
> My result is:
> 
> $ ( cd newlib && time autoreconf --force --verbose )
> real    3m47,685s
> user    1m48,533s
> sys     0m10,338s

i was a bit lazy and didn't reset my disk caches, clear the autom4te.cache
trees, and make sure the set of system packages installing into the common
m4 tree (/usr/share/aclocal) was minimized.  so i would expect our times to
vary a bit.  i think the scale is still relevant though -- we aren't talking
O(<10 seconds) here, we're talking O(minutes).

> That's actually a rather long time. But if I understand correctly, your target
> is to have just 2 'configure' scripts for newlib and libgloss, which means that
> the Autoconf regeneration time is going to get cut drastically. Is that right?

the autoconf part will, but autoreconf is doing more than that.  i have not
split apart the m4 processing (aclocal), aux file processing, the libtoool,
or the automake steps.

i know the names are similar ("autoconf" and "autoreconf") and can be a bit
confusing, but while "autoconf" does just one thing (turn configure.ac into
configure), "autoreconf" drives all possible autotools.

> Is it actually a problem that regenerating the Autoconf files in Newlib takes 5 minutes?

yes

> Is the Newlib project doing Continuous Integration?

the project itself is not, but there are a lot of downstream folks who are.
i.e. your redhats and your windrivers and such (i'm aware that the industry
has gone through a lot of shuffling/acquiring and me quoting companies that
have been fully bought out dates me -- the work is still being done somewhere).

> If so, it should actually be doing the Autoconf regeneration anyway, in order to test it as well.

sure, iff the autotools are touched.  if they aren't, and they often aren't
by most devs, then it's a waste of time.

> In any case, would a 5-minute delay there be an issue?

i think you're in the old world thinking of "CI is that big slow thing that
runs periodically and sends me feedback that i look at hours or days later".
we have systems now that can provide much faster feedback.  presubmit checks
can get down to <minute levels.

plus, telling your developers that build-system-generation takes 80% of their
time bofore they get to the 20% that actually matters to them is a pretty hard
sell.

yes, i grok that this step doesn't always need to be run.  e.g. if you do a
`git pull` and get updated tools, you'll need to refresh, but if you're just
hacking on source files, you won't.  keep in mind that toolchain devs often
deal with regressions which involve git bisect, and if that requires ~minutes
everytime you checkout a diff tree state, as well as possibly needing to pick
a diff version of autotools if the bisect spans multiple releases, that's quite
a lot of pain.

committing the generated files, as ugly as it is, avoids all of this.

> The other way to build such a cross-compiler toolchain is to merge Newlib's sources with GCC's etc.
> That is called a "combined tree" and is mentioned here:
> 
> https://gcc.gnu.org/wiki/Building_Cross_Toolchains_with_gcc
> 
> The instructions on that page appear outdated and do not seem complete.

they do look outdated.  i don't use this flow myself as i rarely hack on gcc.
i get most of my cross-compilers from gcc releases.  i develop binutils, gdb,
newlib, and the sim myself.

> If you are keeping the Newlib build system compatible with GCC's etc,
> you must be testing with some commands, or maybe a script, that merges the trees.
> Could you share those steps?

i just manually symlink or bind mount the project subdirs i care about.

> The instructions on the page I mentioned above talk about GCC's files overwriting any files in Newlib etc.
> I wonder at what level that is supposed to happen. For example, gcc-10.3.0 has a top-level 'configure' script.
> Is that supposed to overwrite Newlib's top-level 'configure' script?

there is only one set of top-level files.  they get manually synced from gcc
to the other projects by devs who notice & need the fixes.

> In the combined tree, GCC must know somehow that it should include the 'newlib' subdirectory.
> Is that what GCC's 'configure' option '--with-newlib' is supposed to do?
> I could not find any information about "--with-newlib" in the GCC documentation.

that's because the top-level scripts contain a superset of options, and many
of them do dynamic probing of subdirectories.  --with-newlib is an option in
every toolchain project -- grab a binutils, gdb, gcc, or newlib release, and
you'll see it in all of them.  it just doesn't make sense in most.

think of the top-level script as a "looks for projects to build, and then passes
all arguments down to them".  it doesn't really do much else.  it's a glorified
mux and is full of shell `case` statements as a result.

if you look at the top-level configure.ac, you'll see a block near the top like:
### To add a new directory to the tree, first choose whether it is a target
### or a host dependent tool.  Then put it into the appropriate list
### (library or tools, host or target), doing a dependency sort.

after that you can see all of the projects that are supported in a combined tree.

> To what extent is build system compatibility in this GCC ecosystem a problem?
> Newlib has had very outdated Autoconf files for quite some time and I presume
> that it has been working with many different GCC versions over the years.

the contract at build time is `./configure`, and even if newlib had its script
generated with an old version of autoconf, it doesn't really change the API.
the top-level Makefile knows to `cd newlib && ./configure ...` and that will
work regardless of the autoconf version.

> I do not suppose that we maintain a table of build system version compatibility
> between Newlib and GCC releases, do we?

probably not.  i don't think anyone maintains a large matrix of binutils/gdb/gcc
and the newlib/glibc libraries.  usually the answer is "pick versions that are
released around the sametime", and the larger you try to skew that window, the
more "it's your problem" to make them work.

> My guess is that GCC's build system is calling 'configure', 'make' etc. inside
> newlib/, and it does not really matter if the versions are a little different.
> Or is a lot of the build system really shared?

the use of `./configure ...` and `make ...` are the most important.  the multilib
multiplexing logic with libraries makes things a bit messier too.
-mike

[-- Attachment #2: signature.asc --]
[-- Type: application/pgp-signature, Size: 833 bytes --]

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-23  0:17           ` Joel Sherrill
@ 2022-01-23 16:57             ` R. Diez
  2022-01-26 10:19               ` Mike Frysinger
  0 siblings, 1 reply; 17+ messages in thread
From: R. Diez @ 2022-01-23 16:57 UTC (permalink / raw)
  To: joel; +Cc: Mike Frysinger, Newlib, Matthew Joyce


> [...]
> We do thia one tree style.

Combining the repositories has drawbacks. Mixing things at least increases chances for confusion.

What are the advantages of using the "combined tree" versus separate directories?

With a "combined tree", do you still need to build GCC in 2 phases, or can you do it in one pass?

My first thought is that a "combined tree" may allow for more parallelism when building, but my makefile already builds all components in parallel too. The components of the toolchain I am building are:

Binutils
GMP
MPFR
MPC
GCC
Newlib
GDB

Binutils, GMP, MPFR and MPC probably need to be completely built before the GCC cross-compiler, so the "combined tree" is not going to paralellise that.

GDB needs MPFR and GMP, but does not need the GCC cross-compiler, for it is a host-based tool.

Therefore, at first sight, it looks like a "combined tree" may only increase parallelism when building GCC with an integrated Newlib, as opposed to building Newlib completely before building GCC. But I wonder if the "combined tree" build system serialises those steps (building GCC and building its Newlib) anyway.

In any case, there may be an advantage of combining the GCC and Newlib source trees, but I cannot see yet how that could benefit the other components.


> [...]
> We move or symlink the newlib/ directory under the top GCC directory.

Just the newlib/ directory? That means that top-level files in the Newlib Git repository, like 'configure' next to newlib/ and libgloss/ , are not used at all for your builds.

Consider top-level file README, which starts like this:

"This directory contains various GNU compilers, assemblers, linkers,
  debuggers, etc., plus their support routines, definitions, and documentation."

This file does not seem to come from Newlib. Why do we need it in Newlib?

The "combined tree" instructions on this page:

https://gcc.gnu.org/wiki/Building_Cross_Toolchains_with_gcc

state the following:

"with the GCC files overriding the binutils/gdb/newlib files when there's a conflict."

But the way you are symlinking just "newlib/" does not create any conflicts, does it?

By the way, are you aware of any public full example (a script or makefile) somewhere which I could use as a reference for creating and building with a combined tree?


> Mike Frysinger wrote:
> [...]
> there is only one set of top-level files.  they get manually synced from
> gcc to the other projects by devs who notice & need the fixes.

OK, I gather that copying those files across from GCC is the standard practice. But why do we need to keep doing that in Newlib? According to the (outdated) cross toolchain build instructions, when combining the tree the GCC versions should prevail.

It is not like anybody would take Newlib without GCC and use those top-level files (the 'configure' script) to build say just Newlib and Binutils, is it?


>> My guess is that GCC's build system is calling 'configure',
>> 'make' etc. inside newlib/, and it does not really matter
>> if the versions are a little different.
>> Or is a lot of the build system really shared?

> Joel Sherrill wrote:
> Based on how we always build, I'd say it is just invocation.

> Mike Frysinger wrote:
> the contract at build time is `./configure`, and even if newlib had its script
> generated with an old version of autoconf, it doesn't really change the API.
> the top-level Makefile knows to `cd newlib && ./configure ...` and that will
> work regardless of the autoconf version.

There seems to be consensus that the Autoconf version in Newlib does not matter, and that the "combined tree" building system interface is just invoking 'configure' and 'make' under newlib/.

Why do we need to fix the Autoconf version to 2.69 then? We could use the newer 2.71 for Newlib.

How about the Automake version? The Automake version is not checked like the Autoconf version is, isn't it? The checked-in "Makefile.in" files were generated with Automake 1.15.1 from 2017-06-19, and the latest is 1.16.5 from 2021-10-03.

If we want to strictly keep in sync with GCC, shouldn't we check the exact Automake version too?


> Mike Frysinger wrote:
> the multilib multiplexing logic with libraries makes things a bit messier too.

Does the multilib multiplexing logic affect the interface between the combined tree build system and Newlib? Or does it require something special inside newlib/ that would conflict with newer Autotools versions?


Thanks for the information,
   rdiez

-- 
rdiez

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-21 22:09       ` Mike Frysinger
  2022-01-21 23:08         ` Joel Sherrill
  2022-01-22 21:20         ` R. Diez
@ 2022-01-26 10:02         ` Mike Frysinger
  2022-02-17  5:18           ` Mike Frysinger
  2 siblings, 1 reply; 17+ messages in thread
From: Mike Frysinger @ 2022-01-26 10:02 UTC (permalink / raw)
  To: R. Diez, joel, Newlib, Matthew Joyce

[-- Attachment #1: Type: text/plain, Size: 1723 bytes --]

On 21 Jan 2022 17:09, Mike Frysinger wrote:
> On 21 Jan 2022 17:09, R. Diez via Newlib wrote:
> > > [...]
> > > The bootstrap time was large enough to
> > > negatively impact our ability to do automated regression testing.
> > 
> > A very long bootstrap time could be an issue.
> > 
> > However, compilation time normally outweighs by far the Autotools regeneration step. Is that a problem in Newlib at the moment?
> 
> autotools (autoreconf really) doesn't run in parallel, so every subdir
> with a configure script needs a separate serialized run of all the tools.
> newlib has many many of these (arguably, too many).
> 
> on my quad core 4.2GHz AMD that is otherwise idle ...
> 
> $ time (cd newlib && autoreconf)
> real    5m22.170s
> user    3m13.709s
> sys     0m12.332s
> 
> $ time (cd libgloss && autoreconf)
> real    1m41.754s
> user    0m43.505s
> sys     0m3.618s
> <this errored out, not sure why, so it might normally take even longer :p>
> 
> # Blackfin builds 8 copies (multilib) of newlib+libgloss by default.
> $ time (cd build; ../configure --host=bfin-elf; make -j4)
> real    1m40.950s
> user    0m58.032s
> sys     0m30.968s

updated timings on my system after recent work to delete many configure scripts
$ time (cd newlib && autoreconf)
real    1m0.619s
user    0m45.249s
sys     0m1.535s

$ time (cd libgloss && autoreconf -I$PWD -I$PWD/.. -I$PWD/../config)
real    0m32.662s
user    0m15.858s
sys     0m1.205s

$ time (cd build; ../configure --host=bfin-elf; make -j4)
real    1m2.337s
user    0m44.987s
sys     0m26.708s

so it's def better, but autotool generation still takes longer than actually
compiling newlib+libgloss 8 times :).
-mike

[-- Attachment #2: signature.asc --]
[-- Type: application/pgp-signature, Size: 833 bytes --]

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-23 16:57             ` R. Diez
@ 2022-01-26 10:19               ` Mike Frysinger
  2022-01-30 22:22                 ` R. Diez
  0 siblings, 1 reply; 17+ messages in thread
From: Mike Frysinger @ 2022-01-26 10:19 UTC (permalink / raw)
  To: R. Diez; +Cc: joel, Newlib, Matthew Joyce

[-- Attachment #1: Type: text/plain, Size: 4430 bytes --]

On 23 Jan 2022 17:57, R. Diez wrote:
> > Mike Frysinger wrote:
> > [...]
> > there is only one set of top-level files.  they get manually synced from
> > gcc to the other projects by devs who notice & need the fixes.
> 
> OK, I gather that copying those files across from GCC is the standard practice.
> But why do we need to keep doing that in Newlib? According to the (outdated)
> cross toolchain build instructions, when combining the tree the GCC versions should prevail.

i don't understand what you're asking.  the top-level provides files that
newlib uses.  they need to be kept up-to-date.  see AC_CONFIG_AUX_DIR use
throughout the tree.

> It is not like anybody would take Newlib without GCC and use those top-level files (the
> 'configure' script) to build say just Newlib and Binutils, is it?

why not ?  i have a gdb/binutils dir that i link just newlib & libgloss
into.  the GNU simulator is part of those trees, and it pulls ABI info
out of newlib & libgloss.  being able to iterate on those parts without
rebuilding gcc is helpful.

> >> My guess is that GCC's build system is calling 'configure',
> >> 'make' etc. inside newlib/, and it does not really matter
> >> if the versions are a little different.
> >> Or is a lot of the build system really shared?
> 
> > Joel Sherrill wrote:
> > Based on how we always build, I'd say it is just invocation.
> 
> > Mike Frysinger wrote:
> > the contract at build time is `./configure`, and even if newlib had its script
> > generated with an old version of autoconf, it doesn't really change the API.
> > the top-level Makefile knows to `cd newlib && ./configure ...` and that will
> > work regardless of the autoconf version.
> 
> There seems to be consensus that the Autoconf version in Newlib does not matter,
> and that the "combined tree" building system interface is just invoking 'configure' and 'make' under newlib/.
> 
> Why do we need to fix the Autoconf version to 2.69 then? We could use the newer 2.71 for Newlib.

the top-level config/ pins to that version.  it also contains macros we utilize
(like the multilib logic).  if no one else is testing newer versions, it sounds
like pointless landmines for newlib.

conversely, what does autoconf 2.71 get us ?  i'm not aware of functionality in
there that we need or otherwise fixes/improves things for newlib specifically.

> How about the Automake version? The Automake version is not checked like the
> Autoconf version is, isn't it? The checked-in "Makefile.in" files were generated
> with Automake 1.15.1 from 2017-06-19, and the latest is 1.16.5 from 2021-10-03.
> 
> If we want to strictly keep in sync with GCC, shouldn't we check the exact Automake version too?

we pin the min version via AM_INIT_AUTOMAKE to 1.15.1.  you're right we don't
currently pin the maximum, it's more through everyone agreeing to use that
version.  if gcc et al had a macro to pin the automake version, we'd prob
leverage it too.

pinning the versions in general is meant to keep generated noise/conflicts
down between devs working on these projects, and to make expectations more
obvious to users.  if diff devs were using autoconf 2.68 & 2.69 to regen
and push changes, there would be a ton more noise, and conflicts would be
a lot more frequent when pulling git updates.  users tend to not be savvy
with autotools ... they just run whatever until things compile.  getting
support requests like "i tried to run autoconf-2.50 and it didn't work"
(or conversely, "i tried to run autoconf-2.100 and it didn't work") is a
waste of valuable dev time.  putting "we used xxx versions" into README
files has historically proven to be insufficient.

> > Mike Frysinger wrote:
> > the multilib multiplexing logic with libraries makes things a bit messier too.
> 
> Does the multilib multiplexing logic affect the interface between the combined
> tree build system and Newlib? Or does it require something special inside newlib/
> that would conflict with newer Autotools versions?

gcc's multilib logic expects a certain layout that newlib provides, and this is
simplified by using the same multilib configure+make logic.  this is space that
hasn't really been documented anywhere that i'm aware of, and can sometimes be
a bit fragile.  i've run into this at various times with diff targets and when
maintaining toolchain packages in distros.
-mike

[-- Attachment #2: signature.asc --]
[-- Type: application/pgp-signature, Size: 833 bytes --]

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-26 10:19               ` Mike Frysinger
@ 2022-01-30 22:22                 ` R. Diez
  0 siblings, 0 replies; 17+ messages in thread
From: R. Diez @ 2022-01-30 22:22 UTC (permalink / raw)
  To: Mike Frysinger; +Cc: Newlib, Matthew Joyce, joel


>>> [...]
>>> there is only one set of top-level files.  they get manually synced from
>>> gcc to the other projects by devs who notice & need the fixes.
>>
>> OK, I gather that copying those files across from GCC is the standard practice.
>> But why do we need to keep doing that in Newlib? According to the (outdated)
>> cross toolchain build instructions, when combining the tree the GCC versions should prevail.
> 
> i don't understand what you're asking.  the top-level provides files that
> newlib uses.  they need to be kept up-to-date.  see AC_CONFIG_AUX_DIR use
> throughout the tree.

So the Newlib build system is a mix of something specific to Newlib, and some bits shared with GCC. But I would still like to understand what files are shared between Newlib, GCC, GDB, Binutils, etc.

I gathered that all AC_CONFIG_AUX_DIR around Newlib point to the top-level directory, which is not a good idea. Auxiliary files should be placed in a subdirectory in order to reduce clutter in the top-level project directory. This also prevents the configuration script from looking for helper scripts outside the project, in ../ and ../../ , possibly finding older versions or incompatible tools with the same name.

But I understand that we are sharing at least the multilib logic with GCC, probably config/multi.m4 , and those shared files perhaps expect AC_CONFIG_AUX_DIR to point to the top-level directory. So we shouldn't change it in Newlib.

Talking about config/multi.m4: this file is almost identical to the one in gcc-10.3.0, so it has fallen only a little behind. But other files like config/override.m4 have changed more. Maybe it is time to upgrade the copies in Newlib then, if they are truly shared and used?

Does Newlib have to include copies of those files anyway? If you are building multilibs, is that not specific to GCC? Wouldn't you need the GCC sources at hand anyway?

Does Newlib have to have files like include/gdb/sim-arm.h ? I thought the simulators lived with GDB, so is that not part of GDB? If you are building the simulators, do you not need the GDB sources anyway?

Does it make sense to have a MAINTAINERS file in Newlib talking about directories like binutils/ which do not exist? Shouldn't we have a different one called MAINTAINERS.Newlib? And the same with ChangeLog.

Why do we need to have our own copy of configure.ac at top-level? If you will be building in a "combined tree", then the other configure.ac copy should override ours, be it GCC, GDB or Binutils. If we are building just Newlib for another compiler or outside a "combined tree", would we be using that same configure.ac at top level? Or do you have to configure newlib/ and libgloss/ separately anyway?

Is it worth sharing the Autotools files with GCC at all? On the one hand, we can reuse GCC's multilib implementation, but on the other hand, I can imagine that dealing with integration problems as GCC etc. evolve is problematic. How do other libc libraries like Musl cope? Do they share the same Autoconf logic with GCC, or do they implement their own multilib logic?


>> It is not like anybody would take Newlib without GCC and use those top-level files (the
>> 'configure' script) to build say just Newlib and Binutils, is it?
> 
> why not ?  i have a gdb/binutils dir that i link just newlib & libgloss
> into.  the GNU simulator is part of those trees, and it pulls ABI info
> out of newlib & libgloss.  being able to iterate on those parts without
> rebuilding gcc is helpful.

Newlib alone is not going to suffice, because it is a "leaf" component. You will always need something else, like GCC or the GDB simulators. GDB brings its own configure.ac, config/ , etc. Should these not override the ones in Newlib when building in this smaller "combined tree"? Or does that overriding rule not apply here?


The main problem is that you need too much undocumented information in order to be able to maintain Newlib, to create a new release, or simply to contribute or to troubleshoot it. Or maybe I haven't found yet the place where all this is documented.

I would have expected to see documentation like this:

---------8<---------8<---------8<---------

README-Newlib.txt

In order to create a Newlib release:

1) Sync the build files shared with GCC.

The Newlib sources are designed to be used inside a "combined tree" with GCC and/or GDB etc. in order to build several toolchain components at the same time.

The files to synchronise from GCC (the master copy) are:

xxx
config/
yyy/
zzz

2) Regenerate the Autoconf files, and check the new versions in.

The exact Autools versions that should be used are listed in file README-maintainer-mode .

Use these commands (or better, provide a script):
( cd newlib && time autoreconf --force --verbose --warnings=all )
...

3) Test building in a "combined tree" with recent GCC and GDB simulator versions.

4) The maintainer (or the user?) can rebuild the Newlib documentation like this:

5) ...

---------8<---------8<---------8<---------


>> If we want to strictly keep in sync with GCC, shouldn't we check the exact Automake version too?
> 
> we pin the min version via AM_INIT_AUTOMAKE to 1.15.1.  you're right we don't
> currently pin the maximum, it's more through everyone agreeing to use that
> version.  if gcc et al had a macro to pin the automake version, we'd prob
> leverage it too.
> 
> pinning the versions in general is meant to keep generated noise/conflicts
> down between devs working on these projects, and to make expectations more
> obvious to users.
> [...]
> putting "we used xxx versions" into README

The implementation is inconsistent. We pin the exact Autoconf version in order to minimise the noise/conflicts when regenerating those files, and state that a README has proven to be insufficient, but we do not pin the exact Automake version. Why not then?

Regards,
   rdiez

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-01-26 10:02         ` Mike Frysinger
@ 2022-02-17  5:18           ` Mike Frysinger
  2022-02-17  6:56             ` Sebastian Huber
  2022-02-20  9:51             ` R. Diez
  0 siblings, 2 replies; 17+ messages in thread
From: Mike Frysinger @ 2022-02-17  5:18 UTC (permalink / raw)
  To: R. Diez, joel, Newlib, Matthew Joyce

[-- Attachment #1: Type: text/plain, Size: 2961 bytes --]

On 26 Jan 2022 05:02, Mike Frysinger wrote:
> On 21 Jan 2022 17:09, Mike Frysinger wrote:
> > On 21 Jan 2022 17:09, R. Diez via Newlib wrote:
> > > > [...]
> > > > The bootstrap time was large enough to
> > > > negatively impact our ability to do automated regression testing.
> > > 
> > > A very long bootstrap time could be an issue.
> > > 
> > > However, compilation time normally outweighs by far the Autotools regeneration step. Is that a problem in Newlib at the moment?
> > 
> > autotools (autoreconf really) doesn't run in parallel, so every subdir
> > with a configure script needs a separate serialized run of all the tools.
> > newlib has many many of these (arguably, too many).
> > 
> > on my quad core 4.2GHz AMD that is otherwise idle ...
> > 
> > $ time (cd newlib && autoreconf)
> > real    5m22.170s
> > user    3m13.709s
> > sys     0m12.332s
> > 
> > $ time (cd libgloss && autoreconf)
> > real    1m41.754s
> > user    0m43.505s
> > sys     0m3.618s
> > <this errored out, not sure why, so it might normally take even longer :p>
> > 
> > # Blackfin builds 8 copies (multilib) of newlib+libgloss by default.
> > $ time (cd build; ../configure --host=bfin-elf; make -j4)
> > real    1m40.950s
> > user    0m58.032s
> > sys     0m30.968s
> 
> updated timings on my system after recent work to delete many configure scripts
> $ time (cd newlib && autoreconf)
> real    1m0.619s
> user    0m45.249s
> sys     0m1.535s
> 
> $ time (cd libgloss && autoreconf -I$PWD -I$PWD/.. -I$PWD/../config)
> real    0m32.662s
> user    0m15.858s
> sys     0m1.205s
> 
> $ time (cd build; ../configure --host=bfin-elf; make -j4)
> real    1m2.337s
> user    0m44.987s
> sys     0m26.708s
> 
> so it's def better, but autotool generation still takes longer than actually
> compiling newlib+libgloss 8 times :).

things are looking up.  with all my pending changes, we have 1 configure script
in newlib and no recursive makes.

$ time (cd newlib && autoreconf)
real    0m8.740s
user    0m7.524s
sys     0m0.193s

i'm not sure if i'll "finish" libgloss.  there's still a lot of subdirs not
even using automake, so while i can kill most configure scripts, i prob won't
do them all, and i prob won't convert more to automake or non-recursive make.
the libgloss arches have a lot harrier logic in them that i don't care to try
to unpack, especially since i converted the dirs i most care about.

$ time (cd libgloss && autoreconf -I$PWD -I$PWD/.. -I$PWD/../config)
real    0m8.313s
user    0m5.015s
sys     0m0.259s

we can see that killing excessive configure scripts & recursive makes helps
with compilation times too.

$ time (cd build; ../configure --host=bfin-elf; make -j4)
real    0m28.831s
user    0m34.828s
sys     0m23.093s

generating autotools is now slightly faster that compiling 8 copies of
newlib+libgloss :).  not that i'm advocating for changing anything :P.
-mike

[-- Attachment #2: signature.asc --]
[-- Type: application/pgp-signature, Size: 833 bytes --]

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-02-17  5:18           ` Mike Frysinger
@ 2022-02-17  6:56             ` Sebastian Huber
  2022-02-20  9:51             ` R. Diez
  1 sibling, 0 replies; 17+ messages in thread
From: Sebastian Huber @ 2022-02-17  6:56 UTC (permalink / raw)
  To: newlib

Hello Mike,

On 17/02/2022 06:18, Mike Frysinger wrote:
> we can see that killing excessive configure scripts & recursive makes helps
> with compilation times too.
> 
> $ time (cd build; ../configure --host=bfin-elf; make -j4)
> real    0m28.831s
> user    0m34.828s
> sys     0m23.093s
> 
> generating autotools is now slightly faster that compiling 8 copies of
> newlib+libgloss :).  not that i'm advocating for changing anything :P.

thanks a lot for cleaning up the build system. I always had trouble to 
understand how it works and your changes are really helpful (for example 
the removal of the libtool support). Speeding up the build is nice.

-- 
embedded brains GmbH
Herr Sebastian HUBER
Dornierstr. 4
82178 Puchheim
Germany
email: sebastian.huber@embedded-brains.de
phone: +49-89-18 94 741 - 16
fax:   +49-89-18 94 741 - 08

Registergericht: Amtsgericht München
Registernummer: HRB 157899
Vertretungsberechtigte Geschäftsführer: Peter Rasmussen, Thomas Dörfler
Unsere Datenschutzerklärung finden Sie hier:
https://embedded-brains.de/datenschutzerklaerung/

^ permalink raw reply	[flat|nested] 17+ messages in thread

* Re: Question about autoreconf to regenerate configuration files
  2022-02-17  5:18           ` Mike Frysinger
  2022-02-17  6:56             ` Sebastian Huber
@ 2022-02-20  9:51             ` R. Diez
  1 sibling, 0 replies; 17+ messages in thread
From: R. Diez @ 2022-02-20  9:51 UTC (permalink / raw)
  To: Mike Frysinger; +Cc: Newlib, Matthew Joyce, joel

> [...]
> things are looking up.  with all my pending changes, we have 1 configure script
> in newlib and no recursive makes.
> 
> $ time (cd newlib && autoreconf)
> real    0m8.740s


I am grateful that you are doing this work. The old build system was horrible.


> generating autotools is now slightly faster that compiling 8 copies of
> newlib+libgloss :).  not that i'm advocating for changing anything :P.


Well, that's a pity. If we are down to 16 seconds for both newlib and libgloss together, I would say that performance is no longer a good reason to check in all those Autotools files.

Regards,
   rdiez

^ permalink raw reply	[flat|nested] 17+ messages in thread

end of thread, other threads:[~2022-02-20  9:52 UTC | newest]

Thread overview: 17+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2022-01-21 12:32 Question about autoreconf to regenerate configuration files Matthew Joyce
2022-01-21 13:44 ` Corinna Vinschen
2022-01-21 15:02 ` R. Diez
2022-01-21 15:37   ` Joel Sherrill
2022-01-21 16:09     ` R. Diez
2022-01-21 22:09       ` Mike Frysinger
2022-01-21 23:08         ` Joel Sherrill
2022-01-22 21:20         ` R. Diez
2022-01-23  0:17           ` Joel Sherrill
2022-01-23 16:57             ` R. Diez
2022-01-26 10:19               ` Mike Frysinger
2022-01-30 22:22                 ` R. Diez
2022-01-23  7:29           ` Mike Frysinger
2022-01-26 10:02         ` Mike Frysinger
2022-02-17  5:18           ` Mike Frysinger
2022-02-17  6:56             ` Sebastian Huber
2022-02-20  9:51             ` R. Diez

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).