public inbox for fortran@gcc.gnu.org
 help / color / mirror / Atom feed
From: Richard Biener <richard.guenther@gmail.com>
To: mckinstry@debian.org
Cc: Toon Moene <toon@moene.org>,
	Jerry DeLisle <jvdelisle@charter.net>,
		Damian Rouson <damian@sourceryinstitute.org>,
	Thomas Koenig <tkoenig@netcologne.de>,
		"Stubbs, Andrew" <ams@codesourcery.com>,
	Janne Blomqvist <blomqvist.janne@gmail.com>,
		GCC Patches <gcc-patches@gcc.gnu.org>,
	"fortran@gcc.gnu.org" <fortran@gcc.gnu.org>
Subject: Re: OpenCoarrays integration with gfortran
Date: Thu, 27 Sep 2018 12:29:00 -0000	[thread overview]
Message-ID: <CAFiYyc32WpRH2TZiOyLXN7zzqo=4fKtw3-JUUZHJaBa0-daTxQ@mail.gmail.com> (raw)
In-Reply-To: <b320042f-33ac-b683-ac66-3e5e146b77e0@debian.org>

On Mon, Sep 24, 2018 at 12:58 PM Alastair McKinstry
<mckinstry@debian.org> wrote:
>
>
> On 23/09/2018 10:46, Toon Moene wrote:
> > On 09/22/2018 01:23 AM, Jerry DeLisle wrote:
> >
> > I just installed opencoarrays on my system at home (Debian Testing):
> >
> > root@moene:~# apt-get install libcoarrays-openmpi-dev
> > ...
> > Setting up libcaf-openmpi-3:amd64 (2.2.0-3) ...
> > Setting up libcoarrays-openmpi-dev:amd64 (2.2.0-3) ...
> > Processing triggers for libc-bin (2.27-6) ...
> >
> > [ previously this led to apt errors, but not now. ]
> >
> > and moved my own installation of the OpenCoarrays-2.2.0.tar.gz out of
> > the way:
> >
> > toon@moene:~$ ls -ld *pen*
> > drwxr-xr-x 6 toon toon 4096 Aug 10 16:01 OpenCoarrays-2.2.0.opzij
> > drwxr-xr-x 8 toon toon 4096 Sep 15 11:26 opencoarrays-build.opzij
> > drwxr-xr-x 6 toon toon 4096 Sep 15 11:26 opencoarrays.opzij
> >
> > and recompiled my stuff:
> >
> > gfortran -g -fbacktrace -fcoarray=lib random-weather.f90
> > -L/usr/lib/x86_64-linux-gnu/open-coarrays/openmpi/lib -lcaf_mpi
> >
> > [ Yes, the location of the libs is quite experimental, but OK for the
> > "Testing" variant of Debian ... ]
> >
> > I couldn't find cafrun, but mpirun works just fine:
> >
> > toon@moene:~/src$ echo ' &config /' | mpirun --oversubscribe --bind-to
> > none -np 20 ./a.out
> > Decomposition information on image    7 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image    6 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image   11 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image   15 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image    1 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image   13 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image   12 is    4 *    5 slabs with   21
> > *   18 grid cells on this image.
> > Decomposition information on image   20 is    4 *    5 slabs with   21
> > *   18 grid cells on this image.
> > Decomposition information on image    9 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image   14 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image   16 is    4 *    5 slabs with   21
> > *   18 grid cells on this image.
> > Decomposition information on image   17 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image   18 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image    2 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image    4 is    4 *    5 slabs with   21
> > *   18 grid cells on this image.
> > Decomposition information on image    5 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image    3 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image    8 is    4 *    5 slabs with   21
> > *   18 grid cells on this image.
> > Decomposition information on image   10 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> > Decomposition information on image   19 is    4 *    5 slabs with   23
> > *   18 grid cells on this image.
> >
> > ... etc. (see http://moene.org/~toon/random-weather.f90).
> >
> > I presume other Linux distributors will follow shortly (this *is*
> > Debian Testing, which can be a bit testy at times - but I do trust my
> > main business at home on it for over 15 years now).
> >
> > Kind regards,
> >
> Thanks, good to see it being tested (I'm the Debian/Ubuntu packager).
>
> caf /cafrun has been dropped (for the moment ? ) in favour of mpirun,
> but I've added pkg-config caf packages so that becomes an option.
>
>     $ pkg-config caf-mpich --libs
>
>     -L/usr/lib/x86_64-linux-gnu/open-coarrays/mpich/lib -lcaf_mpich -Wl,-z,relro -lmpich -lm -lbacktrace -lpthread -lrt
>
> (My thinking is that for libraries in particular, the user need not know
> whether CAF is being used, and if lib foobar uses CAF, then adding a:
>
>      Requires: caf
>
> into the pkg-config file gives you the correct linking transparently.
>
> The "strange" paths are due to Debians multiarch : it is possible to
> include libraries for multiple architectures simultaneously. This works
> ok with pkg-config and cmake , etc (which allow you to set
> PKG_CONFIG_PATH and have multiple pkgconfig files for different libs
> simultaneously) , but currently break wrappers such as caf / cafrun.
>
> I can add a new package for caf / cafrun but would rather not. (W e
> currently don't do non-MPI CAF builds).
>
> There is currently pkg-config files 'caf-mpich' and 'caf-openmpi' for
> testing, and I'm adding a default alias caf -> caf-$(default-MPI)

So I've tried packaging of OpenCoarrays for SUSE and noticed a few things:

 - caf by default links libcaf_mpi static (why?)
 - the build system makes the libcaf_mpi SONAME dependent on the compiler
   version(?), I once got libcaf_mpi2 and once libcaf_mpi3 (gcc7 vs. gcc8)

different SONAMEs definitely makes packaging difficult.  Of course since
there's the first point I may very well elide the shared library
alltogether....?

Other than that it seems to "work" (OBS home:rguenther/OpenCoarrays).

Richard.

> regards
>
> Alastair
>
>
>
>
> --
> Alastair McKinstry, <alastair@sceal.ie>, <mckinstry@debian.org>, https://diaspora.sceal.ie/u/amckinstry
> Misentropy: doubting that the Universe is becoming more disordered.
>

  reply	other threads:[~2018-09-27 12:29 UTC|newest]

Thread overview: 25+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
     [not found] <024e798b9539b765a1259cfc9cb2f1dc480b24ca.1536144068.git.ams@codesourcery.com>
2018-09-05 16:54 ` Fwd: [PATCH 08/25] Fix co-array allocation Toon Moene
2018-09-05 17:02   ` Bernhard Reutner-Fischer
2018-09-05 18:07   ` Janne Blomqvist
2018-09-19 16:24     ` Andrew Stubbs
2018-09-19 21:18       ` Damian Rouson
2018-09-19 22:30         ` Andrew Stubbs
2018-09-19 23:09           ` Damian Rouson
2018-09-20 20:02         ` Thomas Koenig
2018-09-20 20:56           ` Damian Rouson
2018-09-21  7:33           ` Toon Moene
2018-09-23 11:40             ` Janne Blomqvist
2018-09-21 16:25           ` OpenCoarrays integration with gfortran Jerry DeLisle
2018-09-21 19:13             ` Janne Blomqvist
2018-09-21 19:37             ` Richard Biener
2018-09-21 20:17             ` Damian Rouson
2018-09-21 23:23               ` Jerry DeLisle
2018-09-23  9:47                 ` Toon Moene
2018-09-23 16:48                   ` Bernhard Reutner-Fischer
2018-09-23 19:17                     ` Toon Moene
2018-09-23 20:19                       ` Bernhard Reutner-Fischer
2018-09-24 10:58                   ` Alastair McKinstry
2018-09-27 12:29                     ` Richard Biener [this message]
2018-09-27 13:32                       ` Jorge D'Elia
2018-09-20 15:56       ` [PATCH 08/25] Fix co-array allocation Janne Blomqvist
2018-09-20 16:23         ` Andrew Stubbs

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to='CAFiYyc32WpRH2TZiOyLXN7zzqo=4fKtw3-JUUZHJaBa0-daTxQ@mail.gmail.com' \
    --to=richard.guenther@gmail.com \
    --cc=ams@codesourcery.com \
    --cc=blomqvist.janne@gmail.com \
    --cc=damian@sourceryinstitute.org \
    --cc=fortran@gcc.gnu.org \
    --cc=gcc-patches@gcc.gnu.org \
    --cc=jvdelisle@charter.net \
    --cc=mckinstry@debian.org \
    --cc=tkoenig@netcologne.de \
    --cc=toon@moene.org \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).