From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: by sourceware.org (Postfix, from userid 48) id EF8AC3856DD0; Wed, 18 Oct 2023 06:54:22 +0000 (GMT) DKIM-Filter: OpenDKIM Filter v2.11.0 sourceware.org EF8AC3856DD0 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gcc.gnu.org; s=default; t=1697612062; bh=bxgSntqJUjjTsIwnrocnPw4Q+gV2sqH6It/428dI5hA=; h=From:To:Subject:Date:In-Reply-To:References:From; b=FBAKG+TtGULshtlW2fsGlWDOvSmZRnxMMjfi6bQECtzPh2E6QP/gFP/K+65aEUsIT P/7emWnr30tN0JS8T1hOmHNhLGJ1ytn2rv6L5ocMwwPX8UIJYOlYBvY6O5X8+Wa5Ny 6mR2fQ/khsS7nkBO8TsvRHnj/dN5pP7QjM5FoT4A= From: "muecker at gwdg dot de" To: gcc-bugs@gcc.gnu.org Subject: [Bug c/111808] [C23] constexpr with excess precision Date: Wed, 18 Oct 2023 06:54:22 +0000 X-Bugzilla-Reason: CC X-Bugzilla-Type: changed X-Bugzilla-Watch-Reason: None X-Bugzilla-Product: gcc X-Bugzilla-Component: c X-Bugzilla-Version: 14.0 X-Bugzilla-Keywords: X-Bugzilla-Severity: normal X-Bugzilla-Who: muecker at gwdg dot de X-Bugzilla-Status: UNCONFIRMED X-Bugzilla-Resolution: X-Bugzilla-Priority: P3 X-Bugzilla-Assigned-To: unassigned at gcc dot gnu.org X-Bugzilla-Target-Milestone: --- X-Bugzilla-Flags: X-Bugzilla-Changed-Fields: Message-ID: In-Reply-To: References: Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable X-Bugzilla-URL: http://gcc.gnu.org/bugzilla/ Auto-Submitted: auto-generated MIME-Version: 1.0 List-Id: https://gcc.gnu.org/bugzilla/show_bug.cgi?id=3D111808 --- Comment #8 from Martin Uecker --- There are certainly other similar portability issues, e.g.: enum : long { X =3D 0xFFFFFFFFUL }; https://godbolt.org/z/hKsqPe9c1 BTW: Are there better examples where we have similar build failures also in pre-C2X? (not counting explicit compile-time tests for sizes or limits) M= ost simple C expressions do not seem to produce a hard error when switching bet= ween 64 and 32 bit archs, e.g. exceeding the range in an initializer of an enum = does not produce hard errors without -predantic-error before C2X. That we now se= em to have such issues worries me a little bit.=20 In any case, I would argue that issues related to the size of integers are = much better understood by programmers, while excess precision is rather obscure = and also has much more implementation-defined degrees of freedom. The behavior = of integers is more or less fixed by its width, but with what precision 1. / 3= . is computed on any specific platform is not restricted. The use of such a thin= g in a constexpr initializer then makes the program inherently non-portable and = I do not believe programmers are aware of this.=20=20 Debugging such issues after the fact because a package fails to build on, f= or example, 3 of 20 architectures in Debian is generally a huge pain. On the other hand, maybe excess precision on i386 is obscure and i386 will go away and we should not worry?=