Florian Weimer writes: > * Jason Merrill: > >> On Fri, May 12, 2023 at 11:03 AM Florian Weimer wrote: >>> >>> * Joseph Myers: >>> >>> > On Wed, 10 May 2023, Eli Zaretskii via Gcc wrote: >>> > >>> >> That is not the case we are discussing, AFAIU. Or at least no one has >>> >> yet explained why accepting those old K&R programs will adversely >>> >> affect the ability of GCC to compile C2x programs. >>> > >>> > At block scope, >>> > >>> > auto x = 1.5; >>> > >>> > declares x to have type double in C2x (C++-style auto), but type int in >>> > C89 (and is invalid for versions in between). In this case, there is an >>> > incompatible semantic change between implicit int and C++-style auto. >>> > Giving an error before we make -std=gnu2x the default seems like a >>> > particularly good idea, to further alert anyone who has been ignoring the >>> > warnings about implicit int that semantics will change incompatibly. >>> >>> Obviously makes sense to me. >> >> Agreed. But we could safely continue to accept >> >> static x = 42; >> >> or even >> >> auto x = 42; // meaning of 'auto' changes, meaning of the declaration does not >> >> We might make -Wimplicit-int an error by default only if the >> initializer has a type other than 'int'. > > Based on what I saw fixing Fedora, these cases are not very common. > Sure, sometimes common program such as valgrind have an instance > , but that's really an > exception. > > Implicit int is common as the return type of main (especially in > autoconf tests), and due to a missing declaration list entry of an > old-style function definition. The main case could be treated as an > exception. The old-style function definition case is a common source > of bugs and therefore worth fixing. The addition of unnamed function > parameters as an extension actually created a new class of bugs here > (a typo in the type name of a single unnamed parameter results in an > old-style function definition by accident). > >>> > In cases where the standard requires a diagnostic, some are errors, some >>> > are pedwarns-by-default or unconditional pedwarns, some are >>> > pedwarns-if-pedantic - the choice depending on how suspicious the >>> > construct in question is and whether it corresponds to a meaningful >>> > extension (this is not making an automatic choice for every such situation >>> > in the standard, it's a case-by-case judgement by maintainers). By now, >>> > the cases discussed in this thread are sufficiently suspicious - >>> > sufficiently likely to result in unintended execution at runtime (not, of >>> > course, reliably detected because programs with such dodgy code are very >>> > unlikely to have thorough automated tests covering all their code) - that >>> > is it in the interests of users for them to be errors by default (for C99 >>> > and later modes, in the cases that were valid in C89). >>> >>> Just to recap, those are controlled by >>> -Wimplicit-function-declaration, -Wimplicit-int, -Wint-conversion, and >>> -Wincompatible-pointer-types, roughly in increasing order of >>> compatibility impact with old sources. >> >> What would the impact be of making -Wint-conversion an error by >> default only if the types are different sizes? > > From a distribution perspective, it does not change anything because > we build everything on 64-bit anyway. Unlike e.g. Fedora, Debian > doesn't require all builds to succeed before the new package can be > installed, but given that the primary targets are 64 bit, I don't > think a restricted -Wint-conversion error would be much of a > simplification. The target-dependent nature of the warning is > probably more confusing. I don't see us really gaining anything from restricting it. Like you said, the cases in the wild are actually all of the same "class".