From mboxrd@z Thu Jan 1 00:00:00 1970 From: dewar@gnat.com To: pfk@fuchs.offl.uni-jena.de, zack@codesourcery.com Cc: gcc@gcc.gnu.org Subject: Re: Proposal Date: Wed, 19 Sep 2001 00:06:00 -0000 Message-id: <20010919070548.B0DC9F2B64@nile.gnat.com> X-SW-Source: 2001-09/msg00749.html <> In Ada, underscores can only be used to separate digits, and the extension is thus a simple one from a syntactic definition point of view. The actual excerpt from the Ada grammar is as follows: 2 decimal_literal ::= numeral [.numeral] [exponent] 3 numeral ::= digit {[underline] digit} 4 exponent ::= E [+] numeral | E - numeral I don't see any language or definition issue in introducing exactly this same restricted form into C 16777216_UL 16777216U_L 0._1234 0x_1234 12.34_e+56 12.34e_+56 All these should be illegal if the above approach is followed, and I think that is the right choice. The only legitimate use of the underscore is to separate digits in a long sequence of digits.