From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: (qmail 2709 invoked by alias); 2 May 2006 17:53:29 -0000 Received: (qmail 2247 invoked by uid 48); 2 May 2006 17:53:24 -0000 Date: Tue, 02 May 2006 17:53:00 -0000 Message-ID: <20060502175324.2246.qmail@sourceware.org> X-Bugzilla-Reason: CC References: Subject: [Bug tree-optimization/27394] double -> char conversion varies with optimization level In-Reply-To: Reply-To: gcc-bugzilla@gcc.gnu.org To: gcc-bugs@gcc.gnu.org From: "amylaar at gcc dot gnu dot org" Mailing-List: contact gcc-bugs-help@gcc.gnu.org; run by ezmlm Precedence: bulk List-Archive: List-Post: List-Help: Sender: gcc-bugs-owner@gcc.gnu.org X-SW-Source: 2006-05/txt/msg00199.txt.bz2 List-Id: ------- Comment #1 from amylaar at gcc dot gnu dot org 2006-05-02 17:53 ------- In 3.x, double -> char/int conversion was done consistently with the documented behaviour of integer -> signed integer type conversion. http://gcc.gnu.org/onlinedocs/gcc-4.1.0/gcc/Integers-implementation.html#Integers-implementation. In 4.1, fold-const.c:fold_convert_const_int_from_real implements java semantics. I think for consistency, when folding a floating point value to an integer type of smaller width than integer_type_node, it should be converted to integer_type first, and then from there to the target type, using the language-specific semantics. -- amylaar at gcc dot gnu dot org changed: What |Removed |Added ---------------------------------------------------------------------------- Known to fail| |4.1.0 4.2.0 Known to work| |3.2.3 http://gcc.gnu.org/bugzilla/show_bug.cgi?id=27394