From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: (qmail 13400 invoked by alias); 5 Apr 2004 13:31:38 -0000 Mailing-List: contact gcc-bugs-help@gcc.gnu.org; run by ezmlm Precedence: bulk List-Archive: List-Post: List-Help: Sender: gcc-bugs-owner@gcc.gnu.org Received: (qmail 13278 invoked by uid 48); 5 Apr 2004 13:31:33 -0000 Date: Mon, 05 Apr 2004 13:31:00 -0000 Message-ID: <20040405133133.13277.qmail@sources.redhat.com> From: "uros at kss-loka dot si" To: gcc-bugs@gcc.gnu.org In-Reply-To: <20040220113842.14224.terpstra@ito.tu-darmstadt.de> References: <20040220113842.14224.terpstra@ito.tu-darmstadt.de> Reply-To: gcc-bugzilla@gcc.gnu.org Subject: [Bug target/14224] GCC generates pessimizes code for integer division X-Bugzilla-Reason: CC X-SW-Source: 2004-04/txt/msg00409.txt.bz2 List-Id: ------- Additional Comments From uros at kss-loka dot si 2004-04-05 13:31 ------- Problem here is, that maximum quotient should always be 2^32-1, as it should fit in 32bit EAX register. Consider the case, when dividend is 0x0000 0001 0000 0000 and divisor is 0x0000 0001. The result won't fit in 32 bits, and division will produce #DE exception. For your case, divisor is 0xC000 0001. And with your assembly, if x*y is equal or more than 0xC000 0001 0000 0000 (= 13835058059577131008), #DE exception will be generated. I suggest to mark this bug as invalid, because it is not possible for gcc to know maximum value of (x*y). -- http://gcc.gnu.org/bugzilla/show_bug.cgi?id=14224