From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: (qmail 30578 invoked by alias); 19 Aug 2004 21:31:05 -0000 Mailing-List: contact gcc-bugs-help@gcc.gnu.org; run by ezmlm Precedence: bulk List-Archive: List-Post: List-Help: Sender: gcc-bugs-owner@gcc.gnu.org Received: (qmail 30565 invoked by uid 48); 19 Aug 2004 21:31:04 -0000 Date: Thu, 19 Aug 2004 21:31:00 -0000 Message-ID: <20040819213104.30564.qmail@sourceware.org> From: "mmitchel at gcc dot gnu dot org" To: gcc-bugs@gcc.gnu.org In-Reply-To: <20040723171525.16693.paulg@chiark.greenend.org.uk> References: <20040723171525.16693.paulg@chiark.greenend.org.uk> Reply-To: gcc-bugzilla@gcc.gnu.org Subject: [Bug middle-end/16693] [3.4/3.5 regression] Bitwise AND is lost when used within a cast to an enum of the same precision X-Bugzilla-Reason: CC X-SW-Source: 2004-08/txt/msg02014.txt.bz2 List-Id: ------- Additional Comments From mmitchel at gcc dot gnu dot org 2004-08-19 21:31 ------- This is a duplicate of some other PR -- I'm just not sure which. In C++, the compiler can assume that there will be no values of the enum greater than 0xf, given the declaration. That is why the compiler omits the bitwise-and. -- What |Removed |Added ---------------------------------------------------------------------------- Status|NEW |RESOLVED Resolution| |INVALID http://gcc.gnu.org/bugzilla/show_bug.cgi?id=16693