From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: (qmail 28786 invoked by alias); 21 Dec 2002 19:39:16 -0000 Mailing-List: contact gcc-help@gcc.gnu.org; run by ezmlm Precedence: bulk List-Archive: List-Post: List-Help: Sender: gcc-owner@gcc.gnu.org Received: (qmail 28751 invoked from network); 21 Dec 2002 19:39:05 -0000 Received: from unknown (HELO nile.gnat.com) (205.232.38.5) by 209.249.29.67 with SMTP; 21 Dec 2002 19:39:05 -0000 Received: by nile.gnat.com (Postfix, from userid 338) id CE494F2941; Sat, 21 Dec 2002 14:38:53 -0500 (EST) To: aoliva@redhat.com, dewar@gnat.com Subject: Re: possible gcse failure: not able to eliminate redundant loads Cc: gcc@gcc.gnu.org, rth@redhat.com, toon@moene.indiv.nluug.nl Message-Id: <20021221193853.CE494F2941@nile.gnat.com> Date: Sat, 21 Dec 2002 20:36:00 -0000 From: dewar@gnat.com (Robert Dewar) X-SW-Source: 2002-12/txt/msg01312.txt.bz2 > At the expense of getting nice crashes at points where NULL is > dereferenced, which often makes it easy to find bugs? Oh, and there's umm ... optimization is about favoring fast execution of code over debuggability, so yes, exactly that is the trade off. Note that there is absolutely NOTHING to stop a compiler from generating code that diagnoses dereferences of null pointers in fully optimized code if that's what you want (it is for example required in Ada!)