From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: (qmail 25581 invoked by alias); 15 Oct 2009 23:15:09 -0000 Received: (qmail 25571 invoked by uid 22791); 15 Oct 2009 23:15:09 -0000 X-SWARE-Spam-Status: No, hits=-0.4 required=5.0 tests=AWL,BAYES_00,DNS_FROM_RFC_BOGUSMX X-Spam-Check-By: sourceware.org Received: from sebabeach.org (HELO sebabeach.org) (64.165.110.50) by sourceware.org (qpsmtpd/0.43rc1) with ESMTP; Thu, 15 Oct 2009 23:15:05 +0000 Received: from sspiff.sspiff.org (seba.sebabeach.org [10.8.159.10]) by sebabeach.org (Postfix) with ESMTP id 83D9C6E3D5; Thu, 15 Oct 2009 16:15:03 -0700 (PDT) Message-ID: <4AD7ACF7.9040503@sebabeach.org> Date: Thu, 15 Oct 2009 23:15:00 -0000 From: Doug Evans User-Agent: Thunderbird 2.0.0.21 (X11/20090320) MIME-Version: 1.0 To: Dmitry Eremin-Solenikov CC: cgen Subject: Re: equals -> equal? broke format table building References: In-Reply-To: Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit X-IsSubscribed: yes Mailing-List: contact cgen-help@sourceware.org; run by ezmlm Precedence: bulk List-Id: List-Subscribe: List-Archive: List-Post: List-Help: , Sender: cgen-owner@sourceware.org X-SW-Source: 2009-q4/txt/msg00013.txt.bz2 Dmitry Eremin-Solenikov wrote: > Hello, > > I've just tried cgen-20091001 snapshot for my M68HC08 code just to > find that generated > code was a bit broken: When creating iformat table, most of the > instructions got glued together > into one iformat entry. > > Reverting patches one-by-one lead me to the equals->equal? change by > Doug Evans. Reverting the patch on top of 20091001 snapshot permitted > me to generate working code again (with correct iformat table). > > If you do need any additional information (like .cpu file), I can > easily provide them. > > Thanks. The equals->equal? change seems rather innocuous. If you could send me your .cpu file that would be great. Thanks. [It would also help to include correct versions of the generated files. Just send them to me, no need to cc the list if they're large files.]