From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: (qmail 8858 invoked by alias); 17 Sep 2004 11:08:14 -0000 Mailing-List: contact gcc-help@gcc.gnu.org; run by ezmlm Precedence: bulk List-Archive: List-Post: List-Help: Sender: gcc-owner@gcc.gnu.org Received: (qmail 8838 invoked from network); 17 Sep 2004 11:08:12 -0000 Received: from unknown (HELO mail49.ha.ovh.net) (213.186.33.51) by sourceware.org with SMTP; 17 Sep 2004 11:08:12 -0000 Received: (qmail 13774 invoked by uid 503); 17 Sep 2004 11:07:22 -0000 Received: from d213-101-192-51.cust.tele2.fr (HELO ?192.168.1.100?) (laurent%guerby.net@213.101.192.51) by ns0.ovh.net with SMTP; 17 Sep 2004 11:07:22 -0000 Subject: Re: Testing GCC & OSDL From: Laurent GUERBY To: "Joseph S. Myers" Cc: "Randy.Dunlap" , Hans-Peter Nilsson , gcc@gcc.gnu.org In-Reply-To: References: <20040715172523.GN21264@devserv.devel.redhat.com> <20040721151540.GW21264@devserv.devel.redhat.com> <20040721164133.GA4227@us.ibm.com> <20040721171430.GX21264@devserv.devel.redhat.com> <40FEABA5.4070500@codesourcery.com> <1090434729.10860.131.camel@pc.site> <20040916194612.229207fc.rddunlap@osdl.org> <1095402530.6762.61.camel@pc.site> Content-Type: text/plain Message-Id: <1095419242.6762.85.camel@pc.site> Mime-Version: 1.0 Date: Fri, 17 Sep 2004 12:08:00 -0000 Content-Transfer-Encoding: 7bit X-SW-Source: 2004-09/txt/msg01047.txt.bz2 On Fri, 2004-09-17 at 10:43, Joseph S. Myers wrote: > A regression tester that tests every mainline commit rather than batching > them would be feasible on a small cluster of machines (about 30 commits a > day to mainline, a bit more if you want to test release branches as well), BTW, is there a script that assumes a local CVS repository (rsync'ed) and gives you the list of CVS dates in between commits? (so that "cvs co -D X" for X in this list gives the interesting list of sources) > but keeping all the installed compilers from such a process would take > about 9GB a day (and if you want them available long-term to identify the > exact commit at which any newly discovered regression came in, this > storage does need to be backed up). A few choices can mitigate that since we're just caching sequential builds: 1. keep only one out of N so you still speed up the binary search but need to finish by building a few compilers 2. if you have N disks, install build number I on disk I modulo N, this way when you loose one disk you just degrade a bit the situation 3. keep more of the recent or recently needed installs since they're more likely to be useful 4. compress and/or use binary deltas (documentation, language, platform and library patches will affect only one part of the installed files), but you will have to uncompress before use (not much of a problem given CPU/disk performance ratio these days). Laurent