From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: (qmail 32447 invoked by alias); 22 Sep 2005 07:01:45 -0000 Mailing-List: contact gsl-discuss-help@sources.redhat.com; run by ezmlm Precedence: bulk List-Subscribe: List-Archive: List-Post: List-Help: , Sender: gsl-discuss-owner@sources.redhat.com Received: (qmail 31539 invoked by uid 22791); 22 Sep 2005 07:00:39 -0000 Received: from ppp-217-133-14-75.cust-adsl.tiscali.it (HELO server.enernova.it) (217.133.14.75) by sourceware.org (qpsmtpd/0.30-dev) with ESMTP; Thu, 22 Sep 2005 07:00:39 +0000 Received: from localhost (localhost [127.0.0.1]) by server.enernova.it (Postfix) with ESMTP id 35E551ED7E; Thu, 22 Sep 2005 09:01:50 +0200 (CEST) Received: from server.enernova.it ([127.0.0.1]) by localhost (server.enernova.it [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id 13775-07; Thu, 22 Sep 2005 09:01:48 +0200 (CEST) Received: from localhost.localdomain (unknown [192.168.0.3]) by server.enernova.it (Postfix) with ESMTP id 43D071ED78; Thu, 22 Sep 2005 09:01:46 +0200 (CEST) To: James Bergstra Cc: gsl-discuss@sources.redhat.com Subject: Re: question on GSL development References: <20050920151713.GA27057@chopin.iro.umontreal.ca> From: Francisco Yepes Barrera Date: Thu, 22 Sep 2005 07:01:00 -0000 In-Reply-To: <20050920151713.GA27057@chopin.iro.umontreal.ca> Message-ID: User-Agent: Gnus/5.09 (Gnus v5.9.0) Emacs/21.4 MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii X-SW-Source: 2005-q3/txt/msg00098.txt.bz2 James Bergstra writes: > I am working on an extension to facilitate building and training > neural networks (among other things). > > I have some code in cvs at savannah under the project name "Montreal > Scientific Library" for designing neural networks > > Your comments on my approach would be greatly appreciated! Thanks, James. I've downloaded the code. > I was thinking about an approach for coding genetic algorithms, and I > concluded(IMHO!) that the cleanest way to provide generic tools for > solving GA problems and other problems in combinatorial optimization > would be to establish a framework for optimizing a function on a > *tensor*, the way the gsl_multimin_* routines optimize a function on a > vector space. > > Gibbs-sampling would be one algorithm for this, a GA with given > recombination policies would be another, dynamic programming another, > and gradient-descent algorithms could be used too, when the values of > the tensor elements are highly correlated in neighbourhoods. > > Maybe you, or someone else would like to comment on these ideas, > especially if you have some background in combinatorial optimization :) > > James Can you give some more detail on this approach? I've worked fundamentally on steady-state GA, used to solve combinatorial problems in computational chemistry. Paco